Next Article in Journal
Incorporation of Stochastic Policyholder Behavior in Analytical Pricing of GMABs and GMDBs
Next Article in Special Issue
Optimal Reinsurance Under General Law-Invariant Convex Risk Measure and TVaR Premium Principle
Previous Article in Journal
Frailty and Risk Classification for Life Annuity Portfolios

Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

# A Note on Upper Tail Behavior of Liouville Copulas

by
Division of Statistics, Northern Illinois University, DeKalb, IL 60115, USA
Risks 2016, 4(4), 40; https://doi.org/10.3390/risks4040040
Received: 15 September 2016 / Revised: 4 October 2016 / Accepted: 3 November 2016 / Published: 8 November 2016

## Abstract

:
The family of Liouville copulas is defined as the survival copulas of multivariate Liouville distributions, and it covers the Archimedean copulas constructed by Williamson’s d-transform. Liouville copulas provide a very wide range of dependence ranging from positive to negative dependence in the upper tails, and they can be useful in modeling tail risks. In this article, we study the upper tail behavior of Liouville copulas through their upper tail orders. Tail orders of a more general scale mixture model that covers Liouville distributions is first derived, and then tail order functions and tail order density functions of Liouville copulas are derived. Concrete examples are given after the main results.

## 1. Introduction

Recently, the notion of Liouville copula has been introduced in [1], and it is defined as the survival copula of a Liouville distribution (see, [2]). The Liouville copula includes the Archimedean copulas constructed by Williamson’s d-transform ψ as special cases (see, [3]), and it inherits the dependence structure of a multivariate Liouville distribution which can be represented as a scale mixture model.
As a copula model for modeling the dependence of risks, Liouville copulas can account for various strength of dependence in the upper tail, ranging from tail negative dependence to the usual tail dependence case. We refer to [4] for the tail behavior of those Archimedean copulas that can be represented as special cases of the Liouville copulas, and [5] for the conditions that lead to a wide range of strength of dependence in the upper tail of such Archimedean copulas and their applications in modeling tail negative dependence between loss severity and loss frequencies. Most recently, Raymond-Belzile, L. [6] studies the extreme value copulas of Liouville copulas when asymptotic dependence is present.
In this paper, we study the upper tail behavior of Liouville copulas in terms of the following two asymptotic expressions. Studies on the lower tail of a copula C coincide with those on the upper tail of its survival copulas. So, in what follows, in order to study the upper tail of Liouville copulas, we consider the lower tail of survival Liouville copulas. Let C be a survival Liouville copula and c be its copula density function, then for any $w 1 , ⋯ , w d > 0$, we are interested in the following
$C ( u w 1 , ⋯ , u w d ) ∼ u κ ℓ ( u ) b ( w 1 , ⋯ , w d ) , u → 0 + , κ ≥ 1 ; c ( u w 1 , ⋯ , u w d ) ∼ u κ - d ℓ ( u ) λ ( w 1 , ⋯ , w d ) , u → 0 + , κ ≥ 1 ,$
where is a slowly varying function, κ is referred to as the (lower) tail order of the copula C, $b ( w 1 , ⋯ , w d )$ is the (lower) tail order function, and $λ ( w 1 , ⋯ , w d )$ is referred to as the (lower) tail order density function of C. We refer to [7,8] for details about the notion of tail orders. The notion of tail order of copulas corresponds to $1 / η$, with η proposed in [9] as an “coefficient of tail dependence” when the univariate marginals follow the unit Fréchet distribution. We refer to [9,10,11,12,13] for some relevant theories and applications.
Depending on the value of κ and the limits of $ℓ ( u )$ as $u → 0 +$, various tail dependence patterns can be captured: for the bivariate case, $κ = 1$ with $ℓ ( u ) → λ > 0$ coincides with the usual tail dependence defined in [14]; $κ = 1$ with $ℓ ( u ) → 0$, $1 < κ < 2$, or $κ = 2$ with $ℓ ( u ) → ∞$ are referred to as intermediate tail dependence; $κ = 2$ with $ℓ ( u ) → λ , 0 < λ < ∞$, is referred to as tail quadrant independence; $κ = 2$ with $ℓ ( u ) → 0$, or $κ > 2$ are referred to as tail negative dependence. For example, a bivariate Gaussian copula has both upper and lower tail orders being $κ = 2 / ( 1 + ρ )$, where $- 1 < ρ < 1$ is the correlation coefficient of the Gaussian copula. Clearly, $0 < ρ < 1$ leads to intermediate tail dependence, $ρ = 0$ leads to tail quadrant independence, and $- 1 < ρ < 0$ leads to tail negative dependence. Parametric models that have a wide range of tail orders are important for statistical inference on dependence in the tails. For instance, in [15], a regression analysis has been conducted based on a copula that has full-range upper tail dependence, where tail order is linked to covariates so that dynamic upper tail dependence patterns can then be appropriately captured.
Since the Liouville copula is the survival copula of the scale mixture representation $X = d R S$, where R is the scaling/radial random variable, and $S$ follows a Dirichlet distribution, the tail behavior of the Liouville copulas will be largely affected by the interaction between the tail behavior of R and $S i$’s. We will first consider a more general scale mixture model of which the $S$ does not necessarily follow the Dirichlet distribution. Conditions leading to different tail orders of such scale mixture models are derived. Then, as a special case, Liouville copulas will be studied in details. Tail order functions and tail order densities are further derived for Liouville copulas. The results derived contribute to the understanding of upper tails of such a large family of copulas.
The paper is organized as follows. In Section 2, notation and basic concepts will be introduced. Section 3 studies tail order for a more general scale mixture model, and tail oder functions of Liouville copulas are derived in Section 4. Section 5 contains results on tail order density functions of Liouville copulas. Finally, Section 6 concludes the paper.

## 2. Notation

A measurable function f is said to be regularly varying at $t 0 ∈ [ - ∞ , + ∞ ]$ with index α if $lim t → t 0 f ( x t ) / f ( t ) = x α$ for any $x > 0$; it is denoted as $f ∈ RV α ( t 0 )$, and when $t 0 = ∞$, it can be simplified as $f ∈ RV α$. If $α = 0$, then such an f is said to be slowly varying at $t 0$, and is used for a slowly varying function. For a random variable X, if its survival function $F ¯ ∈ RV - α$ with $α > 0$, then it is written as $X ∈ RV - α$. Two random variables X and Y supported on $[ 0 , ∞ )$ are said to be tail equivalent if $F ¯ X ( t ) ∼ F ¯ Y ( t )$ as $t → ∞$, with notation $g ( t ) ∼ h ( t ) , t → t 0$ meaning that $lim t → t 0 g ( t ) / h ( t ) = 1$. Some partial derivatives of copula functions are defined as following: $C 2 | 1 ( u , v ) : = D u C ( u , v )$, $C 1 | 2 ( u , v ) : = D v C ( u , v )$, and $c ( u , v ) : = D u v C ( u , v )$. For a copula C, its lower tail order is denoted as $κ L ( C )$. For two real constants x and y, $x ∧ y : = min { x , y }$, and $x ∨ y : = max { x , y }$.

## 3. Tail Order of a Scale Mixture Model

Copula functions can be constructed by inverting the univariate cdfs of a random vector. Let $X : = ( X 1 , ⋯ , X d )$ be a random vector with univariate marginal cdf $F i$ and joint cdf F, then a copula C can be induced by $X$ as $C ( u 1 , ⋯ , u d ) = F ( F 1 - 1 ( u 1 ) , ⋯ , F d - 1 ( u d ) )$. Now, we assume that $X$ has a scale mixture representation as follows:
$X = d R S = R ( S 1 , ⋯ , S d ) ,$
where the radial/scaling random variable $R ∈ ( 0 , ∞ )$, and $( S 1 , ⋯ , S d )$ is a nonnegative random vector. This representation is commonly used for constructing multivariate models. For example, for Archimedean copulas constructed by Williamson’s d-transforms (see, [3]), $S$ is a uniform distribution on a unit simplex; for Archimedean copula constructed by Laplace transformation of positive random variables, $1 / S i$’s are independent standard exponential distributions; for Liouville copulas, $S$ follows a Dirichlet distribution [1].
Proposition 1.
Let a random vector $X$ be defined as (1), and the induced copula be C. Suppose that $1 / R ∈ RV - α , α > 0$, $1 / S i$’s are tail equivalent with $1 / S i ∈ RV - ξ , ξ > 0$, and there exists an $ϵ > 0$ such that $S * ≥ ϵ$, where $S * : = max { S 1 , ⋯ , S d }$. Then, $κ L ( C ) = max { 1 , α / ξ }$.
Proof.
Let $F i$ be the univariate cdf for $X i$, $i = 1 , ⋯ , d$ and F be the joint cdf. It suffices to study the lower tail of $X$. Due to Equation (1) of [8], the lower tail order κ of $X$ can be written as
$κ = lim u → 0 + log P [ X 1 ≤ F 1 - 1 ( u ) , ⋯ , X d ≤ F d - 1 ( u ) ] log P [ X i ≤ F i - 1 ( u ) ] .$
Let $y : = 1 / x : = 1 / F i - 1 ( u )$, $T : = 1 / R$, and $t : = s y$ in what follows.
If $α < ξ _$, then $1 / S i ∈ RV - ξ$ implies that there exists $δ > 0$ such that $E [ ( 1 / S i ) α + δ ] < 0$. By the Breiman’s theorem [16],
$P [ T / S i > y ] = P [ 1 / X i > y ] ∼ E [ ( 1 / S i ) α ] P [ T > y ] , y → ∞ .$
If $α > ξ _$, then similarly, $P [ 1 / X i > y ] ∼ E [ T ξ ] P [ ( 1 / S i ) > y ]$ as $y → ∞$.
If $α = ξ _$, then $1 / X i ∈ RV - α$, see [17].
Letting $s * = max { s 1 , ⋯ , s d }$, we have
$P [ X 1 ≤ x , ⋯ , X d ≤ x ] = P [ R S 1 ≤ x , ⋯ , R S d ≤ x ] = ∫ P [ T ≥ s * / x ] F S ( d s 1 , ⋯ , d s d ) .$
Since $P [ T ≥ · ] ∈ RV - α$, there exists a slowly varying function $ℓ T ( · )$ such that $P [ T ≥ t ] = t - α ℓ T ( t )$. Also, the condition $s * ≥ ϵ > 0$ implies that, as $y → ∞$, $P [ T > s * y ] / P [ T > y ] → s * - α$ uniformly in $s * ∈ [ ϵ , ∞ )$. Therefore,
$- ∞ < lim y → ∞ log ∫ P [ T ≥ s * y ] P [ T > y ] F S ( d s 1 , ⋯ , d s d ) = log ∫ s * - α F S ( d s 1 , ⋯ , d s d ) ≤ - α log ϵ < ∞ .$
Then,
$κ : = lim x → 0 + log ( P [ X 1 ≤ x , ⋯ , X d ≤ x ] ) log ( P [ X i ≤ x ] ) = lim y → ∞ log P [ T > y ] + log ∫ P [ T ≥ s * y ] / P [ T > y ] F S ( d s 1 , ⋯ , d s d ) log ( P [ 1 / X i > y ] ) = lim y → ∞ - α log y + log ( ℓ T ( y ) ) log ( P [ 1 / X i > y ] ) = 1 , α ≤ ξ ; α / ξ α > ξ ,$
due to the fact that $lim y → ∞ log ( ℓ ( y ) ) / log ( y ) = 0$ based on Proposition 1.3.6 (i) of [18], which completes the proof.  ☐
Remark 1.
In Proposition 1, $S * = max { S 1 , ⋯ , S d }$ is required to be bounded away from 0. The condition is actually very mild, and it is satisfied as long as the point $0$ is excluded from the support of $S$. For example, Dirichlet distributions satisfy the condition, and a uniform distribution on the surface of a unit ball truncated to be within the positive orthant satisfies the condition as well. The condition leading to $κ L ( C ) = 1$ can be further relaxed. We only need to require that $P [ 1 / S i > y ] = o ( P [ T > y ] )$ as $y → ∞$, and $P [ 1 / S i > · ]$ is not necessarily regularly varying; see Lemma 4.1 of [19].
Remark 2.
In Proposition 1 and what follows throughout the paper, all univariate marginals are assumed to be tail equivalent in the sense that $F ¯ i ( t ) ∼ F ¯ j ( t )$ and $f i ( t ) ∼ f j ( t )$ as $t → ∞$ or $t → 0 +$, depending on the context. Otherwise, the tail order κ can not be calculated as in Equation (2). For cases where univariate marginals are not tail equivalent, one needs to first transform them into those that are tail equivalent, and then similar techniques here can be applied.

## 4. Tail Order Function of Liouville Copulas

The family of Liouville distributions is an important multivariate distribution family that is able to induce very flexible multivariate dependence structures. The corresponding Liouville copulas have been studied in [1]. Now, we are studying the upper tail orders for Liouville copulas.
A random vector $X$ on $R + d$ is said to follow a Liouville distribution if it has the following scale mixture representation [2]:
$X = d R S ξ 1 , ⋯ , ξ d , ξ i > 0 , i = 1 , ⋯ , d ,$
where the random vector $S ξ 1 , ⋯ , ξ d : = ( S 1 , ⋯ , S d )$ follows a Dirichlet distribution on the unit simplex $S d - 1 : = { x ≥ 0 : | | x | | = 1 }$, with the $l 1$ norm $| | x | | : = ∑ i = 1 d x i$. Note that, for a Dirichlet distribution, we have the following representation: for $i = 1 , ⋯ , d$, $S i = Z i / | | Z | |$, where $Z : = ( Z 1 , ⋯ , Z d )$, $Z i$’s are independent, and $Z i$ follows Gamma$( ξ i , 1 )$. Let C be the copula induced by $X$, then its survival copula $C ^$ is referred to as a Liouville copula in [1]. For a special case with $ξ i ≡ 1$, $S ξ 1 , ⋯ , ξ d$ becomes a uniform distribution on the unit simplex, and such a survival copula $C ^$ becomes an Archimdean copula [3].
A study on the upper tail of a Liouville copula is dependent on the left tail of R, or equivalently, the right tail of $1 / R$. Based on Theorem 5.5 and Theorem 6.1 in [2], the density of a Liouville distribution is directly related to the density of R. We first extend the result of Proposition 1 of [5] to Liouville copulas in Proposition 2.
Proposition 2 (Upper tail order of Liouville copulas).
Let a random vector $X$ be defined as (4) with its induced copula C, where the radial random variable R is supported on $( 0 , ∞ )$ with $1 / R ∈ RV - α$ for $0 < α < ∞$. Further, let $ξ i = ξ > 0$, $i = 1 , ⋯ , d$. Then, $κ L ( C ) = max { 1 , α / ξ }$.
Proof.
For the ith univariate marginal $X i$, since univariate marginal of a Dirichlet distribution follows Beta $( ξ , d ξ - ξ )$ distribution (see, [2,20]), $1 / S i ∈ RV - ξ$. Also, $T : = 1 / R ∈ RV - α$. Moreover, with $S * : = max { S 1 , ⋯ , S d } ≥ 1 / d > 0$. The claim is concluded based on Proposition 1.  ☐
Tail order κ can be a quantity to indicate the degree of dependence in the tail, with a larger κ leading to weaker dependence. Tail order functions, in addition, provide higher order approximations about the tails. Tail order functions of Liouville copulas are derived in Proposition 3.
Proposition 3 (Tail order functions of Liouville copulas).
Let $X$ be a random vector defined in Proposition 2, and C be its induced copula. Then, for any $w 1 , ⋯ , w d > 0$:
• If $0 < α < ξ$, then
$C ( u w 1 , ⋯ , u w d ) ∼ u × B ( ξ , d ξ - ξ ) B ( ξ - α , d ξ - ξ ) ∫ s ≥ 0 , | | s | | = 1 min i { s i - α w i } F S ( d s ) , u → 0 + .$
• If $0 < ξ < α$, then
$C ( u w 1 , ⋯ , u w d ) ∼ u α / ξ ℓ ( u ) × B ( ξ , d ξ - ξ ) ξ E [ T ξ ] α / ξ ∫ s ≥ 0 , | | s | | = 1 min i { s i - α w i } F S ( d s ) , u → 0 + ,$
where $ℓ ( u ) = ℓ T ( 1 / F 1 - 1 ( u ) )$ with $P [ T > y ] = y - α ℓ T ( y )$ as $y → ∞$.
• If $0 < ξ = α$, and moreover, $E [ T ξ ] < ∞$, then expression (6) holds.
Proof.
Let $T = 1 / R$, and then $P [ T ≥ · ] ∈ RV - α$. Write $P [ T > y ] = y - α ℓ T ( y )$, where $ℓ T ( y )$ is slowly varying as $y → ∞$. If $α < ξ _$, and thus $κ = 1$, then,
$lim u → 0 + C ( u w 1 , ⋯ , u w d ) u = lim u → 0 + P [ X 1 ≤ F 1 - 1 ( u w 1 ) , ⋯ , X d ≤ F d - 1 ( u w d ) ] P [ X 1 ≤ F 1 - 1 ( u ) ] = lim u → 0 + ∫ s ≥ 0 , | | s | | = 1 P [ T ≥ max i { s i / F i - 1 ( u w i ) } ] F S ( d s ) ∫ 0 1 P [ T ≥ s 1 / F 1 - 1 ( u ) ] F S 1 ( d s 1 ) .$
Also, $F i - 1 ∈ RV 1 / α ( 0 + )$. Then,
$η 1 ( u ) : = ∫ s ≥ 0 , | | s | | = 1 P [ T ≥ max i { s i / F i - 1 ( u w i ) } ] F S ( d s ) = ∫ s ≥ 0 , | | s | | = 1 P T ≥ max i s i F 1 - 1 ( u ) F i - 1 ( u w i ) × 1 F 1 - 1 ( u ) F S ( d s ) = P [ T > y ] × ∫ s ≥ 0 , | | s | | = 1 P T ≥ max i s i F 1 - 1 ( u ) F i - 1 ( u w i ) × y P [ T > y ] F S ( d s ) .$
Let $η 2 ( u )$ be the denominator of (7), and then Equation (3) implies that
$lim u → 0 + η 1 ( u ) η 2 ( u ) = 1 E [ ( 1 / S i ) α ] ∫ s ≥ 0 , | | s | | = 1 min i { s i - α w i } F S ( d s ) .$
Since $S i ∼ Beta ( ξ , d ξ - ξ )$, we have
$E [ S i - α ] = [ B ( ξ , d ξ - ξ ) ] - 1 ∫ 0 1 s - α s ξ - 1 ( 1 - s ) d ξ - ξ - 1 d s = B ( ξ - α , d ξ - ξ ) B ( ξ , d ξ - ξ ) ,$
which completes the proof of the first part.
When $α > ξ _$, from the proof of Proposition 2, $P [ 1 / X i > y ] ∼ E [ T ξ ] P [ ( 1 / S i ) > y ]$ as $y → ∞$. Since $S i$ follows $Beta ( ξ , d ξ - ξ )$, by the dominated convergence theorem
$P [ ( 1 / S i ) > y ] = [ B ( ξ , d ξ - ξ ) ] - 1 ∫ 0 1 / y s ξ - 1 ( 1 - s ) d ξ - ξ - 1 d s = [ B ( ξ , d ξ - ξ ) ] - 1 y - ξ ∫ 0 1 t ξ - 1 ( 1 - t / y ) d ξ - ξ - 1 d t ∼ [ B ( ξ , d ξ - ξ ) ] - 1 ξ y - ξ , y → ∞ .$
Therefore,
$lim u → 0 + C ( u w 1 , ⋯ , u w d ) u α / ξ ℓ T ( 1 / F 1 - 1 ( u ) ) = lim u → 0 + P [ X 1 ≤ F 1 - 1 ( u w 1 ) , ⋯ , X d ≤ F d - 1 ( u w d ) ] ( P [ X 1 ≤ F 1 - 1 ( u ) ] ) α / ξ ℓ T ( 1 / F 1 - 1 ( u ) ) = lim y → ∞ P [ T > y ] × ∫ s ≥ 0 , | | s | | = 1 min i { s i - α w i } F S ( d s ) ( E [ T ξ ] [ B ( ξ , d ξ - ξ ) ] - 1 ξ y - ξ ) α / ξ ℓ T ( y ) = B ( ξ , d ξ - ξ ) ξ E [ T ξ ] α / ξ ∫ s ≥ 0 , | | s | | = 1 min i { s i - α w i } F S ( d s ) ,$
which completes the proof of second part.
For the case with $ξ = α$, the condition $E [ T ξ ] < ∞$ guarantees that $P [ 1 / X i > y ] ∼ E [ T ξ ] P [ ( 1 / S i ) > y ]$, due to (10) and Lemma 4.2(3) of [19]. Therefore, the third part is proved. Note that, when $ξ = α$, $E [ ( 1 / S i ) α ]$ is not finite, and therefore a similar case to (5) does not hold.  ☐
Next we give two examples when R follows Gamma$( α , 1 )$, where α is the shape parameter. Both tail dependence and intermediate tail dependence cases are derived.
Example 1 (Gamma-Liouville copula: tail dependence).
Let $R ∼ G a m m a ( α , 1 )$, and a 2-dimensional $S$ follows a Dirichlet distribution on $S 1$ with $ξ 1 = ξ 2 = ξ > α$. Since the univariate marginal of such a Dirichlet distribution is Beta $( ξ , ξ )$, $E [ ( 1 / S i ) α ] = B ( ξ - α , ξ ) / B ( ξ , ξ )$. Letting $w ˜ : = ( 1 + ( w 2 / w 1 ) 1 / α ) - 1$, then as $u → 0 +$, $C ( u w 1 , u w 2 ) ∼ u b ( w 1 , w 2 )$ with
$b ( w 1 , w 2 ) = B ( ξ , ξ ) B ( ξ - α , ξ ) ∫ 0 < s 1 ≤ 1 min { s 1 - α w 1 , ( 1 - s 1 ) - α w 2 } F S 1 ( d s 1 ) = 1 B ( ξ - α , ξ ) ∫ 0 < s 1 ≤ w ˜ s 1 ξ - 1 ( 1 - s 1 ) - α + ξ - 1 w 2 d s 1 + ∫ w ˜ < s 1 ≤ 1 s 1 - α + ξ - 1 ( 1 - s 1 ) ξ - 1 w 1 d s 1 = 1 B ( ξ - α , ξ ) B ( ξ , ξ - α ) I w ˜ ( ξ , ξ - α ) w 2 + B ( ξ - α , ξ ) [ 1 - I w ˜ ( ξ - α , ξ ) ] w 1 = I w ˜ ( ξ , ξ - α ) w 2 + [ 1 - I w ˜ ( ξ - α , ξ ) ] w 1 ,$
where $I x ( α , β )$ is the regularized incomplete beta function.
Example 2 (Gamma-Liouville copula: intermediate and negative tail dependence).
We consider the same $X$ as in Example 1 except that $α > ξ$, then $κ L ( C ) = α / ξ > 1$, which ranges from intermediate tail dependence to tail negative dependence cases. Also, if $R ∼ G a m m a ( α , 1 )$, then $P [ 1 / R > y ] = y - α ℓ T ( y )$ with $ℓ T ( y ) → [ α Γ ( α ) ] - 1$, and $E [ T ξ ] = Γ ( α - ξ ) / Γ ( α )$. Therefore, based on Proposition 3, as $u → 0 +$,
$C ( u w 1 , u w 2 ) ∼ [ α Γ ( α ) ] - 1 Γ ( α ) B ( ξ , ξ ) ξ Γ ( α - ξ ) α / ξ ∫ s ≥ 0 , | | s | | = 1 min i = 1 , 2 { s i - α w i } F S ( d s ) × u α / ξ .$
It can be verified from (11) that, we recover the special case when $ξ = 1$ for Archimedean copulas in (3.15) of [21] by taking $β = 1$ and the upper limit of the integration there as $w 1 / α$.

## 5. Tail Order Density of Liouville Copulas

In Section 2 of [22], and in [21], a uniform convergence condition is assumed for establishing tail order density functions for asymptotic dependence and asymptotic independence cases, respectively. While, in [7], a heuristic argument using the monotone density theorem (Theorem 1.7.2 of [18]) is used for derivatives of a tail order function, and ultimate monotonicity of a function needs to be checked. Here we give a proof of the result only on the bivariate case for tail order functions to avoid tedious arguments, and the proof is based on the typical arguments for the monotone density theorem. The multivariate case can be similarly established for $c ( u w 1 , ⋯ , u w d ) ∼ u κ - d ℓ ( u ) D w b ( w 1 , ⋯ , w d )$, as $u → 0 +$.
Proposition 4.
Let $C ( u , v )$ be a copula with $C ( u , u ) ∼ u κ ℓ ( u ) , κ ≥ 1$ , and $C ( u w 1 , u w 2 ) ∼ u κ ℓ ( u ) b ( w 1 , w 2 )$ as $u → 0 +$ for $w 1 , w 2 > 0$. Assume that $C ( u , v )$ is absolutely continuous with density $c ( u , v )$.
1.
If $C 1 ( x , v ) : = D x C ( x , v )$ is ultimately monotone as $x → 0 +$ for any $0 < v ≤ 1$, and $g ( w 1 , w 2 ) : = lim u → 0 + C 1 ( u w 1 , u w 2 ) u κ - 1 ℓ ( u )$ exists and continuous in $w 1$, then $D w 1 b ( w 1 , w 2 )$ exists and $g ( w 1 , w 2 ) = D w 1 b ( w 1 , w 2 )$.
2.
Further, if $c ( u , x ) : = D x C 1 ( u , x )$ is ultimately monotone as $x → 0 +$ for any $0 < u ≤ 1$, and $h ( w 1 , w 2 ) : = lim u → 0 + c ( u w 1 , u w 2 ) u κ - 2 ℓ ( u )$ exists and continuous in $w 2$, then $D w 2 D w 1 b ( w 1 , w 2 )$ exists and $h ( w 1 , w 2 ) = D w 2 D w 1 b ( w 1 , w 2 )$.
Proof.
For the first part, let $C 1 ( u , v ) : = D u C ( u , v )$, then for any given $0 < v ≤ 1$, without loss of generality, assume that $C 1 ( x , v )$ is ultimately nondecreasing as $x → 0 +$. Then for a small $Δ w > 0$,
$u Δ w C 1 ( u w 1 + u Δ w , u w 2 ) u κ ℓ ( u ) ≤ C ( u w 1 + u Δ w , u w 2 ) - C ( u w 1 , u w 2 ) u κ ℓ ( u ) ≤ u Δ w C 1 ( u w 1 , u w 2 ) u κ ℓ ( u ) ,$
which implies that
$C 1 ( u w 1 + u Δ w , u w 2 ) u κ - 1 ℓ ( u ) ≤ C ( u w 1 + u Δ w , u w 2 ) - C ( u w 1 , u w 2 ) Δ w u κ ℓ ( u ) ≤ C 1 ( u w 1 , u w 2 ) u κ - 1 ℓ ( u ) .$
Letting $u → 0 +$ and then $Δ w → 0 +$ in (12) implies that
$lim Δ w → 0 + lim inf u → 0 + C 1 ( u w 1 + u Δ w , u w 2 ) u κ - 1 ℓ ( u ) ≤ D w 1 b ( w 1 , w 2 ) ≤ lim sup u → 0 + C 1 ( u w 1 , u w 2 ) u κ - 1 ℓ ( u ) .$
Since $g ( w 1 , w 2 ) = lim u → 0 + C 1 ( u w 1 , u w 2 ) u κ - 1 ℓ ( u )$ exists and $g ( w 1 , w 2 )$ is continuous in $w 1$, (13) implies that $g ( w 1 , w 2 ) = D w 1 b ( w 1 , w 2 )$.
For the second part, let $c ( u , v ) : = D v C 1 ( u , v )$, then for any given $0 < u ≤ 1$, without loss of generality, assume that $c ( u , x )$ is ultimately nondecreasing as $x → 0 +$. Then for a small $Δ w > 0$,
$u Δ w c ( u w 1 , u w 2 + u Δ w ) u κ - 1 ℓ ( u ) ≤ C 1 ( u w 1 , u w 2 + u Δ w ) - C 1 ( u w 1 , u w 2 ) u κ - 1 ℓ ( u ) ≤ u Δ w c ( u w 1 , u w 2 ) u κ - 1 ℓ ( u ) ,$
which implies that
$c ( u w 1 , u w 2 + u Δ w ) u κ - 2 ℓ ( u ) ≤ C 1 ( u w 1 , u w 2 + u Δ w ) - C 1 ( u w 1 , u w 2 ) Δ w u κ - 1 ℓ ( u ) ≤ c ( u w 1 , u w 2 ) u κ - 2 ℓ ( u ) .$
Letting $u → 0 +$ and then $Δ w → 0 +$ in (14) implies that
$lim Δ w → 0 + lim inf u → 0 + c ( u w 1 , u w 2 + u Δ w ) u κ - 2 ℓ ( u ) ≤ D w 2 D w 1 b ( w 1 , w 2 ) ≤ lim sup u → 0 + c ( u w 1 , u w 2 ) u κ - 2 ℓ ( u ) .$
Since $h ( w 1 , w 2 ) = lim u → 0 + c ( u w 1 , u w 2 ) u κ - 2 ℓ ( u )$ exists and $h ( w 1 , w 2 )$ is continuous in $w 2$, (15) implies that $h ( w 1 , w 2 ) = D w 2 D w 1 b ( w 1 , w 2 )$.
Examples for the usual tail dependence are referred to [22]. An intermediate tail dependence case is given in Example 3.
Example 3 (Lower tail of Gumbel copula).
For a bivariate Gumbel copula that has the form $C ( u , v ) = exp { - A ( - log u , - log v ) }$, where $A ( x , y ) = ( x δ + y δ ) 1 / δ , 1 ≤ δ < ∞$. Then, based on Example 2 of [7], $C ( u w 1 , u w 2 ) ∼ u 2 1 / δ w 1 2 1 / δ - 1 w 2 2 1 / δ - 1$ as $u → 0 +$ for $w 1 , w 2 > 0$. Note that, $C 1 ( x , v ) = C ( x , v ) ( ( - ln ( x ) ) δ + ( - ln ( v ) ) δ ) 1 / δ - 1 ( - ln ( x ) ) δ - 1 x - 1$, which is decreasing in x. Moreover, as $u → 0 +$,
$C 1 ( u w 1 , u w 2 ) ∼ [ u 2 1 / δ w 1 2 1 / δ - 1 w 2 2 1 / δ - 1 ] × [ 2 1 / δ - 1 u - 1 w 1 - 1 ] = 2 1 / δ - 1 u 2 1 / δ - 1 w 1 2 1 / δ - 1 - 1 w 2 2 1 / δ - 1 .$
So, $g ( w 1 , w 2 )$ corresponding to the one in Proposition 4 is $g ( w 1 , w 2 ) = 2 1 / δ - 1 w 1 2 1 / δ - 1 - 1 w 2 2 1 / δ - 1$. Similarly, the corresponding $h ( w 1 , w 2 ) = 2 2 / δ - 2 w 1 2 1 / δ - 1 - 1 w 2 2 1 / δ - 1 - 1$, and the ultimate monotonicity condition holds. Therefore, the copula density
$c ( u w 1 , u w 2 ) ∼ 2 2 / δ - 2 u 2 1 / δ - 2 w 1 2 1 / δ - 1 - 1 w 2 2 1 / δ - 1 - 1 , u → 0 + .$
When a joint density function for a multivariate model that is used to induce the copula is relatively easier to work with directly, one can derive the tail order density function of copulas based on the joint density function of the multivariate model; see [21] for more details. Now, we derive the tail order density function of Liouville copulas from the joint density functions of Liouville distributions.
Let $X$ be defined as (4), g be the density function of R in the model. Letting $ξ + : = ∑ i = 1 d ξ i$, then the joint density function of $X$ is (see, [2])
$f ( x ) = Γ ( ξ + ) | | x | | ξ + - 1 g ( | | x | | ) ∏ i = 1 d x i ξ i - 1 Γ ( ξ i ) , x ∈ R + d .$
Proposition 5.
Let a random vector $X$ be defined as (4) with $ξ i ≡ ξ > 0$, and its induced copula C, where the radial random variable R is supported on $( 0 , ∞ )$ with $1 / R ∈ RV - α$ for $0 < α < ∞$. Then, there exists a slowly varying function $ℓ ∈ RV 0 ( 0 + )$, such that, for $w 1 , ⋯ , w d > 0$,
$c ( u w 1 , ⋯ , u w d ) ∼ α - d Γ ( d ξ ) | | w 1 / α | | d ξ - α ∏ i = 1 d w i ξ / α - 1 Γ ( ξ ) u κ - d ℓ ( u ) , u → 0 + ,$
where $κ = max { 1 , α / ξ }$.
Proof.
Based on Proposition 2, the lower tail order $κ : = κ L ( C ) = max { 1 , α / ξ }$. Since $P [ 1 / R ≥ · ] ∈ RV - α$, based on the monotone density theorem, there exists a slowly varying function $ℓ 1 ∈ RV 0$, such that, $g ( 1 / s ) ∼ s 1 - α ℓ 1 ( s )$ as $s → ∞$; i.e., $g ( t ) ∼ t α - 1 ℓ 1 ( 1 / t )$ as $t → 0 +$. Based on the proof of Proposition 1, we first note that,
$Y i : = 1 / X i ∈ RV - ( α ∧ ξ ) ,$
and therefore, there exists a slowly varying function $ℓ 2 ∈ RV 0$, such that, $F ¯ Y i ( t ) : = P [ 1 / X i > t ] = t - ( α ∧ ξ ) ℓ 2 ( t ) ∈ [ 0 , 1 ]$. Based on Propositions 2.1 and 3.8 of [21], write $V ( t ) = t α ∧ ξ [ ℓ 1 ( 1 / t ) ] 1 / κ$, then, clearly, the mapping $t ↦ V ( t - 1 ) ∈ RV - ( α ∧ ξ )$. Thus,
$λ ( x ) = lim t → 0 + f ( t x ) t - d V κ ( t ) = lim t → 0 + Γ ( d ξ ) | | x | | d ξ - 1 g ( t | | x | | ) t 1 - d ∏ i = 1 d x i ξ - 1 Γ ( ξ ) / [ t κ ( α ∧ ξ ) - d ℓ 1 ( 1 / t ) ] = Γ ( d ξ ) | | x | | d ξ - α ∏ i = 1 d x i ξ - 1 Γ ( ξ ) , x ∈ R + d .$
Then, based on (3.11) of [21], the lower tail order density of copula C is
$λ L ( w ) = λ ( w 1 / α ) | J ( w 1 1 / α , ⋯ , w d 1 / α ) | = Γ ( d ξ ) | | w 1 / α | | d ξ - α ∏ i = 1 d w i ( ξ - 1 ) / α Γ ( ξ ) ( α - d ∏ i = 1 d w i 1 / α - 1 ) = α - d Γ ( d ξ ) | | w 1 / α | | d ξ - α ∏ i = 1 d w i ξ / α - 1 Γ ( ξ ) .$
Matching the $V ( 1 / t )$ defined here with (2.8) of [21] leads to,
$t - ( α ∧ ξ ) [ ℓ 1 ( t ) ] 1 / κ = [ t - ( α ∧ ξ ) ℓ 2 ( t ) ] × [ ℓ ( t - ( α ∧ ξ ) ℓ 2 ( t ) ) ] 1 / κ .$
Let $u : = F ¯ Y i ( t ) = t - ( α ∧ ξ ) ℓ 2 ( t )$, then,
$ℓ ( u ) = ℓ 1 ( F ¯ Y i - 1 ( u ) ) [ ℓ 2 ( F ¯ Y i - 1 ( u ) ) ] - κ ,$
which completes the proof.  ☐
Remark 3.
Letting $ξ i ≡ 1$ in Proposition 5, $λ L ( w ) = α - d Γ ( d ) | | w 1 / α | | d - α ∏ i = 1 d w i 1 / α - 1$, which corresponds to the case of the Archimedean copula studied in Example 3.9 of [21], where β is replaced by 1.
Example 4.
The slowly varying function in Proposition 5 depends on the lower tail of R and thus the upper tail of $1 / X i$ as well, through their slowly varying functions $ℓ 1$ and $ℓ 2$. For example, suppose that R follows Gamma $( α , 1 )$. Then its density function $g ( t ) ∼ t α - 1 [ Γ ( α ) ] - 1$ as $t → 0 +$; that is, $ℓ 1 ( t ) ≡ [ Γ ( α ) ] - 1$ as $t → ∞$. For $ℓ 2$, if $α > ξ$, then based on the proof of Proposition 1 and (10),
$P [ 1 / X i > y ] ∼ E [ T ξ ] P [ ( 1 / S i ) > y ] ∼ E [ T ξ ] [ B ( ξ , d ξ - ξ ) ] - 1 ξ y - ξ , y → ∞ ,$
and therefore, $ℓ 2 ( t ) ≡ E [ T ξ ] [ B ( ξ , d ξ - ξ ) ] - 1 ξ$, as $t → ∞$. If $α < ξ$, then based on (3), (9) and [23],
$P [ 1 / X i > y ] ∼ E [ ( 1 / S i ) α ] P [ T > y ] ∼ B ( ξ - α , d ξ - ξ ) B ( ξ , d ξ - ξ ) P [ T > y ] , y → ∞ , ∼ B ( ξ - α , d ξ - ξ ) α Γ ( α ) B ( ξ , d ξ - ξ ) y - α , y → ∞ ,$
and therefore, $ℓ 2 ( t ) ≡ B ( ξ - α , d ξ - ξ ) α Γ ( α ) B ( ξ , d ξ - ξ )$, as $t → ∞$. If $α = ξ$, since both $E [ T ξ ] = ∞$ and $E [ S i - α ] = ∞$, the Breiman’s theorem does not apply to deriving $ℓ 2$. For such a case, results based on [24] can be useful in further deriving the slowly varying function.

## 6. Concluding Remark

We have derived asymptotic approximations of the upper tails of Liouville copulas through the notion of tail order, tail order functions, and tail order densities. Both asymptotic dependence and asymptotic independence cases are considered in the paper. Upper tail order for a more general scale mixture model that covers Liouville copulas is also obtained. Here we should note that the results are derived under the assumption of tail equivalence of univariate marginals of $X$ in (1), and in particular for Liouville copulas this means an exchangeable structure. When the univariate marginals are not tail equivalent, a transform on marginals is required first, and then similar arguments can be applied.
The results developed in the paper contribute to the understanding of different upper tail dependence patterns of Liouville copulas, and can be relevant in conducting inference on the limiting measure of hidden regular variation for a random vector that has a Liouville copula and regularly varying marginals, because the tail order functions of copulas and the limiting measures are closely related; see [25] and the references therein for details. Moreover, based on Proposition 1, a very wide range of upper tail order, ranging from positive tail dependence to tail negative dependence, can be achieved with a mild condition on the random vector $S$. Therefore, promising future research includes finding general and convenient structures for $S$, so that based on which a semi-parametric approach for statistical inference on tail dependence patterns can be further developed.

## Acknowledgments

We thank the two anonymous reviewers for their constructive suggestions and comments, which are very helpful in improving the presentation of the paper.

## Conflicts of Interest

The author declares no conflict of interest.

## References

1. A.J. McNeil, and J. Nešlehová. “From Archimedean to Liouville copulas.” J. Multivar. Anal. 101 (2010): 1772–1790. [Google Scholar] [CrossRef]
2. K. Fang, S. Kotz, and K. Ng. Symmetric Multivariate and Related Distributions, Monographs on Statistics and Applied Probability. London, UK: Chapman & Hall, 1990, Volume 36. [Google Scholar]
3. A.J. McNeil, and J. Nešlehová. “Multivariate Archimedean copulas, d-monotone functions and l1-norm symmetric distributions.” Ann. Stat. 37 (2009): 3059–3097. [Google Scholar] [CrossRef]
4. M. Larsson, and J. Nešlehová. “Extremal behavior of Archimedean copulas.” Adv. Appl. Probab. 43 (2011): 195–216. [Google Scholar] [CrossRef]
5. L. Hua. “Tail negative dependence and its applications for aggregate loss modeling.” Insur. Math. Econ. 61 (2015): 135–145. [Google Scholar] [CrossRef]
6. L. Raymond-Belzile. “Extremal and Inferential Properties of Liouville Copulas.” Master’s Thesis, McGill University, Montreal, QC, Canada, 2014. [Google Scholar]
7. L. Hua, and H. Joe. “Tail order and intermediate tail dependence of multivariate copulas.” J. Multivar. Anal. 102 (2011): 1454–1471. [Google Scholar] [CrossRef]
8. L. Hua, and H. Joe. “Intermediate tail dependence: A review and some new results.” In Stochastic Orders in Reliability and Risk: In Honor of Professor Moshe Shaked. Edited by H. Li and X. Li. New York, NY, USA: Springer, 2013, Chapter 15; pp. 291–311. [Google Scholar]
9. A.W. Ledford, and J.A. Tawn. “Statistics for near independence in multivariate extreme values.” Biometrika 83 (1996): 169–187. [Google Scholar] [CrossRef]
10. A.W. Ledford, and J.A. Tawn. “Modelling dependence within joint tail regions.” J. R. Stat. Soc. Ser. B Methodol. 59 (1997): 475–499. [Google Scholar] [CrossRef]
11. S. Coles, J. Heffernan, and J. Tawn. “Dependence Measures for Extreme Value Analyses.” Extremes 2 (1999): 339–365. [Google Scholar] [CrossRef]
12. J.E. Heffernan. “A directory of coefficients of tail dependence.” Extremes 3 (2000): 279–290. [Google Scholar] [CrossRef]
13. A. Ramos, and A. Ledford. “A new class of models for bivariate joint tails.” J. R. Stat. Soc. Ser. B (Stat. Methodol.) 71 (2009): 219–241. [Google Scholar] [CrossRef]
14. H. Joe. Multivariate Models and Dependence Concepts. Monographs on Statistics and Applied Probability; London, UK: Chapman & Hall, 1997. [Google Scholar]
15. L. Hua, and M. Xia. “Assessing High-Risk Scenarios by Full-Range Tail Dependence Copulas.” N. Am. Actuar. J. 18 (2014): 363–378. [Google Scholar] [CrossRef]
16. L. Breiman. “On Some Limit Theorems Similar to the Arc-Sin Law.” Theory Probab. Appl. 10 (1965): 323–331. [Google Scholar] [CrossRef]
17. P. Embrechts, and C.M. Goldie. “On closure and factorization properties of subexponential and related distributions.” J. Aust. Math. Soc. (Ser. A) 29 (1980): 243–256. [Google Scholar] [CrossRef]
18. N.H. Bingham, C.M. Goldie, and J.L. Teugels. Regular Variation. Encyclopedia of Mathematics and Its Applications; Cambridge, UK: Cambridge University Press, 1987. [Google Scholar]
19. A.H. Jessen, and T. Mikosch. “Regularly varying functions.” Publ. Inst. Math. (Beograd) (N.S.) 80 (2006): 171–192. [Google Scholar] [CrossRef]
20. K.W. Ng, G.L. Tian, and M.L. Tang. Dirichlet and Related Distributions: Theory, Methods and Applications. New York, NY, USA: John Wiley & Sons, 2011. [Google Scholar]
21. H. Li, and L. Hua. “Higher order tail densities of copulas and hidden regular variation.” J. Multivar. Anal. 138 (2015): 143–155. [Google Scholar] [CrossRef]
22. H. Li, and P. Wu. “Extremal dependence of copulas: A tail density approach.” J. Multivar. Anal. 114 (2013): 99–111. [Google Scholar] [CrossRef]
23. M. Abramowitz, and I.A. Stegun. Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables. New York, NY, USA: Dover Publications, 1964. [Google Scholar]
24. S. Nadarajah, and S. Kotz. “On the Product and Ratio of Gamma and Beta Random Variables.” Allg. Stat. Arch. 89 (2005): 435–449. [Google Scholar] [CrossRef]
25. L. Hua, H. Joe, and H. Li. “Relations between hidden regular variation and tail order of copulas.” J. Appl. Probab. 51 (2014): 37–57. [Google Scholar] [CrossRef]

## Share and Cite

MDPI and ACS Style

Hua, L. A Note on Upper Tail Behavior of Liouville Copulas. Risks 2016, 4, 40. https://doi.org/10.3390/risks4040040

AMA Style

Hua L. A Note on Upper Tail Behavior of Liouville Copulas. Risks. 2016; 4(4):40. https://doi.org/10.3390/risks4040040

Chicago/Turabian Style

Hua, Lei. 2016. "A Note on Upper Tail Behavior of Liouville Copulas" Risks 4, no. 4: 40. https://doi.org/10.3390/risks4040040

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.