Next Article in Journal
Harmonic Series with Multinomial Coefficient 4nn,n,n,n and Central Binomial Coefficient 2nn
Next Article in Special Issue
Research on Change Point Detection during Periods of Sharp Fluctuations in Stock Prices–Based on Bayes Method β-ARCH Models
Previous Article in Journal
Small Area Estimation under Poisson–Dirichlet Process Mixture Models
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

On Bivariate Distributions with Singular Part †

by
Carles M. Cuadras
Department of Statistics, University of Barcelona, 08028 Barcelona, Spain
To the memory of Joan Augé and Josep Fortiana.
Axioms 2024, 13(7), 433; https://doi.org/10.3390/axioms13070433
Submission received: 1 May 2024 / Revised: 16 June 2024 / Accepted: 20 June 2024 / Published: 27 June 2024
(This article belongs to the Special Issue Applications of Bayesian Methods in Statistical Analysis)

Abstract

:
There are many families of bivariate distributions with given marginals. Most families, such as the Farlie–Gumbel–Morgenstern (FGM) and the Ali–Mikhail–Haq (AMH), are absolutely continuous, with an ordinary probability density. In contrast, there are few families with a singular part or a positive mass on a curve. We define a general condition useful to detect the singular part of a distribution. By continuous extension of the bivariate diagonal expansion, we define and study a wide family containing these singular distributions, obtain the probability density, and find the canonical correlations and functions. The set of canonical correlations is described by a continuous function rather than a countable sequence. An application to statistical inference is given.

1. Introduction

In probability theory, a copula is a bivariate cumulative distribution function (cdf) C ( u , v ) with uniform ( 0 , 1 ) marginals, which captures the dependence properties of two r.v.’s U , V defined on the same probability space.
Constructing copulas is important because they are versatile and allow us to generate bivariate distributions. A copula can model the dependence between two random variables, without the influence of marginal distributions. Copulas have applications in finances, credit risk, insurances, hydrology, physics, psychometry, quality control, statistics, and other fields. Most copulas have absolutely continuous distributions, but there are copulas containing a singular part. These copulas are useful in situations where there are coincidences between the variables.
Section 2 defines a general family of copulas. Section 3 describes the absolutely continuous and singular parts (which could be non-null) of a copula, showing that this family can deal with copulas with a non-null singular part. In Section 3, a general definition of singularity is introduced, which can obtain the probability density with respect to a suitable measure. Section 4 is devoted to the canonical correlation analysis of a copula with singular part. The concept of singularity is extended in Section 5. Section 6 studies the singularity of general bivariate distributions. An application to Bayesian statistics is proposed in Section 7.
We use the following notations
M = min ( u , v ) , Π = u v , W = max ( u + v 1 , 0 ) ,
where M , Π , and W are copulas. The quotient Π / M = max ( u , v ) will have an important role in defining singularities.
W and M are the Fréchet-Hoeffding bounds. Any copula C satisfies
W C M
uniformly in u , v . Both W and M are singular copulas. The “shuffle of min” is another example of a singular copula. But most distributions are absolutely continuous and there are few probability models with a singular part. This is studied in Section 3.
For properties and construction of copulas, see [1,2,3,4,5]. For applications in finances (including copulas with a singular component) and marketing, see [6,7]. For general applications, see [3,8].
Based on [9], we present a general method of generating copulas, putting special emphasis on constructing copulas with a singular part.

2. Correlation Functions and Families

We indicate the unit interval [ 0 , 1 ] by I and the unit square [ 0 , 1 ] × [ 0 , 1 ] by I 2 . In all cases, we suppose 0 θ 1 .
Definition 1. 
A parametric canonical correlation function is an integrable function f θ : I I .
Definition 2. 
A quotient function Q : I 2 R is a two-variable function satisfying
Q ( u , v ) 0 , Q ( 1 , v ) = Q ( u , 1 ) = 1 .
The adjective “canonical” for a correlation function is justified in Section 4. It can also be called a “dependence generator”. For the sake of simplicity, and following the terminology used in [9], we say “correlation function”.
Examples of correlation and quotient functions are
f θ ( t ) = θ , Q ( u , v ) = max ( u , v ) .
Note that max ( u , v ) = Π / M is the quotient of two copulas. In general, for two arbitrary copulas C 1 , C 2
Q ( u , v ) = C 1 ( u , v ) / C 2 ( u , v )
gives rise to a quotient function.
The definition below is a continuous extension of the diagonal expansion of a bivariate distribution. It is mainly based on [9], but it is presented here as a general family constructed by combining correlation and quotient functions. This family is quite useful for constructing copulas with a singular part.
Definition 3. 
Given a correlation function f θ and a quotient function Q , we define the general family of copulas
C θ = Π + Π Q 1 f θ ( t ) t 2 d t .
Properties. Most of them are readily proved.
  • Independence copula. If f θ ( t ) = 0 then C θ = Π .
  • Self-generation. If f θ ( t ) = 1 and Q = Π / C , where C is any copula, then C θ = C .
  • If Q is a quotient function then 1 / Q is also a quotient function.
  • If Q = C 1 / C 2 is the quotient of two copulas then
    W M Q M W .
  • If C is a copula and Q = Π / C then
    Π M Q Π W .
  • Quadrant dependence. Let ρ and τ be Spearman’s rank correlation and Kendall’s correlation coefficients, respectively. We have:
    C θ is positive quadrant dependent (PQD) if Q < 1 . Then ρ and τ are positive.
    C θ is negative quadrant dependent (NQD) if Q > 1 . Then ρ and τ are negative.
    Using a simplified notation, Spearman’s rank correlation coefficient is given by
    ρ = 12 I 2 ( C Π ) d Π ,
    where d Π = d u d v . Then ρ 0 if C Π (PQD) and ρ 0 if C Π (NQD).
    Kendall’s tau is
    τ = 4 I 2 C d C 1 .
    If C Π (PQD) then I 2 C d C I 2 Π d C = I 2 C d Π I 2 Π d Π = 1 / 4 shows that τ 0 . Analogously, C Π (NQD) implies τ 0 .
  • Fréchet family. If f θ ( t ) = θ , 0 θ 1 , and Q = Π / M then
    C θ = θ M + ( 1 θ ) Π .
    As a useful alternative of Definition 1, we give an equivalent expression for family (1).
Definition 4. 
If f θ and Q are correlation and quotient functions, we define the general family of copulas
C θ = Π [ 1 + G θ ( 1 ) G θ ( Q ) ] ,
where G θ ( t ) is a primitive of f θ ( t ) / t 2 .
Clearly, from the above property 3,
C θ * = Π [ 1 + G θ ( 1 ) G θ ( 1 / Q ) ]
is also a copula, being related to C θ by
C θ * = C θ + Π [ G θ ( Q ) G θ ( 1 / Q ) ] .
However, (2) could not provide a copula for some G θ and Q values. For instance, G θ ( t ) = θ t and Q = M / Π give C θ = ( 1 + θ ) Π θ M . But this C θ is not a copula for 0 < θ 1 .
  • Examples.
  • With f θ ( t ) = θ t 2 and Q ( u , v ) = 1 ( 1 u ) ( 1 v ) we obtain the FGM copula
    C θ ( u , v ) = u v [ 1 + θ ( 1 u ) ( 1 v ) ]
    and
    C θ * ( u , v ) = u v 1 θ ( 1 u ) ( 1 v ) 1 ( 1 u ) ( 1 v ) .
  • With f θ ( t ) = θ t 2 / [ 1 θ ( 1 t ) ] 2 and Q ( u , v ) = 1 ( 1 u ) ( 1 v ) we obtain the AMH copula
    C θ ( u , v ) = u v 1 θ ( 1 u ) ( 1 v )
    and
    C θ * ( u , v ) = u v 1 ( 1 u ) ( 1 v ) 1 + ( θ 1 ) ( 1 u ) ( 1 v ) .
Remark 1. 
In some sense, family (1) is a continuous extension of the diagonal expansion
C ( u , v ) = u v + n 1 ρ n 0 1 L ( u , s ) d a n ( s ) 0 1 L ( t , v ) d b n ( t ) ,
where L ( u , s ) = min ( u , s ) u s ,   L ( t , v ) = min ( t , v ) t v . The set { ρ n } is the countable sequence of canonical correlations and { a n ( U ) } a n d { b n ( V ) } are the related sequences of canonical variables and functions of U and V , respectively. This expansion can be obtained integrating the Lancaster diagonal expansion of a bivariate density [8,10,11,12,13,14].

3. Singular Copulas

From the above property 5, the quotient function Q = Π / C satisfies
Q Π / M = max ( u , v ) .
Thus, max ( u , v ) is an infimum quotient function that may provide a class of copulas with singular parts.
Let us consider the class of copulas C θ Q constructed from a correlation function f θ and a fixed quotient Q .
Proposition 1. 
If f θ is increasing in θ, then the class C θ Q with Q 1 is ordered in θ .
Proof. 
If f θ 1 ( t ) f θ 2 ( t ) for θ 1 θ 2 , then
C θ 1 Q = Π + Π Q 1 f θ 1 ( t ) t 2 d t Π + Π Q 1 f θ 2 ( t ) t 2 d t = C θ 2 Q .
Proposition 2. 
If f θ is increasing in θ and we suppose f 1 ( t ) = 1 , then
C θ Q C θ max ( u , v ) C 1 max ( u , v ) = M .
Proof. 
It follows from Q ( u , v ) max ( u , v ) , f θ ( t ) f 1 ( t ) = 1 , considering the primitive of 1 / t 2 .
As a consequence, C θ max ( u , v ) is a supremum copula for the sub-family C θ Q generated by f θ and Q 1 , where f θ is fixed and Q may vary.
If G θ ( t ) is a primitive of f θ ( t ) / t 2 , an equivalent expression for this supremum family, generated by f θ and achieved in Q = Π / M , is given by
Π + Π G θ ( 1 ) G θ ( Π / M ) .
This class of copulas was (implicitly) introduced in [15] and studied in [16]. We next study this class for different correlation functions.

3.1. Defining Singularity

Let C be a general copula. Suppose that the partial derivatives C / u and C / v exist. Consider the step function
g ( u ) = lim v u + C ( u , v ) u lim v u C ( u , v ) u .
This function is the limit of C / u as v u with v > u , minus the limit of C / u as v u with v < u . If the bivariate distribution is absolutely continuous, then g ( u ) = 0 , 0 u 1 .
If the joint distribution of ( U , V ) is C , it is s worth noting [4] that, for any v I ,
C ( u , v ) u = P ( V v | U = u ) ,
and this partial derivative exists for almost all u I . Therefore, g ( u ) 0 in (4) means that the conditional distribution function of V given U = u has a discontinuity at V = u .
Indeed, any copula C defines a measure μ C in I 2 which has an absolutely continuous part and a singular part, i.e.,
C = C a c + C s .
C is absolutely continuous if C s = 0 , whereas C is singular if C a c = 0 . In short, C has a singular part if there exists a non-empty Borel set B I 2 with Lebesgue measure μ 2 ( B ) = 0 , but μ C ( B ) > 0 . In plain words, the “area” of B is zero but the probability is positive. For instance, C has a singular part if B is a line. See [17,18,19] for further details.
We introduce a class of copulas with singular part.
Definition 5. 
Suppose that the cdf of ( U , V ) is the copula C . We say that the joint distribution is M-singular if g defined in (4) satisfies
g ( u ) 0 , 0 u 1 .
This means that there is positive probability concentrated on the diagonal D of I 2 . Note that D has zero Lebesgue measure, i.e., μ 2 ( D ) = 0 .
Now we consider the class (2) with Q = max ( u , v ) . Let μ 1 be the Lebesgue measure on the diagonal D . Dirac’s delta function is the indicator of the diagonal D , i.e., δ { u = v } = 1 if u = v , and 0 if u v . Similarly, δ { u v } .
Theorem 1 can be proven using Schwartz’s distribution theory [15,20] or by means of the Radon–Nykodim theorem [17,18,21,22]. We present a more affordable proof by proper use of limits and integrals, which can be quite useful in practice. See the Appendix A.
Theorem 1. 
Suppose that ( U , V ) has the joint cdf
C θ = Π [ 1 + G θ ( 1 ) G θ ( Π / M ) ] .
Let μ 2 and μ 1 be the Lebesgue measures on I 2 and the diagonal D of I 2 , respectively. The probability density of ( U , V ) with respect to the measure μ = μ 2 + μ 1 is given by
h θ ( u , v ) = 1 + G θ ( 1 ) G θ ( max ( u , v ) ) max ( u , v ) G θ ( max ( u , v ) ) δ { u v } + u 2 G θ ( u ) δ { u = v } .
Proof. 
If u v , the second partial derivative of C θ is given by
2 C θ ( u , v ) u v = 1 + G θ ( 1 ) G θ ( v ) v G θ ( v ) if u < v , 1 + G θ ( 1 ) G θ ( u ) u G θ ( u ) if u > v .
Considering g θ defined in (4), we have
C θ ( u , v ) = 0 u 0 v 2 C θ ( s , t ) s t d s d t + 0 min ( u , v ) g θ ( t ) d t ,
as these integrals give the mass in [ 0 , u ] × [ 0 , v ] plus the mass on the line from ( 0 , 0 ) to [ w , w ] , where w = min ( u , v ) . The second integral is
0 min ( u , v ) lim t s + C c ( s , t ) s lim t s C θ ( s , t ) s d s = 0 min ( u , v ) g θ ( t ) d t .
This expression may be interpreted by considering
C θ ( u , u + ε / 2 ) / u C θ ( u , u ε / 2 ) / u ε ε ,
with ε > 0 arbitrarily small. Accordingly, the limit as ε 0 may be informally understood as a kind of second partial derivative at ( u , u ) , post-multiplied by d u .
Let us find an explicit expression for g θ . If u > v , the partial derivative C / u is
v + v G θ ( 1 ) v G θ ( u ) u v G θ ( u ) .
If u < v , the partial derivative C / u is
v + v G θ ( 1 ) v G θ ( v ) .
The difference is v G θ ( v ) + v G θ ( u ) + u v G θ ( u ) , and the limit as v u is g θ ( u ) = u 2 G θ ( u ) .  □
Proposition 3. 
The function g θ is the correlation function. Hence,
f θ ( t ) = t 2 G θ ( t ) .
Proof. 
G θ ( t ) is a primitive of f θ ( t ) / t 2 .
Proposition 4. 
The probability of coincidence is
P [ U = V ] = 0 1 f θ ( t ) d t .
Proof. 
μ ( D ) = μ 2 ( D ) + μ 1 ( D ) , and μ 2 ( D ) = 0 , so we should consider only the mass on D .

3.2. Examples of M-Singular Copulas

1. Fréchet copula. This copula is the weighted arithmetic mean of M and Π
F θ = θ M + ( 1 θ ) Π .
We have G θ ( t ) = θ / t . The second partial derivative gives the constant ( 1 θ ) . We also find u 2 G θ ( u ) = θ . The probability density is
h θ ( u , v ) = ( 1 θ ) δ { u v } + θ δ { u = v } ,
and
P [ U = V ] = θ .
2. The Cuadras–Augé copula is the weighted geometric mean of M and Π
C A θ = M θ Π ( 1 θ ) .
Obtaining the derivatives we find
h θ ( u , v ) = max ( u , v ) θ δ { u v } + θ u 1 θ δ { u = v } .
Computing 0 1 θ u 1 θ d u , we obtain
P [ U = V ] = θ 2 θ .
3. From the correlation function f θ ( t ) = θ t , we obtain
C θ = Π 1 + θ ln ( Π / M ) .
The probability density is
h θ ( u , v ) = 1 θ ( 1 + ln max ( u , v ) ) δ { u v } + θ u δ { u = v } .
Also,
P [ U = V ] = θ / 2 .
4. From f θ ( t ) = θ t 2 exp [ θ ( 1 t ) ] , we obtain
C θ = Π exp θ ( 1 Π / M ) .
The probability density is
h θ ( u , v ) = exp θ ( 1 w ) 1 θ w δ { u v } + θ t 2 exp [ θ ( 1 t ) ] δ { u = v } ,
where w = max ( u , v ) . Then,
P [ U = V ] = 2 e θ ( θ + 1 ) 2 1 θ 2 .
See [4,19] for more examples of copulas with singular parts.

4. Canonical Analysis of a Copula

Let C be a copula with a cdf of the random vector ( U , V ) . Consider the kernels K = C Π and L = M Π . If α and β are functions of bounded variation, the covariance between α ( U ) and β ( V ) is [23]:
cov ( α ( U ) , β ( V ) ) = I 2 [ C ( u , v ) u v ] d α ( u ) d β ( v ) .
The variance of α ( U ) is
var ( α ( U ) ) = I 2 [ min ( u , v ) u v ] d α ( u ) d α ( v ) ,
and similarly, var ( β ( V ) ) . In particular, if α = β = ϕ , and C ( u , v ) is symmetric in u , v , the correlation coefficient between ϕ ( U ) and ϕ ( V ) is
cor ( ϕ ( U ) , ϕ ( V ) ) = cov ( ϕ ( U ) , ϕ ( V ) ) var ( ϕ ( U ) ) .
The notation is justified below
cov ( ϕ ( U ) , ϕ ( V ) ) = ( ϕ , K ϕ ) , var ( ϕ ( U ) ) = ( ϕ , L ϕ ) .
Therefore, we can write the correlation as
cor ( ϕ ( U ) , ϕ ( V ) ) = ( ϕ , K ϕ ) ( ϕ , L ϕ ) .
Our aim is to find the pairs ( ϕ , λ ) of canonical functions and correlations for a copula C . In particular,
ρ 1 = sup ϕ ( ϕ , K ϕ ) ( ϕ , L ϕ )
is the first canonical correlation. This functional analysis approach is related to seeking the eigenpairs of the symmetric kernel K = C Π with respect to L = M Π .
Definition 6. 
A generalized eigenfunction, eigenvalue, of K with respect to L is a pair ( ϕ , λ ) that satisfies K ϕ = λ L ϕ in the sense that
I K ( u , v ) d ϕ ( v ) = λ I L ( u , v ) d ϕ ( v ) ,
for all u I .
Clearly, if ( ϕ , λ ) with λ 0 , is an eigenpair of K with respect to L , then
( ϕ , K ϕ ) ( ϕ , L ϕ ) = λ .
This leads us to consider the canonical pairs as eigenpairs.
Definition 7. 
For arbitrarily small 0 γ 1 and ε > 0 , we define
H γ , ε ( x ) = 0 if x < γ , 1 if γ x < γ + ε , ε / ( γ + ε ) if x γ + ε ,
the limit of which is the indicator function ϕ γ ( x ) , i.e.,
lim ε 0 H γ , ε ( x ) = ϕ γ ( x ) = 1 if x = γ , 0 if x γ .
Propositions 5–8 contain preliminary results, which are useful to prove Theorem 2.
Proposition 5. 
We have I u v d H γ , ε ( v ) = 0 and
I min ( u , v ) d H γ , ε ( v ) = u u γ / ( γ + ε ) if u < γ < γ + ε , γ u γ / ( γ + ε ) if γ < u γ + ε , 0 if γ + ε < u .
Proof. 
I u v H γ , ε ( v ) = u γ u ( γ + ε ) γ / ( γ + ε ) = 0 . Similarly,
I min ( u , v ) d H γ , ε ( v ) = γ ( γ + ε ) γ / ( γ + ε ) .
Proposition 6. 
Suppose that C = Π + Π [ G ( 1 ) G ( Π / M ) ] is M-singular. Then,
I C ( u , v ) d H γ , ε ( v ) = u γ G ( γ ) u ( γ + ε ) G ( γ + ε ) γ / ( γ + ε ) if u < γ < γ + ε , u γ G ( u ) u ( γ + ε ) G ( γ + ε ) γ / ( γ + ε ) if γ < u γ + ε , u γ G ( u ) u ( γ + ε ) G ( u ) γ / ( γ + ε ) if γ + ε < u .
Proof. 
As I u v H γ , ε ( v ) = 0 , this reduces to I u v G ( max ( u , v ) ) d H γ , ε ( v ) .
Proposition 7. 
Suppose that the correlation function f θ generates the copula C θ . Consider the symmetric kernels K θ 1 = C θ 1 Π and K θ 2 = C θ 2 Π . Then,
lim ε 0 ( H γ , ε , K θ 1 H γ , ε ) ( H γ , ε , K θ 2 H γ , ε ) = f θ 1 ( γ ) f θ 2 ( γ ) .
Proof. 
Clearly, u γ as ε 0 . Then,
lim ε 0 u γ [ G θ 1 ( u ) G θ 1 ( γ + ε ) u γ [ G θ 2 ( u ) G θ 2 ( γ + ε ) ] = lim ε 0 γ 2 [ G θ 1 ( γ ) G θ 1 ( γ + ε ) ] / ε γ 2 [ G θ 2 ( γ ) G θ 2 ( γ + ε ) ] / ε = γ 2 G θ 1 ( γ ) γ 2 G θ 2 ( γ ) ,
where γ 2 G θ ( γ ) = f θ ( γ ) .
Proposition 8. 
Suppose that C = Π [ 1 + G ( 1 ) G ( Π / M ) ] is M-singular. Consider the symmetric kernels K = C Π , L = M Π . Then,
lim ε 0 ( H γ , ε , K H γ , ε ) ( H γ , ε , L H γ , ε ) = γ 2 G ( γ ) .
Proof. 
From (7) with G = G θ 1 and taking G θ 2 ( t ) = 1 / t , the limit reduces to
γ 2 G ( γ ) γ 2 ( 1 / γ 2 ) .
Since u γ and ( γ u γ / ( γ + ε ) ) / ε 1 as ε 0 , an equivalent proof follows from
lim ε 0 u γ [ G ( u ) G ( γ + ε ) γ u γ / ( γ + ε ) ] = lim ε 0 γ 2 [ G ( γ ) G ( γ + ε ) ] / ε γ / ε γ 2 / [ ( γ + ε ) ε ] .
This limit gives γ 2 G ( γ ) .
Theorem 2. 
The set of canonical functions and correlations for the M-singular family C = Π [ 1 + G ( 1 ) G ( Π / M ) ] is ( ϕ γ , f ( γ ) ) ,   0 γ 1 , where ϕ γ = lim ε 0 H γ , ε is the indicator of γ and f ( γ ) = γ 2 G ( γ ) is the correlation function.
Proof. 
lim ε 0 ( H γ , ε , K H γ , ε ) ( H γ , ε , L H γ , ε ) = ( ϕ γ , K ϕ γ ) ( ϕ γ , L ϕ γ ) = γ 2 G ( γ ) .
As C M , it is clear that K L , so
I K ( u , v ) d ϕ ( v ) I L ( u , v ) d ϕ ( v ) .
Thus, f ( γ ) = γ 2 G ( γ ) 1 .
Remark 2. 
If f : I I is a continuous function, it is worth noting that the set of canonical correlations has the power of the continuum.

Examples of Eigenpairs

  • Fréchet copula. We have f θ ( t ) = θ . For a fixed parameter θ , the set of canonical functions and correlations is ( ϕ θ , θ ) . Note that θ is an eigenvalue of continuous multiplicity. In fact, any function is eigenfunction. Also note that θ is the correlation coefficient.
  • Cuadras–Augé copula. The correlation function is f θ ( t ) = θ t 1 θ . The set of canonical functions and correlations is ( ϕ θ , θ t 1 θ ) ,   0 t 1 . Each eigenvalue is simple and we have a continuous set of eigenvalues. Note that θ is the maximum canonical correlation.
  • For the family C θ = Π 1 + θ ln ( Π / M ) , the correlation function is f θ ( t ) = θ t . The set of canonical functions and correlations is ( ϕ θ , θ t ) , 0 t 1 .
  • For the family C θ = Π exp θ ( 1 Π / M ) , the correlation function is f θ ( t ) = θ t 2 exp [ θ ( 1 t ) ] . The set of canonical functions and correlations is ( ϕ θ , θ t 2 exp [ θ ( 1 t ) ] ) .

5. Extended Singularity

Let ( U , V ) be a random vector with cdf of the copula C . To define the singularity on the second diagonal of I 2 , we consider the joint distribution of ( U , 1 V )
C σ ( u , v ) = u C ( u , 1 v ) .
Definition 8. 
Suppose that the cdf of ( U , V ) is the copula C . We say that the joint distribution of ( U , 1 V ) is W-singular if the distribution of ( U , V ) is M-singular.
Proposition 9. 
The cdf of a W-singular copula is
C σ ( u , v ) = u v + u ( 1 v ) [ G ( max ( u , 1 v ) ) G ( 1 ) ] ,
where G ( t ) is the primitive of f ( t ) / t 2 and f is a correlation function.
Proof. 
If the cdf of ( U , V ) is C and is M-singular, the cdf of ( U , 1 V ) is C σ ( u , v ) = u C ( u , 1 v ) , with C ( u , v ) = u v + u v [ G ( 1 ) G ( max ( u , 1 v ) ] . Then,
C σ ( u , v ) = u u ( 1 v ) u ( 1 v ) [ G ( 1 ) G ( max ( u , 1 v ) ] .
Simplifying this, we obtain (8). □
Theorem 3. 
Let C θ σ be a W-singular copula; see (8). The probability density with respect to μ = μ 2 + μ 1 , where μ 2 is the Lebesgue measure on I 2 and μ 1 is the Lebesgue measure on the diagonal u + v = 1 , is given by
h θ σ ( u , v ) = 1 + G θ ( 1 ) G θ ( w ) w G θ ( w ) δ { u + v 1 } + u 2 G θ ( u ) δ { u + v = 1 } ,
where w = max ( u , 1 v ) .
Proof. 
Taking into account the step of C / u at the diagonal u + v = 1 , the proof is quite similar to the one given in Theorem 1. The cdf (8) can be expressed as
0 u 0 v 1 + G θ ( 1 ) G θ ( z ) z G θ ( z ) d s d t δ { u + v 1 } + a b t 2 G θ ( t ) d t δ { u + v = 1 } ,
where z = max ( s , 1 t ) ,   a = min ( u , 1 u ) , and b = max ( u , 1 u ) .

Examples of W-Singular Copulas

  • Fréchet. If F θ = ( 1 θ ) Π + θ M , with G θ ( t ) = θ / t , we obtain
    F θ σ = ( 1 θ ) Π + θ W ,
    the weighted average of the lower bound W and Π . The density is
    h θ σ ( u , v ) = ( 1 θ ) δ { u + v 1 } + θ δ { u + v = 1 } .
  • Cuadras–Augé. If C A θ = Π ( 1 θ ) M θ with G θ ( t ) = t θ , then
    C A θ σ ( u , v ) = u C A θ ( u , 1 v ) .
    The density is
    h θ σ ( u , v ) = max ( u , 1 v ) θ δ { u + v 1 } + θ ( 1 u ) 1 θ δ { u + v = 1 } .
From u u ( 1 v ) = u v and u min ( u , 1 v ) = max ( u + v 1 , 0 ) , this family C A θ σ reduces to Π if θ = 0 and to W if θ = 1 .

6. Bivariate Singular Distributions

Let ( X , Y ) be a random vector with joint cdf H and univariate marginals F X , F Y . From Sklar’s theorem [1,4], there exists a copula C H such that H can be expressed as
H ( x , y ) = C H ( F X ( x ) , F Y ( y ) ) .
For example, considering the family (2), we have
H ( x , y ) = F X ( x ) F Y ( y ) [ 1 + G ( 1 ) G ( Q ( F X ( x ) , F Y ( y ) ) ] ,
where Q ( u , v ) is a quotient function.
The diagonal u = v of I 2 now becomes the curve with implicit equation F X ( x ) = F Y ( y ) . The singularity is along this curve and the density is
h θ ( x , y ) = 1 + G θ ( 1 ) G θ ( Z ( x , y ) ) Z ( x , y ) G θ ( Z ( x , y ) ) δ { F X ( x ) F Y ( y ) } + F X ( x ) 2 G θ ( F X ( x ) ) δ { F X ( x ) = F Y ( y ) } ,
where Z ( x , y ) = max ( F X ( x ) , F Y ( y ) ) and G θ ( F x ( x ) ) = F X ( x ) F x G θ ( F X ( x ) ) .
Next, we introduce a non-linear singularity on a general curve φ , i.e., along the points with coordinates ( x , φ ( x ) ) .
Definition 9. 
We say that the bivariate cdf H θ is φ-singular if
g θ ( x ) = lim y φ ( x ) + H θ ( x , y ) x lim y φ ( x ) H θ ( x , y ) x
satisfies g θ ( x ) 0 .
Thus, if C H is M-singular, then H = C H ( F x , F y ) is φ -singular, where φ = F y 1 F x .
There are more constructions of distributions with singular components.

6.1. Regression Family

An alternative construction of φ -singular distributions is as follows [24]. Suppose that X and Y have the same support S . Let φ : S S be a real function with positive derivative φ ( x ) > 0 . Consider the inverse function ψ = φ 1 .
A family of distributions with singular parts is
H θ ( x , y ) = θ F X ( min ( x , ψ ( y ) ) ) + ( 1 θ ) F X ( x ) J θ ( y ) ,
where, for 0 θ < 1 ,
J θ ( y ) = [ F Y ( y ) θ F X ( ψ ( y ) ) ] / ( 1 θ )
is a univariate cdf.
Proposition 10. 
The family (10) is φ-singular for 0 θ < 1 such that J θ ( y ) is a cdf. The density with respect to the measure μ = μ 2 + μ φ , where μ 2 is the Lebesgue measure on S 2 and μ φ is the Lebesgue measure on the curve y = φ ( x ) , is given by
h θ ( x , y ) = f X ( x ) f Y ( y ) θ ψ ( y ) f X ( y ) δ { y φ ( x ) } + θ f X ( x ) δ { y = φ ( x ) } .
Proof. 
The difference (9) is θ f X ( x ) and the second-order derivative is
f X ( x ) f Y ( y ) θ ψ ( y ) f X ( y ) .
Note the stochastic independence if θ = 0 .
This family has an interesting property.
Proposition 11. 
Suppose that X and Y have absolutely continuous distributions and the expectations exist. The regression curve Y / X is y = m ( x ) , where
m ( x ) = m Y + θ φ ( x ) m φ ,
with m Y = E Y ( Y ) ,   m φ = E X ( φ ( X ) ) . Hence, m ( x ) is linear in φ ( x ) .
Proof. 
If μ X = E ( X ) ,   μ Y = E ( Y ) , the regression curve is
E ( Y | X = x ) = S y × [ h θ ( x , y ) / f X ( x ) ] d y = S y × [ f Y ( y ) θ ψ ( y ) f X ( y ) ] d y + S θ y δ { y = φ ( x ) } d y = m Y θ S φ ( x ) f X ( x ) d x + θ φ ( x ) ,
We use the change y = φ ( x ) and agree [25] that S y δ { y = φ ( x ) } d y = φ ( x ) .
This regression family can be generated by the initial model (2), as a consequence of the self-generation property. Namely, consider G ( t ) = t . Then, G ( 1 ) = 1 and H θ can be expressed as
H θ ( x , y ) = F X ( x ) F Y ( y ) [ 1 1 + Q ( F X ( x ) , F Y ( y ) ) ] ,
where
Q ( F X ( x ) , F Y ( y ) ) = H θ ( x , y ) F X ( x ) F Y ( y ) .

6.2. Another Extension

We may generalize the family (3) by replacing max ( u , v ) with max ( φ ( u ) , v ) , where φ : I I is a function such that φ ( 0 ) = 0 ,   φ ( 1 ) = 1 . The extended family is
C θ , φ ( u , v ) = u v [ 1 + G θ ( 1 ) G θ ( max ( φ ( u ) , v ) ) ] .
Proposition 12. 
The family C θ , φ is φ-singular. The probability density with respect to the measure μ = μ 2 + μ φ , where μ φ is the Lebesgue measure on the curve y = φ ( x ) , is given by
1 + G θ ( 1 ) G θ ( max ( φ ( u ) , v ) ) G θ ( max ( φ ( u ) , v ) ) Φ ( u , v ) δ { v φ ( u ) } + u φ ( u ) φ ( u ) G θ ( φ ( u ) ) δ { v = φ ( u ) } ,
where
Φ ( u , v ) = v i f v > φ ( u ) , u φ ( u ) i f v < φ ( u ) .
Proof. 
The partial derivative C θ , φ ( u , v ) / u is
v [ 1 + G θ ( 1 ) ] v G θ ( v ) i f v > φ ( u ) , v [ 1 + G θ ( 1 ) ] v G θ ( φ ( u ) ) u v φ ( u ) G θ ( φ ( u ) ) i f v < φ ( u ) .
The difference in the limits as v φ ( u ) is u φ ( u ) φ ( u ) G θ ( φ ( u ) ) . We similarly obtain the second-order partial derivative 2 C θ , φ / u v .

7. An Application to Statistical Inference

Consider two independent binomial distributions B ( P , n ) , B ( P , n ) and the null hypothesis H 0 : P = P . If Ω = I 2 and ω = D (diagonal of I 2 ), then we accept H 0 if ( P , P ) ω .
The classical approach interprets P ,   P as fixed parameters and uses the chi-squared test. The Bayesian approach interprets P ,   P as random quantities and postulates a prior probability distribution. The probability of Ω is 1 . The null hypothesis H 0 is accepted if ω has probability 1 . Since ω Ω is a set such that μ 2 ( ω ) = 0 , the prior distribution concentrates mass on ω [26]. Indeed, the prior distribution must be M-singular, in order to assign positive probability to ω . We accept the null hypothesis if the probability of ω is
Pr [ ( P , P ) ω ] = 0 1 f θ ( t ) d t = 1 for θ = 1 .
This implies that f 1 ( t ) = 1 . Four suitable correlation functions f θ ( t ) are
a ) θ , b ) θ t 1 θ , c ) θ exp [ ( 1 θ ) t ] , d ) θ [ 1 ( 1 θ ) t ] .
Therefore, ( P , P ) has the prior density
h θ ( p , q ) = K ( p , q ) δ { p q } + f θ ( p ) δ { p = q } ,
with respect to the measure μ = μ 2 + μ 1 , where μ 1 , μ 2 and the right sides of
K ( p , q ) = 1 + G θ ( 1 ) G θ ( max ( p , q ) ) max ( p , q ) G θ ( max ( p , q ) )
are given in Theorem 1.
Once f θ has been chosen, we construct C θ . Then, from statistical data, e.g., the frequencies k ,   k of the events with probabilities P ,   P , the decision can be made using the Bayes factor [27,28]
B = ω L ( k . k ; p , q ) d C θ Ω ω L ( k . k ; p , q ) d C θ ,
where L is the likelihood function and d C θ reduces to d p in ω (where p = q ), and to K ( p , q ) d p d q in Ω ω .
Note the use of averages (Bayesian factor) as opposed to the use of an eigenpair (likelihood ratio in the frequentist approach).
If the null hypothesis is P = 1 P , this proposal suggests working with W-singular copulas. Of course, all this can be generalized to other comparison tests, with data drawn from normal, exponential, logistic, etc., distributions.

8. Discussion, Conclusions and Future Work

Starting from a correlation function (dependence generator), we studied several methods for constructing copulas with singular parts. The singularity is defined by a line with equation y = φ ( x ) having a positive probability. If φ is linear, we obtain singularities related to the Fréchet–Hoeffding bounds M and W . The function φ can be non-linear. We study a case in which φ is increasing. The decreasing and the general cases can be approached by using the extensions proposed in [24]. We obtain the probability density related to the singular part, a function that defines the continuous set of canonical correlations. This set is uncountable rather than countable (Mercer’s theorem [25]).
A uniparametric procedure follows from the direct application of the above models. For instance, if we have dimension d 2 , we may consider
M = min ( u 1 , , u d ) , Π = u 1 × × u d .
Then, F θ = θ M + ( 1 θ ) Π and C A θ = M θ Π ( 1 θ ) are d dimensional copulas with singular parts. See an application in [29].
More generally, we can naturally define the family
C θ = Π + Π Q 1 f θ ( t ) t 2 d t ,
where f θ is a correlation function and Q is a d dimensional quotient function. For example, Q = Π / M . See [15,24] for other multivariate families of distributions with singular parts.
An application to the Bayesian inference is commented on, showing that the singularity of the prior distribution is implicit in some tests. This approach to testing the hypothesis P = P justifies the M-singular copulas. If the null hypothesis is P = 1 P , we should use W-singular copulas.
The properties obtained via integral operators and eigenanalysis on two kernels are useful for symmetric copulas. It is an open question to find the additional conditions for the correlation and quotient functions to ensure that these models provide a copula, and to perform a generalization to non-symmetric copulas [30]. This challenge may be solved by functional singular value decomposition.

Funding

This research received no external funding.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Acknowledgments

I am indebted to three anonymous reviewers for useful comments that improved this article.

Conflicts of Interest

The author declares no conflicts of interest.

Appendix A

The following list of references by topics may be helpful.
Definition of copula: [1] p. 10, [2] p. 12, [4] p. 10.
Families of copulas: [2] chap. 5, [4] chap. 4, [9,15,16].
Absolutely continuous and singular parts: [17] p. 59, [18] p. 247.
Canonical analysis: [3] p. 49, [10] p. 108, [13] p. 582.
Diagonal expansion: [8] p. 41, [9,10] p. 248, [12] chap. 6.
Sklar’s theorem: [1] p. 42, [2] p. 13, [4] p. 17.
Distribution theory: [20] chap. 2.
Radon–Nykodim theorem: [17] p. 63, [18] p. 193, [21], p. 196, [22].
Mercer’s theorem: [25] p. 271.
Dirac delta function: [25] p. 303.
Bayesian inference, Bayes factor: [26,27] p. 153, [28] p. 30.

References

  1. Durante, F.; Sempi, C. Principles of Copula Theory; CRC Press: Boca Raton, FL, USA; Chapmam and Hall: London, UK; New York, NY, USA, 2016. [Google Scholar]
  2. Joe, H. Multivariate Models and Dependence Concepts; Chapman and Hall: London, UK, 1997. [Google Scholar]
  3. Mardia, K.V. Families of Bivariate Distributions; Charles Griffin: London, UK, 1970. [Google Scholar]
  4. Nelsen, R.B. An Introduction to Copulas, 2nd ed.; Springer: New York, NY, USA, 2006. [Google Scholar]
  5. Drouet Mari, D.; Kotz, S. Correlation and Dependence; Imperial College Press: London, UK, 2004. [Google Scholar]
  6. Cherubini, U.; Gobbi, F.; Mulinacci, S.; Romagnoli, S. Dynamic Copula Methods in Finance; Wiley: New York, NY, USA, 2012. [Google Scholar]
  7. van der Goorbergh, R.W.J.; Genest, C.; Werker, B.J.M. Bivariate option pricing using dynamic copula models. Insur. Econ. 2005, 37, 101–114. [Google Scholar] [CrossRef]
  8. Hutchinson, T.P.; Lai, C.D. The Engineering Statistician’s Guide to Continuous Bivariate Distributions; Rumsby Scientific Pub.: Adelaide, Australia, 1991. [Google Scholar]
  9. Cuadras, C.M. Contributions to the diagonal expansion of a bivariate copula with continuous extensions. J. Multivar. Anal. 2015, 139, 28–44. [Google Scholar] [CrossRef]
  10. Greenacre, M.J. Theory and Applications of Correspondence Analysis; Academic Press: London, UK, 1983. [Google Scholar]
  11. Lancaster, H.O. The structure of bivariate distributions. Ann. Math. Stat. 1958, 29, 719–736. [Google Scholar] [CrossRef]
  12. Lancaster, H.O. The Chisquared Distribution; Wiley: New York, NY, USA, 1969. [Google Scholar]
  13. Rao, C.R. Linear Statistical Inference and their Applications; Wiley: New York, NY, USA, 1973. [Google Scholar]
  14. Cuadras, C.M. Correspondence analysis and diagonal expansions in terms of distribution functions. J. Stat. Inference 2002, 103, 137–150. [Google Scholar] [CrossRef]
  15. Cuadras, C.M.; Augé, J. A continuous general multivariate distribution and its properties. Commun. Stat.-Theory Methods 1981, A10, 339–353. [Google Scholar] [CrossRef]
  16. Durante, F. A new family of symmetric bivariate copulas. C. R. Acad. Sci. Paris Ser. I 2007, 344, 195–198. [Google Scholar] [CrossRef]
  17. Ash, R.B. Real Analysis and Probability; Academic Press: New York, NY, USA, 1972. [Google Scholar]
  18. Chow, Y.S.; Teicher, H. Probability Theory: Independence, Interchangeability, Martingales; Springer: New York, NY, USA, 1978. [Google Scholar]
  19. Durante, F.; Fernández Sxaxnchez, J.; Sempi, C. A note on the notion of singular copula. Fuzzy Sets Syst. 2013, 211, 120–122. [Google Scholar] [CrossRef]
  20. Schwartz, L. Théorie des Distributions; Hermann: Paris, France, 1957. [Google Scholar]
  21. Munroe, M.E. Introduction to Measure and Integration; Addison Wesley Pub. Co.: Reading, UK, 1973. [Google Scholar]
  22. Ruiz-Rivas, C.; Cuadras, C.M. Inference properties of a one-parameter curved exponential family of distributions with given marginals. J. Multivar. Anal. 1988, 27, 447–456. [Google Scholar] [CrossRef]
  23. Cuadras, C.M. On the covariance between functions. J. Multivar. Anal. 2002, 81, 19–27. [Google Scholar] [CrossRef]
  24. Cuadras, C.M. Distributions with Given Marginals and Given Regression Curve; Lecture Notes-Momograph Series; Institute of Mathematical Statisitics: Hayward, CA, USA, 1996; Volume 28, pp. 76–83. [Google Scholar]
  25. Thomas, J.B. An Introduction to Applied Probability and Random Processes; Wiley: New York, NY, USA, 1971. [Google Scholar]
  26. Casella, G.; Moreno, E. Assessing robustness of intrinsic tests of independence in two-way contingency tables. J. Am. Stat. Assoc. 2009, 104, 1261–1271. [Google Scholar] [CrossRef]
  27. Garthwaite, P.H.; Jolliffe, I.; Jones, B. Statistical Inference; Prentice Hall: London, UK, 1995. [Google Scholar]
  28. Girón, F.J. Bayesian Testing of Statistical Hypotheses; Royal Academy of Sciences, Ed.; Arguval: Málaga, Spain, 2021. [Google Scholar]
  29. Pérez, A.; Prieto-Alaiz, M.; Chamizo, M.; Liebscher, E.; Úbeda-Flores, M. Nonparametric estimation of the multivariate Spearman’s footrule: A further discussion. Fuzzy Sets Syst. 2023, 467, 108489. [Google Scholar] [CrossRef]
  30. Marshall, A.W. Copulas, Marginals, and Joint Distributions; Lecture Notes-Momograph Series; Institute of Mathematical Statisitics: Hayward, CA, USA, 1996; Volume 28, pp. 213–222. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cuadras, C.M. On Bivariate Distributions with Singular Part. Axioms 2024, 13, 433. https://doi.org/10.3390/axioms13070433

AMA Style

Cuadras CM. On Bivariate Distributions with Singular Part. Axioms. 2024; 13(7):433. https://doi.org/10.3390/axioms13070433

Chicago/Turabian Style

Cuadras, Carles M. 2024. "On Bivariate Distributions with Singular Part" Axioms 13, no. 7: 433. https://doi.org/10.3390/axioms13070433

APA Style

Cuadras, C. M. (2024). On Bivariate Distributions with Singular Part. Axioms, 13(7), 433. https://doi.org/10.3390/axioms13070433

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop