Next Article in Journal
Probabilistic Selling with Unsealing Strategy: An Analysis in Markets with Vertical-Differentiated Products
Previous Article in Journal
A Surrogate Piecewise Linear Loss Function for Contextual Stochastic Linear Programs in Transport
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Continuous Exchangeable Markov Chains, Idempotent and 1-Dependent Copulas

Department of mathematics, College of Liberal Arts, University of Mississippi, University, MS 38677, USA
Mathematics 2025, 13(12), 2034; https://doi.org/10.3390/math13122034
Submission received: 20 May 2025 / Revised: 16 June 2025 / Accepted: 17 June 2025 / Published: 19 June 2025

Abstract

:
New copula families are constructed based on orthogonality in L 2 ( 0 , 1 ) . Subclasses of idempotent copulas with square integrable densities are derived. It is shown that these copulas generate exchangeable Markov chains that behave as independent and identically distributed random variables conditionally on the initial variable. We prove that the extracted family of copulas is the only set of symmetric idempotent copulas with square integrable densities. We extend these copula families to asymmetric copulas with square integrable densities having special dependence properties. One of our extensions includes the Farlie–Gumbel–Morgenstern (FGM) copula family. The mixing properties of Markov chains generated by these copulas are established. The Spearman’s correlation coefficient ρ S is provided for each of these copula families. Some graphs are also provided to illustrate the properties of the copula densities.
MSC:
62G08; 62M02; 60J35

1. Introduction

This work is concerned mostly with the question of the existence of bivariate copulas with square integrable densities that are idempotent. The requirement is to identify Markov chains that have a stationary joint distribution for any two of its variables. For such Markov chains, any missing observation can be ignored without consequence in the theoretical analysis, as ( X 1 , X 2 , X 3 ) has the same distribution as ( X k , X j , X m ) for any k < j < m . To answer this question, we wrap it into the larger question involving the study of positive definite kernel operators. This is possible because any copula induces a kernel operator that is a Hilbert–Schmidt operator when the density of the copula is symmetric and square integrable. Several new copula constructions are obtained, which can be identified as perturbations of the independence copula as defined in Longla (2022) [1], Longla (2022b) [2] or Komornik et al. (2017) [3].
The term copula is used in this paper in place of “bivariate copula”. When needed, the term “n-copula” is used to indicate a copula with more than two variables. A copula is a function C defined on [ 0 , 1 ] 2 such that C ( x , 0 ) = C ( 0 , x ) = 0 , C ( 1 , x ) = C ( x , 1 ) = x , and for any 0 x 1 x 2 1 , 0 y 1 y 2 1 , C ( x 2 , y 2 ) + C ( x 1 , y 1 ) C ( x 1 , y 2 ) C ( x 2 , y 1 ) 0 (see Nelsen (2006) [4] or Durante and Sempi (2016) [5]).
Many authors have worked on building copulas with various properties over the past decades. Some constructions can be found in Nelsen (2006) [4], Longla et al. (2022) [1], Longla et al. (2022b) [2], and the references therein. Some more recent constructions can be found in Chesneau (2021) [6] and (2023) [7], where trigonometric copulas were studied and some extensions of the FGM copula were constructed. These constructions are quite far from the main considerations of this paper, which is about mixing properties, Markov chains, and exchangeability. The copulas constructed here do not in any form represent a copy or an extension of their work.
One of the main points of this paper is that the constructed copula families have a mixing structure that is known; the copula of any two variables along the Markov chain that they generate belongs to the copula family of interest, and it is possible to explicitly find the joint distribution of any two variables of the chain. This makes statistical analysis easier and often avoids computational issues around approximations of the variances of estimators. One extension includes the Farlie–Gumbel–Morgenstern copula family. These families are cases of Type I Longla copulas, introduced in Longla (2024) [8].
A central role in this investigation is played by the fold product of copulas, defined in Darsow et al. (1992) [9] by C 1 ( u , v ) = C ( u , v ) , for n > 1
C n ( u , v ) = C n 1 C ( u , v ) = 0 1 C , 2 n 1 ( u , t ) C , 1 ( t , v ) d t ,
where C , i is the derivative of the copula with respect to its i t h variable. Recall that the fold product as defined for n = 2 is the joint distribution of ( X 1 , X 3 ) if ( X 1 , X 2 , X 3 ) is a stationary Markov chain with copula C and the uniform distribution as the marginal distribution (see Darsow et al. (1992) [9]). Thus, for any copula C, the set of { C n , n N , n 1 } is a closed class of copulas under the fold product (meaning that the fold product of any two elements of this set is an element of the set). For most of the copula families considered in this work, this set has a single accumulation point Π defined by Π ( u , v ) = u v , that is, the independence copula. When this is the limit of powers of a copula, it implies that observations of the Markov chain in the long run tend to be independent from the first observation. It is important to point out some examples of singular idempotent copulas or in general idempotent copulas that are not fully defined by their absolutely continuous parts. We have M ( u , v ) and C ( u , v ) = 1 2 W ( u , v ) + M ( u , v ) , where W ( u , v ) = max ( u + v 1 , 0 ) and M ( u , v ) = min ( u , v ) . The copula M is that of perfect positive correlation, while W is that of perfect negative correlation (see Nelsen (2006) [4]).
Note that by Sklar’s Theorem (see Sklar 1959 [10]), a copula can be obtained from any bivariate distribution H by scaling out the effect of marginal distributions F ( x ) and G ( y ) by the inversion formula C ( u , v ) = H ( F 1 ( u , G 1 ( v ) ) ) . This copula is not unique when the variables are not continuous. When variables are continuous, there is a one-to-one relationship between joint distributions and the set of copulas and marginal distributions. This means that for continuous random variables, any set of copula and marginal distributions is equivalent to a joint distribution function.
In general, the copula C n is the joint distribution of ( X 0 , X n ) when X 0 , , X n is a stationary Markov chain generated by C and the Uniform(0,1) distribution. These notions are important for the study of mixing properties as well as association and asymptotic distributions of averages of functions of Markov chains (see Longla et al. (2022) [1] and Longla (2024) [8]).
Another notion mentioned earlier is that of idempotent Markov chain. An idempotent Markov chain has copula C such that C 2 ( u , v ) = C ( u , v ) for ( u , v ) [ 0 , 1 ] 2 . Such copulas are called idempotent copulas. For the purposes of this work, we impose symmetry and square integrability. As will be seen later, the independence copula is the limit of C n ( u , v ) for all copulas of the form C ( u , v ) = u v + λ Φ ( v ) Φ ( v ) with | λ | < 1 . In this paper, we also show that there exists only one class of absolutely continuous idempotent copulas with square integrable density.
The idempotent copulas analyzed here have been topic of research for many scholars, among whom Darsow and Olsen (2010) [11] provided a characterization of idempotent 2-copulas through the study of invariant sets. They additionally provided general conditions and constructed some examples. Among other things, they showed that the class of idempotent copulas is a lattice under the partial ordering E F if E F = F E = E . Darsow and Olsen (2010) [11] showed that for any copula C there exists an idempotent copula E C which annihilates C (meaning that E C C ( u , v ) = C E C ( u , v ) = E C ( u , v ) ) such that E C ( u , v ) E ( u , v ) for any other idempotent annihilator E ( u , v ) of C ( u , v ) . For the idempotent copulas constructed in this work, it turns out that E C ( u , v ) = C ( u , v ) , which provides a way of characterizing idempotent copulas. For any other copula defined by Formula (3) that is not idempotent, E C ( u , v ) = Π ( u , v ) , where E C ( u , v ) is
E C ( u , v ) = lim n 1 n i = 1 n C i ( u , v ) .
Note that when lim n C n ( u , v ) exists and is C 0 ( u , v ) , it holds that E C ( u , v ) = C 0 ( u , v ) . However, it is sometimes the case that E C ( u , v ) exists when lim n C n ( u , v ) does not exist, as for the copula C 2 defined later in this paper. Because C 2 n ( u , v ) is either C 2 ( u , v ) or C 1 ( u , v ) depending on parity, E C 2 ( u , v ) = 0.5 C 1 ( u , v ) + 0.5 C 2 ( u , v ) = Π ( u , v ) .
Zabell (1995) [12] provided a characterization of probability mass functions for finite or countable state-space Markov chains. We have not found a similar characterization for continuous state-space Markov chains. Among other things, the present work fills this gap. For the special case of square integrable densities of copulas, we use the existence of eigenvalues to provide a different view of idempotent copulas while studying the mixing properties of such symmetric copulas and the reversible Markov chains that they generate.
This work can be combined with the results of Blum et al. (1958) [13] to establish asymptotic distributions of estimators in the case of exchangeable Markov chains. Blum et al. (1958) [13] showed that for any mean-zero exchangeable sequence with Gaussian joint distributions for finite subsets, correlations ρ > 0 , and variance 1, the central limit theorem holds in the form X ¯ N ( 0 , ρ ) as n ( n X ¯ N ( 0 , 1 ) when ρ = 0 ). This was stated as a corollary of another result stating that for a mean zero exchangeable sequence with variance 1, the central limit theorem holds if and only if E ( X 1 X 2 ) = E [ ( X 1 2 1 ) ( X 2 2 1 ) ] = 0 . Taylor et al. (1985) [14] provided more results on central limit theorems for mean zero exchangeable random sequences, including n X ¯ N ( 0 , 1 ) (only for exchangeable sequences for which the correlation is ρ = 0 and E ( X 1 2 X 2 2 ) = 1 = E ( X 1 2 ) ).
The rest of this paper is organized as follows. In Section 2, we introduce symmetric copulas with square integrable densities. In Section 3, we approach idempotent copulas with square integrable densities. A characterization is provided and used to build new sets of copula families that extend existing ones from Longla (2024) [8]. We show that idempotent copulas with square integrable densities generate exchangeable Markov chain. These copula families weirdly behave conditionally like the independence copula. In Section 4 we extend copula families to n-dimensional copulas that include the Farlie–Gumbel–Morgenstern copula family. Here, we also provide the Spearman’s ρ S for each of the copula families along with some useful remarks and graphs. Section 5 is in general about the central limit theorem for exchangeable Markov chains and consequences. Section 6 presents some simulation studies and related large sample theory. Finally, Section 7 provides conclusions and comments.

2. Symmetric Square Integrable Densities

This work is concerned mainly with absolutely continuous symmetric bivariate copulas C for which the density function c exists and satisfies
C ( u , v ) = 0 u 0 v c ( s , t ) d t d s .
Formula (1) follows from the fact that the singular part of the copula vanishes for absolutely continuous copulas (see Darsow et al. (1992) [9]). From the definition of a copula, when the density exists, it is a positive bivariate function on I (subset of [ 0 , 1 ] 2 of Lebesgue measure 1). For more on copulas, see Darsow et al. (1992) [9]. We say that a copula C is symmetric if and only if C ( u , v ) = C ( v , u ) for all ( u , v ) [ 0 , 1 ] 2 .
Let { 1 , φ k ( x ) , k N } be an orthonormal basis made of eigenfunctions of the operator K. Here, K φ k ( x ) = λ k φ k ( x ) , where λ k is called eigenvalue of K associated to the eigenfunction φ k . Based on this, for a symmetric copula with square integrable density (see also Beare (2010) [15], Longla (2024) [8]) and for a converging sequence λ k , the copula density can be written as
c ( u , v ) = 1 + k = 1 λ k φ k ( u ) φ k ( v ) .
When the density of the copula is a positive semi-definite kernel, all eigenvalues are positive. This is the case for idempotent copulas, as eigenvalues of the Hilbert–Schmidt operator associated with C 2 are squares of the eigenvalues of the operator associated with C. Therefore, such eigenvalues have only two possible values, 0 and 1.

Perturbations of the Copula Π and Mixing

The above representation (2) represents a form of perturbation of copulas. We define this notion in order to provide clarification. The notion of perturbation has been recently used in many papers to address modifications to the copula Π in order to improve estimation results by introducing some level of dependence. Various kinds of perturbations have been considered in Durante et al. (2013) [16], Komornik et al. (2017) [3], Longla et al. (2022) [1], Longla et al. (2022c) [17], and the references therein. This work relates to perturbations of the kind C D ( u , v ) = C ( u , v ) + D ( u , v ) , where C is the copula that is being modified by a function D. Several copula families were built using this method in Longla (2024) [8]. Here, we obtain results for the case when C ( u , v ) = Π ( u , v ) . In Longla et al. (2022) [1] and Longla et al. (2022c) [17], we considered the mixing structure of Markov chains generated by perturbations with D ( u , v ) = θ ( Π ( u , v ) C ( u , v ) ) and D ( u , v ) = θ ( M ( u , v ) C ( u , v ) ) . Mixing simplifies the study of parameter estimators and testing for problems related to copulas. In this work, I = [ 0 , 1 ] . The ψ -mixing coefficient for copula-based Markov chains with continuous marginals is defined as
ψ n = sup μ ( A ) μ ( B ) > 0 , A , B I | P n ( A B ) μ ( A ) μ ( B ) | μ ( A ) μ ( B ) ,
where P n is the probability measure associated with C n and μ ( A ) is the Lebesgue measure of A. We say that the Markov chain is ψ -mixing when ψ n 0 . The Spearman’s correlation is ρ S ( C ) = 12 0 1 0 1 C ( u , v ) d u d v 3 .
In Longla (2024) [8], absolutely continuous copulas with square integrable densities were characterized using the formula
C ( u , v ) = u v + k = 2 λ k 0 u φ k ( x ) d x 0 v φ k ( y ) d y ,
where 0 1 φ k ( x ) d x = 0 = 0 1 φ k 2 ( x ) d x 1 = 0 1 φ k ( x ) φ j ( x ) d x for every k j under conditions
k = 1 λ k 2 < , 1 + k = 1 λ k α k 0 , α k = max φ k 2 if λ k < 0 , min φ k max φ k if λ k > 0 ,
where max φ k 2 is the essential supremum of φ k and min φ k is the essential supremum of φ k . We follow this approach because for some versions of φ k there can be points at which c ( u , v ) is not positive. We call this version of φ k any function that differs from φ k on a set of Lebesgue measure 0.

3. Symmetric Idempotent Copulas

Recall that an idempotent copula is defined by C C ( u , v ) = C ( u , v ) , where ∗ is the fold product of copulas. Based on our representation of absolutely continuous copulas with square integrable densities, if C is an absolutely continuous copula with square integrable density, then for some positive integer s and a set of orthonormal functions φ k ( x ) on L 2 [ 0 , 1 ] , the following proposition holds for
Φ k ( u ) = 0 u φ k ( t ) d t , C ( u , v ) = u v + k = 1 s Φ k ( u ) Φ k ( v ) .
Proposition 1.
An absolutely continuous copula with square integrable density c ( u , v ) is idempotent when its eigenvalue 1 has finite multiplicity and it can be represented by (5) for some functions φ k ( x ) .
Examples of idempotent copulas and extensions are built using various mean zero functions on [ 0 , 1 ] . We consider such copulas families in this section.

3.1. Examples of Idempotent Copulas

Starting from our earlier comments, we note to begin with that clearly not all idempotent copulas will have densities that allow for their full identification. Earlier, we identified the following copulas that are idempotent: Π , M, and 0.5 M + 0.5 Π . Now, we formulate the following result.
Theorem 1.
For any copula C, if lim n C n ( u , v ) = C 0 ( u , v ) , then C 0 is idempotent and commutes with C.
Proof. 
Assume that lim n C n ( u , v ) = C 0 ( u , v ) . We have C n C 0 ( u , v ) = C 0 ( u , v ) . These equalities are due to the fact that the limit of the fold product does not change when we fix the first n terms. This argument is identical to the proof of Theorem 2.3 in Darsow et al. (1992) [9]. It is also consistent with the formula of the annihilator presented by Darsow and Olsen (2010) [11]. Applying Theorem 2.3 of Darsow et al. (1992) [9], we conclude C 0 ( u , v ) = C 0 2 ( u , v ) ; moreover, C C 0 = C 0 = C 0 C . □
In search of idempotent copulas, we consider Formula (5) with a single piecewise constant function. Note that the copula (5) has a piecewise constant density function if and only if φ is piecewise constant. The following copula family is obtained considering φ ( x ) = a I ( 0 x c ) + b I ( c < x 1 ) with a , b > 0 (see Longla (2024) [8]). Simple computations show that the only possible values of a and b satisfy the equations b a = 1 and c ( b 2 + 1 ) = b 2 . These conditions come from the fact that the function has its integral equal to 0 and that its square integrates to 1. The copula C α for a = α = 1 / b , c = b 2 / ( 1 + b 2 ) is
C α ( u , v ) = ( 1 + α ) u v , if 0 u , v 1 α + 1 u , if 0 u 1 α + 1 < v 1 v , if 0 v 1 α + 1 < u 1 u v + 1 α ( 1 u ) ( 1 v ) , if 1 α + 1 < u , v 1 .
The copula in (6) has a maximal coefficient of correlation 1 and ρ S ( C α ) = 3 α ( 1 + α ) 2 for α > 0 . Thus, this is an example of an idempotent copula with less than full support. This copula satisfies C 0 ( u , v ) = C ( u , v ) = Π ( u , v ) , where these copulas are understood as limits when α 0 and α .
Remark 1.
For Markov chains generated by C α with U n i f o r m ( 0 , 1 ) marginal distribution, the sets A = ( 1 / ( 1 + α ) , 1 ] and B = [ 0 , 1 / ( 1 + α ) ] are absorbing sets. Therefore, the Markov chains are not recurrent but rather aperiodic. This means that when a Markov chain generated by (6) starts in each of these sets, it does not leave for the other set. The only chance to visit the other set is to reach the point x = 1 α + 1 , and this is an event with probability 0. Therefore, a single realization of a Markov chain generated by this copula and the uniform distribution is a set of independent values from one of sets A and B. Thus, repeated runs of the Markov chain will have separate random samples from A with probability α α + 1 and from B with probability 1 α + 1 .
For α = 1 , we denote the copula C 1 . This copula is also obtained using φ ( x ) = s i g n ( 2 x 1 ) in our characterization of idempotent copulas with s = 1 . This copula has a density that is uniform on [ 0 , 0.5 ] 2 [ 0.5 , 1 ] 2 and has ρ S ( C ) = 3 / 4 . For a large sample of Markov chains generated by this copula and the u n i f o r m ( 0 , 1 ) marginal distribution, there is approximately the same number of chains with values in ( 0 , 0.5 ) and ( 0.5 , 1 ) . A related copula is C 2 , defined by density c 2 ( u , v ) = 1 s i g n ( 2 u 1 ) · s i g n ( 2 v 1 ) . It is obvious that C 2 2 ( u , v ) = C 1 ( u , v ) . C 2 is not idempotent but is rather recurrent and periodic with ρ S ( C 2 ) = 3 / 4 . This copula is uniform on ( [ 0.5 , 1 ] × [ 0 , 0.5 ] ) ( [ 0 , 0.5 ] × [ 0.5 , 1 ] ) . Any Markov chain generated by this copula has values that bounce back and forth between the two regions [ 0 , 0.5 ] and [ 0.5 , 1 ] . Unlike for C 1 , C 2 generates Markov chains that can take values all over the set [ 0 , 1 ] :
C 2 ( u , v ) = 0 , if 0 u , v < 0.5 ( 2 v 1 ) u , if 0 u < 0.5 v < 1 ( 2 u 1 ) v , if 0 v < 0.5 u < 1 u + v 1 , if 0.5 < u , v 1 .
Theorem 2.
For any α > 0 , let A be a union of sub-intervals of [ 0 , 1 ] of measure 1 1 + α and let B be its complement on [ 0 , 1 ] . Then,
c ( u , v ) = ( 1 + α ) I ( u , v A ) + 1 α I ( u , v B )
is the density of an idempotent copula.
The proof of this result is based on simple computations using φ ( x ) = α I ( x A ) + 1 α I ( x B ) . The condition that makes c ( u , v ) a copula density is built in the construction ( min φ max φ = 1 ) . An example of copula density uses A = [ 0 , 1 2 ( 1 + α ) ) ( 1 1 + α , α + 2 2 ( 1 + α ) ] and B = [ 1 2 ( 1 + α ) , 1 1 + α ] ( α + 2 2 ( 1 + α ) , 1 ] .
Note that based on Longla (2015) [18], an absolutely continuous copula that is bounded away from zero on a set of Lebesgue measure 1 generates ψ -mixing Markov chains. Based on Formula (5), absolutely continuous symmetric idempotent copulas with square integrable densities have maximal coefficient of correlation equal to 1. Therefore, these copulas do not generate ρ -mixing. It follows that they do not generate ψ -mixing (as ψ -mixing implies ψ -mixing; see Bradley (2007) [19]) and their densities are not bounded away from zero on any set of Lebesgue measure 1. Thus, the following theorem holds.
Theorem 3.
Any absolutely continuous copula that generates non-ψ-mixing Markov chains with uniform marginals has a density that is either equal to zero on a set with a non-zero Lebesgue measure (when piecewise constant) or is equal to zero on a closed set with a Lebesgue measure of 0 (when not piecewise constant).
Based on Theorem 3, we state the following more general result.
Theorem 4.
The copula in (6) is the only absolutely continuous symmetric idempotent copula with piecewise constant density.
Proof. 
Assuming that the copula is idempotent with piecewise continuous density implies that the functions φ i ( x ) of Formula (5) are piecewise constant functions. If we assume that φ i ( x ) = k = 1 m a k I ( x A k ) for A i , forming a partition of [ 0 , 1 ] , μ ( A i ) 0 , and a i 0 for all i = 1 , , m , then
0 1 φ i ( x ) d x = k = 1 m a k μ ( A k ) = 0 , 0 1 φ i 2 ( x ) d x = k = 1 m a k 2 μ ( A k ) = 1 ,
and c ( u , v ) = 1 + φ i ( u ) φ i ( v ) = 1 + j = 1 m k = 1 m a k a j I ( u A k , v A j ) .
The key point in this proof is the fact that C is a copula and does not generate ψ -mixing stationary Markov chains. This implies that Theorem 3 holds. Thus, there exists a rectangle A i 0 × A i 1 on which the copula density is 0. Without loss of generality, we can assume that a 1 a 2 a m . Otherwise, we can always renumber the sets to make sure that this condition holds. Assuming this ordering, c ( u , v ) 0 for all ( u , v ) [ 0 , 1 ] 2 implies a 1 a m + 1 = 0 . This can take some algebra to prove, but does not present any difficulties.
At this point, we have a 1 < 0 , a m > 0 and a 1 = 1 a m . Suppose that m > 3 and that a 1 , a 2 , a m are such that a 1 < a 2 < a m . This means that there are at least three different values of the eigenvalue function. Then, an additional relationship allows us to form a system of three equations and with three unknowns μ 1 , μ 2 , μ m . This additional equation is due to the partition of [ 0 , 1 ] , k = 1 m μ ( A k ) = 1 . Regrouping the terms, we obtain
μ 1 + μ 2 + μ m = 1 k = 3 m 1 μ ( A k ) , a 1 μ 1 + a 2 μ 2 + a m μ m = k = 3 m 1 a k μ ( A k ) , a 1 2 μ 1 + a 2 2 μ 2 + a m 2 μ m = 1 k = 3 m 1 a k 2 μ ( A k ) .
Therefore, using Cramer’s rule, we obtain
μ 1 = 1 + a 2 a m k = 3 m 1 ( a k a 2 ) ( a k a m ) μ k ( a 1 a m ) ( a 1 a 2 ) , μ 2 = 1 + a 1 a m k = 3 m 1 ( a k a 1 ) ( a k a m ) μ k ( a 2 a m ) ( a 2 a 1 ) , μ m = 1 + a 2 a 1 k = 3 m 1 ( a k a 1 ) ( a k a 2 ) μ k ( a m a 1 ) ( a m a 2 ) .
Note that this solution is possible and unique only because a i a i when i j . Moreover, given that μ i > 0 for all i, we can look carefully at μ 2 to realize that 1 + a 1 a m = 0 and ( a 2 a m ) ( a 2 a 1 ) < 0 and that ( a k a 1 ) ( a k a m ) < 0 for all i = 3 m 1 . This leads to μ 2 < 0 , which is a contradiction. Therefore, we cannot have more than three different values of φ i ( x ) , meaning that there can only be two different values of φ i ( x ) . To be exact, these arguments assume that we are working on a subset of full Lebesgue measure, on the complement of which ϕ i ( x ) might have other values. The rest of the proof for one function φ i is the content of Theorem 2.
Now, if we consider the case with several functions φ i , each has to take the form specified by Theorem 2. Moreover, φ i are orthogonal and are defined with α 1 , α 2 , as in Theorem 2. We can split the segment [ 0 , 1 ] into three sets A 1 , A 2 , A 3 such that μ ( A 1 ) = 1 1 + α 1 , μ ( A 3 ) = α 2 1 + α 2 , μ ( A 2 ) = α 2 α 1 ( α 1 + 1 ) ( α 2 + 1 ) , and
φ 1 ( x ) φ 2 ( x ) = α 1 α 2 I ( x A 1 ) + 1 α 1 α 2 I ( x A 3 ) α 1 α 2 I ( x A 2 ) .
The integral of this function easily leads to
α 2 α 1 α 1 α 2 + 3 α 1 + 1 α 2 ( α 1 + 1 ) ( α 2 + 1 ) = 0 ,
implying that α 2 = 1 + 3 α 1 1 α 1 . On the other hand, c ( u , v ) = 1 + φ 1 ( u ) φ 1 ( v ) + φ 2 ( u ) φ 2 ( v ) fails to be a copula density because c ( u , v ) = α 1 α 2 < 0 for ( u , v ) A 1 × A 3 . Therefore, we cannot have more than one function in the sum. This concludes the proof. □
Theorem 5.
If 0 1 f ( x ) d x = 0 , 0 1 f 2 ( x ) d x = 1 , and 1 + min f ( x ) max f ( x ) = 0 , then there exists A 0 I , B 0 I , and m 0 : f 0 ( x ) = m I ( x A 0 ) 1 m I ( x B 0 ) , where μ ( A 0 ) = 1 1 + m 2 , μ ( B 0 ) = m 2 1 + m 2 .
Proof. 
Suppose that the conditions of Theorem 5 are satisfied but that its conclusion does not hold. Let max f ( x ) = m , min f ( x ) = 1 / m , A = { x : f ( x ) = m } , and B = { x : f ( x ) = 1 / m } . The first step is to show that μ ( A ) 1 1 + m 2 , μ ( B ) m 2 1 + m 2 .
Given that Theorem 5 does not hold, we have μ ( A ) + μ ( B ) < 1 . Suppose that μ ( A ) > 1 m 2 + 1 . Then, it follows that μ ( B ) < m 2 m 2 + 1 . Moreover, f ( x ) 1 / m implies that
[ 0 , 1 ] A B f ( x ) d x 1 m μ ( [ 0 , 1 ] A B ) = 1 m ( 1 μ ( A ) μ ( B ) ) .
So , 0 = A B f ( x ) d x + [ 0 , 1 ] A B f ( x ) d x m μ ( A ) 1 m μ ( B ) 1 m ( 1 μ ( A ) μ ( B ) ) .
The right-hand side is μ ( A ) m 2 + 1 m 1 m = m 2 + 1 m ( μ ( A ) 1 m 2 + 1 ) . Thus, μ ( A ) 1 m 2 + 1 . On the left-hand side, f ( x ) m implies
[ 0 , 1 ] A B f ( x ) d x m μ ( [ 0 , 1 ] A B ) = m ( 1 μ ( A ) μ ( B ) ) .
Thus , 0 = A B f ( x ) d x + [ 0 , 1 ] A B f ( x ) d x m μ ( A ) 1 m μ ( B ) + m ( 1 μ ( A ) μ ( B ) ) .
The right-hand side is equal to m 2 + 1 m ( m 2 m 2 + 1 μ ( B ) ) , which implies μ ( B ) m 2 m 2 + 1 . Therefore, we have μ ( A ) 1 m 2 + 1 and μ ( B ) m 2 m 2 + 1 .
The second step is to show that unless μ ( A ) = 1 m 2 + 1 and μ ( B ) = m 2 m 2 + 1 , we have a contradiction. We define f 0 ( x ) = m I ( x A 0 ) 1 m I ( x B 0 ) with A A 0 , B B 0 and μ ( A 0 ) = 1 1 + m 2 , μ ( B 0 ) = m 2 1 + m 2 . Note that f ( x ) = f 0 ( x ) for x A B , f ( x ) f 0 ( x ) for x A B , and the integral of each of these functions is equal to 0. Taking the difference of these integrals, it follows that
A 0 A f ( x ) f 0 ( x ) d x + B 0 B f ( x ) f 0 ( x ) d x = 0 .
If f ( x ) f 0 ( x ) , then the first integral is strictly negative and the second is strictly positive. Moreover, 0 1 f 2 ( x ) d x = 0 1 f 0 2 ( x ) d x implies that
( A 0 A ) ( B 0 B ) f 2 ( x ) d x = ( A 0 A ) ( B 0 B ) f 0 2 ( x ) d x .
Given that f 2 ( x ) < f 0 2 ( x ) on ( A 0 A ) ( B 0 B ) , Equation (9) implies that the set ( A 0 A ) ( B 0 B ) has Lebesgue measure 0. This leads to μ ( A ) = 1 m 2 + 1 and μ ( B ) = m 2 m 2 + 1 . Thus, f ( x ) = f 0 ( x ) except perhaps on a set of Lebesgue measure 0. □

3.2. Exchangeable Markov Chains

Note that exchangeability implies that the copula of ( X 1 , X 2 ) is idempotent because the copula of ( X 1 , X 3 ) is C 2 ( u , v ) under the assumption of Markov chain and ( X 1 , X 3 ) has the same distribution as ( X 1 , X 2 ) due to exchangeability. For this reason, we can only consider copulas of the type C ( u , v ) = u v + k = 1 s Φ k ( u ) Φ k ( v ) . We have seen that for absolutely continuous idempotent copulas with square integrable densities, the only non-zero eigenvalue is λ = 1 with finite multiplicity. To allow such a copula to generate an exchangeable Markov chain, if we denote the eigenfunctions for k = 1 , , s as φ k and Φ k ( x ) = 0 x φ k ( t ) d t , then we obtain the following theorem for the copula-based Markov chains that they generate.
Theorem 6.
All absolutely continuous idempotent copulas with square densities generate exchangeable Markov chains. Their densities are provided by (8) for some A , B [ 0 , 1 ] such that μ ( A ) = 1 α + 1 , μ ( B ) = α α + 1 , where μ is the Lebesgue measure on [ 0 , 1 ] and A B = . The copula of the Markov chain ( X 1 , , X n + 1 ) is obtained for n > 1 via the recurrence relationship
C n ( x 1 , , x n + 1 ) = 0 x n C n 1 , n ( x 1 , , x n 1 , t ) × C , 1 ( t , x n + 1 ) d t .
Proof. 
The proof is based on the fact that any other copula will need to be characterized by C ( u , v ) = u v + k = 1 s Φ k ( u ) Φ k ( v ) . Considering s = 1 , exchangeability implies that C ( x 1 , x 2 , x 3 ) = C ( x 1 , x 3 , x 2 ) . This can be written as
0 x 2 C , 2 ( x 1 , t ) C , 1 ( t , x 3 ) d t = 0 x 3 C , 2 ( x 1 , t ) C , 1 ( t , x 2 ) d t .
Thus, substituting the necessary derivatives, we obtain
0 x 2 ( x 1 + Φ 1 ( x 1 ) φ 1 ( t ) ) × ( x 3 + Φ 1 ( x 3 ) φ 1 ( t ) ) d t =
0 x 3 ( x 1 + Φ 1 ( x 1 ) φ 1 ( t ) ) × ( x 2 + Φ 1 ( x 2 ) φ 1 ( t ) ) d t .
Integrating and simplifying leads to
x 3 Φ ( x 1 ) Φ ( x 2 ) + Φ ( x 1 ) Φ ( x 3 ) 0 x 2 φ 2 ( t ) d t = x 2 Φ ( x 1 ) Φ ( x 3 ) + Φ ( x 1 ) Φ ( x 2 ) 0 x 3 φ 2 ( t ) d t .
After simplifications, we obtain
x 3 Φ ( x 2 ) + Φ ( x 3 ) 0 x 2 φ 2 ( t ) d t = x 2 Φ ( x 3 ) + Φ ( x 2 ) 0 x 3 φ 2 ( t ) d t .
Rearranging the terms yields Φ ( x 3 ) ( 0 x 2 φ 2 ( t ) d t x 2 ) = Φ ( x 2 ) ( 0 x 3 φ 2 ( t ) d t x 3 ) .
Because this equality holds for all x 2 , x 3 , there exists a constant k such that
0 x φ 2 ( t ) d t x = k Φ ( x ) for almost all x .
Taking the derivatives provides φ 2 ( x ) 1 = k φ ( x ) for almost every x. Thus, for any k, φ ( x ) has only two possible values where the product is equal to 1 . If we can show that the sets on which those values are taken have Lebesgue measures 1 α + 1 and α 1 + α for some positive α , then this exactly defines the idempotent copula provided earlier and shown in Theorem 5. Because all implications in the proof are equivalencies, the copula provided here defines exchangeable Markov chains.
We now have φ ( x ) = α I ( x A ) 1 α I ( x B ) . Theorem 4 completes the proof. □
The distribution of ( X 1 , X 2 , X 3 ) is
C ( x 1 , x 2 , x 3 ) = x 1 x 2 x 3 + x 1 Φ ( x 2 ) Φ ( x 3 ) + x 3 Φ ( x 1 ) Φ ( x 2 ) + x 2 Φ ( x 1 ) Φ ( x 3 ) + 1 α α Φ ( x 1 ) Φ ( x 2 ) Φ ( x 3 ) .
This copula is obtained by computing 0 x 2 C , 2 ( x 1 , t ) C , 1 ( t , x 3 ) d t . According to Nelsen (2006) [4], this integral provides the joint density of the said vector. When α = 1 , we have
C 3 ( x 1 , x 2 , x 3 , x 4 ) = x 1 x 2 x 3 x 4 + x 1 x 2 Φ ( x 3 ) Φ ( x 4 ) +
x 1 x 4 Φ ( x 2 ) Φ ( x 3 ) + x 1 x 3 Φ ( x 2 ) Φ ( x 4 ) + x 2 x 4 Φ ( x 1 ) Φ ( x 3 ) +
x 2 x 3 Φ ( x 1 ) Φ ( x 4 ) + x 3 x 4 Φ ( x 1 ) Φ ( x 2 ) + Φ ( x 1 ) Φ ( x 2 ) Φ ( x 3 ) Φ ( x 4 ) .

4. Multivariate Extensions

We consider n-copulas with n > 2 as extensions of our copula families. This emphasizes the importance of our constructions via some examples of new copula families with specific dependence properties. A set of copulas that generate exchangeable random vectors that are not Markov chains is proposed.
Proposition 2.
For any bounded mean-zero function φ with
( max φ 2 ( x ) ) n / 2 θ ( max φ 2 ( x ) ) n / 2 , Φ ( x ) = 0 x φ ( t ) d t ,
the function C ( x 1 , , x n ) = x 1 x 2 + θ Φ ( x 1 ) Φ ( x n )
is a copula that generates exchangeable vectors that are not Markov chains.
Proof. 
The condition on θ guarantees that we have a positive density, the mean of zero ensures that θ does not affect lower-dimensional distributions, and symmetry ensures that any subset of n 1 components consists of independent variables. This implies that any subset of n 1 or fewer components is made up of independent variables. □
Another example is the extension of the copula C 1 with density
c ( x 1 , , x n ) = 1 + θ s i g n ( 1 2 x 1 ) s i g n ( 1 2 x n ) , provided by
C ( x 1 , , x n ) = i = 1 n x i + θ i = 1 n x i δ ( x i ) ( 1 x i ) 1 δ ( x i ) ,
where δ ( x ) = I ( x < . 5 ) , | θ | 1 . Using φ ( x ) = 2 x 1 , we obtain a subclass of the Farlie–Gumbel–Morgenstern multivariate copula
C θ ( x 1 , , x n ) = i = 1 n x i + θ i = 1 n ( x i x i 2 ) | θ | 1 .
In the spirit of Proposition 2, the following theorem holds.
Theorem 7.
For any integer n 3 and for any mean zero function φ i such that | min φ | = max φ = 1 , let Φ ( x ) = 0 x φ ( s ) d s ; then,
the function C ( x 1 , , x n ) = i = 1 n x i + θ i = 1 n Φ ( x i ) + i = 1 n θ i x i j i Φ ( x j ) ,
where i = 0 n | θ i | 1 is a copula that is exchangeable iff θ i are equal for i > 0 .
Examples include the following trigonometric copulas for any k N * by
φ ( x ) = cos k π x for Φ ( x ) = 1 k π sin k π x ,
φ ( x ) = sin 2 k π x , for Φ ( x ) = 1 2 k π ( cos 2 k π x 1 ) ,
C ( u , v , w ) = u v w + θ 0 ( k π ) 3 sin k π u sin k π v sin k π w + θ 1 ( k π ) 2 u sin k π v sin k π w + θ 2 ( k π ) 2 v sin k π u sin k π w + θ 3 ( k π ) 2 w sin k π u sin k π v .
Using a shifted truncated hyperbolic tangent, we obtain
φ ( x ) = 1 e 4 x + 2 1 + e 4 x + 2 , for Φ ( x ) = x + 1 2 ln e 4 x + 2 + 1 e 2 + 1 = 1 2 ln e 2 x + 2 + e 2 x e 2 + 1 ,
C ( u , v ) = u v + θ 4 ln e 2 u + 2 + e 2 u e 2 + 1 ln e 2 v + 2 + e 2 v e 2 + 1 .
Following Longla (2024) [8], note that the maximal coefficient of correlation for Markov chains generated by this copula is | λ | , where λ = 2 θ 1 + e 2 is the only non-zero eigenvalue of the copula operator. This conclusion relies on
0 1 φ 2 ( x ) d x = 2 1 + e 2 .
Therefore, we can take | θ | ( 1 + e 2 ) 2 ( 1 e 2 ) 2 = θ m a x in Formula (14) or | λ | 2 ( 1 + e 2 ) ( 1 e 2 ) 2 . This range of dependence λ is larger than that of the FGM copula.
Consider φ ( x ) = ( α + 1 ) x α 1 (for α > 0 ); via simple computations, we can establish the following proposition.
Proposition 3.
For any α R + and for any λ R such that
α 2 2 α + 1 λ α 2 α + 1 ( for α 1 ) or 1 2 α + 1 λ α 2 α + 1 ( for α 1 ) ,
C ( u , v ) = u v + 2 α + 1 α 2 λ ( u α + 1 u ) ( v α + 1 v )
is a copula. Moreover, ρ S ( C ) = 3 ( 2 α + 1 ) ( α + 2 ) 2 λ and C is the FGM when α = 1 .
Using φ ( x ) = 5 ( 3 x 2 1 ) / 2 , we obtain Φ ( x ) = 5 ( x 3 x ) / 2 and
C ( u , v ) = u v + 5 λ 4 ( u 3 u ) ( v 3 v ) , 1 5 λ 2 5 .
The positive Spearman correlations for this copula are larger than those of the FGM copula. We have ρ S ( C F G M ) = λ , while
ρ S ( C ) = 15 λ 16 for 3 16 ρ S 3 8 .
Using φ ( x ) = 2 ( 3 x 2 ) , we obtain a copula with 1 8 λ 1 4 provided by
C ( u , v ) = u v + 8 λ u v ( u 1 ) ( v 1 ) .
In this case, ρ S ( C ) = 24 25 λ ; thus, 3 25 ρ S ( C ) 6 25 and the range of ρ S ( C ) reduces.
Remark 2.
The particularity of the copulas from Theorem 7 is that any subset of n 2 variables from the vector with the copula in (13) is made of independent components. For Proposition 2, any subset of n 1 variables from the vector with the copula in (11) is made up of independent components.
Theorem 8.
Under the assumptions of Theorem 7, for any k < n 1 , the function
C ( x 1 , , x n ) = i = 1 n x i + θ 0 i = 1 n Φ ( x i ) + J M θ J j J x j j J Φ ( x j )
with | θ 0 | + J M | θ J | 1 , where the sum is taken over all subsets J of size k 1 of M = { 1 , 2 , , n } , is a copula for which any subset of n k variables is made up of independent components. This copula is exchangeable iff θ J are equal for J 0 .
Figure 1 presents an example of a copula built using the truncated hyperbolic tangent function. The graph shows the level curves for the specified θ .
Figure 2 shows the level curves and density of the extreme values for (16).
Figure 3 shows the level curves of the extreme values for (17).
Figure 4 shows the densities of the extreme values for the copulas in (17).

Examples of 1-Dependent Copulas

In this section we discuss asymmetric copulas. We consider asymmetry in the sense of using different basis functions for u and v. The first example we consider is with the cosine functions as φ k . For any positive integers k 1 and k 2 and for any real number θ such that | θ | 1 ,
C ( u , v ) = u v + θ k 1 k 2 π 2 sin ( k 1 π u ) sin ( k 2 π v )
is a copula. For k 1 k 2 , this copula is not symmetric; however, per Longla et al. (2022b) [2], it generates ψ -mixing Markov chains. Moreover, it can be easily shown that for a Markov chain generated by this copula, any pair of variables separated by one or more variables is made up of independent components.
Figure 5 shows a graph of an example of the copulas in (19) with k 1 = 1 , k 2 = 2 , and θ = 1 . The Spearman correlation for this family of copulas is 0 unless k 1 and k 2 are both even integers. The second example that we consider has the same properties as the previous example but is based on orthogonal polynomials:
C ( u , v ) = u v + θ ( 2 u 3 3 u 2 + u ) ( v 2 v )
for | θ | 1 is a copula. The Spearman correlation is 0 for any value of θ .
Figure 6 shows the level curves and density of the copula in (20) with θ = 1 . A slight modification of (20) provides the following copula, which does not more have the property of m-dependence for any m:
C ( u , v ) = u v + θ u v ( u 2 1 ) ( v 1 ) = u v + θ ( u 3 u ) ( v 2 v ) .
The Spearman correlation for this copula is θ / 2 with | θ | 1 / 2 . As in the last example, we consider piecewise constant functions for | θ | 1 :
C ( u , v ) = u v θ u v δ ( v ) ( 1 v ) 1 δ ( v ) , u 0.25 u v + θ ( u 0.5 ) v δ ( v ) ( 1 v ) 1 δ ( v ) , 0.25 < u 0.75 u v + θ ( 1 u ) v δ ( v ) ( 1 v ) 1 δ ( v ) 0.75 < u .
For θ = 1 , the copula in (22) and its density are equal to 0 for 0 u 0.25 and 0 v 0.5 , while for θ = 1 the density is equal to 0 for 0.75 < u 1 and 0.5 < v 1 . Moreover, for all θ , the Spearman correlation is 0 and C 2 ( u , v ) = Π ( u , v ) . This implies that any Markov chain generated by this copula and a continuous marginal distribution is ψ -mixing. This means that the conditions of Longla (2022c) [17] are not necessary for ψ -mixing.

5. Central Limit Theorem and Exchangeability

Based on the properties of exchangeable random vectors, the variance of partial sums of functions ( Y = f ( X ) ) of { X 1 , , X n } generated by an exchangeable copula is provided by the formula
v a r ( S n ) = n ( 1 + ( n 1 ) ρ ) v a r ( Y 1 ) ,
where ρ is the Pearson correlation of any two components ( Y s , Y k ) of the vector. This is justified by the fact that the joint distributions of any two components of the vector are equal. Kipnis and Varadhan (1986) [20] justified this on the basis that if the exchangeable vector is a Markov chain, then it is reversible and the central limit theorem holds if and only if ρ = 0 , as in that case we have v a r ( S n ) / n v a r ( Y 1 ) . It is clear that even if there is no independence, this variance implies that when ρ = 0 , the central limit theorem holds in the form
n S n n μ N ( 0 , σ 2 ) ,
where μ = E f ( X ) , σ 2 = V a r ( f ( X ) ) . This result is implied by Kipnis and Varadhan (1986) [20], and the equivalence of this convergence and the condition ρ = 0 is a consequence of Theorem 2.1.2 and Remark 2.1.1 in Taylor et al. (1985) [14]. In this case, we also see that when ρ 0 , it is forcibly the case that ρ > 0 and v a r ( S n ) n . Therefore, S n n cannot satisfy a central limit theorem when ρ 0 .
For examples of the exchangeable Markov chains mentioned in this work, we have the following. First, partial sums of Markov chains generated by M ( u , v ) do not satisfy the central limit theorem, as the variance of partial sums is exactly n 2 v a r ( Y 1 ) . Markov chains generated by 1 2 [ M ( u , v ) + W ( u , v ) ] and the uniform distribution are rows of x 0 , 1 x 0 , with probability 0.5 to move at any time to any of these two states. They are conditionally independent given the first state; thus, the partial sums conditionally satisfy the central limit theorem. Considering the constructed copula families, we can conclude that Let 0 a 1 4 , b a = 1 4 ( 1 2 a ) ,
φ a ( x ) = I ( x [ 0 , a ) [ b , 0.5 + b a ) ) I ( x [ a , b ) [ 0.5 + b a , 1 ] ) .
Note that a = 1 4 implies that b = 3 4 ; the last interval has only one point, and can be ignored due to continuity of the random variables.
Theorem 9.
The copula C a ( u , v ) defined by density c a ( u , v ) = 1 + φ a ( u ) φ a ( v ) is idempotent and generates exchangeable Markov chains ( X i , i N ) , for which the following hold:
1. 
When the marginal distribution F ( x ) is continuous with density f ( x ) , these Markov chains are not α-mixing, but satisfy the conditional central limit theorem (24) for any given fixed initial value, where μ and σ 2 are the conditional mean and variance of X 1 | X 0 .
2. 
The covariance of any two variables of Markov chains generated by these copulas is always positive.
Proof. 
Note that this copula is constant on eight subsets that form its support and that there are regions that can never be reached after the first value is known. This justifies the Markov chain being non-mixing. The values seen for every realization of the Markov chain depends on the first observed value. All other observations of the Markov chain after the first observation are independent and uniformly distributed on two possible subintervals. Therefore, the conditional central limit theorem holds given X 0 for partial sums. This is what Peligrad (2023) [21] called the quenched central limit theorem.
The covariance of any two variables of the Markov chain generated by these copulas with a continuous distribution F is
C o v ( X , Y ) = ( x f ( x ) φ ( F ( x ) ) d x ) 2 = ( 0 1 F 1 ( u ) φ ( u ) d u ) 2 .
The only other point in Theorem 9 that requires a proof is mixing, which we can complete by the following proposition. □
Proposition 4.
Exchangeable Markov chains generated by copulas with symmetric absolutely continuous and square integrable density and continuous marginal distributions are non-mixing.
For general notions on mixing, see Bradley (2007) [19]. Here, we use α -mixing, which is equivalent to
lim n sup A , B | P n ( A , B ) P ( A ) P ( B ) | = 0 .
Proof. 
The proof of the proposition relies on the fact that φ ( x ) = 1 on a set A of Lebesgue measure 1 2 and φ ( x ) = 1 on a set B of Lebesgue measure 1 2 .
| P n ( X 0 A , X n B ) P ( A ) P ( B ) | = P ( A ) P ( B ) = 1 4 ,
since the copula of ( X 0 , X n ) is absolutely continuous and its density is 0 on A × B . Therefore, α n = 1 4 for all positive integer n.

6. Applications

6.1. Random Sample from C α ( u , v )

Assume that a random sample { ( X 1 i , Y 2 i ) , i = 1 , , n } is obtained from a vector ( X 1 , X 2 ) of the copula in (6) and marginal distributions F θ 1 ( x ) and F θ 2 ( x ) , where θ 1 and θ 2 are parameters belonging to Θ 1 R k 1 and Θ 2 R k 2 . It is known that the joint cumulative distribution function of ( X 1 , X 2 ) is H ( x , y ) = C α ( F θ 1 ( x ) , F θ 2 ( y ) ) and that the joint density function is h ( x , y ) = f θ 1 ( x ) f θ 2 ( y ) c α ( F θ 1 ( x ) , F θ 2 ( y ) ) , where f θ 1 ( x ) and f θ 2 ( y ) are the density functions of X 1 and X 2 respectively. It follows that the likelihood function is
L ( α , θ 1 , θ 2 ) = ( 1 + α ) n α n 2 i = 1 n f θ 1 ( X 1 i ) f θ 2 ( X 2 i ) A i ,
where A i = I ( F θ 1 ( X 1 i ) 1 1 + α , F θ 2 ( X 2 i ) 1 1 + α ) + I ( F θ 1 ( X 1 i ) < 1 1 + α , F θ 2 ( X 2 i ) < 1 1 + α ) and n 2 is the number of X 1 i for which F θ 1 ( X 1 i ) 1 α + 1 . The MLE of ( θ 1 , θ 2 ) is ( θ ^ 1 , θ ^ 2 ) , where θ ^ 1 is the MLE of θ 1 for a simple random sample from f θ 1 ( x ) and θ ^ 2 is the MLE of θ 2 for iid data from f θ 2 ( x ) .
Theorem 10.
Assume that F θ 1 and F θ 2 satisfy the regularity conditions and that the true values of the parameters are interior points of the parameter space Θ = Θ 1 × Θ 2 . Then,
n ( θ ^ 1 θ 1 ) N ( 0 , I 1 ( θ 1 ) ) , n ( θ ^ 2 θ 2 ) N ( 0 , I 1 ( θ 2 ) ) .
Moreover, n ( θ ^ θ ) N ( 0 , Σ ) , where B 12 = E ( θ 1 ln f θ 1 ( X 1 ) θ 2 ln f θ 2 ( X 2 ) ) and Σ = I 1 ( θ 1 ) I 1 ( θ 1 ) B 12 I 1 ( θ 2 ) I 1 ( θ 2 ) B 21 I 1 ( θ 1 ) I 1 ( θ 2 ) .
Proof. 
Under the regularity conditions, Taylor expansion in the neighborhood of the true value of θ combined with the Slutsky theorem implies that n ( θ ^ j θ j ) is asymptotically equivalent to I 1 ( θ j ) n θ j ( θ j , X j ) , where ( θ j , X j ) = i = 1 n ln f θ j ( X j i ) and θ j is the gradient of with respect to θ j . The asymptotic normality of each of these two terms follows from the standard central limit theorem for independent and identically distributed random variables, noting that v a r ( θ j ) = I ( θ j ) . Multivariate normality follows from the Cramer–Wold device deployed below. Consider appropriate vectors t 1 and t 2 to construct Y = t 1 θ 1 ( θ 1 , X 1 ) + t 2 θ 2 ( θ 2 , X 2 ) . Rewriting Y / n as an average of independent observations, we obtain Y n N ( 0 , σ 2 ( t 1 , t 2 ) ) , where σ 2 ( t 1 , t 2 ) = v a r ( t 1 θ 1 ln ( f θ 1 ( X 1 ) ) + t 2 θ 2 ln ( f θ 2 ( X 2 ) ) ) .
Using regularity conditions, we have E ( θ j ln ( f θ j ( X j ) ) ) = 0 and σ 2 ( t 1 , t 2 ) = t 1 I ( θ 1 ) t 1 + t 2 I ( θ 2 ) t 2 + 2 t 1 E ( θ 1 ln ( f θ 1 ( X 1 ) ) θ 2 ln ( f θ 2 ( X 2 ) ) ) t 2 . By the Cramer–Wold device, 1 n ( θ 1 ln ( f θ 1 ( X 1 ) ) , θ 2 ln ( f θ 2 ( X 2 ) ) ) N ( 0 , B ) , where B 11 = I ( θ 1 ) , B 22 = I ( θ 2 ) and B 12 = B 21 = E ( θ 1 ln ( f θ 1 ( X 1 ) ) θ 2 ln ( f θ 2 ( X 2 ) ) ) .
By the properties of multivariate normal distributions, n ( θ ^ 1 θ 1 , θ ^ 2 θ 2 ) N ( 0 , Σ ) , where Σ = I 1 ( θ 1 ) 0 0 I 1 ( θ 2 ) B I 1 ( θ 1 ) 0 0 I 1 ( θ 2 ) . Simple computation leads to Σ = I 1 ( θ 1 ) I 1 ( θ 1 ) B 12 I 1 ( θ 2 ) I 1 ( θ 2 ) B 21 I 1 ( θ 1 ) I 1 ( θ 2 ) . In the case of the copula C α ( u , v ) we obtain
B 12 = ( φ a ( F θ 1 ( x ) ) θ 1 f θ 1 ( x ) d x ) ( φ a ( F θ 2 ( x ) ) θ 2 f θ 2 ( x ) d x ) .
Example 1.
Take f θ i ( x ) = 1 θ i e x θ i I ( x > 0 ) , that is, the exponential distribution, along with the copula C α ( u , v ) . We obtain I 1 ( θ i ) = θ i 2 and B 12 = B 21 = α θ 1 θ 2 ln 2 ( 1 + α α ) . Thus, Σ = θ 1 2 θ 2 θ 1 α ln 2 ( 1 + α α ) θ 2 θ 1 α ln 2 ( 1 + α α ) θ 2 2 . The asymptotic correlation of these variables is ρ = α ln 2 ( 1 + α α ) .
Example 2.
Take f θ 2 ( x ) = 1 θ 2 e x θ 2 I ( x > 0 ) , that is, the exponential distribution, and f θ 1 ( x ) = 1 2 π σ 2 e 1 2 σ 2 ( x θ 1 ) 2 , that is, the normal distribution with known variance σ 2 , along with the copula C α ( u , v ) . We obtain I 1 ( θ 1 ) = σ 2 , I 1 ( θ 2 ) = θ 2 2 and B 12 = B 21 = ( 1 + α ) θ 2 σ ln ( 1 + α α ) ϕ ( Φ 1 ( 1 α + 1 ) ) . Thus, Σ = σ 2 θ 2 σ ( 1 + α ) ln ( 1 + α α ) ϕ ( Φ 1 ( 1 α + 1 ) ) σ θ 2 ( 1 + α ) ln ( 1 + α α ) ϕ ( Φ 1 ( 1 α + 1 ) ) θ 2 2 . The asymptotic correlation of these variables is ρ = ( 1 + α ) ln ( 1 + α α ) ϕ ( Φ 1 ( 1 α + 1 ) ) , where ϕ ( x ) and Φ ( x ) are the density and cumulative distribution functions of the standard normal distribution, respectively (Figure 7).
Example 3.
Take f θ 2 ( x ) = 1 θ 2 e x θ 2 I ( x > 0 ) , that is, the exponential distribution, and f θ 1 ( x ) = 1 2 π θ 1 e 1 2 θ 1 ( x μ ) 2 , that is, the normal distribution with known mean μ, along with the copula C α ( u , v ) . We obtain I 1 ( θ 1 ) = 2 θ 1 2 , I 1 ( θ 2 ) = θ 2 2 and B 12 = B 21 = ( 1 + α ) 2 θ 1 θ 2 ln ( 1 + α α ) ϕ ( Φ 1 ( 1 α + 1 ) ) Φ 1 ( 1 1 + α ) . Thus, Σ = 2 θ 1 2 Σ 12 Σ 12 θ 2 2 , where Σ 12 = ( 1 + α ) θ 1 θ 2 ln ( 1 + α α ) ϕ ( Φ 1 ( 1 α + 1 ) ) Φ 1 ( 1 1 + α ) . The asymptotic correlation of these variables is ρ = ( 1 + α ) 2 ln ( 1 + α α ) ϕ ( Φ 1 ( 1 α + 1 ) ) Φ 1 ( 1 1 + α ) .
When the two marginal distributions are both normal with known means, the asymptotic correlation of the MLE of the variances is ρ = ( 1 + α ) 2 2 α ϕ ( Φ 1 ( 1 α + 1 ) ) Φ 1 ( 1 1 + α ) 2 .
The estimator of α is obtained after estimating the parameters of the marginal distributions. We order the variables X 1 , , X n from smallest to largest and denote them Y 1 , , Y n . The variable n 2 is random. For observed data, we find the value of α , assuming that there is at least one observation in each of the blocks of the support of the bivariate variable. This makes sense because the probabilities are positive. We obtain α ^ = α k * , where
k * = a r g max k { 1 , , n 1 } ( 1 + α k ) n α k k , where α k = a r g max α S k ( 1 + α ) n α k and
S k = 1 F θ ^ 1 ( Y k ) 1 , 1 F θ ^ 1 ( Y k + 1 ) 1 .
Although the function seems simple, the solution depends on the sample, as the function being maximized is convex and includes randomness through its parameter n 2 .
Figure 8 is a locally zoomed version of the fourth graph of Figure 7, and shows the behavior of the correlation for smaller values of the parameter α .

6.2. Simulation Studies

The coverage probabilities for the confidence interval and confidence region of each parameter were computed based on 1000 replications of samples of size 100. The copula parameter was set to levels of 0.25 , 0.5 , 1, 5, 10, and 15. Simulations were carried out for each example under various parameter combinations.

6.2.1. Example 1

We conducted a simulation study based on exponential marginals with θ 1 = 3 and θ 2 = 5 , then with θ 1 = 1 and θ 2 = 10 . The first case represents the first rows of the tables, while the second row represents the alternative (Table 1 and Table 2).

6.2.2. Example 2

We conducted a simulation study based on the exponential marginals with θ 2 for Y and N o r m a l ( θ 1 , σ 2 ) for X. The first row is for θ 2 = 3 , θ 1 = 4 , and σ 2 = 4 , while the second is for θ 2 = 10 , θ 1 = 5 , and σ 2 = 1 (Table 3 and Table 4).

6.2.3. Example 3

We conducted a simulation study based on exponential marginals with θ 2 for Y and N o r m a l ( μ , θ 1 ) for X. The first row is for θ 2 = 3 , θ 1 = 4 , and μ = 50 , while the second row is for θ 2 = 10 , θ 1 = 9 , and μ = 0 (Table 5 and Table 6).

7. Conclusions

In this paper, we have provided a characterization of square integrable idempotent reversible copulas. We have shown that these idempotent copulas generate exchangeable Markov chains and provided several other families of copulas, including extensions of existing copula families such as the FGM family. The mixing properties of the constructed copula families have been studied and conditions for the central limit theorem have been discussed. Interesting properties of the constructed copula families present promising avenues for further research on estimation problems based on copulas.
An important insight of the constructed idempotent copulas is that they generate Markov chains, every realization of which appears as an independent sample from a distribution other than the original marginal distribution. This means that such samples will pass the test of independence and mislead the researcher if it is not known ahead of time that there are other possible values of the variable within a range that will not appear in the data regardless of the sample length. This suggests that analysis based on several independent paths of such Markov chains is necessary before drawing any firm conclusions.

Funding

This research received no external funding.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Conflicts of Interest

The author declares no conflicts of interest.

References

  1. Longla, M.; Djongreba Ndikwa, F.; Muia Nthiani, M.; Takam Soh, P. Perturbations of copulas and mixing properties. J. Korean Stat. Soc. 2022, 51, 149–171. [Google Scholar] [CrossRef]
  2. Longla, M.; Muia Nthiani, M.; Djongreba Ndikwa, F. Dependence and mixing for perturbations of copula-based Markov chains. Probab. Stat. Lett. 2022, 180, 109239. [Google Scholar] [CrossRef]
  3. Komornik, J.; Komornikova, M.; Kalicka, J. Dependence measures for perturbations of copulas. Fuzzy Sets Syst. 2017, 324, 100–116. [Google Scholar] [CrossRef]
  4. Nelsen, R.B. An Introduction to Copulas, 2nd ed.; Springer Series in Statistics; Springer: New York, NY, USA, 2006. [Google Scholar]
  5. Durante, F.; Sempi, C. Principles of Copula Theory; CRC Press: Boca Raton, FL, USA, 2016. [Google Scholar]
  6. Chesneau, C. On New types of multivariate trigonometric copulas. AppliedMath 2021, 1, 3–17. [Google Scholar] [CrossRef]
  7. Chesneau, C. A Collection of New Trigonometric- and Hyperbolic-FGM-Type Copulas. AppliedMath 2023, 3, 147–174. [Google Scholar] [CrossRef]
  8. Longla, M. New copula families and mixing properties. Stat. Pap. 2024, 65, 4331–4363. [Google Scholar] [CrossRef]
  9. Darsow, W.F.; Nguyen, B.; Olsen, E.T. Copulas and Markov processes. Ill. J. Math. 1992, 36, 600–642. [Google Scholar] [CrossRef]
  10. Sklar, A. Fonctions de répartition à n dimensions et leurs marges. In Annales de l’ISUP; Publ. Inst. Statist. Univ.: Paris, France, 1959; Volume 8, pp. 229–231. [Google Scholar]
  11. Darsow, W.F.; Olsen, E.T. Characterization of Idempotent 2-copulas. Note Mat. 2010, 30, 147–177. [Google Scholar]
  12. Zabell, S.L. Characterizing Markov Exchangeable Sequences. J. Theor. Probab. 1995, 8, 175–178. [Google Scholar] [CrossRef]
  13. Blum, J.R.; Chernoff, H.; Rosenblatt, M.; Teicher, H. Central limit theorem for interchangeable processes. Can. J. Math. 1958, 10, 222–229. [Google Scholar] [CrossRef]
  14. Taylor, R.L.; Daffer, P.Z.; Patterson, R.F. Limit Theorems for Sums of Exchangeable Random Variables; Rowman & Allanheld Publishers: Lanham, MD, USA, 1985. [Google Scholar]
  15. Beare, B.K. Copulas and temporal dependence. Econometrica 2010, 78, 395–410. [Google Scholar]
  16. Durante, F.; Sanchez, J.F.; Flores, M.U. Bivariate copulas generated by perturbations. Fuzzy Sets Syst. 2013, 228, 137–144. [Google Scholar] [CrossRef]
  17. Longla, M.; Mous-Abou, H.; Ngongo, I.S. On some mixing properties of copula-based Markov chains. J. Stat. Theory Appl. 2022, 21, 131–154. [Google Scholar] [CrossRef]
  18. Longla, M. On mixtures of copulas and mixing coefficients. J. Multivar. Anal. 2015, 139, 259–265. [Google Scholar] [CrossRef]
  19. Bradley, R.C. Introduction to Strong Mixing Conditions; Kendrick Press: Heber City, UT, USA, 2007; Volume 1–2. [Google Scholar]
  20. Kipnis, C.; Varadhan, S.R.S. Central limit theorem for additive functionals of reversible Markov processes and applications to simple exclusions. Comm. Math. Phys. 1986, 104, 1–19. [Google Scholar] [CrossRef]
  21. Peligrad, M. On the Quenched CLT for Stationary Markov Chains. J. Theor. Probab. 2024, 37, 603–622. [Google Scholar] [CrossRef]
Figure 1. Level curves of density for hyperbolic copulas: (a) θ = 1 , (b) θ = θ m a x .
Figure 1. Level curves of density for hyperbolic copulas: (a) θ = 1 , (b) θ = θ m a x .
Mathematics 13 02034 g001
Figure 2. Level curves of density for polynomial copulas: (a) λ = 2 / 5 , (b) λ = 1 / 5 .
Figure 2. Level curves of density for polynomial copulas: (a) λ = 2 / 5 , (b) λ = 1 / 5 .
Mathematics 13 02034 g002
Figure 3. Level curves of density for power copulas: (a) λ = 1 / 4 , (b) λ = 1 / 8 .
Figure 3. Level curves of density for power copulas: (a) λ = 1 / 4 , (b) λ = 1 / 8 .
Mathematics 13 02034 g003
Figure 4. Density of power copulas: (a) λ = 1 / 4 , (b) λ = 1 / 8 .
Figure 4. Density of power copulas: (a) λ = 1 / 4 , (b) λ = 1 / 8 .
Mathematics 13 02034 g004
Figure 5. A 1-dependent copula: (a) Plot of density of C ( u , v ) , (b) Contour plot of density of C ( u , v ) .
Figure 5. A 1-dependent copula: (a) Plot of density of C ( u , v ) , (b) Contour plot of density of C ( u , v ) .
Mathematics 13 02034 g005
Figure 6. A 1-dependent copula (20): (a) Contour plot of density, (b) Plot of density of C ( u , v ) .
Figure 6. A 1-dependent copula (20): (a) Contour plot of density, (b) Plot of density of C ( u , v ) .
Mathematics 13 02034 g006
Figure 7. Asymptotic correlations of θ ^ 1 and θ ^ 2 : (a) θ 1 = λ 1 and θ 2 = λ 2 , (b) θ 1 = μ and θ 2 = λ , (c) θ 1 = σ 2 and θ 2 = λ , (d) θ 1 = σ 1 2 and θ 2 = σ 2 2 .
Figure 7. Asymptotic correlations of θ ^ 1 and θ ^ 2 : (a) θ 1 = λ 1 and θ 2 = λ 2 , (b) θ 1 = μ and θ 2 = λ , (c) θ 1 = σ 2 and θ 2 = λ , (d) θ 1 = σ 1 2 and θ 2 = σ 2 2 .
Mathematics 13 02034 g007
Figure 8. Locally zoomed correlations for θ 1 = σ 1 2 and θ 2 = σ 2 2 : (a) α = 0–5, (b) α = 10–35.
Figure 8. Locally zoomed correlations for θ 1 = σ 1 2 and θ 2 = σ 2 2 : (a) α = 0–5, (b) α = 10–35.
Mathematics 13 02034 g008
Table 1. E x p o n e n t i a l ( θ 1 ) E x p o n e n t i a l ( θ 2 ) .
Table 1. E x p o n e n t i a l ( θ 1 ) E x p o n e n t i a l ( θ 2 ) .
Coverage Probabilities for 95% Confidence Regions
α 0.25 0.5 10
PE θ ^ 1 θ ^ 2 θ ^ θ ^ 1 θ ^ 2 θ ^ θ ^ 1 θ ^ 2 θ ^
1 94.4 94.6 94.4 94.9 95.3 94.9 94.5 94.9 94.6
2 94.7 95.0 95.1 94.8 94.1 94.0 94.4 94.2 93.8
Table 2. E x p o n e n t i a l ( θ 1 ) E x p o n e n t i a l ( θ 2 ) .
Table 2. E x p o n e n t i a l ( θ 1 ) E x p o n e n t i a l ( θ 2 ) .
Coverage Probabilities for 95% Confidence Regions
α 1515
PE θ ^ 1 θ ^ 2 θ ^ θ ^ 1 θ ^ 2 θ ^ θ ^ 1 θ ^ 2 θ ^
1 94.9 95.3 95.7 95.7 96.0 94.5 95.0 94.0 93.8
2 94.1 94.9 94.1 94.7 94.6 93.7 94.4 94.8 93.5
Table 3. N o r m a l ( θ 1 , σ 2 ) E x p o n e n t i a l ( θ 2 ) .
Table 3. N o r m a l ( θ 1 , σ 2 ) E x p o n e n t i a l ( θ 2 ) .
Coverage Probabilities for 95% Confidence Regions
α 0.25 0.5 10
PE θ ^ 1 θ ^ 2 θ ^ θ ^ 1 θ ^ 2 θ ^ θ ^ 1 θ ^ 2 θ ^
1 94.9 95.0 95.1 94.7 94.8 95.7 94.5 95.3 94.6
2 94.5 94.2 94.3 94.9 94.1 94.1 95.3 94.4 94.9
Table 4. N o r m a l ( θ 1 , σ 2 ) E x p o n e n t i a l ( θ 2 ) .
Table 4. N o r m a l ( θ 1 , σ 2 ) E x p o n e n t i a l ( θ 2 ) .
Coverage Probabilities for 95% Confidence Regions
α 1515
PE θ ^ 1 θ ^ 2 θ ^ θ ^ 1 θ ^ 2 θ ^ θ ^ 1 θ ^ 2 θ ^
1 95.6 94.1 94.4 95.9 94.3 94.5 96.3 94.3 95.1
2 94.5 93.8 93.9 95.3 94.2 94.2 93.6 95.5 94.2
Table 5. N o r m a l ( μ , θ 1 ) E x p o n e n t i a l ( θ 2 ) .
Table 5. N o r m a l ( μ , θ 1 ) E x p o n e n t i a l ( θ 2 ) .
Coverage Probabilities for 95% Confidence Regions
α 0.25 0.5 10
PE θ ^ 1 θ ^ 2 θ ^ θ ^ 1 θ ^ 2 θ ^ θ ^ 1 θ ^ 2 θ ^
1 94.7 95.0 94.6 94.1 95.4 94.3 95.8 94.5 94.5
2 94.5 94.2 94.3 94.9 94.1 94.1 95.3 94.4 94.9
Table 6. N o r m a l ( μ , θ 1 ) E x p o n e n t i a l ( θ 2 ) .
Table 6. N o r m a l ( μ , θ 1 ) E x p o n e n t i a l ( θ 2 ) .
Coverage Probabilities for 95% Confidence Regions
α 1515
PE θ ^ 1 θ ^ 2 θ ^ θ ^ 1 θ ^ 2 θ ^ θ ^ 1 θ ^ 2 θ ^
1 94.3 94.4 94.0 93.8 95.0 94.1 94.6 95.7 94.9
2 94.5 93.8 93.9 95.3 94.2 94.2 94.5 95.4 94.6
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Longla, M. Continuous Exchangeable Markov Chains, Idempotent and 1-Dependent Copulas. Mathematics 2025, 13, 2034. https://doi.org/10.3390/math13122034

AMA Style

Longla M. Continuous Exchangeable Markov Chains, Idempotent and 1-Dependent Copulas. Mathematics. 2025; 13(12):2034. https://doi.org/10.3390/math13122034

Chicago/Turabian Style

Longla, Martial. 2025. "Continuous Exchangeable Markov Chains, Idempotent and 1-Dependent Copulas" Mathematics 13, no. 12: 2034. https://doi.org/10.3390/math13122034

APA Style

Longla, M. (2025). Continuous Exchangeable Markov Chains, Idempotent and 1-Dependent Copulas. Mathematics, 13(12), 2034. https://doi.org/10.3390/math13122034

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop