Next Article in Journal
The Linear Skew-t Distribution and Its Properties
Next Article in Special Issue
Consecutive-k1 and k2-out-of-n: F Structures with a Single Change Point
Previous Article in Journal
Assessing Area under the Curve as an Alternative to Latent Growth Curve Modeling for Repeated Measures Zero-Inflated Poisson Data: A Simulation Study
Previous Article in Special Issue
Panel Data Models for School Evaluation: The Case of High Schools’ Results in University Entrance Examinations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

On Weak Convergence of the Bootstrap Copula Empirical Process with Random Resample Size

LMAC (Laboratory of Applied Mathematics of Compiègne), Université de Technologie de Compiègne, 60200 Compiègne, France
Stats 2023, 6(1), 365-380; https://doi.org/10.3390/stats6010023
Submission received: 18 January 2023 / Revised: 13 February 2023 / Accepted: 20 February 2023 / Published: 22 February 2023
(This article belongs to the Special Issue Advances in Probability Theory and Statistics)

Abstract

:
The purpose of this note is to provide a description of the weak convergence of the random resample size bootstrap empirical process. The principal results are used to estimate the sample rank correlation coefficients using Spearman’s and Kendall’s respective methods. In addition to this, we discuss how our findings can be applied to statistical testing.
MSC:
Primary: 60F17; secondary: 62H10; 62G20; 60F15

1. Introduction

Let us consider a random vector [rv] X = ( X 1 , , X d ) on some the probability space ( A , B , P ) . Let F ( · ) denotes the joint cumulative distribution function [cdf] and F 1 ( · ) , , F d ( · ) marginal df.s. The characterisation theorem of [1] implies that there exists a copula function C ( · ) in such a way that
F ( x ) = F ( x 1 , , x d ) = C ( F 1 ( x 1 ) , , F d ( x d ) ) , f o r a l l x 1 , , x d R .
The copula function C ( · ) is a d-variate cdf on [ 0 , 1 ] d with uniform distributions on the interval [ 0 , 1 ] . In the case of continuous marginal df.s F 1 ( · ) , , F d ( · ) , then the function C ( · ) is uniquely defined and
C ( u ) = C ( u 1 , , u d ) = F ( F 1 1 ( u 1 ) , , F d 1 ( u d ) ) ,
where, for j = 1 , , d ,
F j 1 ( u ) = inf { x : F j ( x ) u } w i t h u ( 0 , 1 ]
and
F j 1 ( 0 ) = lim t 0 F j 1 ( t ) = F j 1 ( 0 + ) ,
is the quantile function of F j ( · ) . Vectors are denoted by bolded letters throughout the paper, e.g., x = ( x 1 , , x d ) is a d-dimensional vector. The inequalities x y are understood componentwise, i.e., x j y j for all j = 1 , , d . Unless otherwise provided, we suppose that the, for j = 1 , , d , F j ( · ) , are continuous functions. In the books by [2,3] the reader can find exhaustive descriptions of the modeling theory’s components as well as surveys of the most frequently used copulas.
For detailed and summary historical information we refer to [4]. We can refer also to [5], where the author highlights the proof of (1), examines some of its repercussions and some of the research on copulas. Copulas are a powerful tool for analyzing dependence patterns, and their adaptability and versatility have been demonstrated repeatedly. To be more specific, copula C ( · ) “couples” the joint distribution function F ( · ) to its univariate margins, which capture the interdependency pattern of X = ( X 1 , , X d ) . In fact, the copula can be used as an explicit expression for most common measures of dependence. This characteristic has led to productive applications in actuarial science and survival analysis (see, e.g., [6,7]). In the literature on risk management and, more generally, in mathematical economics and mathematical finance modeling, there are numerous examples (refer to books of [8,9]), in particular, in the context of asset pricing and credit risk management.
In the independent case, the empirical copula suggested by [10] took on a slightly different shape. Researchers looked into its features and potential uses even more, [11,12] in the general case as well as by [13,14]. These latter works introduced the empirical copula process on a discrete grid, but referred to it by its technical term, the multivariate rank order process. Indeed, the sequential variant was introduced and investigated in greater depth for nonstationary and mixing random variables in that study. Further talks on the empirical copula process may be found in [15] and the references therein. Ref. [16] demonstrated the weak convergence of the empirical copula process under the assumption that the first-order partial derivatives of the copula exist and are continuous on particular subsets of [ 0 , 1 ] d .
Let us denote by C n ( · ) the empirical copula function based on an independent sample of size n (see, e.g., (3) below for definition). It is generally accepted that the standardized process n ( C n C ) ( · ) converges weakly towards a Gaussian field γ ( · ) (cf. (9) below) with covariance structure is copula-dependent, hence the asymptotic limiting distributions are not straightforward [there exists different methods to show this result, the interested reader may refer to [10,12,16,17,18,19,20], the interested in learning more about the derivations of the limit of the related empirical copula processes may look into and the references therein], this prevents practical implementations of explicit computation. Multiple researchers have proposed employing bootstrap approaches to approximate the limit distribution, a topic of growing practical relevance, refer to [16,21] and the references therein for more details. Recall that the bootstrap method is a type of resampling techniques for statistical inference, was introduced in [22]’s seminal paper. In order to draw conclusions about the features of the underlying population, the bootstrap relies on the assumption that a given sample is statistically equivalent to the population as a whole, refer to [23]. For recent reference on the subject, refer to [24,25,26,27,28]. The primary purpose of writing this paper is that, to the best of our knowledge, the results presented here provide a response to a previously understudied topic.
We begin by discussing the limiting behavior of empirical copula processes in Section 2. Weak convergence for the bootstrapped empirical copula process when the sample size is random is discussed in Section 3. In Section 4, we discuss how our main results can be applied to other areas, specifically the functional of copulas and statistical tests. Some concluding remarks are given in Section 5. The Appendix contains the detailed mathematical developments.

2. Some Useful Results on Empirical Copulas

Let X k = ( X 1 ; k , , X d ; k ) , k = 1 , , n , denotes i.i.d. rv.s with a d-dimensional continuous df F ( · ) whose j-th marginal, for j = 1 , , d , and the copula associated with are denoted by F j ( · ) and C ( · ) , respectively. The empirical df.s are defined, respectively, by
F n ( x ) = 1 n k = 1 n 1 I X 1 ; k x 1 , , X d ; k x d , = 1 n k = 1 n j = 1 d 1 I X j ; k x j , x R d , F n j ( x j ) = 1 n k = 1 n 1 I X j ; k x j , x j R ,
where 1 I · is the usual indicator function of the set · . Copulas have been estimated nonparametrically ever since [10] presented an empirical estimator of the copula
C n ( u ) = F n ( F n 1 1 ( u 1 ) , , F n d 1 ( u d ) ) ,
where, for j = 1 , , d ,
F n j 1 ( u j ) = inf { x : F n j ( x ) u j } .
As may be expected, the law of C n ( · ) depends on F ( · ) solely via the related copula C ( · ) . For this result, let ξ k = ( ξ 1 ; k , , ξ d ; k ) , k = 1 , , n be i.i.d. rv. s with df C ( · ) and set
G n ( x ) = 1 n k = 1 n 1 I ξ 1 ; k x 1 , , ξ d ; k x d , G n j ( x j ) = 1 n k = 1 n 1 I ξ j ; k x j .
Consequently,
F n ( x ) = L G n ( F 1 ( x 1 ) , , F d ( x d ) ) ,
where the equality in distribution is denoted by = L and
( F n 1 ( x 1 ) , , F n d ( x d ) ) = L ( G n 1 ( F 1 ( x 1 ) ) , , G n d ( F d ( x d ) ) ) ,
consequently, it follows that
C n ( u ) = L G n ( G n 1 1 ( u 1 ) , , G n d 1 ( u d ) ) .
This demonstrates that the law of C n ( · ) is similar for all F ( · ) whose related copula is C ( · ) . Therefore, it suffices to investigate the empirical copula derived from the data ξ k . We let C n ( · ) denote the right-hand side of (3) in the sequel. The empirical process α n ( · ) and the empirical copula process γ n ( · ) , both related with C ( · ) , are given, for u [ 0 , 1 ] d , respectively, by
α n ( u ) = n ( G n ( u ) C ( u ) ) , γ n ( u ) = n ( C n ( u ) C ( u ) ) .
The empirical process corresponding to the empirical distribution function G n j ( · ) is defined by
α n j ( u j ) = n ( G n j ( u j ) u j ) ,
for u [ 0 , 1 ] d and u j [ 0 , 1 ] . Note that α n j ( 0 ) = α n j ( 1 ) = 0 almost surely. We have
α n α ( n )
in ( [ 0 , 1 ] d ) . The weak convergence is denoted by the arrow ‘⇝’ as in Definition 1.3.3 in [17]. We shall utilize the same notation, definitions, and conditions from [16] going forward. The limit process α ( · ) is a C -Brownian bridge, i.e. a tight Gaussian process, centered and with covariance function
cov α ( u ) , α ( v ) = C ( u v ) C ( u ) C ( v ) ,
for u , v [ 0 , 1 ] d ; here u v = ( min ( u 1 , v 1 ) , , min ( u d , v d ) ) . The tightness of the process and the continuity of its mean and covariance functions imply the existence of a version of α with continuous trajectories. Without losing generality, we will now suppose that α is such a variant. For j { 1 , , d } , let e j denote the jth coordinate vector in R d . For u [ 0 , 1 ] d such that 0 < u j < 1 , let
C ˙ j ( u ) = lim h 0 C ( u + h e j ) C ( u ) h ,
denotes, provided it exists, the jth first-order partial derivative of C ( · ) .
Condition 1. 
For any j { 1 , , d } , on the set V d , j : = { u [ 0 , 1 ] d : 0 < u j < 1 } , the jth first-order partial derivative C ˙ j ( · ) exists and is continuous.
Henceforth, suppose that the Condition 1 is satisfied. For notational convenience, the domain of C ˙ j ( · ) will be extended to the whole of [ 0 , 1 ] d by considering
C ˙ j ( u ) = lim sup h 0 C ( u + h e j ) h , if u [ 0 , 1 ] d , u j = 0 ; lim sup h 0 C ( u ) C ( u h e j ) h , if u [ 0 , 1 ] d , u j = 1 .
In this manner, C ˙ j ( · ) is defined everywhere on [ 0 , 1 ] d , takes values in [ 0 , 1 ] because
| C ( u ) C ( v ) | j = 1 d | u j v j | ,
and is continuous on the set V d , j , by Condition 1. Also note that C ˙ j ( u ) = 0 as soon as u i = 0 for some i j .
Condition 2. 
For all i , j { 1 , , d } , on the set V d , i V d , j , the second-order partial derivative C ¨ i j ( · ) is defined and continuous and there exists a constant K > 0 in such a way that
| C ¨ i j ( u ) | K min 1 u i ( 1 u i ) , 1 u j ( 1 u j ) , u V d , i V d , j .
Proposition 1 
([16]). Assume that the Conditions 1 and 2 are satisfied. Then, as n , we have
sup u [ 0 , 1 ] d | γ n ( u ) γ ˜ n ( u ) | = O n 1 / 4 ( log log n ) 1 / 4 ( log n ) 1 / 2 a . s . ,
where
γ ˜ n ( u ) = α n ( u ) j = 1 d C ˙ j ( u ) α n j ( u j ) , u [ 0 , 1 ] d ,
and
α n j ( u j ) = α n ( 1 , , 1 , u j , 1 , , 1 ) ,
where the jth entry is the variable u j .
This result is presented with an outline of the proof by [11], whereas a complete proof is provided by [19,29] and complemented recently by [15,16,30,31,32,33,34,35,36]. Assume first that the first-order partial derivatives C ˙ j ( · ) exist and are continuous throughout the closed hypercube [ 0 , 1 ] d . For u [ 0 , 1 ] d , define
γ ( u ) = α ( u ) j = 1 d C ˙ j ( u ) α j ( u j ) ,
where α j ( u j ) = α ( 1 , , 1 , u j , 1 , , 1 ) . By continuity of C ˙ j ( · ) throughout [ 0 , 1 ] d , the trajectories of γ are continuous. From [19] we learn that γ n γ as n in the space ( [ 0 , 1 ] d ) .
Proposition 2 
([16]). Suppose that the Condition 1 holds, then, with γ ˜ n as defined in (8),
sup u [ 0 , 1 ] d γ n ( u ) γ ˜ n ( u ) P 0 ( n ) .
Consequently, in ( [ 0 , 1 ] d ) , we have
γ n γ ( n ) .
Ref. [16] justified the last weak convergence by the fact that the map from ( [ 0 , 1 ] d ) into itself that sends a function f to
f j = 1 d C ˙ j π j ( f ) ,
with ( π j ( f ) ) ( u ) = f ( 1 , , 1 , u j , 1 , , 1 ) , is linear and bounded.

3. Main Resuts

Let us introduce the following definitions. Let P n ( ω ) denote the empirical measure for ω Ω , i.e.,
P n ( ω ) = P n ω = 1 n i = 1 n δ ξ i ( ω ) .
Let ξ ^ n , 1 ω , , ξ ^ n , m ω denote an iid vectors sampled from the empirical measure P n ( ω ) and let P ^ n m ( ω ) denotes the empirical measure related to ξ ^ n , 1 ω , , ξ ^ n , m ω , i.e.,
P ^ n m ( ω ) = P ^ n m ω = 1 m i = 1 m δ ξ ^ n i ( ω ) .
For ω Ω , n = 1 , 2 , , ξ ^ n , 1 ω , , ξ ^ n , m ω , are row iid with distribution P ^ n m ( ω ) on ( A , B ) . We define on a common probability space ( Ω , S , Pr ω ) = ( A , B , P 1 ω ) N × × ( A , B , P n ω ) N × × the resulting triangular array.
(C.1) 
Let { N n ω , n 1 } denote a sequence of positive integer-valued random variables, in such a way that
Pr P a . e . ω , N n ω n P r ω ν ω , a s ,
where ν ω dentes a positive random variable, that is defined on the same initial probability space ( Ω , Σ , Pr ω ) , i.e.,
Pr ω ( ν ω > 0 ) = 1 .
Set
G ^ n m ( x ) = 1 m k = 1 m 1 I ξ ^ n , k ω x , G ^ n m , j ( x j ) = 1 m k = 1 n 1 I ξ ^ j ; k ω x j ,
and therefore it follows that the bootstrapped empirical copula is defined by
C ^ n m ( u ) = G ^ n m ( G ^ n m , 1 1 ( u 1 ) , , G ^ n m , d 1 ( u d ) ) .
The empirical process α n N n ( · ) and the empirical copula process γ ^ n N n ( · ) , both associated with C ( · ) , are defined, for u [ 0 , 1 ] d , respectively, by
α ^ n N n ( u ) = N n 1 / 2 ( G ^ n N n ( u ) G n ( u ) ) , γ ^ n N n ( u ) = N n 1 / 2 ( C ^ n N n ( u ) C n ( u ) ) .
Let E ( N n ) = μ n , Var ( N n ) = σ n 2 . We define
α ˜ n N n ( u ) = μ n 1 / 2 1 μ n i = 1 N n 1 I ξ ^ n , k ω x G n ( x ) .
and
γ ˜ n N n ( u ) = μ n 1 / 2 C ˜ n N n ( u ) G n ( x ) .
where C ˜ n N n ( u ) is a bootstrapped copula associated with
G ˜ n N n ( u ) = 1 μ n i = 1 N n 1 I ξ ^ n , k ω x .
(C.2) 
N n is independent of the ξ n , k ω ’s;
(C.3) 
for all n = 1 , 2 , , 0 < E N n = μ n , Var N n = σ n 2 < , and
lim n μ n = a n d lim n σ n 2 μ n = β 0 ,
(C.4) 
either N n is a degenerate random variable for all n or σ n 2 > 0 ,
N n μ n Pr P 1 a n d N n μ n σ n d Z N ( 0 , 1 ) , a s n .
We indicate by P the weak convergence conditional on the data in probability as defined by [37], that is,
γ n N n ( · ) P γ ( · ) ,
if
sup h B L 1 ( ( [ 0 , 1 ] d ) ) E ( h ( γ n N n ) { ξ k } 1 k n ) E h ( γ ) P 0 ,
and, for every, h B L 1 ( ( [ 0 , 1 ] d ) )
E ( h ( γ n N n ) { ξ k } 1 k n ) E ( h ( γ n N n ) { ξ k } 1 k n ) P 0 ,
where
B L 1 ( ( [ 0 , 1 ] d ) )    = f : ( [ 0 , 1 ] d ) R , f 1 ,     | f ( l 1 ) f ( l 2 ) | sup u [ 0 , 1 ] d l 1 ( u ) l 2 ( u ) , l 1 , l 2 ( [ 0 , 1 ] d )
is the class that encompasses all uniformly bounded continuous functions with that are Lipschitz with constant less than 1 and
f = sup x [ 0 , 1 ] d | f ( x ) | .
Moreover, h ( γ n N n ) and h ( γ n N n ) are, with respect to the bootstrapped sample, measurable majorants and minorants.
The following theorems will be based on the work of (Theorem 4, [38]) in combination with the delta method. We recall the principles in the section of proofs.
Theorem 1. 
Suppose that the Condition 1 and (C.1) hold. Then, we have, in ( [ 0 , 1 ] d ) ,
γ ^ n N n ( · ) P γ ,
Let us introduce the limiting process α β ( · ) , that is centered process and with related covariance function
cov α β ( u ) , α β ( v ) = C ( u v ) ( 1 β ) C ( u ) C ( v ) ,
and define
γ β ( u ) = α β ( u ) j = 1 d C ˙ j ( u ) α j β ( u j ) ,
Theorem 2. 
Suppose that the Conditions 1 and (C.1)-(C.2)-(C.3)-(C.4) are satisfied. Then, in ( [ 0 , 1 ] d ) , we have
γ ˜ n N n ( · ) P γ β ,
Remark 1. 
We have the following special cases:
(1)
N n is Poisson random variable P ( μ n ) , in this case β = 1 ;
(2)
N n is a binomial random variable with parameter ( n , p ) , then
β = 1 p ,
in light of the binomial distribution’s wide range of possible uses, this is of particular importance.
Remark 2. 
The empirical copula bootstrap process has received considerable attention in the literature. There has been no work on the bootstrap of Kac empirical copula process to date. This note provides a more general setting in the sense that when N n is a Poisson random variable, the bootstrap of the empirical Kac copula process is obtained. We highlight that the empirical Kac copula process was first examined in [32], see the references therein.

4. Applications

The limiting laws of several statistics, such as the Kendall and Spearman sample rank correlation coefficients, can be derived from Theorem 1, as shown in [31]. Taking a broader view, let us define, for any function J ( · ) on [ 0 , 1 ] 3 , the functional
S ( C ) : = 0 1 0 1 J ( u , v , C ( u , v ) ) d u d v .
The related sample quantity S ( C n ) may be called Spearman type rank statistics, the interested reader may refer to [11,29] for more details. To be more precise, assume that z J ( u , v , z ) has a continuous derivative J ( 3 ) ( u , v , z ) with
sup u , v , z [ 0 , 1 ] | J ( 3 ) ( u , v , z ) | = sup u , v , z [ 0 , 1 ] J ( u , v , z ) z < .
Then, we can write
n ( S ( C n ) S ( C ) ) = n 0 1 0 1 J ( u , v , C n ( u , v ) ) d u d v 0 1 0 1 J ( u , v , C ( u , v ) ) d u d v = 0 1 0 1 J 3 ( u , v , δ n ( u , v ) ) γ n ( u , v ) d u d v ,
where δ n ( u , v ) is a point between C n ( u , v ) and C ( u , v ) , so that δ n ( · ) converge to C ( · ) uniformly with probability one. Making use of Theorem 1, the limiting law in the following equation, can be evaluated,
n ( S ( C n ) S ( C ) ) L 0 1 0 1 J 3 ( u , v , C ( u , v ) ) γ ( u , v ) d u d v .
We define, for any function J ( · ) on [ 0 , 1 ] 3 ,
T ( C ) : = 0 1 0 1 J ( u , v , C ( u , v ) ) d C ( u , v ) .
We call T ( C n ) a Kendall type rank statistic. Similarly, using Theorem 1 we can evaluate the limiting law of
n ( T ( C n ) T ( C ) ) .
In the multivariate case, the population version of Spearman’s rho is defined, refer to [39], as
ρ ( C ) = d + 1 2 d ( d + 1 ) 2 d [ 0 , 1 ] d C ( u ) d u 1
and can be estimated by
ρ ( C n ) = d + 1 2 d ( d + 1 ) 2 d [ 0 , 1 ] d C n ( u ) d u 1 .
Based on the continuous mapping theorem we infer
n ( ρ ( C n ) ρ ( C ) ) L N ( 0 , σ ρ 2 ) ,
where
σ ρ 2 = d + 1 2 d ( d + 1 ) 2 2 2 d [ 0 , 1 ] d [ 0 , 1 ] d E γ ( u ) E γ ( v ) d u d v .
Theorem 1 is useful to approximate this limiting law. The multivariate population version of Kendall’s tau is defined, refer to [40], as
τ ( C ) = 1 2 d 1 1 1 + 2 d [ 0 , 1 ] d C ( u ) d C ( u )
which can be estimated by
τ ( C n ) = 1 2 d 1 1 1 + 2 d [ 0 , 1 ] d C n ( u ) d C n ( u ) .
n ( τ ( C n ) τ ( C ) ) L N ( 0 , σ τ 2 ) ,
where σ τ 2 is the variance of τ ( C n ) . Theorem 1 can be used to evaluate this limiting law. Indeed, we have
2 d 1 1 2 d n ( τ ( C n ) τ ( C ) ) = n [ 0 , 1 ] d C n ( u ) d C n ( u ) [ 0 , 1 ] d C ( u ) d C ( u ) = 2 n [ 0 , 1 ] d C ( u ) d C n ( u ) [ 0 , 1 ] d C ( u ) d C ( u ) + n [ 0 , 1 ] d ( C n ( u ) C ( u ) ) d C n ( u ) [ 0 , 1 ] d ( C n ( u ) C ( u ) ) d C ( u ) = 2 n [ 0 , 1 ] d C ( u ) d ( C n ( u ) C ( u ) ) + n [ 0 , 1 ] d ( C n ( u ) C ( u ) ) d ( C n ( u ) C ( u ) ) = 2 n [ 0 , 1 ] d C ( u ) d ( C n ( u ) C ( u ) ) + o P ( 1 ) ,
the right-hand variable can be shown to be asymptotically normal in a very elementary way.
Early in the last century, Corrado Gini proposed a sample measure of association based on absolute differences in ranks. The population version of that measure, for random variables X 1 and X 2 with copula C ( · ) , is given by, see e.g., [4,41],
Λ : = 2 [ 0 , 1 ] 2 ( | u 1 + u 2 1 | + | u 1 u 2 | ) d C ( u 1 , u 2 ) .
For every u [ 0 , 1 ] d , we have
W ( u ) : = max i = 1 d u i d + 1 , 0 C ( u ) min u 1 , u 2 , , u d = : M ( u ) .
We denote by A ( · ) the function defined by M ( · ) + W ( · ) / 2 . We define the survival function K ¯ ( · ) of a measurable function K : [ 0 , 1 ] d [ 0 , 1 ] by
K ¯ ( u ) : = 1 + k = 1 d ( 1 ) k 1 i 1 < i 2 < < i k d K i 1 i 2 i k u i 1 , u i 2 , , u i k
where the functions on the right-hand side are appropriate lower margins of K ( · ) . A multivariate version of Gini’s gamma is then defined as
γ ( C ) : = 1 b ( d ) a ( d ) [ 0 , 1 ] d { A ( u ) + A ¯ ( u ) } d C ( u ) a ( d )
with normalization constants a ( d ) and b ( d ) of the form
a ( d ) = [ 0 , 1 ] d { A ( u ) + A ¯ ( u ) } d u = 1 d + 1 + 1 2 ( d + 1 ) ! + i = 0 d ( 1 ) i d i 1 2 ( i + 1 ) !
and
b ( d ) = [ 0 , 1 ] d { A ( u ) + A ¯ ( u ) } d M ( u ) = 1 i = 1 d 1 1 4 i ,
for details refer to [42,43]. Let Q d ( C , M ) be the probability of concordance between C T and M defined as (see [2])
Q d ( C T , M ) = [ 0 , 1 ] d ( M ( u ) + M ¯ ( u ) ) d C T ( u ) .
For any d-copula C , the multivariate Spearman’s footrule φ d ( C ) -or simply θ , if there is no confusion-, proposed by [44] can be defined as
θ = Q d ( C , M ) a d b d a d ,
where and
a d = Q d ( Π , M ) = [ 0 , 1 ] d ( M ( u ) + M ¯ ( u ) ) d i = 1 d u 1 = 2 d + 1
b d = Q d ( M , M ) = 0 1 ( M ( t , , t ) + M ¯ ( t , , t ) ) d t = 1 ;
that is,
θ = 1 ( d + 1 ) 1 Q d ( C T , M ) ( d 1 ) .
Notice that θ can be alternatively written as
θ = 1 d + 1 d 1 [ 0 , 1 ] d max 1 j d u j min 1 j d u j d C ( u ) .
As natural estimator of θ is given by
θ n = 1 d + 1 d 1 [ 0 , 1 ] d max 1 j d u j min 1 j d u j d C n ( u ) .
Following [45], let | A | denote the cardinality of any set A D = { 1 , , d } , and denote by t A the vector t 1 , , t d such that t = t 1 ( A ) + 1 ( A ) for all { 1 , , d } so that, for example, t D = ( t , , t ) .
n θ n θ N 0 , σ C 2 .
Hence the variance is given by
σ C 2 = d + 1 d 1 2 Γ ( D , D ) + 2 A D ( 1 ) | A | Γ ( A , D ) + Γ ¯ ( D , D ) ,
where for arbitrary A , B D , one has
Γ ( A , B ) = 0 1 0 1 cov C s A , C t B d s d t ,
and
Γ ¯ ( D , D ) = A D B D ( 1 ) | A | + | B | Γ ( A , B ) = 0 1 0 1 cov C ¯ s D , C ¯ t D d s d t ,
refer to [45] for the details. When C is radially symmetric, that is, C = C ¯ , one gets Γ ¯ ( D , D ) = Γ ( D , D ) , refer also to [46]. Let us recall the multivariate versions of Blomqvist’s beta
β d , C = 2 d 1 [ C ( 1 / 2 ) + C ¯ ( 1 / 2 ) ] 1 2 d 1 1 ,
refer to [44]. A natural estimator is given by
β d , C , n = 2 d 1 [ C n ( 1 / 2 ) + C ¯ n ( 1 / 2 ) ] 1 2 d 1 1 .
From Proposition 1 of [47], we have
n ( β d , C , n β d , C ) N ( 0 , σ 2 ) ,
where
σ 2 = 2 2 d 2 C ( 1 / 2 ) + C ¯ ( 1 / 2 ) { C ( 1 / 2 ) + C ¯ ( 1 / 2 ) } 2 / 2 d 1 1 2 .
Now, let’s say we want to test the following hypothesis
H 0 : C ( u 1 , u 2 ) = C ( u 2 , u 1 ) , f o r ( u 1 , u 2 ) [ 0 , 1 ] 2 ,
Let us introduce the following process
S n ( u 1 , u 2 ) = n ( C n ( u 1 , u 2 ) C n ( u 2 , u 1 ) ) = H 0 γ n ( u 1 , u 2 ) γ n ( u 2 , u 1 ) .
It is straightforward to prove that
γ n ( u 1 , u 2 ) γ n ( u 2 , u 1 ) γ ( u 1 , u 2 ) γ ( u 2 , u 1 ) ( n ) .
An application of the bootstrap continuous mapping theorem, see [37], gives
S ˜ n ( u 1 , u 2 ) = γ ^ n N n ( u 1 , u 2 ) γ ^ n N n ( u 2 , u 1 ) P γ ( u 1 , u 2 ) γ ( u 2 , u 1 ) .
Let us define the statistic
K n = sup u 1 , u 2 [ 0 , 1 ] | S n ( u 1 , u 2 ) | ,
and we reject H 0 if K n , is larger than the 1 α quantile of the law of
K n = sup u 1 , u 2 [ 0 , 1 ] | S n ( u 1 , u 2 ) | ,
which depends in a complex manner on the unknown copula. Using a bootstrap sample, it is possible to estimate quantiles
K ˜ n = sup u 1 , u 2 [ 0 , 1 ] | S ˜ n ( u 1 , u 2 ) | .
We would now to test
H 0 : C ( u 1 , u 2 ) = u 1 + u 2 1 + C ( 1 u 1 , 1 u 2 ) ,
against
H 1 : C ( u 1 , u 2 ) u 1 + u 2 1 + C ( 1 u 1 , 1 u 2 ) .
Let us consider the following process
A n ( u 1 , u 2 ) = n C n ( u 1 , u 2 ) ( u 1 + u 2 1 + C n ( 1 u 1 , 1 u 2 ) ) = γ n ( u 1 , u 2 ) γ n ( 1 u 1 , 1 u 2 ) ) .
In [46] it is shown that
γ n ( u 1 , u 2 ) γ n ( 1 u 1 , 1 u 2 ) ) γ ( u 1 , u 2 ) γ ( 1 u 1 , 1 u 2 ) ( n ) .
By the bootstrap continuous mapping theorem, see [37], we obtain
A ˜ n ( u 1 , u 2 ) = γ ^ n N n ( u 1 , u 2 ) γ ^ n N n ( 1 u 1 , 1 u 2 ) P γ ( u 1 , u 2 ) γ ( 1 u 1 , u 2 ) .
In a similar way, we can perform statistical tests for radial symmetry.
Remark 3. 
It is common knowledge that Theorem 1 may be utilized easily using normal bootstrap sampling, which we will demonstrate in detail below. Let N be a large integer. For any k = 1 , , N , let
γ ^ n N n ( k ) ( u ) : = N n 1 / 2 ( C ^ n N n ( k ) ( u ) C n ( u ) ) , f o r u [ 0 , 1 ] d .
Now, according to Theorem 1, we readily obtain that,
( γ n ( · ) , γ ^ n N n ( 1 ) ( · ) , , γ ^ n N n ( N ) ( · ) ) ( γ ( · ) , γ ( 1 ) ( · ) , , γ ( N ) ( · ) ) i n ( [ 0 , 1 ] d ) ( N + 1 ) ,
where γ ( 1 ) ( · ) , , γ ( N ) ( · ) are independent copies of γ ( · ) . In order to approximate the limiting distribution of { γ n ( u ) : u [ 0 , 1 ] d , n > 0 } , one can use the empirical distribution of γ ^ n N n ( 1 ) ( · ) , , γ ^ n N n ( N ) , for N large enough. In the examples that came before, the statistics may be written as a function of the empirical copula, and the asymptotic behavior of those statistics can be deduced from the weak convergence features of the bootstrapped empirical copula process. To be more explicit, if we are interested in carrying out a statistical test that is based on a functional relationship that is considered to be "smooth" functional
S n : = φ ( γ n ) ,
with the convention that large values of S n lead to the rejection of the null hypothesis, H 0 , under certain regularity conditions, a valid approximation to the P-value is given by
1 N k = 1 N 1 I { S n ( k ) S n } ,
where
S n ( k ) : = φ ( γ n ; N n ( k ) ) ,
we may refer to [31,48,49,50,51,52,53].

5. Concluding Remarks

This note’s primary objective is to describe the weak convergence of the random resample size bootstrap empirical process. Spearman’s and Kendall’s respective approaches are utilized to estimate the sample rank correlation coefficients utilizing the principal results. In addition, we address the application of our findings to statistical testing. The interest of doing so would be to extend our work to the conditional copula by using the kernel or the k-nearest neighbours estimators. Presently it is beyond reasonable hope to achieve this program without new technical arguments. Another direction of research is to consider the projection pursuit regression and projection pursuit conditional copula, which need an extension and generalization of the methods used in the present work. If we assume that the conditional copula function is smooth enough, that is p + 1 times differentiable at a fixed x 0 , it will be better to use the local polynomial regression techniques, refer to [54], to obtain a more appropriate estimate at x 0 than that given by the Nadaraya-Watson estimator. We will not treat the weak convergence of such estimators in the present paper, and leave it for future investigation. It would be of interest to extend the present work to the problem of the present paper to the incomplete data setting which requires nontrivial mathematics, that goes well beyond the scope of the present paper. A future research direction would be to study the problem investigated in this work in the setting of serially dependent observations.

6. Proof

This part is all about the proof that our methods work. The notation we talked about before is still used below.
Proof of Theorem 1. 
We shall use the functional delta method to prove Theorem 1. Here we recall the proof from [17,55]. Let
Φ ( F ) = F ( F 1 1 , , F d 1 ) .
We decompose the map Φ ( · ) into three simpler maps as follows:
( F ) φ 1 ( F 1 , , F d , F ) φ 2 ( F 1 1 , , F d 1 , F ) φ 3 F ( F 1 1 , , F d 1 ) .
Hence we obtain
Φ ( · ) = φ 3 φ 2 φ 1 ( · ) ,
and an application of the chain rule (see [17], Theorem 3.9.3) we obtain that
Φ ( θ ) = φ 3 ( φ 2 φ 1 ( θ ) ) φ 2 ( φ 1 ( θ ) ) φ 1 ( θ ) .
We note that map φ 1 ( · ) is linear and continuous, which implies that is Hadamard differentiable. Its derivative is defined by
φ 1 ( F ) ( α , β ) = ( α 1 , , α d , β ) .
By Lemma 3.9.23 in [17], we have the map φ 2 ( · ) is Hadamard differentiable tangentially to C ( [ 0 , 1 ] d ) and the derivative is
φ 2 ( F 1 , , F d , F ) ( γ 1 , , γ d , ξ ) = γ 1 f 1 F 1 1 , , γ d f d F d 1 , ξ .
Finally, the Hadamard differentiability of the map φ 3 ( · ) is a consequence of Lemma 3.9.27 in [17], and the derivative is given by
φ 3 ( F 1 1 , , F d 1 , F ) ( μ 1 , , μ d , ν )  = ν ( F 1 1 , , F d 1 ) + F ( F 1 1 , , F d 1 ) F 1 1 , , F ( F 1 1 , , F d 1 ) F d 1 μ 1 μ d  = ν ( F 1 1 , , F d 1 ) + i = 1 d F ( F 1 1 , , F d 1 ) F i 1 μ i .
By using the last three results, we establish that the map Φ ( · ) is Hadamard differentiable. This results as a composition of Hadamard differentiable functions. Hence the derivative is given by:
Φ ( F ) ( α ) ( u ) = φ 3 ( φ 2 φ 1 ( F ) ) φ 2 ( φ 1 ( F ) ) φ 1 ( F ) ( α ) ( u ) = α ( F 1 1 ( u 1 ) , , F d 1 ( u d ) ) i = 1 d F ( F 1 1 ( u 1 ) , , F d 1 ( u d ) ) F i 1 ( u i ) α i ( F i 1 ( u i ) ) f i ( F i 1 ( u i ) ) = α ( F 1 1 ( u 1 ) , , F d 1 ( u d ) ) i = 1 d C ( u 1 , , u d ) u i α i ( F i 1 ( u i ) ) .
This when combined with Theorem 3.9.4 of [17], implies that
γ ^ n N n P Φ ( F ) ( α ) .
The proof is completed by observing that
Φ ( F ) ( α ) = α ( u ) j = 1 d C ˙ j ( u ) α j ( u j ) , u [ 0 , 1 ] d .
Thus, the proof is conclusive. □

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The author is indebted to the Editor-in-Chief, Associate Editor and the referees for their very generous comments and suggestions on the first version of our article which helped us to improve content, presentation, and layout of the manuscript.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Sklar, A. Fonctions de répartition à n dimensions et leurs marges. Publ. Inst. Stat. Univ. Paris 1959, 8, 229–231. [Google Scholar]
  2. Nelsen, R.B. An Introduction to Copulas, 2nd ed.; Springer Series in Statistics; Springer: New York, NY, USA, 2006; pp. xiv+269. [Google Scholar] [CrossRef]
  3. Joe, H. Multivariate Models and Dependence Concepts; Monographs on Statistics and Applied Probability; Chapman & Hall: London, UK, 1997; Volume 73, pp. xviii+399. [Google Scholar] [CrossRef]
  4. Schweizer, B. Thirty years of copulas. In Advances in Probability Distributions with Given Marginals; Mathematics and Its Applications; Kluwer Academic Publishers: Dordrecht, The Netherlands, 1991; Volume 67, pp. 13–50. [Google Scholar]
  5. Sklar, A. Random variables, joint distribution functions, and copulas. Kybernetika 1973, 9, 449–460. [Google Scholar]
  6. Frees, E.W.; Valdez, E.A. Understanding relationships using copulas. N. Am. Actuar. J. 1998, 2, 1–25. [Google Scholar] [CrossRef] [Green Version]
  7. Cui, S.; Sun, Y. Checking for the gamma frailty distribution under the marginal proportional hazards frailty model. Stat. Sin. 2004, 14, 249–267. [Google Scholar]
  8. Cherubini, U.; Luciano, E.; Vecchiato, W. Copula Methods in Finance; Wiley Finance Series; John Wiley & Sons, Ltd.: Chichester, UK, 2004; pp. xvi+293. [Google Scholar] [CrossRef]
  9. McNeil, A.J.; Frey, R.; Embrechts, P. Quantitative Risk Management: Concepts, Techniques and Tools; Princeton Series in Finance; Princeton University Press: Princeton, NJ, USA, 2005; pp. xvi+538. [Google Scholar]
  10. Deheuvels, P. La fonction de dépendance empirique et ses propriétés. Un test non paramétrique d’indépendance. Bull. Acad. R. Belg. 1979, 65, 274–292. [Google Scholar] [CrossRef]
  11. Stute, W. The oscillation behavior of empirical processes: The multivariate case. Ann. Probab. 1984, 12, 361–379. [Google Scholar] [CrossRef]
  12. Gaenssler, P.; Stute, W. Seminar on Empirical Processes; DMV Seminar; Birkhäuser Verlag: Basel, Switzerland, 1987; Volume 9, pp. vi+110. [Google Scholar] [CrossRef]
  13. Rüschendorf, L. On the empirical process of multivariate, dependent random variables. J. Multivar. Anal. 1974, 4, 469–478. [Google Scholar] [CrossRef] [Green Version]
  14. Rüschendorf, L. Asymptotic distributions of multivariate rank order statistics. Ann. Stat. 1976, 4, 912–923. [Google Scholar] [CrossRef]
  15. Deheuvels, P. A multivariate Bahadur-Kiefer representation for the empirical copula process. Zap. Nauchn. Sem. S.-Peterburg. Otdel. Mat. Inst. Steklov. (POMI) 2009, 364, 120–147, 237. [Google Scholar] [CrossRef]
  16. Segers, J. Asymptotics of empirical copula processes under non-restrictive smoothness assumptions. Bernoulli 2012, 18, 764–782. [Google Scholar] [CrossRef]
  17. van der Vaart, A.W.; Wellner, J.A. Weak Convergence and Empirical Processes: With Applications to Statistics; Springer Series in Statistics; Springer: New York, NY, USA, 1996; pp. xvi+508. [Google Scholar] [CrossRef]
  18. Ghoudi, K.; Rémillard, B. Empirical processes based on pseudo-observations II: The multivariate case. In Asymptotic Methods in Stochastics; Fields Institute Communications; American Mathematical Society: Providence, RI, USA, 2004; Volume 44, pp. 381–406. [Google Scholar]
  19. Tsukahara, H. Semiparametric estimation in copula models. Canad. J. Stat. 2005, 33, 357–375. [Google Scholar] [CrossRef]
  20. van der Vaart, A.W.; Wellner, J.A. Empirical processes indexed by estimated functions. In Asymptotics: Particles, Processes and Inverse Problems; IMS Lecture Notes-Monograph Series; Institute of Mathematical Statistics: Beachwood, OH, USA, 2007; Volume 55, pp. 234–252. [Google Scholar] [CrossRef] [Green Version]
  21. Scaillet, O. A Kolmogorov-Smirnov type test for positive quadrant dependence. Canad. J. Stat. 2005, 33, 415–427. [Google Scholar] [CrossRef] [Green Version]
  22. Efron, B. Bootstrap methods: Another look at the jackknife. Ann. Stat. 1979, 7, 1–26. [Google Scholar] [CrossRef]
  23. Efron, B.; Tibshirani, R.J. An Introduction to the Bootstrap; Monographs on Statistics and Applied Probability; Chapman and Hall: New York, NY, USA, 1993; Volume 57, pp. xvi+436. [Google Scholar] [CrossRef] [Green Version]
  24. Bouzebda, S. Bootstrap de l’estimateur de Hill: Théorèmes limites. Ann. I.S.U.P. 2010, 54, 61–72. [Google Scholar]
  25. Alvarez-Andrade, S.; Bouzebda, S. Some selected topics for the bootstrap of the empirical and quantile processes. Theory Stoch. Process. 2019, 24, 19–48. [Google Scholar]
  26. Soukarieh, I.; Bouzebda, S. Renewal type bootstrap for increasing degree U-process of a Markov chain. J. Multivar. Anal. 2023, 195, 105143. [Google Scholar] [CrossRef]
  27. Bouzebda, S.; Soukarieh, I. Renewal type bootstrap for U-process Markov chains. Markov Process. Relat. Fields 2022, 28, 673–735. [Google Scholar]
  28. Soukarieh, I.; Bouzebda, S. Exchangeably Weighted Bootstraps of General Markov U-Process. Mathematics 2022, 10, 3745. [Google Scholar] [CrossRef]
  29. Tsukahara, H. Empirical Copulas and Some Applications; Research Report 27; The Institute for Economic Studies, Seijo University: Tokyo, Japan, 2000. [Google Scholar] [CrossRef]
  30. Bouzebda, S.; Zari, T. Strong approximation of empirical copula processes by Gaussian processes. Statistics 2013, 47, 1047–1063. [Google Scholar] [CrossRef] [Green Version]
  31. Bouzebda, S. On the strong approximation of bootstrapped empirical copula processes with applications. Math. Methods Stat. 2012, 21, 153–188. [Google Scholar] [CrossRef]
  32. Bouzebda, S. Kac’s representation for empirical copula process from an asymptotic viewpoint. Stat. Probab. Lett. 2017, 123, 107–113. [Google Scholar] [CrossRef]
  33. Bouzebda, S. Some applications of the strong approximation of the integrated empirical copula processes. Math. Methods Stat. 2016, 25, 281–303. [Google Scholar] [CrossRef]
  34. Bouzebda, S. Some new multivariate tests of independence. Math. Methods Stat. 2011, 20, 192–205. [Google Scholar] [CrossRef]
  35. Bouzebda, S.; Zari, T. Asymptotic behavior of weighted multivariate Cramér-von Mises-type statistics under contiguous alternatives. Math. Methods Stat. 2013, 22, 226–252. [Google Scholar] [CrossRef]
  36. Bouzebda, S. Asymptotic properties of pseudo maximum likelihood estimators and test in semi-parametric copula models with multiple change points. Math. Methods Stat. 2014, 23, 38–65. [Google Scholar] [CrossRef]
  37. Kosorok, M.R. Introduction to Empirical Processes and Semiparametric Inference; Springer Series in Statistics; Springer: New York, NY, USA, 2008; pp. xiv+483. [Google Scholar] [CrossRef]
  38. Toan, N.V. On weak convergence of the bootstrap general empirical process with random resample size. Vietnam J. Math. 2014, 42, 233–245. [Google Scholar] [CrossRef]
  39. Schmid, F.; Schmidt, R. Multivariate extensions of Spearman’s rho and related statistics. Stat. Probab. Lett. 2007, 77, 407–416. [Google Scholar] [CrossRef]
  40. Joe, H. Multivariate concordance. J. Multivar. Anal. 1990, 35, 12–30. [Google Scholar] [CrossRef] [Green Version]
  41. Dall’Aglio, G. Fréchet classes: The beginnings. In Advances in Probability Distributions with Given Marginals; Mathematics and Its Applications; Kluwer Academic Publishers: Dordrecht, The Netherlands, 1991; Volume 67, pp. 1–12. [Google Scholar]
  42. Schmid, F.; Schmidt, R.; Blumentritt, T.; Gaiß er, S.; Ruppert, M. Copula-based measures of multivariate association. In Copula Theory and Its Applications; Lecture Notes in Statistics; Springer: Berlin/Heidelberg, Germany, 2010; Volume 198, pp. 209–236. [Google Scholar] [CrossRef]
  43. Behboodian, J.; Dolati, A.; Úbeda Flores, M. A multivariate version of Gini’s rank association coefficient. Stat. Pap. 2007, 48, 295–304. [Google Scholar] [CrossRef]
  44. Úbeda Flores, M. Multivariate versions of Blomqvist’s beta and Spearman’s footrule. Ann. Inst. Stat. Math. 2005, 57, 781–788. [Google Scholar] [CrossRef]
  45. Genest, C.; Nešlehová, J.; Ben Ghorbal, N. Spearman’s footrule and Gini’s gamma: A review with complements. J. Nonparametr. Stat. 2010, 22, 937–954. [Google Scholar] [CrossRef]
  46. Bouzebda, S.; Cherfi, M. Test of symmetry based on copula function. J. Stat. Plann. Inference 2012, 142, 1262–1271. [Google Scholar] [CrossRef]
  47. Schmid, F.; Schmidt, R. Nonparametric inference on multivariate versions of Blomqvist’s beta and related measures of tail dependence. Metrika 2007, 66, 323–354. [Google Scholar] [CrossRef]
  48. Bouzebda, S.; Zari, T. Strong approximation of multidimensional P-P plots processes by Gaussian processes with applications to statistical tests. Math. Methods Stat. 2014, 23, 210–238. [Google Scholar] [CrossRef]
  49. Radulović, D.; Wegkamp, M.; Zhao, Y. Weak convergence of empirical copula processes indexed by functions. Bernoulli 2017, 23, 3346–3384. [Google Scholar] [CrossRef] [Green Version]
  50. Rémillard, B.; Scaillet, O. Testing for equality between two copulas. J. Multivar. Anal. 2009, 100, 377–386. [Google Scholar] [CrossRef] [Green Version]
  51. Bouzebda, S. General Tests of Conditional Independence Based on Empirical Processes Indexed by Functions. Jpn. J. Stat. Data Sci. 2023, 21, 1–59. [Google Scholar] [CrossRef]
  52. Bouzebda, S. General tests of independence based on empirical processes indexed by functions. Stat. Methodol. 2014, 21, 59–87. [Google Scholar] [CrossRef]
  53. Bücher, A.; Dette, H. A note on bootstrap approximations for the empirical copula process. Stat. Probab. Lett. 2010, 80, 1925–1932. [Google Scholar] [CrossRef] [Green Version]
  54. Fan, J.; Gijbels, I. Local Polynomial Modelling and Its Applications; Monographs on Statistics and Applied Probability; Chapman & Hall: London, UK, 1996; Volume 66, pp. xvi+341. [Google Scholar]
  55. Jordan, A.; Ivanoff, B.G. Multidimensional pp plots and precedence tests for point processes on  R d. J. Multivar. Anal. 2013, 115, 122–137. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Bouzebda, S. On Weak Convergence of the Bootstrap Copula Empirical Process with Random Resample Size. Stats 2023, 6, 365-380. https://doi.org/10.3390/stats6010023

AMA Style

Bouzebda S. On Weak Convergence of the Bootstrap Copula Empirical Process with Random Resample Size. Stats. 2023; 6(1):365-380. https://doi.org/10.3390/stats6010023

Chicago/Turabian Style

Bouzebda, Salim. 2023. "On Weak Convergence of the Bootstrap Copula Empirical Process with Random Resample Size" Stats 6, no. 1: 365-380. https://doi.org/10.3390/stats6010023

APA Style

Bouzebda, S. (2023). On Weak Convergence of the Bootstrap Copula Empirical Process with Random Resample Size. Stats, 6(1), 365-380. https://doi.org/10.3390/stats6010023

Article Metrics

Back to TopTop