Equivalent Conditions of Complete p-th Moment Convergence for Weighted Sum of ND Random Variables under Sublinear Expectation Space

: We investigate the complete convergence for weighted sums of sequences of negative dependence (ND) random variables and p-th moment convergence for weighted sums of sequences of ND random variables under sublinear expectation space. Using moment inequality and truncation methods, we prove the equivalent conditions of complete convergence for weighted sums of sequences of ND random variables and p-th moment convergence for weighted sums of sequences of ND random variables under sublinear expectation space.


Introduction
The nonadditive probabilities theory and nonadditive expectations theory are useful tools for researching measures of risk, uncertainties in statistics, non-linear stochastic calculus and superhedging in finance, cf. Peng [1,2], Denis [3], Gilboa [4], Marinacci [5]. This paper considers the general sublinear expectations which were introduced by Peng [6][7][8] in a general space by relaxing the linear property of the classical expectation to the subadditivity and positive homogeneity (cf. Definition 1 below). The sublinear expectation conception provided a very flexible framework to model the problems which are not additive. Inspired by the work of Peng, researchers have tried to study lots of limit theorems under linear expectation space to extend the corresponding results in probability and statistics. Zhang [9][10][11] studied the exponential inequalities, Rosenthal's inequalities, Hölder's inequalities and Donsker's invariance principle under sublinear expectation space. Chen [12][13][14] studied the strong laws of large numbers, the weak laws of large numbers, and the large deviation for ND random variables under sublinear expectations, respectively. Wu [15] obtained precise asymptotics for complete integral convergence under sublinear expectation space. For more research about limit theorems of sublinear expectation space, the reader could refer to the articles of Hu and Peng [15], Li and Li [16], Liu [17], Ding [18], Wu [19], Guo and Zhang [20,21], Dong and Tan [22].
Recently, Guo and Shan [23] studied equivalent conditions of complete q-th moment convergence for sums of sequences of negatively orthant dependent (NOD) variables under the classical space. Xu and Cheng [24,25] obtained equivalent conditions of complete convergence for sums of independence identical distribution (i.i.d.) random variables sequences and p-th moment convergence for sums of i.i.d. random variables sequences under sublinear expectation space. ND sequences have wide applications in penetration theory, multivariable statistics, etc. Therefore, it is necessary to generalize the properties of independent sequences to ND sequences. Hence, it is meaningful to extend the results of |ϕ(x) − ϕ(y)| ≤ C|x − y|, ∀x, y ∈ R n . Definition 1. A sublinear expectation E on H is a function E : H → R satisfying the following properties: for all X, Y ∈ H, we have In this paper, given a sublinear expectation space (Ω, H, E), we set the capacity V(A) := E[I A ] for A ∈ F . We set the Choquet expectations C V by Definition 2. Let X 1 be a n-dimensional random vector defined in sublinear expectation space (Ω 1 , H 1 , E 1 ) and X 2 be a n-dimensional random vector defined in sublinear expectation space (Ω 2 , H 2 , E 2 ). They are called 'identically distributed', denoted by X 1 Definition 3. In a sublinear expectation space (Ω, H, E), a random vector Y = (Y 1 , . . . , Y n ), Y i ∈ H is said to be independent to another random vector X = (X 1 , . . . , X m ), Random variables {X n , n ≥ 1} are said to be independent, if X i+1 is independent to (X 1 , . . . , X i ) for each i ≥ 1.
From the definition of independence, it is easily seen that, if Y is independent to X and X, Y ∈ L , L = {X ∈ H : Further, if Y is independent to X and X, Y ∈ L and X ≥ 0, Y ≥ 0, then Definition 4. A sequence of random variables {X n , n ≥ 1} is said to be i.i.d., if X i d = X 1 and X i+1 is independent to (X 1 , . . . , X i ) for each i ≥ 1.
< ∞, and either ϕ 1 and ϕ 2 are coordinate-wise non-increasing. (ii) Let {X n , n ≥ 1} be a sequence of random variables in the sublinear expectations. X 1 , X 2 , . . . are said to be ND if X i+1 is ND to (X 1 , . . . , X i ) for each i ≥ 1.
From the definition of independence and ND, if Y is independent to X, then Y is ND to X. Furthermore, let {X n , n ≥ 1} be a sequence of independent random variables and is also a sequence of independent random variables; let {X n , n ≥ 1} be a sequence of ND random variables, f 1 (x), f 2 (x), . . . ∈ C l,Lip(R) are non-decreasing (non-increasing) functions, then { f n (X n ), n ≥ 1} is also a sequence of ND.
In the sequel we suppose that E is sub-additive. Let C denote a positive constant which may differ from place to place. a n b n denote that there exists a constant C > 0 such that a n ≤ Cb n for n large enough, a n ≈ b n means that a n b n and b n a n , log x means ln(max{e, x}). I(A) or I A represents the indicator function of A.
We present several necessary lemmas to prove our main results.

Lemma 4 ([24]
). Let Y be a random variable under sublinear expectation space (Ω, H, E). Then, for any α > 0, γ > 0, and β > −1 Lemma 5. Let {X n , n ≥ 1} be a sequence of ND random variables under sublinear expectation space (Ω, H, E). Then, the condition that for all x > 0, implies that there exist constants C such that for all x > 0, and n large enough, Without the loss of generality, we may assume that In the same way, we could obtain It follows that Similar to the proof of Lemma 2.5 in Xu [24], by positive homogeneity of sublinear expectation space, Lemma 1 and the subadditivity of expectations, we conclude that which combined with (1) results in (2) immediately. Therefore the proof is finished.
Assume that Y is a random variable under sublinear expectation space (Ω, H, E). Then, for p > 0, q > 0, r > 0, the following is equivalent: Assume that Y is a random variable under sublinear expectation space (Ω, H, E). Then, for p > 0, q > 0, r > 0, the following is equivalent:

Main Results
Our main results are as follows.
be a triangular array of real numbers. Then, the following is equivalent: Furthermore, let {a ni ≈ (i/n) β (1/n) q , 1 ≤ i ≤ n, n ≥ 1} be a triangular array of real numbers. Then the following is equivalent: for p < r/q, C V (|X| r/q log |X|) < ∞, for p = r/q.

Proof of Theorem 1
We first prove (3) ⇒ (4). Choose δ > 0, small enough, and a sufficiently large integer K. For all 1 ≤ i ≤ n, n ≥ 1, we write Thus, in order to establish (4), it suffices to prove that In order to estimate I 1 , we verify that ni → 0 as n → ∞.
By Lemma 2 and 3, we could obtain E|X| 1/q < ∞, E|X| r/q < ∞. When q > 1, notice that |X When 1 2 < q ≤ 1, note that E(X) = −E(−X) = 0, by choosing τ small enough such that−τ(1 − r/q) + 1 − r < 0, we obtain Hence, to prove I 1 < ∞, it suffices to prove that From the property of ND random variables under sublinear expectation space, we could obtain X ni is also a sequence of ND random variables under sublinear expectation space. By Markov's inequality and Cr s inequality under sublinear expectation, Lemma 3, it can be shown that for a suitably large M,
By the definition of X Since {a ni ≈ (i/n) β (1/n) q }, by Lemma 4, we see that Then by (3), we conclude I 4 < ∞. Now we prove (4) ⇒ (3). Since By Lemma 5, it follows that, for all > 0 Now, combining (12) with (4) gives By the process of proof of I 4 < ∞, we see that (13) is equivalent to (3). The proof of Theorem 1 is finished.

Proof of Theorem 2
We first prove that (5) ⇒ (6). Notice that a ni X i ≥ x 1/p dx :=I + I I From Theorem 1, we see that I < ∞. We next establish I I < ∞. Choose 0 < α < 1/p, δ > 0, sufficiently small, and a large enough integer K. For every 1 ≤ i ≤ n, n ≥ 1, we note the fact that n is sufficiently large to guarantee x α n −τ < x 1/p 4K . Without the loss of restrictions, we could write Thus, in order to establish (6), we only need to prove that In order to estimate J 1 , we verify that Lemmas 1 and 2, and (5) imply that E|X| 1/q < ∞, E|X| r/q < ∞.
By Lemma 5, it follows that, for all > 0 Now, combining (16) with (4) gives By the process of proof of I 4 < ∞, we see that (17) is equivalent to (3). The proof of Theorem 2 is finished.

Proof of Theorem 3
From the supposition of Theorem 3, for β = −q/r, one can obtain and By the same argument as the proof of Theorem 2, with Lemma 7 in place of Lemma 6, together with (18) and (19), we could prove Theorem 3. Therefore, the proof is omitted.
By the same argument as the proof of Theorem 2, with (21) in place of (15), we could prove Theorem 4. Therefore, the proof is omitted.

Conclusions
In this paper, using the moment inequality for ND random variables sequences under sublinear expectation space and the truncation method, the authors establish the equivalent conditions of complete convergence for sums of ND random variables sequences and p-th moment convergence for sums of ND random variables sequences. The results extend the corresponding results from the classical probability space to the sublinear expectation space, as well as extending i.i.d random variables to ND random variables. In the future, we will try to establish the corresponding results for other dependent sequences under sublinear expectation space.