Abstract
We investigate the complete convergence for weighted sums of sequences of negative dependence (ND) random variables and p-th moment convergence for weighted sums of sequences of ND random variables under sublinear expectation space. Using moment inequality and truncation methods, we prove the equivalent conditions of complete convergence for weighted sums of sequences of ND random variables and p-th moment convergence for weighted sums of sequences of ND random variables under sublinear expectation space.
MSC:
60F10; 60F05
1. Introduction
The nonadditive probabilities theory and nonadditive expectations theory are useful tools for researching measures of risk, uncertainties in statistics, non-linear stochastic calculus and superhedging in finance, cf. Peng [1,2], Denis [3], Gilboa [4], Marinacci [5]. This paper considers the general sublinear expectations which were introduced by Peng [6,7,8] in a general space by relaxing the linear property of the classical expectation to the subadditivity and positive homogeneity (cf. Definition 1 below). The sublinear expectation conception provided a very flexible framework to model the problems which are not additive. Inspired by the work of Peng, researchers have tried to study lots of limit theorems under linear expectation space to extend the corresponding results in probability and statistics. Zhang [9,10,11] studied the exponential inequalities, Rosenthal’s inequalities, Hölder’s inequalities and Donsker’s invariance principle under sublinear expectation space. Chen [12,13,14] studied the strong laws of large numbers, the weak laws of large numbers, and the large deviation for ND random variables under sublinear expectations, respectively. Wu [15] obtained precise asymptotics for complete integral convergence under sublinear expectation space. For more research about limit theorems of sublinear expectation space, the reader could refer to the articles of Hu and Peng [15], Li and Li [16], Liu [17], Ding [18], Wu [19], Guo and Zhang [20,21], Dong and Tan [22].
Recently, Guo and Shan [23] studied equivalent conditions of complete q-th moment convergence for sums of sequences of negatively orthant dependent (NOD) variables under the classical space. Xu and Cheng [24,25] obtained equivalent conditions of complete convergence for sums of independence identical distribution (i.i.d.) random variables sequences and p-th moment convergence for sums of i.i.d. random variables sequences under sublinear expectation space. ND sequences have wide applications in penetration theory, multivariable statistics, etc. Therefore, it is necessary to generalize the properties of independent sequences to ND sequences. Hence, it is meaningful to extend the results of Xu and Cheng [24,25] to ND random variables under sublinear expectation space. In this paper, we try to prove the equivalent conditions of complete convergence random variables and p-th moment convergence for weighted sums of sequences of ND random variables under sublinear expectation space.
2. Preliminaries
We use the framework of Peng [8]. Suppose that is a given measurable space, is a linear space of real functions defined on such that , where , denotes the indicator function of A, and if , then for each , where is the linear space of local Lipschitz continuous functions satisfying
for some depending on . We also denote as the linear space of bounded Lipschitz continuous functions, for some , satisfying
Definition 1.
A sublinear expectation on is a function satisfying the following properties: for all , we have
- (1)
- Monotonicity: if then ;
- (2)
- Constant preserving: ;
- (3)
- Sub-additivity: whenever is not of the form or ;
- (4)
- Positive homogeneity: .Here, . The triple is called a sublinear expectation space. Give a sublinear expectation , let us denote the conjugate expectation of by
A set function is called a capacity if
- (1)
- ;
- (2)
- .
In this paper, given a sublinear expectation space , we set the capacity for . We set the Choquet expectations by
Definition 2.
Let be a n-dimensional random vector defined in sublinear expectation space and be a n-dimensional random vector defined in sublinear expectation space . They are called ’identically distributed’, denoted by , if
Definition 3.
In a sublinear expectation space , a random vector is said to be independent to another random vector , under if
Random variables are said to be independent, if is independent to for each .
From the definition of independence, it is easily seen that, if Y is independent to X and , . , then
Further, if Y is independent to X and and , then
Definition 4.
A sequence of random variables is said to be i.i.d., if and is independent to for each .
Definition 5.
(i) In a sublinear expectation space , a random vector , is said to be ND to another random vector , under if for each pair of test functions and , we have whenever , and either and are coordinate-wise non-increasing.
(ii) Let be a sequence of random variables in the sublinear expectations. are said to be ND if is ND to for each .
From the definition of independence and ND, if Y is independent to X, then Y is ND to X. Furthermore, let be a sequence of independent random variables and , then is also a sequence of independent random variables; let be a sequence of ND random variables, are non-decreasing (non-increasing) functions, then is also a sequence of ND.
In the sequel we suppose that is sub-additive. Let C denote a positive constant which may differ from place to place. denote that there exists a constant such that for n large enough, means that and , means . or represents the indicator function of A.
We present several necessary lemmas to prove our main results.
Lemma 1
([9]). Let be two real numbers satisfying . Then, for two random variables in we have .
Lemma 2
([9]). If is countably subadditive and , then
Lemma 3
([9]). Suppose that is ND to for each , or is ND to for each . Then, for ,
where , is a positive constant depending only on p.
Lemma 4
([24]). Let Y be a random variable under sublinear expectation space . Then, for any , and
Lemma 5.
Let be a sequence of ND random variables under sublinear expectation space . Then, the condition that for all ,
implies that there exist constants C such that for all , and n large enough,
Proof.
Write . Without the loss of generality, we may assume that . Since and are sequences of ND under sublinear expectation space, denote , , combining inequality and Lemma 3 results in
In the same way, we could obtain
It follows that
Lemma 6
([25]). Assume that Y is a random variable under sublinear expectation space .
Then, for , the following is equivalent:
(i)
(ii)
Lemma 7
([25]). Assume that Y is a random variable under sublinear expectation space .
Then, for , the following is equivalent:
(i)
(ii)
3. Main Results
Our main results are as follows.
Theorem 1.
Assume that is a ND random variables sequence under sublinear expectation space , which is identically distributed as X. Suppose that , , , moreover, for ,
Furthermore, let be a triangular array of real numbers. Then, the following is equivalent:
(i)
(ii)
Theorem 2.
Assume that is a ND random variables sequenceunder sublinear expectation space , which is identically distributed as X. Suppose that , , , moreover, for ,
Furthermore, let be a triangular array of real numbers. Then the following is equivalent:
(i)
(ii)
Theorem 3.
Assume that is a ND random variables sequence under sublinear expectation space , which is identically distributed as X. Suppose that , , moreover, for ,
Furthermore, let be a triangular array of real numbers. Then, (6) equivalent to
Theorem 4.
Assume that is a ND random variables sequence under sublinear expectation space , which is identically distributed as X. Suppose that , , moreover, for ,
Furthermore, let be a triangular array of real numbers. Then (6) equivalent to
4. Proof of the Main Results
4.1. Proof of Theorem 1
We first prove (3) ⇒ (4). Choose , small enough, and a sufficiently large integer K. For all , we write
Obviously, . Notice that
Thus, in order to establish (4), it suffices to prove that
In order to estimate , we verify that
By Lemmas 2 and (3), we could obtain , . When , notice that and , it follows that
When , note that , by choosing small enough such that , we obtain
Hence, to prove , it suffices to prove that
From the property of ND random variables under sublinear expectation space, we could obtain is also a sequence of ND random variables under sublinear expectation space. By Markov’s inequality and Cr’s inequality under sublinear expectation, Lemma 3, it can be shown that for a suitably large M,
Taking M sufficiently large such that , we have
When , we could choose a sufficiently large M such that , , then
From , and , choosing a sufficiently large M such that , , we obtain
By the definition , we have . It follows that
Hence, by Markov’s inequality under sublinear expectation, it follows that
Notice that , we could choose , small enough, and a sufficiently large integer K such that and . Hence, by Lemma 2, we obtain . Similarly, we could obtain .
By the definition of , we have
Since , by Lemma 4, we see that
By Lemma 5, it follows that, for all
4.2. Proof of Theorem 2
From Theorem 1, we see that . We next establish . Choose , , sufficiently small, and a large enough integer K. For every , we note the fact that n is sufficiently large to guarantee . Without the loss of restrictions, we could write
It is obvious that . Notice that
Thus, in order to establish (6), we only need to prove that
In order to estimate , we verify that
Lemmas 1 and 2, and (5) imply that
When , since and , by Lemma 2, it follows that
Since , , we could know . Then by (14), for , we obtain
When , noticing that , taking a sufficiently small such that , we obtain
Observing that , we have
Then, to prove , we only need to show
It is obvious that is a sequence of negatively dependent random variables under sublinear expectation space. It follows from Markov’s inequality and Cr’s inequality under sublinear expectation, Lemma 3, that for a sufficiently large M,
Taking a suitably large M such that , , we have
Consequently, we obtain . Similar to the proof of (9), we could obtain
From , and , we obtain
By Maokov’s inequality under sublinear expectations, we conclude that
Since , we could take a sufficiently small and sufficiently large K such that and . It follows that . Similarly, we can obtain . It is obvious that implies . Then,
It follows that
Hence, from Lemma 6 and (5), we obtain . Now we prove (6) ⇒ (5). By Markov’s inequality under sublinear expectations, (6), and Lemma 2, we have
similar proofs of (3.17) are available in Guo [23], we have
By Lemma 5, it follows that, for all
4.3. Proof of Theorem 3
From the supposition of Theorem 3, for , one can obtain
and
4.4. Proof of Theorem 4
From the supposition of Theorem 4, for , one can obtain
and
5. Conclusions
In this paper, using the moment inequality for ND random variables sequences under sublinear expectation space and the truncation method, the authors establish the equivalent conditions of complete convergence for sums of ND random variables sequences and p-th moment convergence for sums of ND random variables sequences. The results extend the corresponding results from the classical probability space to the sublinear expectation space, as well as extending i.i.d random variables to ND random variables. In the future, we will try to establish the corresponding results for other dependent sequences under sublinear expectation space.
Author Contributions
P.S., D.W. and X.T. contributed equally to the development of this paper. All authors have read and agreed to the published version of the manuscript.
Funding
Department of Science and Technology of Jilin Province (Grant No. YDZJ202101ZYTS156), Natural Science Foundation of Jilin Province (Grant No. YDZJ202301ZYTS373).
Data Availability Statement
No new data were created or analyzed in this study.
Conflicts of Interest
The authors declare no conflict of interest.
References
- Peng, S.G. Backward SDE and related G-Expectation. Pitman Res. Notes Math. Ser. 1997, 364, 141–159. [Google Scholar]
- Peng, S.G. Monotonic limit theorem of BSDE and nonlinear decomposition theorem of Doob-Meyer type. Probab. Theory Relat. Fields 1999, 113, 473–499. [Google Scholar] [CrossRef]
- Denis, L.; Martini, C. A theoretical framework for the pricing of contingent claims in the presence of model uncertainty. Ann. Appl. Probab. 2006, 16, 827–852. [Google Scholar] [CrossRef]
- Gilboa, I. Expected utility with purely subjective non-additive probabilities. J. Math. Econ. 1987, 16, 65–88. [Google Scholar] [CrossRef]
- Marinacci, M. Limit laws for non-additive probabilities and their frequentist interpretation. J. Econ. Theory 1999, 84, 145–195. [Google Scholar] [CrossRef]
- Peng, S.G. G-Gxpectation, G-Brownian motion and related stochastic calculus of Ito’s type. Stoch. Anal. Appl. 2006, 2, 541–567. [Google Scholar]
- Peng, S.G. Multi-dimensional G-Brownian motion and related stochastic calculus under G-expectation. Stoch. Proc. Appl. 2008, 118, 2223–2253. [Google Scholar] [CrossRef]
- Peng, S.G. Nonlinear Expectations and Stochastic Calculus under Uncertainty, 1st ed.; Springer: Berlin/Heidelberg, Germany, 2010. [Google Scholar]
- Zhang, L.X. Exponential inequalities under the sub-linear expectations with applications to laws of the iterated logarithm. Sci. China Math. 2016, 59, 2503–2526. [Google Scholar] [CrossRef]
- Zhang, L.X. Rosenthal’s inequalities for independent and negatively dependent random variables under sub-linear expectations with applications. Sci. China Math. 2016, 59, 751–768. [Google Scholar] [CrossRef]
- Zhang, L.X. Donsker’s invariance principle under the sub-linear expectation with an application to chung’s law of the iterated logarithm. Commun. Math. Stat. 2015, 3, 187–214. [Google Scholar] [CrossRef]
- Zhang, M.; Chen, Z.J. Strong laws of large numbers for sub-linear expectations. Sci. China Math. 2016, 59, 945–954. [Google Scholar]
- Chen, Z.J.; Liu, Q.; Zong, G. Weak laws of large numbers for sublinear expectation. Math. Control Relat. Fields 2018, 8, 637–651. [Google Scholar] [CrossRef]
- Chen, Z.J.; Feng, X.F. Large deviation for negatively dependent random variables under sublinear expectation. Comm. Stat. Theory Methods 2016, 45, 400–412. [Google Scholar] [CrossRef]
- Wu, Q.Y. Precise Asymptotics for Complete Integral Convergence under Sublinear Expectations. Math. Probl. Eng. 2020, 13, 3145935. [Google Scholar] [CrossRef]
- Li, M.; Shi, Y.F. A general central limit theorem under sublinear expectations. Sci. China Math. 2010, 53, 1989–1994. [Google Scholar] [CrossRef]
- Liu, W.; Zhang, Y. Large deviation principle for linear processes generated by real stationary sequences under the sub-linear expectation. Comm. Stat. Theory Methods 2023, 52, 5727–5741. [Google Scholar] [CrossRef]
- Ding, X. A general form for precise asymptotics for complete convergence under sublinear expectation. AIMS Math. 2022, 7, 1664–1677. [Google Scholar] [CrossRef]
- Wu, Y.; Wang, X.J. General results on precise asymptotics under sub-linear expectations. J. Math. Anal. Appl. 2022, 511, 126090. [Google Scholar] [CrossRef]
- Guo, S.; Zhang, Y. Central limit theolrem for linear processes generated by m-dependent random variables under the sub-linear expectation. Comm. Stat. Theory Methods 2023, 52, 6407–6419. [Google Scholar] [CrossRef]
- Guo, S.; Zhang, Y. Moderate deviation principle for m-dependent random variables under the sublinear expectation. AIMS Math. 2022, 7, 5943–5956. [Google Scholar] [CrossRef]
- Dong, H.; Tan, X.L.; Yong, Z. Complete convergence and complete integration convergence for weighted sums of arrays of rowwise m-END under sub-linear expectations space. AIMS Math. 2023, 8, 6705–6724. [Google Scholar] [CrossRef]
- Guo, M.L.; Shan, S. Equivalent conditions of complete qth monent convergence for weighted sums of sequences of negatively orthant dependent raodom variables. Chin. J. Appl. Probab. Stat. 2020, 36, 381–392. [Google Scholar]
- Xu, M.Z.; Cheng, K. Convergence of sums of i.i.d. random variables under sublinear expectations. J. Inequal. Appl. 2021, 2021, 157. [Google Scholar] [CrossRef]
- Xu, M.Z.; Cheng, K. Equivalent conditions of complete p-th moment convergence for weighted sums of i.i.d. random variables under sublinear expectations. arXiv 2021, arXiv:2109.08464. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).