Abstract
We consider a class of strongly mixing sequences with infinite second moment. This class contains important GARCH processes that are applied in econometrics. We show the relative stability for such processes and construct a counterexample. We apply these results and obtain a new CLT without the requirement of exponential decay of mixing coefficients, and provide a counterexample to this as well.
MSC:
60F05; 60G10; 60G42
1. Introduction
Let be a strictly stationary sequence defined on a probability space . In the case of independent random variables, A. Ya. Khinchine in 1926 stated the problem of the relation between the Central Limit Theorem (CLT) for centered summands and the Law of Large Numbers (LLN) (see p. 421 in [1]). Let , where is the indicator of the set A, and
It turns out that if is centered, independent and identically distributed (i.i.d.), then the CLT holds if and only if (see §28 in [2])
where denotes convergence in probability. The convergence in probability of normalized sums to 1 was named relative stability in 1936 by A. Ya. Khinchine. One of the equivalent conditions for relative stability is the slow variation (in the sense of Karamata) of (see Th. 8.8.1 in [3]). The latter remains true for uniformly strong mixing sequences (see p. 436 in Vol. I, [4]). Here, we concentrate on strongly mixing sequences such that . Recall the following measure of dependence
where (in notations, definitions, and other explanations, we, in principle, follow [4]). We say that is strongly (or ) mixing if . One of the aims of this paper is to give criteria for relative stability applicable to the CLT. The second aim of this paper is to give an example of a strongly mixing sequence that fails to be relatively stable with defined in (1). Thus, the situation is different from that of uniformly strong mixing sequences.
The paper is organized as follows. In Section 2 the main results are presented. These are new relative stability results for -mixing sequences with an infinite second moment in terms of the quantile function (see Ch. 10 in [4]) and a counterexample to this result. To the authors’ knowledge, there are no such counterexamples in the existing literature. In Section 3, we apply results from the previous section. We obtain a new CLT for martingale differences with infinite variance and CLT with heavy tailed marginals without the assumption of the exponential decay of . Such results are important in modelling volatility phenomena in econometric time series (see [5]).
2. Main Results
Set
The following statement gives a sufficient condition for relative stability in the case of –mixing.
Theorem 1.
Suppose is slowly varying such that . If
then is relatively stable.
Note that in the case of a finite second moment and the CLT, the “borderline” conditions are formulated regarding the quantile function (see Ch. 10 in [4]).
Proof.
Theorem 1
Therefore is uniformly integrable, so by using the Markov–Bernstein blocking technique (see Sec. 1.17 in Vol. I, [4]), Theorem 2 on p. 140 in [2], and the truncation argument, we obtain Theorem 1. □
Next, consider a strictly stationary Markov chain with a countable number of states and a positive functional f defined on it. Suppose is aperiodic and irreducible and is the initial (stationary) distribution. By Theorem 7.7 on p. 212 in Vol. I, [4] is –mixing. We will follow in [7]. Set
Let
and put It is well-known that is an i.i.d. sequence, where denotes almost sure convergence. We have
Finally, define
We have the following dissection formula
where is an i.i.d. random sequence and is bounded in probability. Thus, by the Khinchine relative stability theorem (cf. Theorem 8.8.1 on p. 373 in [3]), we get Theorem 2.
Theorem 2.
Suppose is a strictly stationary Markov chain taking countable number states and f is a positive functional defined on it. Assume is aperiodic and irreducible. Then, is relatively stable if and only if is a slowly varying function in the sense of Karamata.
In Theorem 2, we cannot replace with the marginal.
Theorem 3.
There exists an α–mixing homogeneous Markov chain with slowly varying which is not relatively stable with .
Proof.
Theorem 3
Consider the Markov chain with state space and the transition matrix
Set . The stationary initial vector equals
and by Theorem 7.7 on p. in Vol. I, [4] the chain is –mixing. Further,
and therefore for we have
(cf. Example 7.11 on pp. 217–218 in Vol. I, [4]).
Take . Set . Write when . We have
and therefore . Thus the normalizing sequence in Theorem 1 is .
On the other hand,
and . Thus
and therefore . Further, the normalizing sequence in Theorem 2 is . Thus by Theorem 2 we have
So
Whence for from Theorem 1
Therefore is not relatively stable with . □
The following result is a consequence of Theorem 3.
Corollary 1.
There exists a strongly mixing (martingale difference) sequence with slowly varying, for which CLT fails to hold under normalizing .
Proof.
Corollary 1
Suppose is a sequence of the Rademacher functions independent of . Now one can, via Theorem 3, obtain an example of a martingale difference sequence that is strongly mixing and fails to satisfy CLT while the truncated second moment of its marginal varies slowly. □
3. CLT via Principle of Conditioning
The Principle of Conditioning (PC) says that if we transfer the conditions of a limit theorem for row-wise independent random variables in such a way that
- (i)
- the expectations are replaced by conditional expectations with respect to the past;
- (ii)
- the convergence of numbers is replaced by convergence in the probability of random variables appearing in the conditions;
then still the conclusion will hold for adapted sequences (see the proof of Th. 2.2.4 in [8]). For example, the Lévy theorem states that if , , then CLT holds for . Thus if is an adapted sequence and and
then CLT holds, too (because ). From the latter the Billingsley–Ibragimov theorem follows. However, by this method, only necessary conditions transfer to adapted sequences. Compared with the Markov–Bernstein method, the advantage of the PC is that we avoid the calculation of the dependence coefficient (which might not be easy).
We can apply PC for dependent random variables (e.g., martingale differences (cf. [9])).
Theorem 4.
Suppose that is an identically distributed, adapted sequence with slowly varying function (s.v.) and such that . If is relatively stable with , then CLT holds.
Proof.
Theorem 4
Set . Since , we have to prove that
or equivalently,
Now, observe that by Karamata’s theorem (see Theorem 8.1.2 in [3])
hence
since Again, by the Karamata theorem (see Theorem 8.1.3 on p. 332 in [3])
so we have
Thus (4) holds since is r.s. with . To prove the conditional Lindeberg condition
we use the inequality
with . Thus
and
So (5) is fulfilled and by the PC (see Th. 4.7 in [10]) we get in distribution. □
Suppose as . It is not difficult to see that . Thus by Theorem 1 and Theorem 4 we have
Theorem 5.
Suppose that is a strictly stationary, adapted sequence such that and as . If then CLT holds.
This is since (2) holds for . This result is applicable to GARCH processes (see, e.g., [11]). However, GARCH processes are strongly mixing at an exponential rate. Corollary 1 says that for a slower-than-exponential rate, the CLT may fail.
Proof.
Theorem 5
Set . By the Hölder inequality and 10.13(III) on p. 319 in Vol. I, [4] we get
Using the formula
and de l’Hôpital’s rule we have . Whence
so that (2) holds. Thus is relatively stable and we deduce Theorem 5 from Theorem 4. □
Author Contributions
Conceptualization, A.J. and Z.S.S.; methodology, A.J. and Z.S.S.; software, A.J. and Z.S.S.; validation, A.J. and Z.S.S.; formal analysis, A.J. and Z.S.S.; investigation, A.J. and Z.S.S.; resources, A.J. and Z.S.S.; data curation, A.J. and Z.S.S.; writing—original draft preparation, A.J. and Z.S.S.; writing—review and editing, A.J. and Z.S.S.; visualization, A.J. and Z.S.S.; supervision, A.J. and Z.S.S.; project administration, A.J. and Z.S.S.; funding acquisition, A.J. and Z.S.S. All authors have read and agreed to the published version of the manuscript.
Funding
This research received no external funding.
Data Availability Statement
The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.
Acknowledgments
The authors thank the anonymous referee’s for their valuable comments that improved the present paper.
Conflicts of Interest
The authors declare no conflicts of interest.
Correction Statement
This article has been republished with a minor correction to the Data Availability Statement. This change does not affect the scientific content of the article.
References
- Gnedenko, B.V. A Course in Probability Theory; URSS: Moscow, Russia, 2005. (In Russian) [Google Scholar]
- Gnedenko, B.V.; Kolmogorov, A.N. Limit Distributions for Sums of Independent Random Variables; Addison-Wesley: Reading, MA, USA, 1968. [Google Scholar]
- Bingham, N.H.; Goldie, C.M.; Teugels, J.L. Regular variation. In Encyclopedia of Mathematics and Its Applications; Cambridge University Press: Cambridge, UK, 1987; Volume 27. [Google Scholar]
- Bradley, R.C. Introduction to Strong Mixing Conditions; Kendrick Press: Heber City, UT, USA, 2007; Volume I–III. [Google Scholar]
- Matsui, M.; Mikosch, T. The Gaussian central limit theorem for a stationary time series with infinite variance. arXiv 2025, arXiv:2503.15894v2. [Google Scholar] [CrossRef]
- Rio, E. Asymptotic Theory of Weakly Dependent Random Processes; Springer: Berlin/Heidelberg, Germany, 2017. [Google Scholar]
- Chung, K.L. Markov Chains with Stationary Transition Probabilities; Springer: Berlin/Heidelberg, Germany; New York, NY, USA, 1967. [Google Scholar]
- Merlevède, F.; Peligrad, M.; Utev, S. Functional Gaussian Approximation for Depenedent Structures; OUP: Oxford, UK, 2019. [Google Scholar]
- Hall, P.; Heyde, C.C. Martingale Limit Theory and Its Applications; Academic Press: New York, NY, USA, 1980. [Google Scholar]
- Petrov, V.V. Limit Theorems of Probability Theory. Sequences of Independent Random Variables; Oxford University Press: Oxford, UK, 1995. [Google Scholar]
- Buraczewski, D.; Damek, E.; Mikosch, T. Stochastic Models with Power-Law Tails. The Equation X = AX + B; Springer: Berlin/Heidelberg, Germany, 2016. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).