Previous Article in Journal
From Magnetic Field Seeds to Planetary and Galactic Magnetism
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

On Relative Stability for Strongly Mixing Sequences

by
Adam Jakubowski
and
Zbigniew Stanisław Szewczak
*
Faculty of Mathematics and Computer Science, Nicolaus Copernicus University, ul. Chopina 12/18, 87-100 Toruń, Poland
*
Author to whom correspondence should be addressed.
Foundations 2025, 5(4), 33; https://doi.org/10.3390/foundations5040033
Submission received: 30 July 2025 / Revised: 6 September 2025 / Accepted: 23 September 2025 / Published: 25 September 2025
(This article belongs to the Section Mathematical Sciences)

Abstract

We consider a class of strongly mixing sequences with infinite second moment. This class contains important GARCH processes that are applied in econometrics. We show the relative stability for such processes and construct a counterexample. We apply these results and obtain a new CLT without the requirement of exponential decay of mixing coefficients, and provide a counterexample to this as well.
MSC:
60F05; 60G10; 60G42

1. Introduction

Let { X k : k Z } be a strictly stationary sequence defined on a probability space ( Ω , F , P ) . In the case of independent random variables, A. Ya. Khinchine in 1926 stated the problem of the relation between the Central Limit Theorem (CLT) for centered summands and the Law of Large Numbers (LLN) (see p. 421 in [1]). Let U 2 ( x ) = E ( | X 1 | 2 I ( | X 1 | x ) ) , where I ( A ) is the indicator of the set A, and
b n 2 = sup { x > 0 ; n U 2 ( x ) x 2 } .
It turns out that if { X k } is centered, independent and identically distributed (i.i.d.), then the CLT holds if and only if (see §28 in [2])
X 1 2 + + X n 2 b n 2 P 1 ,
where P denotes convergence in probability. The convergence in probability of normalized sums to 1 was named relative stability in 1936 by A. Ya. Khinchine. One of the equivalent conditions for relative stability is the slow variation (in the sense of Karamata) of U 2 ( x ) (see Th. 8.8.1 in [3]). The latter remains true for uniformly strong mixing sequences (see p. 436 in Vol. I, [4]). Here, we concentrate on strongly mixing sequences such that E ( X 1 2 ) = . Recall the following measure of dependence
α ( n ) = α n = α ( F 0 , F n ) = sup { | P ( B A ) P ( A ) P ( B ) | ; A F 0 , B F n } ,
where F k m = σ ( { X i : k i m } ) (in notations, definitions, and other explanations, we, in principle, follow [4]). We say that { X k } is strongly (or α ) mixing if lim n α ( n ) = 0 . One of the aims of this paper is to give criteria for relative stability applicable to the CLT. The second aim of this paper is to give an example of a strongly mixing sequence that fails to be relatively stable with { b n } defined in (1). Thus, the situation is different from that of uniformly strong mixing sequences.
The paper is organized as follows. In Section 2 the main results are presented. These are new relative stability results for α -mixing sequences with an infinite second moment in terms of the quantile function (see Ch. 10 in [4]) and a counterexample to this result. To the authors’ knowledge, there are no such counterexamples in the existing literature. In Section 3, we apply results from the previous section. We obtain a new CLT for martingale differences with infinite variance and CLT with heavy tailed marginals without the assumption of the exponential decay of α n . Such results are important in modelling volatility phenomena in econometric time series (see [5]).

2. Main Results

Set
Q Z ( u ) = inf { t R : P ( Z > t ) u } and Y k n = X k 2 I ( | X k | b n ) .
The following statement gives a sufficient condition for relative stability in the case of α –mixing.
Theorem 1. 
Suppose U 2 ( x ) is slowly varying such that E ( X 1 2 ) = . If
n k = 1 n 0 α k Q Y 1 n 2 ( u ) d u = O ( b n 4 )
then { X k 2 } is relatively stable.
Note that in the case of a finite second moment and the CLT, the “borderline” conditions are formulated regarding the quantile function (see Ch. 10 in [4]).
Proof. 
Theorem 1
Write
( k = 1 n x k ) 2 = k = 1 n x k 2 + 2 ν = 2 n k = 1 ν 1 x k x ν .
By this and Theorem 1.1 in [6], Theorem 8.1.3 on p. 332 in [3], and (2), we get
Var ( k = 1 n Y k n b n 2 ) n Var ( Y 1 n ) b n 4 + 4 b n 4 ν = 2 n k = 1 ν 1 0 α ν k Q Y 1 n 2 ( u ) d u o ( 1 ) + 4 n b n 4 k = 1 n 0 α k Q Y 1 n 2 ( u ) d u < .
Therefore { k = 1 n Y k n b n 2 } is uniformly integrable, so by using the Markov–Bernstein blocking technique (see Sec. 1.17 in Vol. I, [4]), Theorem 2 on p. 140 in [2], and the truncation argument, we obtain Theorem 1. □
Next, consider a strictly stationary Markov chain { ξ k } with a countable number of states and a positive functional f defined on it. Suppose { ξ k } is aperiodic and irreducible and ( π 1 , π 2 , ) is the initial (stationary) distribution. By Theorem 7.7 on p. 212 in Vol. I, [4] { f ( ξ k ) } is α –mixing. We will follow § 16 in [7]. Set
τ 1 ( ω ) = min { n 1 ; ξ n ( ω ) = i } ,
τ k + 1 ( ω ) = min { n τ k ( ω ) ; ξ n ( ω ) = i , ξ ν ( ω ) i , τ k ( ω ) ν < n } .
Let
l n ( ω ) = min { k 0 ; τ k ( ω ) n < τ k + 1 ( ω ) }
and put ρ k = τ k + 1 τ k . It is well-known that { ρ k } is an i.i.d. sequence, E ( ρ k ) = 1 π i ,   l n n a . s . π i , τ l n n a . s . 1 , where a . s . denotes almost sure convergence. We have
P ( ρ k = n ) = f i i ( n ) = P ( ξ ν i , 0 < ν < n , ξ n = i | ξ 0 = i ) .
Finally, define
Y k = ν = τ k τ k + 1 1 f ( ξ ν ) , Y n = ν = 0 min { n , τ 1 1 } f ( ξ ν ) , Y n = I [ l n 1 ] ν = τ l n n f ( ξ ν ) .
We have the following dissection formula
S n = k = 1 n f ( ξ k ) = Y n + k = 1 l n 1 Y k + Y n ,
where { Y k } is an i.i.d. random sequence and Y n < ( a . s . ) , Y n is bounded in probability. Thus, by the Khinchine relative stability theorem (cf. Theorem 8.8.1 on p. 373 in [3]), we get Theorem 2.
Theorem 2. 
Suppose { ξ k } is a strictly stationary Markov chain taking countable number states and f is a positive functional defined on it. Assume { ξ k } is aperiodic and irreducible. Then, { f ( ξ k ) } is relatively stable if and only if E ( Y 1 I ( Y 1 x ) ) is a slowly varying function in the sense of Karamata.
In Theorem 2, we cannot replace Y 1 with the marginal.
Theorem 3. 
There exists an α–mixing homogeneous Markov chain { X k 2 } with E ( X 1 2 I ( X 1 2 x ) slowly varying which is not relatively stable with b n 2 .
Proof. 
Theorem 3
Consider the Markov chain { ξ k } with state space { 1 , 2 , } and the transition matrix
1 1 2 ln 3 2 2 ln 2 1 2 ln 3 2 2 ln 2 0 0 1 2 2 ln 4 3 2 ln 3 0 2 2 ln 4 3 2 ln 3 0 1 3 2 ln 5 4 2 ln 4 0 0 3 2 ln 5 4 2 ln 4 .
Set μ = 1 + 1 ln 2 k 1 ln ( k + 2 ) ( k + 1 ) 2 . The stationary initial vector π equals
( 1 μ , ln 3 μ 2 2 ln 2 , ln 4 μ 3 2 ln 2 , ln 5 μ 4 2 ln 2 , )
and by Theorem 7.7 on p. in Vol. I, [4] the chain { ξ k } is α –mixing. Further,
P ( { ξ n = n + m } { ξ 0 = n } ) = ln ( n + m + 1 ) μ ( n + m ) 2 ln 2
and therefore for m = n we have
lim inf n n 2 α n 1 4 μ ln 2 lim inf n ln ( 2 n + 1 ) =
(cf. Example 7.11 on pp. 217–218 in Vol. I, [4]).
Take f ( x ) = x ln x . Set X k = f ( ξ k ) . Write g ( x ) h ( x ) when lim x g ( x ) h ( x ) = 1 . We have
x 2 P ( X 1 > x ) = x 2 k ln k > x 2 ln ( k + 1 ) μ k 2 ln 2 x 2 k > 2 x 2 ln x ln ( k + 1 ) μ k 2 ln 2 x 2 ln ( 2 x 2 ln x ) 2 μ ln 2 · x 2 ln x ln x μ ln 2 · ln x = 1 μ ln 2
and therefore E ( X 1 2 I ( X 1 x ) ) ln x μ ln 2 . Thus the normalizing sequence in Theorem 1 is b n 2 = c n = n E ( X 1 2 I ( X 1 2 c n ) ) n ln n μ ln 2 .
On the other hand,
f 11 ( n ) = { ξ ν 1 , 0 < ν < n , ξ n = 1 | ξ 0 = 1 } = 1 2 2 2 ln 3 ln 2 · 2 2 3 2 ln 4 ln 3 ( n 1 ) 2 n 2 ln ( n + 1 ) ln n · ( 1 n 2 ( n + 1 ) 2 ln ( n + 2 ) ln ( n + 1 ) ) = ln ( n + 1 ) n 2 ln 2 ln ( n + 2 ) ( n + 1 ) 2 ln 2 2 ln ( n + 1 ) n 3 ln 2
and P ( Y 1 = ν = 1 n 1 ν ln ν ) = f 11 ( n ) . Thus
P ( Y 1 > x ) = ν = 1 n 1 ν ln ν > x 2 ln ( n + 1 ) n 3 ln 2 n 2 ln n > 2 x 2 ln ( n + 1 ) n 3 ln 2 n > 2 x ln x 2 ln ( n + 1 ) n 3 ln 2 2 ln ( 2 x ln x ) 8 x ln x ln 2 ln x 8 x ln 2
and therefore E ( Y 1 I ( Y 1 x ) ) 1 x ln y 8 y ln 2 d y ln 3 2 x 12 ln 2 . Further, the normalizing sequence in Theorem 2 is c n n 12 ln 2 ln 3 2 n . Thus by Theorem 2 we have
12 ln 2 n ln 3 2 n k = 1 n Y k P 1 and 12 μ ln 2 n ln 3 2 n k = 1 l n Y k P 1 .
So
12 μ ln 2 n ln 3 2 n k = 1 n X k 2 = 12 μ ln 2 n ln 3 2 n k = 1 n f ( ξ k ) P 1 .
Whence for c n from Theorem 1
μ ln 2 n ln n k = 1 n X k 2 = ln n 12 · 12 μ ln 2 n ln 3 2 n k = 1 n f ( ξ k ) P .
Therefore { X k 2 } is not relatively stable with b n 2 = c n . □
The following result is a consequence of Theorem 3.
Corollary 1. 
There exists a strongly mixing (martingale difference) sequence { X k } with E ( X 1 2 I ( | X 1 | x ) ) slowly varying, E ( X 1 2 ) = , for which CLT fails to hold under normalizing b n .
Proof. 
Corollary 1
Suppose { r k } is a sequence of the Rademacher functions independent of { X k } . Now one can, via Theorem 3, obtain an example of a martingale difference sequence { r k X k } that is strongly mixing and fails to satisfy CLT while the truncated second moment of its marginal varies slowly. □

3. CLT via Principle of Conditioning

The Principle of Conditioning (PC) says that if we transfer the conditions of a limit theorem for row-wise independent random variables in such a way that
(i)
the expectations are replaced by conditional expectations with respect to the past;
(ii)
the convergence of numbers is replaced by convergence in the probability of random variables appearing in the conditions;
then still the conclusion will hold for adapted sequences (see the proof of Th. 2.2.4 in [8]). For example, the Lévy theorem states that if E ( X 1 ) = 0 , E ( X 1 2 ) = 1 , then CLT holds for { X k } . Thus if ( X k , F k ) is an adapted sequence and E ( X k | F k 1 ) = 0 and
E ( X 1 2 | F 0 ) + + E ( X n 2 | F n 1 ) n P 1
then CLT holds, too (because n = E ( X 1 2 ) + + E ( X n 2 ) ). From the latter the Billingsley–Ibragimov theorem follows. However, by this method, only necessary conditions transfer to adapted sequences. Compared with the Markov–Bernstein method, the advantage of the PC is that we avoid the calculation of the dependence coefficient (which might not be easy).
We can apply PC for dependent random variables (e.g., martingale differences (cf. [9])).
Theorem 4. 
Suppose that ( X k , F k ) is an identically distributed, adapted sequence with E ( X 1 2 I ( | X 1 | x ) ) slowly varying function (s.v.) and such that E ( X k | F k 1 ) = 0 . If { X k 2 } is relatively stable with b n 2 , then CLT holds.
Proof. 
Theorem 4
Set X n k = E ( X k I ( | X k | b n ) ) ,   b n 2 n E ( X 1 2 I ( | X 1 | b n ) ) . Since n P ( | X 1 | > b n ) 0 , we have to prove that
b n 2 k = 1 n E ( ( X n k E ( X n k | F k 1 ) ) 2 | F k 1 ) P 1
or equivalently,
b n 2 k = 1 n ( E ( X n k 2 | F k 1 ) E 2 ( X n k | F k 1 ) ) P 1 .
Now, observe that by Karamata’s theorem (see Theorem 8.1.2 in [3])
x E ( | X 1 | I ( | X 1 | > x ) ) E ( X 1 2 I ( | X 1 | x ) ) 0 as x
hence
b n 2 k = 1 n E 2 ( X n k | F k 1 ) b n 2 k = 1 n | E ( X n k | F k 1 ) | E | X k X n k | L 1 0
since | E ( X n k | F k 1 ) | E | X k | I ( | X k | > b n ) . Again, by the Karamata theorem (see Theorem 8.1.3 on p. 332 in [3])
E ( X 1 4 I ( | X 1 | x ) ) x 2 E ( X 1 2 I ( | X 1 | x ) ) 0 as x
so we have
b n 4 E ( k = 1 n ( X n k 2 E ( X n k 2 | F k 1 ) ) ) 2 = b n 4 k = 1 n E ( X n k 2 E ( X n k 2 | F k 1 ) ) 2 2 n b n 4 E ( X 1 4 I ( | X 1 | b n ) ) C E ( X 1 4 I ( | X 1 | b n ) b n 2 E ( X 1 2 I ( | X 1 | b n ) 0 .
Thus (4) holds since { X k 2 } is r.s. with b n 2 . To prove the conditional Lindeberg condition
b n 2 k = 1 n E ( ( X n k E ( X n k | F k 1 ) ) 2 I ( | X n k E ( X n k | F k 1 ) | > ϵ b n ) | F k 1 ) P 0
we use the inequality
2 p E ( | X Y | p I ( | X Y | > 2 z ) ) E ( | X | p I ( | X | > z ) ) + E ( | Y | p I ( | Y | > z ) )
with p = 2 . Thus
b n 2 E | k = 1 n E ( X n k 2 I ( | X n k | > ϵ 4 b n ) | F k 1 ) | n b n 2 E ( X 1 2 I ( | X 1 | ( ϵ 4 b n , b n ] ) ) n 0
and
b n 2 E | k = 1 n E ( E 2 ( X n k | F k 1 ) I ( | E ( X n k | F k 1 ) | > ϵ 4 ϵ b n ) | F k 1 ) | b n 2 k = 1 n E ( E 2 ( X n k | F k 1 ) ) n b n E ( | X 1 | I ( | X 1 | > b n ) ) n 0 .
So (5) is fulfilled and by the PC (see Th. 4.7 in [10]) we get b n 1 S n N ( 0 , 1 ) in distribution. □
Suppose x 2 P ( | X 1 | > x ) C , C ( 0 , ) , as x . It is not difficult to see that b n 2 C n ln n . Thus by Theorem 1 and Theorem 4 we have
Theorem 5. 
Suppose that ( X k , F k ) is a strictly stationary, adapted sequence such that E ( X k | F k 1 ) = 0 and x 2 P ( | X 1 | > x ) C as x . If n α n = O ( ln 3 n ) then CLT holds.
This is since (2) holds for { X k 2 } . This result is applicable to GARCH processes (see, e.g., [11]). However, GARCH processes are strongly mixing at an exponential rate. Corollary 1 says that for a slower-than-exponential rate, the CLT may fail.
Proof. 
Theorem 5
Set U 4 ( x ) = E ( X 1 4 I ( X 1 x ) ) . By the Hölder inequality and 10.13(III) on p. 319 in Vol. I, [4] we get
0 α k Q Y 1 n 2 ( u ) d u = 0 1 I [ 0 , α k ) ( u ) Q Y 1 n 2 ( u ) d u α k U 4 ( b n ) .
Using the formula
U 4 ( x ) = x 4 P ( | X 1 | > x ) + 4 0 x y 3 P ( | X 1 | > y ) d y
and de l’Hôpital’s rule we have U 4 ( x ) C x 2 . Whence
n b n 4 k = 1 n 0 α k Q Y 1 n 2 ( u ) d u const . n ln 3 n k = 1 n ln 3 k k = O ( 1 )
so that (2) holds. Thus { X k 2 } is relatively stable and we deduce Theorem 5 from Theorem 4. □

Author Contributions

Conceptualization, A.J. and Z.S.S.; methodology, A.J. and Z.S.S.; software, A.J. and Z.S.S.; validation, A.J. and Z.S.S.; formal analysis, A.J. and Z.S.S.; investigation, A.J. and Z.S.S.; resources, A.J. and Z.S.S.; data curation, A.J. and Z.S.S.; writing—original draft preparation, A.J. and Z.S.S.; writing—review and editing, A.J. and Z.S.S.; visualization, A.J. and Z.S.S.; supervision, A.J. and Z.S.S.; project administration, A.J. and Z.S.S.; funding acquisition, A.J. and Z.S.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Acknowledgments

The authors thank the anonymous referee’s for their valuable comments that improved the present paper.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Gnedenko, B.V. A Course in Probability Theory; URSS: Moscow, Russia, 2005. (In Russian) [Google Scholar]
  2. Gnedenko, B.V.; Kolmogorov, A.N. Limit Distributions for Sums of Independent Random Variables; Addison-Wesley: Reading, MA, USA, 1968. [Google Scholar]
  3. Bingham, N.H.; Goldie, C.M.; Teugels, J.L. Regular variation. In Encyclopedia of Mathematics and Its Applications; Cambridge University Press: Cambridge, UK, 1987; Volume 27. [Google Scholar]
  4. Bradley, R.C. Introduction to Strong Mixing Conditions; Kendrick Press: Heber City, UT, USA, 2007; Volume I–III. [Google Scholar]
  5. Matsui, M.; Mikosch, T. The Gaussian central limit theorem for a stationary time series with infinite variance. arXiv 2025, arXiv:2503.15894v2. [Google Scholar] [CrossRef]
  6. Rio, E. Asymptotic Theory of Weakly Dependent Random Processes; Springer: Berlin/Heidelberg, Germany, 2017. [Google Scholar]
  7. Chung, K.L. Markov Chains with Stationary Transition Probabilities; Springer: Berlin/Heidelberg, Germany; New York, NY, USA, 1967. [Google Scholar]
  8. Merlevède, F.; Peligrad, M.; Utev, S. Functional Gaussian Approximation for Depenedent Structures; OUP: Oxford, UK, 2019. [Google Scholar]
  9. Hall, P.; Heyde, C.C. Martingale Limit Theory and Its Applications; Academic Press: New York, NY, USA, 1980. [Google Scholar]
  10. Petrov, V.V. Limit Theorems of Probability Theory. Sequences of Independent Random Variables; Oxford University Press: Oxford, UK, 1995. [Google Scholar]
  11. Buraczewski, D.; Damek, E.; Mikosch, T. Stochastic Models with Power-Law Tails. The Equation X = AX + B; Springer: Berlin/Heidelberg, Germany, 2016. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Jakubowski, A.; Szewczak, Z.S. On Relative Stability for Strongly Mixing Sequences. Foundations 2025, 5, 33. https://doi.org/10.3390/foundations5040033

AMA Style

Jakubowski A, Szewczak ZS. On Relative Stability for Strongly Mixing Sequences. Foundations. 2025; 5(4):33. https://doi.org/10.3390/foundations5040033

Chicago/Turabian Style

Jakubowski, Adam, and Zbigniew Stanisław Szewczak. 2025. "On Relative Stability for Strongly Mixing Sequences" Foundations 5, no. 4: 33. https://doi.org/10.3390/foundations5040033

APA Style

Jakubowski, A., & Szewczak, Z. S. (2025). On Relative Stability for Strongly Mixing Sequences. Foundations, 5(4), 33. https://doi.org/10.3390/foundations5040033

Article Metrics

Back to TopTop