On the Condition of Independence of Linear Forms with a Random Number of Summands

: The property of independence of two random forms with a non-degenerate random number of summands contradicts the Gaussianity of the summands.


Introduction
The interest in the characterization of probability distributions appeared during the 1920s. The original document was an article by G. Polya [1], which highlights an interesting characteristic of Gaussian distribution due to the property of the identity distribution of a random variable and a special linear form. The first result on the characterization of the distribution by the independence of two linear forms of two random variables was achieved by S.N. Bernstein [2], which again appears to be a characterization of the Gaussian distribution. The result obtained by S.N. Bernstein was essentially generalized by Skitovich [3] and Darmois [4]. In their works, the independence of two linear forms from an arbitrary number of random variables was considered. However, the result appeared to be the same. Independence took place for normally distributed variables only. In 1972, the first monograph on the characterization problems in statistics was published, written by A.M. Kagan, Yu.V. Linnik and C.R. Rao [5]. The monograph contained the results on the characterization of probability distributions from many fields of statistics, probability and their applications. Among these were, of course, the results related to independence properties, generalizing those noted above. Nevertheless, all results on the independence of linear forms led to the characterizations of normal distribution. Note that previously unsolved problems were formulated in the monograph [5], the solutions of which could contribute to a better understanding of the role and significance of the characterizations of probability distributions. During the period since the publication of [5], some of these problems have been solved, which led to the further development of the corresponding theory. One of the unsolved problems remained the following (see [5], pp. 460-461): let {X i } be a sequence of independent (and to start with, identically distributed) r.v.'s, let {a i } and {b i } be two sequences of real numbers, and let τ be a Markovian stopping time (cf. Chapter 12 also): then, construct the 'linear forms' with a random number of summands: Investigate the conditions for the independence of L 1 and L 2 . Under what conditions on τ and the sequences {a i } and {b i } would the independence of L 1 and L 2 imply the normality of X i ?
Here, we consider the case: when τ 1 and τ 2 are identically distributed random variables independent of the sequence {X i } and from each other. The variant with τ 1 = τ 2 = τ may almost surely be considered in the same way which would lead to the same result.

Main Result
Our main result is given by the following Theorem.
Theorem 1. Suppose that {X i } is a sequence of independent non-degenerate random variables and τ is a positive integer-valued random variable independent on {X i }. Linear forms: are independent for normally distributed {X i } if and only if τ 1 = τ 2 = n with probability 1 (n is a positive integer constant) and ∑ n i=1 a i b i σ 2 j = 0 where σ 2 j is a variance of X j .
Proof. Suppose the opposite. This means that there are normally distributed X j with the parameters m j ∈ R 1 , σ j ∈ R 1 + (j = 1, 2, . . .) and a positive integer-valued non-degenerated random variable τ independent of the sequence {X j } and such that L 1 and L 2 are independent. The independence of L 1 and L 2 may be written in terms of characteristic functions, which has the form: or, in a more detailed form: where p k = P{τ = k}, and f j (u) = exp{im j t − σ 2 j u 2 /2} is a characteristic function of the normal distribution with the parameters m j and σ j (j = 1, 2, . . .). The relation (1) may be written in more detail as where: Let us note that A k and B k are non-decreasing functions of k and strictly increasing on the set of all k for which p k = 0.
Denote by k o the minimal value of k for which p k = 0. Let us fix arbitrary t in (2) and let s → ∞. It is clear that the right-hand-side of (2) tends toward zero and: as s → ∞. However, for the left-hand-side of (2) we have: as s → ∞. From (4) and (5), it follows that: as s → ∞ for arbitrary fixed t ∈ R 1 . The left-hand-side of (6) does not depend on s.
However, B n > B k o for any n > k o and t is arbitrary. Therefore, exp{it ∑ n j=k o b j m j − t 2 2 B n }/ exp{− t 2 2 B k o } = o(1) as t → 0. Taking this into account and passing to absolute values in both sides of (7), we can see that: The relation (8) implies p o = 1.

Conclusions
As mentioned in the introduction, the independence of two linear forms usually leads to the normality of the corresponding random variables. Our Theorem 1 shows that this is not the case in the situation under consideration. The reason for this dependence of the forms for non-degenerate distributed, τ 1 and τ 2 , lies in the random number of random variables used. Similar facts may be found in the publications [6,7] for the cases of some different problems. One can see a big difference between problems with a fixed or a random number of variables involved.