1.1. Notation and Preliminaries
Assume that all the random variables and random vectors are defined on one and the same probability space . Let . The distribution of a random variable Y or a d-variate random vector  with respect to the measure  will be denoted  and , respectively. The weak convergence, the coincidence of distributions and the convergence in probability with respect to a specified probability measure will be denoted by the symbols ⟹,  and , respectively. The product of independent random elements will be denoted by the symbol ∘. The symbol ⊙ denotes the operation of coordinate-wise multiplication of independent random vectors. The vector with all zero coordinates will be denoted : . The vector whose all coordinates are equal to 1 will be denoted : .
A univariate random variable with the standard normal distribution function 
 will be denoted 
X,
        
Let 
 be a positive definite 
-matrix. The normal distribution in 
 with zero vector of expectations and covariance matrix 
 will be denoted 
. This distribution is defined by its density
        
The characteristic function 
 of a random vector 
 such that 
 has the form
        
Let 
E be a random variable with the standard exponential distribution: 
. The characteristic function of the r.v. 
E has the form
        
Let 
. The distribution of the random variable 
:
        is called the 
Weibull distribution with shape parameter 
. It is obvious that 
. It is easy to see that 
.
Recall that the distribution of a d-variate random vector  is called stable, if for any  there exist  and  such that , where  and  are independent and . In what follows, we will concentrate our attention on a special sub-class of stable distributions called strictly stable. This sub-class is characterized by that in the definition given above .
In the univariate case, the characteristic function 
 of a strictly stable random variable can be represented in several equivalent forms (see, e.g., [
1,
2]). For our further constructions the most convenient form is
        
        where
        
Here, 
 is the 
characteristic exponent, 
 is the 
skewness parameter. Representation (
2) leads to a more general representation by introducing a scale parameter additionally. Any random variable with characteristic function (
2) will be denoted 
 and the characteristic function (
2) itself will be written as 
. The distribution function corresponding to the characteristic function 
 will be denoted 
. For definiteness, 
.
From (
2) it follows that the characteristic function of a symmetric (
) strictly stable distribution has the form
        
From (
4) it is easy to see that 
.
Univariate stable distributions are popular examples of heavy-tailed distributions. Their moments of orders  do not exist (the only exception is the normal law corresponding to ). Stable laws and only they can be limit distributions for sums of a non-random number of independent identically distributed random variables with infinite variance under linear normalization.
Let 
. By 
 we will denote a positive random variable with the one-sided stable distribution corresponding to the characteristic function 
, 
. The Laplace–Stieltjes transform 
 of the random variable 
 has the form
        
The moments of orders 
 of the random variable 
 are infinite. For more details see [
2,
3].
Now turn to the multivariate case. By 
 we denote the unit sphere: 
. Let 
 be a finite (‘spectral’) measure on 
. It is known that the characteristic function of a strictly stable random vector 
 has the form
        
        with 
 defined in (
3), see [
4,
5,
6,
7]. A 
d-variate random vector with the characteristic function (
5) will be denoted 
.
As is known, a random vector 
 has a strictly stable distribution with some characteristic exponent 
 if and only if for any 
 the random variable 
 (the projection of 
) has the univariate strictly stable distribution with the same characteristic exponent 
 and some skewness parameter 
 up to a scale coefficient 
:
        see [
8]. Moreover, the projection parameter functions are related with the spectral measure 
 as
        
        see [
6,
7,
8]. Conversely, the spectral measure 
 is uniquely determined by the projection parameter functions 
 and 
. However, there is no simple formula for this [
7].
A d-variate analog of a one-sided univariate strictly stable random variable  is the random vector  where  and  is a finite measure concentrated on the set .
Let 
 be a symmetric positive definite 
-matrix, 
. If the characteristic function of a strictly stable random vector 
 has the form
        
        then the random vector 
 is said to have the (centered) 
elliptically contoured stable distribution with characteristic exponent 
. In this case for better vividness we will use the special notation 
.
The paper is organized as follows. In 
Section 1.2, a detailed description of the univariate Zolotarev problem is given as well as of some related results. Examples of distributions related with the univariate Zolotarev problem are presented. In 
Section 2, a multivariate analog of the Zolotarev problem is considered. In 
Section 2.1, the notion of a general multivariate geometric sum is introduced. For this purpose we first give the definitions of a multivariate Bernoulli scheme and related multivariate geometric distribution. It should be noted that the multivariate geometric distribution can be defined in several different ways, however, the asymptotic behavior of the corresponding distributions in limit theorems is the same. The properties of the multivariate geometric distribution are discussed as well as its relation with the Marshall–Olkin distribution within a special model. In 
Section 2.2 a multivariate version of the Zolotarev problem and the implied problems for general multivariate geometric sums are considered. Contrary to expectations, the limit distributions appearing within the model under consideration are not necessarily multivariate geometric stable. In particular, it is shown here that the Marshall–Olkin distribution is limiting in the general scheme of multivariate geometric summation, but, in general, is not multivariate geometric stable. In 
Section 2.3, the notion of an anisotropic multivariate geometric stable distribution is introduced. These distributions can be regarded as limit analogs of multivariate geometric stable distributions. It is shown that a rather wide class of limit distributions for multivariate geometric sums possesses the property of anisotropic geometric stability. The structure of anisotropic multivariate geometric stable distributions is described. In 
Section 2.4, some examples of these limit distributions are considered. In particular, anisotropic multivariate Linnik and Mittag–Leffler distributions are introduced and some of their properties are discussed.
  1.2. Univariate Zolotarev Problem and Related Distributions
In the 1960s and 1980s, the topics related to the so-called characterization problems became very popular in probability theory and mathematical statistics. Many excellent results were obtained yielding, in particular, new statistical techniques. The importance of these problems was acknowledged by the publication of the book [
9].
In the beginning of the 1980s V. M. Zolotarev put forward the problem of description of all the r.v.s 
Y such that for any 
 there exists a r.v. 
 providing the validity of equality
        
        with the r.v.’s 
Y, 
, 
 being independent and the r.v. 
 having the Bernoulli distribution with parameter 
.
Initially it seemed that this is just one more special characterization problem. This problem was solved in 1984 in the paper [
10]. It turned out that it is closely tied with generalizations of classical limit theorems to the case of geometric summation. In particular, in the paper [
10] it was demonstrated that the Zolotarev problem is equivalent to the problem of description of all r.v.s 
Y such that for any 
 the representation
        
        holds with the r.v.s 
, 
, 
, being independent, 
, 
 are identically distributed and the r.v. 
 has the geometric distribution with parameter 
p. These r.v.s 
Y were called 
geometrically infinitely divisible. Thus, the Zolotarev problem was reduced to the description of the class of geometrically infinitely divisible distributions.
The problems of this type themselves are interesting. However, they find numerous applications in many applied problems, for example, in financial and insurance mathematics, reliability and queueing theory, etc. (see, e.g., [
11]). Below we will discuss one of these problems considered by Kovalenko [
12].
The solution of the Zolotarev problem is given by the following theorem following theorem proved in [
10].
Theorem 1. A function  is the characteristic function of a geometrically infinitely divisible distribution if and only if it can be represented aswhere  is an infinitely divisible characteristic function.  By analogy with problems of “conventional” summation, in [
10] the following important notion was introduced as well. Later this notion was successfully used in many problems.
Definition 1. A r.v. Y is said to have a geometrically strictly stable distribution, if for any  there exists a constant  such thatwhere the r.v.s  are independent and identically distributed and the r.v.  is independent of  and has the geometric distribution with parameter p.  The following theorem was proved in [
10].
Theorem 2. A function  is the characteristic function of a geometrically strictly stable distribution if and only if it can be represented aswhere  is a strictly stable characteristic function with some characteristic exponent .  By the Fubini theorem (or the formula of total expectation) and (
1) it is easy to see that the characteristic function (
14) corresponds to the r.v.
        
        that is, any geometrically strictly stable distribution is a scale mixture of a strictly stable law, the mixing distribution being Weibull.
To trace the relation of geometrically strictly stable distributions with random summation, we will use the following auxiliary result proved in [
13,
14]. Consider a sequence of r.v.s 
 Let 
 be natural-valued r.v.s such that for every 
 the r.v. 
 is independent of the sequence 
 In the following statement the convergence is meant as 
.
Lemma 1. Assume that there exist an infinitely increasing (convergent to zero) sequence of positive numbers  and a r.v. S such that If there exist an infinitely increasing (convergent to zero) sequence of positive numbers  and a r.v. N such thatthenwhere the r.v.s on the right-hand side of (
18) 
are independent. If, in addition,  in probability and the family of scale mixtures of the distribution function of the r.v. S is identifiable, then condition (
17) 
is not only sufficient for (
18) 
, but is necessary as well.  This lemma is actually a generalization and sharpening of the famous Gnedenko–Fahim transfer theorem proved in [
15] for random sums and the Dobrushin lemma proved in [
16] for power-type normalizing functions to arbitrary general random sequences with independent random indices.
Univariate geometric distributions possess the following well-known property.
Lemma 2. Let ,  so that . If the r.v.  has the geometric distribution with parameter , then  as , where the r.v.  has the exponential distribution with parameter λ.
 One of most important results concerning geometrically strictly stable distributions is the following theorem.
Theorem 3. A univariate probability distribution is geometrically strictly stable if and only if it is limiting for a geometric random sum of independent identically distributed r.v.s as the parameter p of the random index tends to zero.
 We supply this result by a sketch of the proof. 
The ‘if’ part directly follows from Lemmas 1 and 2, Theorem 2 and (
15). To prove 
the ‘only if’ part consider a geometrically strictly stable distribution corresponding to the characteristic function (
14) with some 
 and 
. Choose a distribution function 
F from the domain of attraction of the strictly stable distribution 
 and consider independent identically distributed r.v.s 
 with the common distribution function 
F. For 
 denote 
. Since 
F belongs to the domain of attraction of the strictly stable distribution 
, there exists a sequence 
 of positive numbers such that (
16) holds with 
. Moreover, in [
17] it was shown that 
 can be chosen as 
, 
, where 
 is a slowly varying function: for any 
		(also see [
18]). Let 
 and 
 be a r.v. having the geometric distribution with parameter 
p. For simplicity, without loss of generality, let 
 and 
, 
. Assume that for each 
 the r.v.s 
 are independent. Consider the limit behavior of the r.v.s 
. We have
        
Consider the second term on the right-hand side of (
20). Let 
 be an arbitrary small positive number 
, 
, 
. By virtue of Lemma 2 there exists an 
 such that
        
        for all 
. Let 
. From (
21) it follows that for 
 we have
        
		According to Theorem 1.1 in [
19], convergence (
19) is uniform in every closed segment of values of 
y. Therefore, an 
 can be found such that for all 
 we have
        
        so that for these 
n the first term on the right-hand side of (
22) equals zero. Thus,
        
        as 
. By virtue of (
22) and the Slutsky lemma (see [
20]) this means that the asymptotic behavior of 
 as 
 coincides with that of 
, that is, 
. Now the reference to lemma 1 with 
, 
 and (
15) completes the proof. An alternative proof of this result can be found in [
21].
Based on the ‘if and only if’ character of the result presented in Theorem 3, it became conventional to define geometric strictly distribution as weak limits for the distributions of geometric sums of independent identically distributed r.v.s.
Well-known examples of geometrically strictly stable distributions are exponential distribution with parameter 
 corresponding to the case 
, 
, whose Laplace–Stieltjes transform has the form
        
        the Linnik distribution with parameters 
 and 
 defined by the characteristic function
        
        with the Laplace distribution defined by the Lebesgue density
        
        being a particular case corresponding to 
 (see, e.g., [
22]), and the Mittag–Leffler distribution defined by its Laplace–Stieltjes transform
        
The numbers 
 are the parameters of this distribution. If 
, we arrive at the exponential distribution. The r.v.s with Laplace–Stieltjes transform (
24) will be denoted 
. As far ago as in 1965, it was shown that the distributions with the Laplace–Stieltjes transform (
24) and only they can be limiting for the distributions of geometric sums of independent identically distributed nonnegative r.v.s (see [
12]). As this is so, from Theorem 3 it follows that these distributions are geometrically strictly stable. Moreover, from (
15) it follows that
        
For more details and the history of the Mittag–Leffler distribution see [
22]. In what follows, r.v.s with the Linnik distribution and Laplace distribution will be denoted 
 and 
, respectively.
As it has been already mentioned, geometric strictly stable distributions appear in limit theorems for random sums of independent identically distributed r.v.s in which the number of summands has the geometric distribution and is independent of the summands. We recall some theorems of this type.
Consider a sequence  of identically distributed r.v.s and the integer-valued r.v.  having the geometric distribution with parameter . Assume that all these r.v.s are jointly independent.
Theorem 4. Assume that the r.v.s  have finite expectation . Thenas .  This theorem is a ‘geometric’ analog of the law of large numbers and is often called the Rényi theorem, see [
23].
The following result (a ‘light’ version of the result of [
12]) can be regarded as a generalization of the Rényi theorem.
Theorem 5. Let the common distribution of nonnegative r.v.s  belong to the domain of normal attraction of a one-sided strictly stable distribution with characteristic exponent . Thenas .  Theorem 6. Let the common distribution of r.v.s  belong to the domain of normal attraction of a symmetric strictly stable distribution with characteristic exponent . Thenas .  As a corollary of this result we obtain the following ‘geometric’ version of the central limit theorem.
Theorem 7. Assume that  and , . Thenas .  In the present paper we consider a multivariate version of the Zolotarev problem generalizing some results of [
10]. An ‘isotropic’ multivariate generalization of these results to the case of geometric random sums of random vectors was considered in [
21,
24]. In that case all the coordinates of the vectors 
are summed up to one and the same geometrically distributed r.v. resulting in random scalar scaling of the multivariate stable distribution in the limit geometrically stable law. Here, we extend these results to a more general case of “anisotropic” random summation where sums of independent random vectors with multivariate random index having a special multivariate geometric distribution are considered resulting in that in each coordinate of the random vectors 
the summation is performed up to a separate random index. Anisotropic-geometric stable distributions are introduced. It is demonstrated that these distributions are coordinate-wise scale mixtures of elliptically contoured stable distributions with the Marshall–Olkin mixing distributions. The corresponding “anisotropic” analogs of multivariate Laplace, Linnik and Mittag–Leffler distributions are introduced. Some relations between these distributions are presented.