Randomly Stopped Sums, Minima and Maxima for Heavy-Tailed and Light-Tailed Distributions

: This paper investigates the randomly stopped sums, minima and maxima of heavy-and light-tailed random variables. The conditions on the primary random variables, which are independent but generally not identically distributed, and counting random variable are given in order that the randomly stopped sum, random minimum and maximum is heavy/light tailed. The results generalize some existing ones in the literature. The examples illustrating the results are provided


Introduction
This paper is devoted to the randomly stopped sums, minima and maxima of heavyand light-tailed random variables (r.v.s).Such objects appear when the number of the random variables under consideration is unknown and is described by some random integer.In particular, randomly stopped sums appear in such fields as insurance and financial mathematics, survival analysis, risk theory, computer and communication networks, etc.The area of randomly stopped sums for heavy-tailed r.v.s has been well developed for more than 50 years and covers mainly the case of independent identically distributed (i.i.d.) r.v.s.In this paper, we consider the case where the underlying r.v.s are not necessarily identically distributed, although they are independent.
The main task considered in this paper is to give conditions guaranteeing that F S ν is heavy-/light-tailed, provided that some of the d.f.s F X k or F ν are heavy-/light-tailed.
Other objects of the paper are the randomly stopped minima and maxima.By the randomly stopped minimum of sums, we call the minimum of partial sums: and by the rrandomly stopped maximum of sums, we call the maximum of partial sums: Also, we provide some results for the randomly stopped minimum, and the randomly stopped maximum, X (ν) = max{0, X 1 , . . ., X ν }.
Similarly, we are interested in when F X (ν) , F X (ν) , F S (ν) and F S (ν) are heavy-tailed or lighttailed.The most attention we pay is to the closure of heavy-tailed and light-tailed classes of distributions with respect to random transformations under consideration.For example, Proposition 1 (see parts (iii), (iv)) below implies that a randomly stopped sum remains heavy-tailed if at least one of the primary r.v.s {X 1 , X 2 , . ..} reached by the counting r.v.ν is heavy-tailed.Proposition 2 (see parts (i), (ii)) shows that the randomly stopped maximum has an analogous property.Meanwhile, Proposition 3 (i) shows that the randomly stopped minimum remains heavy-tailed if the first primary r.v.X 1 is heavy-tailed, and the tails of other primary r.v.s are asymptotically compared to the distribution tail of the first primary r.v.Proposition 5 (iii) implies that the randomly stopped maximum of sums for any counting r.v.remains heavy-tailed if the first primary r.v.X 1 is heavy tailed.Meanwhile, according to Proposition 4 (i), in order for the randomly stopped maximum to remain heavy-tailed, it is necessary that the other primary r.v.s {X 2 , X 3 , . ..} obtain some nonnegative values.Similar facts about the closure of the class of light-tailed distributions with respect to the considered transformations can also be obtained from Propositions 1-5 below.For various distribution classes, similar questions on the closure with respect to various transformations have been studied in .In particular, regularly varying distributions were considered in [23], consistently varying distributions in [2,15], long-tailed distributions in [18,19,21] and dominatedly varying distributions in [6,18,19].Maxima and sums of nonstationary random-length sequences of random variables with regularly varying tails were studied in [31].We mention also paper [32], where two independent heavy-tailed r.v.s, such that their minimum is not heavy tailed, were constructed.
One of the incentives to study the randomly stopped structures is related to the models describing the insurance business.According to the well-known Sparre Andersen model [33], the insurer's wealth W u (t) is described by the risk renewal model: where u ≥ 0 is the initial capital, p > 0 is a constant premium rate, N θ (t) is a counting process generated by a sequence of not negative r.v.s {θ 1 , θ 2 , . ..} and {Z 1 , Z 2 , . ..} is a sequence of independent random claims.Due to such a model, the behavior of the insurer's wealth W u (t) is driven by the randomly stopped sums and the model ruin probability, It is well known that the behavior of S θ (t), the selection of the premium rate p and the estimation of the ruin probability depends on whether the generating elements{θ 1 , θ 2 , . ..}, {Z 1 , Z 2 , . ..} and S θ (t) have light tails or heavy tails, even in the case that the distributions generating the model are identically distributed.For details, see [34][35][36][37][38].
We also note the well known duality of the homogeneous risk renewal model and the G/G/1 model from queuing theory, where the arrivals follow the counting process generated by distribution F θ and service times have distribution F Z .Then the probability of ruin ψ(u) coincides with the probability that the stationary waiting time exceeds u.For details see [34].
The structure of the paper is as follows.In Section 2, we introduce heavy-and lighttailed distributions and formulate two auxiliary lemmas.The main results are formulated in Section 3. Some examples of nonstandard heavy-tailed and light-tailed distributions are presented in Section 4. The heaviness of the distribution tails presented in Section 4 is determined on the basis of the statements formulated in Section 3. The proofs of the main results are presented in Section 5.The last section 6 is devoted to the discussion of the obtained results in the broadest context together with the highlighting of future research directions.
We formulate two lemmas that will be used in the proofs of several main propositions.Although the results of the lemmas are well known and can be found, e.g., in [41,43,44], we provide the proofs for the sake of convenience.The first lemma gives equivalent conditions for the distribution F to be heavy-/light-tailed.Lemma 1. Suppose that F is a d.f. of a real-valued r.v.The following statements are equivalent: Similarly, the equivalent are the following statements: Proof.We prove only the first part of the lemma.
(i) ⇒ (iii).Suppose that F(λ) = ∞ for any λ > 0. Let, on the contrary, lim sup Then, there exist constants c > 0 and For any δ ∈ (0, c), using (2) and the alternative expectation formula (see [45], for instance), we obtain we obtain that there exists an infinitely increasing sequence {x n } such that For any given λ > 0, this implies that there exists n λ ≥ 1 such that Hence, e λx n F(x n ) tends to infinity as n → ∞, and thus, lim sup Since this holds for any λ > 0, we have (ii).
The next lemma implies that H and H c are closed with respect to weak tail equivalence.
Lemma 2. Let F and G be two distributions of real-valued r.v.s.
then G ∈ H. (ii) If F ∈ H c , and G(x) ≤ c F(x) for some c > 0 and large x (x > x c), then G ∈ H c .

Proof. Consider part (i). By condition (3), we obtain that
for some ĉ and sufficiently large x (x > x ĉ).Therefore, The proof of part (ii) can be constructed in a similar way by using Lemma 1 (ii'), showing that lim sup x→∞ e λx G(x) < ∞ for some λ > 0. Lemma 2 is proved.

Main Results
In this section, we formulate the main results of the paper.We start with the randomly stopped sums.We notice that the d.f.F S ν can become heavy-tailed because of the heavy tail of some element in {F X 1 , F X 2 , . ..} or because of the heavy tail of the counting random variable ν.Proposition 1.Let X 1 , X 2 , . . .be independent real-valued r.v.s and let ν be a counting r.v.independent of the sequence {X 1 , X 2 , . ..}. Distribution F S ν is heavy-tailed if at least one of the following conditions is satisfied: Distribution F S ν is light-tailed if at least one of the following conditions is satisfied: (vi) sup k≥1 E e λX k < ∞ for some λ > 0, and F ν ∈ H c .
Our next statement is about the randomly stopped maximum of r.v.s.We observe that some conditions under which the distribution of the randomly stopped maximum F X (ν) becomes heavy-tailed are the same as in Proposition 1.Unfortunately, we did not find how to make a heavy-tailed distribution F X (ν) from the light-tailed primary r.v.s {X 1 , X 2 , . ..}. Proposition 2. Let X 1 , X 2 , . . .be independent real-valued r.v.s and let ν be a counting r.v.independent of the sequence {X 1 , X 2 , . ..}.
The statement below is on the distribution of the randomly stopped minimum of r.v.s.From the formulation below, we observe that the tail of the d.f.F X (ν) has much less chance of becoming heavy compared to the d.f.s F S ν and F X (ν) .Proposition 3. Let X 1 , X 2 , . . .be independent real-valued r.v.s and let ν be a counting r.v.independent of the sequence {X 1 , X 2 , . ..}.
The next two statements are on the heaviness of randomly stopped minimum of sums and randomly stopped maximum of sums.It can be seen from the presented formulations that some of the conditions were already present in the previous statements.However, for the sake of clarity, we present the full statements on the heaviness of F S (ν) and F S (ν) .Proposition 4. Let X 1 , X 2 , . . .be independent real-valued r.v.s and let ν be a counting r.v.independent of the sequence {X 1 , X 2 , . ..}.
(ii) If F X 1 ∈ H c , then F S (ν) ∈ H c for any r.v.ν.
Proposition 5. Let {X 1 , X 2 , . ..} and ν be r.v.s.such as in Propositions 1-4.Then F S (ν) ∈ H if at least one of the following conditions is satisfied: in the case of finite supp(ν).
In the i.i.d.case, Proposition 1 immediately implies the following corollaries.Note that the first two corollaries can be found in monograph [41] as Problems 2.12 and 2.13.
Analogous corollaries can be formulated for randomly stopped minima and maxima.

Examples
In this section, we present two examples showing how one concretely can construct heavy-tailed distributions by using the above randomly stopped structures.
Example 1.Let {X 1 , X 2 , . ..} be a sequence of independent r.v.s such that the first member X 1 has the Pareto distribution and other elements of the sequence are identically exponentially distributed: According to Proposition 1 (parts (iii) and (iv)) and Proposition 5 (iii), distributions F S ν and F S (ν) are heavy-tailed for any counting r.v.independent of the sequence {X 1 , X 2 , . ..}.This is due to the fact that the first of all primary distributions has a significantly heavier tail than the other elements of the infinite primary sequence.For instance, in the case of the discrete uniform counting r.v. with parameter N ≥ 2, we have that distributions with the tail belong to the class H. Proposition 2 (ii) implies that distribution F X (ν) belongs to the class H for any counting r.v.ν independent of {X 1 , X 2 , . ..}. Meanwhile Proposition 3 (i) and Proposition 4 (i) imply that F X (ν) and F S (ν) are heavy-tailed for counting r.v.under condition 1 ∈ supp(ν).In the case of the discrete uniform counting r.v.ν with parameter N = 3, we have that F S (ν) = F X 1 and distributions with the following tails are heavy-tailed: Example 2. Let {X 1 , X 2 , . ..} be a sequence of independent r.v.s uniformly distributed on the interval [0, 1], i.e., for each k ∈ N. Obviously, for any λ > 0 and all k ∈ N. Therefore, by Proposition 1 (i) and Proposition 5 (i), we obtain that distributions F S ν and F S (ν) are heavy-tailed for an arbitrary heavy-tailed counting r.v.ν independent of {X 1 , X 2 , . ..}. Suppose that counting r.v.ν is distributed according to the zeta distribution with parameter 2: denotes the Riemann zeta function.Such ν is heavy-tailed.Propositions 1 (i) and 5 (i) imply that distribution belongs to class H, where is the well-known Irwin-Hall distribution with parameter n; see [46,47] or Section 26.9 in [48].Meanwhile, Propositions 3 (ii) and 4 (ii) imply that distributions with tails are light-tailed despite the fact that the counting r.v.ν distributed according to the zeta distribution is heavy-tailed.
Example 3. Let {X 1 , X 2 , . ..} be a sequence of independent r.v.s distributed according to the Burr type XII law, i.e., and let the counting r.v.ν be independent of {X 1 , X 2 , . ..} and distributed according to the shifted Poisson law, i.e., Since we obtain from Proposition 3 (i) that F X (ν) ∈ H and with A graphical representation of the asymptotic ( 8) is shown in Figure 1.We note that Proposition 3 (i) can also be applied to other Burr type XII distributions whose distribution functions have the form where α, β, γ are positive parameters; see [49], for instance.
Example 4. Let {X 1 , X 2 , . ..} be a sequence of independent r.v.s such that F X 1 is distributed according to the Weibull law the scale parameter 1 and the shape parameter 1/2, i.e., Since F X 1 ∈ H, due to Proposition 4 (i), we obtain that the d.f. of the randomly stopped minimum of sums F S (ν) is heavy-tailed and and ν is distributed according to the shifted Poisson law (7), then F S (ν) ∈ H and A graphical representation of the last relation is shown in Figure 2, having in mind that and where

Proofs of the Main Results
In this section, we present the proofs of all main propositions.We assign a separate subsection to the proof of each proposition.

Proof of Proposition 1
Proof of part (i).For any λ > 0 and an arbitrary K ≥ 1, we have From the condition inf k≥1 E e λX k > 1 we derive that the estimate min 1≤k≤K Therefore, for all n ∈ {1, . . ., K}, we obtain This, together with (9), implies that Since F ν ∈ H, we have Hence, E e λS ν = ∞ implying F S ν ∈ H by definition.Part (i) of the proposition is proved.

Proof of part (ii).
Let us fix an arbitrary λ > 0. Due to the conditions of part (ii), for such λ, we have Hence, the assertion of part (ii) follows from part (i) of the proposition.

Proof of part (iii).
The requirement F ν (x) > 0 for all x ∈ R implies that counting r.v.ν has an unbounded support.Thus, we can find K ≥ κ such that P(ν = K) > 0. Let λ be any positive number and M ≥ 1.Then, because F κ ∈ H and Ee λX k > 0 for each k ∈ {1, . . ., K}.Therefore, F S K ∈ H.By representation (9), we obtain that implying F S ν ∈ H.This completes the proof of part (iii) of the proposition.

Proof of part (iv).
Let K be such that P(ν = K) > 0 and κ ≤ K. Clearly, the conditions of part (iv) imply the existence of such K.To finish the proof of this part, it is sufficient to repeat the arguments of part (iii).

Proof of part (v)
. Suppose that 0 < δ ≤ λ, and λ > 0 is such that . By the standard representation (9), we have where S + 0 = 0 and Condition (4) implies for some c 1 > 0, all k ≥ 1 and all x ∈ R. Therefore, by the alternative expectation formula (see, for instance, [45]), we derive from ( 12) that Since X + 1 , X + 2 , . . .are independent r.v.s, we obtain Hence, by inequality (11) and condition F ν ∈ H c we derive that is chosen as sufficiently small.This implies that F S ν ∈ H c .

Proof of part (vi).
The statement of this part can be proved analogously to the statement of part (v).Namely, the conditions of part (vi) imply that for some constants λ > 0 and c λ ≥ 1.Therefore, using the alternative expectation formula, we derive for all δ ∈ (0, λ) and k ≥ 1.The last estimation and inequality (11) imply that .
If δ ∈ (0, λ] is sufficiently small, then the last expectation is finite because of F ν ∈ H c .Hence, F S ν ∈ H c as well.Part (vi) of the proposition is proved.

Proof of Proposition 2
Proof of part (i).By the standard representation, we have for x > 0 and any K such that P(ν = K) > 0, K ≥ κ.Due to the conditions of part (ii), there exists a sequence of numbers K with the above property.Obviously, Consequently, for an arbitrary λ > 0, we obtain from ( 13) and ( 14) The assertion of part (i) follows now by Lemma 1.

Proof of part (ii).
The proof of this part is similar to the proof of part (i), because the conditions of part (ii) imply that there exists at least one K such that K ≥ κ and P(ν = K) > 0.

Proof of part (iii).
The standard representation implies that for positive x.Due to Lemma 1, there is λ > 0 such that lim sup It follows from the estimate (15) that lim sup for all n ≥ 1, for some c 4 > 0 and for sufficiently large x (x ≥ x 1 ).Therefore, by (17) and (18), we obtain that lim sup The assertion of part (iii) follows now by Lemma 1.

Proof of Proposition 3
Proof of part (i).By the standard representation we have and for each positive x.In addition, conditions of part (i) give that F X (κ) (x) > 0 for all positive x.Therefore, We obtain from this, by using Lemma 2, that F Due to the condition F X 1 ∈ H and Lemma 1, we have lim sup for an arbitrary λ > 0. The requirement for some positive c 5 , sufficiently large x (x ≥ x 2 ) and for all 1 ≤ k ≤ κ.Therefore, for any positive λ and large x (x ≥ x 2 ) we obtain By relation (20)
Let us now suppose that κ > 1. Due to the conditions of part (i) for some c 6 > 0 and all 1 ≤ k ≤ κ.Hence, by the standard decomposition, we obtain that for positive x On the other hand, similarly, as in the case κ = 1, we have Estimates ( 22) and ( 23) imply that the asymptotic relation ( 6) holds for any possible κ.In addition, we observe that, by Lemma 2, distribution F S (ν) belongs to H together with F X 1 .Part (i) of the proposition is proved.

Proof of part (ii).
The statement of this part follows immediately from the estimate (23) and Lemma 1 because lim sup x→∞ e λx F S (ν) (x) ≤ P(ν ≥ 1) lim sup x→∞ e λx F X 1 (x) for any λ > 0.

Proof of Proposition 5
Proof of part (i).Proof of this part is similar to the proof of part (i) of Proposition 1. Namely, for λ > 0 and K ≥ 2 by using (10), we obtain that E e λS (ν) ≥ E e λS Proof of part (iii).For positive x, we have The assertion of part (iii) follows now from Lemma 1 because by (24) lim sup = F S ν (x).
Hence, F S (ν) ∈ H, according to the Lemma 2. Part (iv) of the proposition is proved.