Some Information Measures Properties of the GOS-Concomitants from the FGM Family

In this paper we recall, extend and compute some information measures for the concomitants of the generalized order statistics (GOS) from the Farlie–Gumbel–Morgenstern (FGM) family. We focus on two types of information measures: some related to Shannon entropy, and some related to Tsallis entropy. Among the information measures considered are residual and past entropies which are important in a reliability context.


Introduction
The notion of concomitants or induced order statistics arose in the early 1970s in the works of David [1] and Bhattacharya [2]. Briefly, when there is a sample from a bivariate distribution ordered by the first variate, the second variate paired with the r-th first variate is called the concomitant of the r-th-order statistic. Concomitants are important in situations in which are implied two characteristics and measuring one of them can influence the other. Therefore, they have applications in many fields such as selection procedures, inference problems, double sampling plans and systems reliability. For example, in [3,4] are studied from a reliability point of view complex systems with components which have two subcomponents that performs different tasks, and in [5], the distribution theory of lifetimes of two component systems is discussed. In studies regarding the concomitants there are two elements that have to be mentioned: the kind of dependence between first and second variate, and the kind of order for the first variate. The majority of studies are based on the hypothesis of simple order statistics, but there are also studies that assume different kinds of orders such as as record values order or generalized order statistics.
Generalized order statistics (GOS) was introduced by Kamps [6] and it is a unifying concept for various types of order statistics such as simple order statistics, record values, sequential order statistics.
In this paper, we focus on the concomitants of GOS and with the dependence structure between the first variate and the second variate given by the Fairlie-Gumbel-Morgenstern (FGM) family. This family is a flexible family of bivariate distributions used as a modeling tool for bivariate data in many fields [7], one such field being Reliability, see [3][4][5]. The FGM family has a simple analytical form, but it can describe only relatively weak dependence because the correlation coefficient between the two components cannot exceed 1/3. To prevent this limitation, extensions of FGM family have been proposed, for example, iterative FGM distributions or Huang-Kotz FGM distributions [8][9][10][11][12]. The results obtained in our paper will be generalized for these extensions of FGM family in a future work.
For the concomitants mentioned above, we recall and determine properties that some information measures have. The information measures that we deal with are in two categories, information measures related to Shannon entropy and information measures related to Tsallis entropy.
Since it was introduced in physics and adapted to information theory by Shannon in 1948, the concept of entropy has become more and more important in fields such as information theory, code theory, probability and statistics, reliability.
In probability and statistics, entropy measures the uncertainty associated to a random variable. Taking as a starting point Shannon entropy, a series of entropies have been defined as a generalization of it. For the concomitants of GOS from the FGM family, we will look at Shannon-derived and Tsallis-derived entropies, and our main aim is to determine Awad-type extensions for all the considered entropies, because Awad entropies do not have several drawbacks that Shannon entropy, for example, has: different systems with the same entropy, possible negative values for continuous distributions, different results in discrete and continuous case of linear random variable transformation, etc.
Furthermore, for the concomitants of GOS from FGM, we will determine not only entropies, but also other information measures such as Tsallis divergence and shift-invariant Fisher-Tsallis information number.
In the following sections, we recall some definitions and properties of GOS and their concomitants, in particular, when the bivariate distribution is in the FGM family. Then, we will discuss Shannon-type entropies, Tsallis-type entropies, Fisher information and divergences for concomitants of GOS from the FGM family. For these concomitants, in the last section, we will introduce new extensions and results on information measures.

Concomitants
The term concomitant was introduced by David (1973) [1] and has the following definition: Definition 2. Let (X 1 , Y 1 ), (X 2 , Y 2 ), . . . , (X n , Y n ) be iid bivariate random variables with cumulative distribution function F(x, y). Then, the Y variate associated to the r-th-order statistic of X-s, X (r:n) , denoted by Y [r:n] , is the concomitant of X (r:n) .
A natural use of concomitants is in selection procedures when k individuals are chosen on the basis of their X-values. Then, the corresponding Y-values represent performance on an associated characteristics. In Reliability Theory, the role of the concomitants is emphasized in [3][4][5].

Concomitants of FGM Family
The FGM bivariate distribution family has a flexible form and it was studied by Farlie [16], Gumbel [17], Morgenstern [18], and Johnson and Kotz [19]. Definition 3. Let X and Y be two random variables with distribution functions F X and F Y , respectively. Additionally, let α be a real number. Then, the FGM family has the distribution function: The corresponding probability density function (pdf) of (5) is: where f X (x) f Y (y) are the marginals of f X,Y (x, y).
The parameter α ∈ [−1, 1] is known as the association parameter and the two random variables X and Y are independent when α = 0. For α = 0, there is a dependence between the two variables, characterized by the FGM-copula whose properties were studied in [20].
Concomitants of FGM family, related to GOS, started to come into notice with the work of Beg and Ahsanullah in 2008 [21] where the density g [r,n,m,k] of the concomitant of the r-th GOS is derived: where is a constant.

Remark 2.
If m = 0, k = 1, then C * (r, n, 0, 1) = −(n − 2r + 1)/(n + 1) and is the density of the concomitant of r-th-order statistic from the FGM family. If m = −1, k = 1, then C * (r, n, −1, 1) = 1 − 2 1−r and If we are in the case of progressive type II censoring order statistics with equi-balanced censoring scheme, the density of the concomitant of r-th-order statistic from the FGM family is (7) with m = R, the removal number.
The cumulative distribution function and the survival function of the concomitant of r-th-order statistic can also be computed: In the following, in order to make it easier to read computations, we make the notations:

Information Measures for the Concomitants from the FGM Family, Existing Results
In this section, we will recall some definitions and results for the information measures of the concomitants of GOS from the FGM family.

Shannon and Shannon-Related Entropies
Shannon entropy was introduced by Shannon in 1948 [22], it has multiple applications and it can be defined as: Information measures for concomitants derived from the FGM family have been studied by Tahmasebi and Behboodian: in [23] for concomitants of order statistics, and in [24] for concomitants of GOS. Using (7), they proved that for the Shannon entropy of Y [r] , the concomitant of the r-th generalized order statistics is: where and Remark 3. In [23] are analyzed the properties of (13) in the particular case when m = 0 and k = 1, i.e., when the GOS reduces to the simple order statistics, and therefore, the entropy from (13) in this case is the Shannon entropy of the concomitant of r-th-order statistic: In [24], Shannon entropy for record values is also mentioned: If we are in the case of progressive type II censoring order statistics with an equi-balanced censoring scheme, the Shannon entropy of the concomitant of r-th-order statistic from the FGM family is (13) with m = R, the removal number.
Awad, in 1987, ref. [25] noticed that Shannon entropy, in the continuous case, does not fulfill the condition that the entropy is preserved under the linear transformation and proposed the following entropy known also in the literature as Sup-entropy: where δ = sup{ f (x)|x ∈ R}. We will call this entropy Shannon-Awad entropy. Residual and past Shannon entropies were defined in the context of reliability, being important in measuring the amount of information that a residual life or a past life of a unit has. In the following, the random variable X with pdf f , cdf F, and survival functionF, is considered positive and it has the meaning of a lifetime of a unit.
Residual entropy is introduced and its properties are analyzed in the works of Ebrahimi [26] and Ebrahimi and Pellerey [27]. Residual entropy is based on the idea of measuring the expected uncertainty contained in the conditional density of X − t given X > t [27]: In terms of failure rate, the residual entropy can be written as: where λ F (·) = f (·)/F(·) is the failure rate function. Similar to the definition of the residual entropy, DiCrescenzo and Longobardi [28] introduced past entropy as a dual to the residual entropy. Past entropy measures the uncertainty about past life of a failed unit: In terms of reversed failure rate, past entropy can be written as: where τ F (·) = f (·)/F(·) is the reversed failure rate function.
Residual and past entropies for concomitants of GOS from the FGM family were determined by Mohie EL-Din et al. in [29]. They considered also concomitants of other types of GOS, but the form of the entropies is similar. The residual entropy of the r-th concomitant of GOS from the FGM family is [29]: We notice that for t = 0, the residual entropy (21) becomes the entropy (13).

Remark 4.
For m = 0 and k = 1, we obtain residual Shannon entropy for the concomitant of r-th-order statistic: For m = −1 and k = 1, we obtain residual Shannon entropy for the concomitant of r-th record value: In the case of progressive type II censoring order statistics with equi-balanced censoring scheme, the residual Shannon entropy of the concomitant of r-th-order statistic from the FGM family is (21) with m = R, the removal number.
In a similar way, the past entropy for the concomitant of the r-th GOS from the FGM family is defined as [29]: We notice that for t → ∞, the past entropy (26) becomes the entropy (13).

Tsallis and Tsallis-Related Entropies
For the first time, Tsallis entropy was introduced and used in the context of Cybernetics Theory by Harvda and Charvat [30], but it has become well known since its definition as a generalization of Boltzmann-Gibbs statistics, in the context of thermodynamics, by Tsallis in 1988 [31]. Being the starting point of the field of non-extensive statistics, Tsallis entropy is a non-additive generalization of the Shannon entropy and for a continuous random variable X with density function f , it can be defined as: When q → 1, Tsallis entropy approaches to Shannon entropy. Tsallis entropy has, in turn, various generalizations, see, for example, [32]. Another important element in non-extensive statistics is log q function: and Tsallis entropy can be obtained using this function in two ways: Tsallis entropy has applications in many fields, from statistical mechanics and thermodynamics, to image processing and reliability, sometimes being more suited to measuring uncertainty than classical Shannon entropy [33,34].
In [35], Tsallis entropy was computed and its properties obtained for record values and their concomitants when the bivariate distribution is in the FGM family.
Similar to the Shannon case, we can think about residual and past variants of the Tsallis entropy in the context of reliability. In [36], Nanda and Paul introduced residual Tsallis entropy as the 'first kind residual entropy of order β'. In our notation, β is q: In addition to entropy type information measures, there are another two types information measures that can be associated to probability distributions-Fisher measures and divergence measures [37].

Fisher Information Number
Fisher information measures the amount of information that we can obtain from a sample about an unknown parameter and therefore, it measures the uncertainty included in a unknown characteristic of a population. If the parameter is a location one, then Fisher information is shift-invariant and has the form: Shift-invariant Fisher information, also called Fisher Information Number (FIN), was studied in [38]. It has applications in statistical physics where it is also known by the name extreme physical information [39], and it is used in analyzing the evolution of dynamical systems.
For the concomitants of GOS from the FGM family, the Fisher information number was determined in [40].

Divergence Measures
Divergences are useful tools when a measure of the difference between two probability distributions is needed and therefore, they have applications in various fields, from inference for Markov chains, [41][42][43] to machine learning [44,45]. One of the best-known divergence is Kullback-Leibler divergence [46,47], which for two continuous random variables Z 1 , with probability density f 1 , and Z 2 with probability density f 2 is : Kullback-Leibler divergence for the concomitants of GOS from the FGM family was computed in [24] and the result is distribution-free.
One of the generalizations of Kullback-Leibler divergence measure is Tsallis divergence, which expands Kullback-Leibler divergence in a similar way to that in which Tsallis entropy extends Shannon entropy. There is a very rich literature on Tsallis divergence or Tsallis relative entropy in the case of the discrete distributions, see, for example, [48][49][50]. Tsallis divergence for continuous distributions does not appear so frequently in the literature, being studied mainly in Machine Learning context, [44,45]. Tsallis divergence for the concomitants will be determined in this paper in the next section.

Information Measures for the Concomitants from FGM Family, New Results
In this section, we will provide some generalizations of the existing results on the information measures for the concomitants of GOS from the FGM family, results that are mentioned in previous section. We are interested in Awad-type extensions of the entropies, in residual and past Tsallis entropies, in Tsallis type extension of the FIN, and in Tsallis divergence.

Shannon and Shannon-Related Entropies
One can easily notice that the relationship between Shannon-Awad entropy (18) and Shannon entropy (12) is: In the following, we provide natural extensions of the results obtained in [24], considering Shannon-Awad entropy instead of Shannon entropy. Thus, Shannon-Awad entropy of the concomitant of the r-th GOS from the FGM family is: where W(r, α, n, m, k) and φ f are given by (14) and (15) and

Remark 5.
For the simple OS, the r-th OS concomitant from the FGM family, Shannon-Awad entropy is: with δ [r] = sup{g [r] (x)|x > 0} and g [r] being here the pdf of the concomitant r-th-order statistics.
For the record values, the r-th record value concomitant from the FGM family, Shannon-Awad entropy is: with δ [r] = sup{g [r] (x)|x > 0} and g [r] being here the pdf of the concomitant r-th record value.
In the case of progressive type II censoring order statistics with equi-balanced censoring scheme, the Shannon-Awad entropy of the concomitant of r-th-order statistic from the FGM family is (35) with m = R, the removal number.
An extension of the above entropies are residual and past Shannon-Awad entropies. We define residual Shannon-Awad entropy as: where In terms of failure rate, residual Shannon-Awad entropy can be obtained: We notice that the relationship between residual Shannon entropy and residual Shannon-Awad entropy is similar to (34) and it is: In a similar way, we can extend past Shannon entropy to past Shannon-Awad entropy: where As a function of reversed failure rate, past Shannon-Awad entropy can be written: We can write also the relationship between past Shannon-Awad entropy and past Shannon entropy: Taking into account the above relationships, (21), and (26), we can obtain the Awadtype extension of the Shannon entropy for the concomitant of r-th GOS from the FGM family, when the concomitant represents the residual life or the past life of a unit.

Theorem 1. Residual Shannon-Awad entropy for the concomitant of r-th GOS from the FGM family is:
(t,∞) where K 1 (r, t, α, n, m, k) and φ f (t) are given by (22), (23) respectively, and Past Shannon-Awad entropy for the concomitant of r-th GOS from the FGM family is: where K 2 (r, t, α, n, m, k) and φ f (y), and δ (0,t) are given by (22) and (23) respectively, and

Corollary 1. The residual Shannon-Awad entropy for the concomitant of r-th-order statistic is:
The residual Shannon-Awad entropy for the concomitant of r-th record value is: In the case of progressive type II censoring order statistics with equi-balanced censoring scheme, the residual Shannon-Awad entropy of the concomitant of r-th-order statistic from the FGM family is (46) with m = R, the removal number. Similar results can be obtained also for past Shannon-Awad entropy.

Tsallis and Tsallis-Related Entropies
Information measures related to Tsallis entropy for the concomitants are very few in the literature. In [35], Tsallis entropy and residual Tsallis entropy for the concomitants of the record values from the FGM family are obtained. In this subsection, we will obtain more general results, computing Tsallis entropies for the concomitants of generalized order statistics and, furthermore, considering Awad-type extensions of the Tsallis entropies.

Theorem 2.
Tsallis entropy for the concomitant of the r-th GOS from the FGM family is: where U is an U(0, 1) random variable and E U is the expectation of f Y (F −1 Y (U)) q−1 U k−s .

Proof.
Taking into account the definitions of Tsallis entropy (29) and the density of the concomitants (6), we obtain: We have that: Additionally, if we consider the transformation the result (50) follows.

Corollary 2.
The Tsallis entropy for the concomitant of r-th-order statistic is: The Tsallis entropy for the concomitant of r-th record value is: In the case of progressive type II censoring order statistics with equi-balanced censoring scheme, the Tsallis entropy of the concomitant of r-th-order statistic from the FGM family is (50) with m = R, the removal number.
We now discuss some Tsallis-related entropies. First, we give an Awad-type extension of Tsallis entropy and then we focus on Residual Tsallis, Past Tsallis entropies and their Awad-type extensions.
Several Awad-type extensions have been proposed in the literature ( [51,52]). Now, we introduce this type of extension for Tsallis entropy and we define Tsallis-Awad entropy for a continuous random variable X which take values in R: where δ = sup{ f (x)|x ∈ R}. We notice that the relationship between Tsallis-Awad entropy and Tsallis entropy is: Using (50) and (54), we can obtain the expression of Tsallis-Awad entropy for the concomitant of the r-th GOS from the FGM family: Theorem 3. Tsallis-Awad entropy for the concomitant of the r-th GOS from the FGM family is: where U is an U(0, 1) random variable, and, in this case, δ = sup{g [r] (x)|x > 0}.
Corollary 3. The Tsallis-Awad entropy for the concomitant of r-th-order statistic is: The Tsallis-Awad entropy for the concomitant of r-th record value is: In the case of progressve type II censoring order statistics with equi-balanced censoring scheme, the Tsallis-Awad entropy of the concomitant of r-th-order statistic from the FGM family is (55) with m = R, the removal number.
In a similar way to the definition of residual Tsallis (31), we can consider the past Tsallis entropy: Taking into account Theorem 2, the following theorem is naturally deduced: Theorem 4. Residual Tsallis entropy for the concomitant of the r-th GOS from the FGM family is: where U is an U(0, 1) random variable and E U is the conditional expectation of f Y (F −1 (U)) q−1 U k−s , given U > F Y (t). Past Tsallis entropy for the concomitant of the r-th GOS from the FGM family is: where U is a U(0, 1) random variable and E U is the conditional expectation of f Y (F −1 (U)) q−1 U k−s , given U < F Y (t).

Corollary 4.
The residual Tsallis entropy for the concomitant of r-th-order statistic is: The residual Tsallis entropy for the concomitant of r-th record value is: In the case of progressive type II censoring order statistics with equi-balanced censoring scheme, the residual Tsallis entropy of the concomitant of r-th-order statistic from the FGM family is (59) with m = R, the removal number. Similar results can be obtained for past Tsallis entropy.
The residual Tsallis-Awad entropy for the concomitant of r-th record value is: (t,∞) .
In the case of progressive type II censoring order statistics with equi-balanced censoring scheme, the residual Tsallis entropy of the concomitant of r-th-order statistic from the FGM family is (59) with m = R, the removal number. Similar results can be obtained for past Tsallis entropy.

Fisher-Tsallis Information Number
Various generalizations of FIN have been proposed, see, for example, [53][54][55]. In [53], the FIN is generalized, replacing the expectation and the logarithm functions with their q variants, and in [54], a (β, q)-Fisher information is defined. We here consider the following extension FIN, which we call Fisher-Tsallis information number: where log q is given by (30). This extension is a type of extension from [54], with β = 2 and q = 1. For the concomitants of the GOS from FGM family, we have the following theorem which can be seen as an extension of the results obtained in [40]: When q → 1, Tsallis entropy becomes Shannon entropy and also Tsallis divergence becomes Kullback-Leibler divergence (33). The next theorem generalizes the results from [23], computing Tsallis divergence for two concomitants of GOS from the FGM family.
Theorem 7. Let Y [r] and Y [s] the r-th and the s-th concomitants of the GOS from the FGM family with the densities g [r] and g [s] . Then, the Tsallis divergence of g [s] from g [r] has the following form: where D 1 = 1 q(q + 1) with F 1 being the hypergeometric function.

Conclusions
This paper is focused on information measures related to Shannon entropy, Tsallis entropy, Fisher information, and divergences for the concomitants of GOS from the FGM family. We review the literature on the mentioned information measures and we generalize existing results. The study of the concomitants, pairs of the order statistics in a sample from a bivariate distribution, ordered by one variate, could have applications in reliability, for example, in the analysis of the lifetime uncertainty of complex systems. For this reason, we also discuss residual and past versions of the entropies. Considering generalized order statistics (GOS) results in an increasing complexity of computations, but it gives a general form of the computed measures that can be applied for the concomitants of various order statistics.
Author Contributions: All authors contributed equally to the paper. All authors have read and agreed to the published version of the manuscript.