Next Article in Journal
Intelligent Path-Selection-Aided Decoding of Polar Codes
Next Article in Special Issue
Novel Detection of Atmospheric Turbulence Profile Using Mie-Scattering Lidar Based on Non-Kolmogorov Turbulence Theory
Previous Article in Journal
Acknowledgment to the Reviewers of Entropy in 2022
Previous Article in Special Issue
Novel Simulation and Analysis of Mie-Scattering Lidar for Detecting Atmospheric Turbulence Based on Non-Kolmogorov Turbulence Power Spectrum Model
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Further Properties of Tsallis Entropy and Its Application

1
Department of Mathematical Sciences, College of Science, Princess Nourah Bint Abdulrahman University, P.O. Box 84428, Riyadh 11671, Saudi Arabia
2
Department of Statistics and Operations Research, College of Science, King Saud University, P.O. Box 2455, Riyadh 11451, Saudi Arabia
*
Author to whom correspondence should be addressed.
Entropy 2023, 25(2), 199; https://doi.org/10.3390/e25020199
Submission received: 24 November 2022 / Revised: 8 January 2023 / Accepted: 15 January 2023 / Published: 19 January 2023
(This article belongs to the Special Issue Entropy in Fluids)

Abstract

:
The entropy of Tsallis is a different measure of uncertainty for the Shannon entropy. The present work aims to study some additional properties of this measure and then initiate its connection with the usual stochastic order. Some other properties of the dynamical version of this measure are also investigated. It is well known that systems having greater lifetimes and small uncertainty are preferred systems and that the reliability of a system usually decreases as its uncertainty increases. Since Tsallis entropy measures uncertainty, the above remark leads us to study the Tsallis entropy of the lifetime of coherent systems and also the lifetime of mixed systems where the components have lifetimes which are independent and further, identically distributed (the iid case). Finally, we give some bounds on the Tsallis entropy of the systems and clarify their applicability.

1. Introduction

The Shannon entropy measure of a probability distribution has found critical applications in numerous areas described in Shannon’s seminal work [1]. Information theory provides an uncertainty measure associated with unpredictable phenomena. If X is a non-negative random variable (rv) with absolutely continuous cumulative distribution function (cdf) F ( x ) and probability density function (pdf) f ( x ) , the Shannon entropy is represented as follows
H ( X ) = H ( f ) = 0 f ( x ) log f ( x ) d x ,
provided the integral is finite. The entropy of Tsallis with order α generalizes the Shannon entropy, defined by (see [2])
H α ( X ) = H α ( f ) = 1 α ¯ 0 f α ( x ) d x 1 , = 1 α ¯ [ E ( f α 1 ( F 1 ( U ) ) ) 1 ] ,
where α ¯ = 1 α and for all α D α = ( 0 , 1 ) ( 1 , ) , where E ( · ) means the expectation and F 1 ( u ) for u [ 0 , 1 ] , stands for the quantile function. Generally, Tsallis entropy can be negative, however, it can be nonnegative if a proper value of α is chosen. It is evident that H ( f ) = lim α 1 H α ( f ) and hence it reduces to the Shannon entropy.
It is known that the Shannon entropy is additive. The entropy of Tsallis is, however, non-additive since
H α ( X , Y ) = H α ( X ) + H α ( Y ) + α ¯ H α ( X ) H α ( Y ) .
Due to the flexibility of the Tsallis entropy compared to the Shannon entropy, the non-additive entropy measures find their justification in many areas of information theory, physics, chemistry, and technology.
Several properties and statistical aspects of the Tsallis entropy can be found in [3,4,5,6]. The Tsallis entropy for coherent systems and also mixed systems in the iid case is studied. A novel measure of uncertainty involving differential Shannon entropy and discriminant Kullback–Leibler quantity for comparing systems in terms of uncertainty in predicting lifetimes has been developed by [7,8,9]. A similar consequence on the subject has been argued in [10]. Furthermore, the Tsallis entropy properties of the order statistics in [11]. We aim here to continue the research within an analogous framework.
The rest of the article is thus planned as follows. In Section 2, we first study some essential properties of the Tsallis entropy of order α and then establish sufficient conditions for it to preserve the usual stochastic order. Section 3 examines various properties of the dynamical version in detail. Section 4 discusses the Tsallis entropy and its properties for coherent structures and mixed structures in the iid case. Finally, bounds on the Tsallis entropy of system lifetimes are also given.
The stochastic orders s t , h r and d , known as usual stochastic order, hazard rate order and dispersive order will be utilized in the rest of the paper (see Shaked and Shanthikumar [12]).
The following implications hold:
h r s t and d s t .
[The] dispersive order is recognized to be the order of distributional variability.

2. Properties of Tsallis Entropy

Below are other useful properties of the measure. First, another useful expression for the Tsallis entropy can be conveyed in terms of the proportional hazard rate (PH) function. For this purpose, if X R + is related to S ( x ) , or the hazard rate function λ ( x ) = f ( x ) / S ( x ) , the Tsallis entropy is expressed as
H α ( f ) = 1 1 α E ( λ α 1 ( X α ) ) 1 ,
for all α D α , where X α denotes the rv with pdf
p α ( x ) = α f ( x ) S α 1 ( x ) .
In Table 1, we provided the Tsallis entropy of some well-known distributions.
It is worth noting that the pdf given in (4), is actually the pdf of the minimum of two rv’s in the iid case (see, [13]). We note that the Tsallis entropy is invariant in the discrete case, while it is not invariant in the continuous case under one-to-one transformation of the rv under consideration. In this case, if ϕ ( · ) : R R is a one-to-one function and X 2 = ϕ ( X 1 ) , then (see e.g., Ebrahimi et al. [14])
H ( X 2 ) = H ( X 1 ) E [ log J ϕ ( X 1 ) ] ,
where J ϕ ( X 2 ) = | d ϕ 1 ( X 2 ) d X 2 | is the Jacobian of the transformation. It is evident that f X 2 ( x ) = f X 1 ( ϕ 1 ( x ) ) | 1 ϕ ( ϕ 1 ( x ) ) | . Hence, one can readily find that
H α ( X 2 ) = 1 1 α 0 f X 2 α ( x ) d x 1 = 1 1 α 0 f X 1 α ( ϕ 1 ( x ) ) 1 | ϕ ( ϕ 1 ( x ) ) | α d x 1 = 1 1 α 0 f X 1 α ( u ) | ϕ ( u ) | α 1 d u 1 ,
for all α D α . The Shannon entropy is scale-dependent. However, it is free of location, that is, X retains the identical differential entropy as X + b , for any b R . The same results also hold for the Tsallis entropy. Indeed, for all a 0 and b R , from the above relation, we have
H α ( a X + b ) = a 1 α H α ( X ) + 1 a α 1 1 α .
Now, we recall the definition of Tsallis entropy which can be seen in [4] for greater details.
Definition 1. 
Suppose X 1 , X 2 R + with the cdfs F 1 , F 2 . The rv X 1 is smaller than the rv X 2 in the Tsallis entropy of order α , ( X 1 T E X 2 ) if H α ( X 1 ) H α ( X 2 ) for all α D α .
It is worth pointing out that X 1 T E X 2 indicates that the predictability of the outcome of X 1 is more than that of X 2 in terms of the Tsallis entropy. As an immediate consequence of (2), consider the subsequent theorem.
Theorem 1. 
Let X 1 , X 2 R + with cdfs F 1 and F 2 , respectively. If X 1 d X 2 , then X 1 T E X 2 .
Proof. 
If X 1 d X 2 , then, for all α ( ) 1
( 1 α ) H α ( X 2 ) = 0 1 f 2 α 1 ( F 2 1 ( u ) ) d u 1 ( ) 0 1 f 1 α 1 ( F 1 1 ( u ) ) d u 1 = ( 1 α ) H α ( X 1 ) .
This yields H α ( X 1 ) H α ( X 2 ) , for all α > 0 . □
The following theorem develops the impact of a transformation on the Tsallis entropy of an rv. It is analogous to Theorem 1 in [15] and hence we skip its proof.
Theorem 2. 
Let X 1 R + with the pdf f 1 ( x ) and X 2 = ϕ ( X 1 ) , where ϕ : ( 0 , ) ( 0 , ) is a function with a continuous derivative ϕ ( x ) such that E ( X 2 2 ) < . If | ϕ ( x ) | 1 for all x supported by X 1 , then H α ( X 1 ) H α ( X 2 ) for all α D α .
The next theorem presents implications of the stochastic order under some aging constraints of the associated rv’s and the order α .
Theorem 3. 
If X 1 s t X 2 , and
(i) 
X 1 is DFR, then H α ( X 1 ) H α ( X 2 ) for all 0 α 1 ;
(ii) 
X 2 is DFR, then H α ( X 1 ) H α ( X 2 ) for all α 1 .
Proof. 
(i) Let X 1 s t X 2 and 0 α 1 . If X 1 is DFR, then
0 f 1 α ( x ) d x 0 f 2 ( x ) f 1 α 1 ( x ) d x 0 f 2 α ( x ) d x 1 α 0 f 1 α ( x ) d x α 1 α .
The first inequality in (5) is obtained as follows. Since X 1 is DFR, then f 1 α 1 ( x ) is increasing in x for all 0 α 1 . So, the result ensues from the fact that X 1 s t X 2 implies E [ f 1 α 1 ( X 1 ) ] E [ f 1 α 1 ( X 2 ) ] for all increasing function f 1 α 1 ( x ) , x > 0 . The second inequality is obtained by using Hölder’s inequality. Thanks to the use of relations (2) and (5), we have
1 + ( 1 α ) H α ( X 1 ) 1 + ( 1 α ) H α ( X 2 ) ,
which proves the claim. To prove (ii), we have
0 f 2 α ( x ) d x 0 f 1 ( x ) f 2 α 1 ( x ) d x 0 f 1 α ( x ) d x 1 α 0 f 2 α ( x ) d x α 1 α .
The first inequality is obtained by mentioning that X 2 is DFR and hence f 2 α 1 ( x ) is decreasing in x for all α 1 while the second inequality is given by using Hölder’s inequality. Now, the results follow. □
It is worth noting that Theorem 3 can be applied to several statistical models such as Weibull, Rayleigh, Pareto, Gamma, and Half Logistic, among others. The mentioned models involve the DFR aging property by choosing a suitable parameter.

3. Properties of Residual Tsallis Entropy

Here we give an overview of the dynamical perspectives of the Tsallis entropy of order α . We note that in this section, the term “decreasing” is equivalent to “non-increasing”, and “increasing” is equivalent to “non-decreasing”. Suppose that X is the life length of a new system. In this case, the Tsallis entropy H α ( X ) is appropriate to measure the uncertainty of such a system. However, if the system remains alive until the age t , then H α ( X ) is not appropriate to measure the remaining or residual uncertainty in the system’s lifetime. Thus, let us denote X t : = [ X t | X t ] , the residual lifetime of an item with the pdf f ( x ; t ) = f ( x ) / S ( t ) , x D t where D t = { x : x > t } . Then, the residual Tsallis entropy is represented by
H α ( X ; t ) = 1 1 α D t f α ( x ; t ) d x 1 = 1 1 α t f ( x ) S ( t ) α d x 1 ,
for all t 0 and α D α . Another useful representation is as follows:
H α ( X ; t ) = 1 1 α E ( λ α 1 ( X α , t ) ) α 1 ,
where X α , t denotes the rv having pdf
p α ( x ; t ) = α f ( x ) S ( t ) S ( x ) S ( t ) α 1 , x t > 0 .
Based on the measure H α ( X ; t ) , a few classes of life distributions are proposed.
Definition 2. 
The rv X has increasing (decreasing) residual Tsallis entropy of order α ( I R T E α ( D R T E α ) ) if H α ( X ; t ) is increasing (decreasing) in t for all α D α .
Roughly speaking, if a unit has a cdf belonging to the class of I R T E α ( D R T E α ) , then the conditional pdf becomes less (more) informative as the unit becomes older. In the following lemma, we give the derivative of the residual Tsallis entropy.
Lemma 1. 
For all t 0 , we have
H α ( X ; t ) = λ ( t ) 1 α E ( λ α 1 ( X α , t ) ) λ α 1 ( t ) ,
for all α D α .
Remark 1. 
Let us assume that X is I R T E α ( D R T E α ) . Then, H α ( X ; t ) = 0 and therefore we have E ( λ α 1 ( X α , t ) ) = λ α 1 ( t ) . That is λ ( t ) = λ , t 0 . This reveals that exponential distribution is the only distribution fulfilling I R T E α and D R T E α .
We establishes a relationship among the new and known classes of life distributions with increasing (decreasing) failure rates.
Theorem 4. 
For any X R + with the pdf f , if X has IFR(DFR) property, then X is D R T E α ( I R T E α ) .
Proof. 
We shall prove it for IFR, while the DFR case can be derived similarly. Suppose X is IFR and 0 α 1 . Then λ α 1 ( x ) is decreasing and hence we have
E ( λ α 1 ( X α , t ) ) = t λ α 1 ( x ) p α ( x ; t ) d x λ α 1 ( t ) ,
for t > 0 . Since 1 α 0 , from (8) the result follows. When α 1 , then the above relation is reversed and using again (8), we have the result and this implies that X is D R T E α .
There is a large class of monotonic density distributions to which the above theorem can be applied. Another important class of life distributions is those with an increasing failure rate in average (IFRA). In particular, X is IFRA if log S ( x ) / x increases in x . The following example shows no relationship between the proposed class and the IFRA class of life distributions.
Example 1. 
Consider the sf of a random lifetime as
S ( x ) = 1 ( 1 e 0.2 x ) ( 1 e 3 x ) ,
and hence
f ( x ) = 0.2 e 0.2 x ( 1 e 3 x ) + 3 ( 1 e 0.2 x ) e 3 x .
The plot of the residual Tsallis entropy in Figure 1 exhibits that X is not D R T E α .
We now see how the residual Tsallis entropy and the hazard rate orders are related.
Theorem 5. 
If X 1 h r X 2 and either X 1 or X 2 is DFR, then H α ( X 1 ; t ) H α ( X 2 ; t ) for all α D α .
Proof. 
Suppose X 1 , t and X 2 , t denote the residual rv’s with pdfs f 1 , t and f 2 , t , respectively. The relation X 1 h r X 2 implies that [ X 1 , α t | X 1 , α > t ] s t [ X 2 , α t | X 2 , α > t ] , where X 1 , α and X 2 , α have sf’s S 1 α ( x ) and S 2 α ( x ) , respectively. Let us suppose α 1 . If we assume that X 1 is DFR, then λ 1 α 1 ( x ) is decreasing in x . Hence, we have
E [ λ 1 α 1 ( X 1 , α , t ) ] E [ λ 2 α 1 ( X 2 , α , t ) ] E [ λ 2 ( X 1 , α , t ) ] ,
for all t 0 , where X i , α , t = [ X i , α t | X i , α > t ] , ( i = 1 , 2 ) . From (6), we obtain H α ( X 1 ; t ) H α ( X 2 ; t ) for all α 1 . Now if we assume that 0 α 1 , then the above relation can be reversed and then from (6), we obtain H α ( X 1 ; t ) H α ( X 2 ; t ) for all 0 α 1 . So, we have H α ( X 1 ; t ) H α ( X 2 ; t ) for all α D α . The proof when X 2 is DFR can be obtained similarly. □

4. Tsallis Entropy of Coherent Structures and Mixed Structures

This section gives some aspects of the Tsallis entropy of coherent (and mixed) structures. A coherent system is one in which all system components are relevant, and the structure-function of it is monotonic. The k-out-of-n system is a particular case of a coherent structure. Furthermore, a mixture of coherent systems is considered as a mixed system (see, [16]). The reliability function of the mixed system lifetime T, in the iid case, is represented by
S T ( t ) = i = 1 n p i S i : n ( t ) ,
where S i : n ( t ) = j = 0 i 1 n j [ F ( t ) ] j [ S ( t ) ] n j when i = 1 , , n are the sf’s of X 1 : n , , X n : n . The density function of T can be written as
f T ( t ) = i = 1 n p i f i : n ( t ) ,
so that
f i : n ( t ) = Γ ( n + 1 ) Γ ( i ) Γ ( n i + 1 ) [ F ( t ) ] i 1 [ S ( t ) ] n i f ( t ) , 1 i n ,
The vector p = ( p 1 , , p n ) in S T ( t ) is called the system signature, such that p i = P ( T = X i : n ) . Note that p 1 , , p n are non-negative as they are probabilities and thus i = 1 n p i = 1 holds.
The order statistic U i : n = F ( X i : n ) has pdf
g i ( u ) = Γ ( n + 1 ) Γ ( i ) Γ ( n i + 1 ) u i 1 ( 1 u ) n i .
Now, the pdf of V = F ( T ) is given by
g V ( v ) = i = 1 n p i g i ( v ) .
Applying the above transformations, we find in the following theorem a formula for the Tsallis entropy of T .
Proposition 1. 
The Tsallis entropy of T is
H α ( T ) = 1 1 α 0 1 g V α ( v ) f α 1 ( F 1 ( v ) ) d v 1 ,
where g V ( v ) has been shown in (14).
Proof. 
By using the change of v = F ( x ) , from (2) and (11) we have
H α ( T ) = 1 1 α E 0 i = 1 n p i f i : n ( t ) α d t 1 , = 1 1 α 0 1 i = 1 n p i Γ ( n + 1 ) Γ ( i ) Γ ( n i + 1 ) v i 1 ( 1 v ) n i α f α 1 ( F 1 ( v ) ) d v 1 , = 1 1 α 0 1 g V α ( v ) f α 1 ( F 1 ( v ) ) d v 1 ,
for all α D α . The result now follows. □
To apply Proposition 1, consider the next example.
Example 2. 
The vector s = ( 0 , 0.2 , 0.6 , 0.2 , 0 ) is the signature of a bridge system with n = 5 in the iid case with the basal reliability function S ( t ) = exp ( λ t ) . This system remains functional provided that there is a path of operational connections running from left to right. It is obvious that f ( F 1 ( v ) ) = λ ( 1 v ) , 0 v 1 , and we therefore have
H α ( T ) = 1 1 α λ α 1 0 1 g V α ( v ) ( 1 v ) ( α 1 ) d v 1 ,
for all α D α . The Tsallis entropy is decreasing with respect to λ as the uncertainty of the system’s lifetime decreases with increasing the parameter λ . Moreover, we have H 0.2 ( T ) = 2.6952 , H 1.5 ( T ) = 0.4670 , H 2 ( T ) = 0.3682 , H 2.5 ( T ) = 0.3062 . It is clear that it decreases as α increases.
The system signatures of orders 1–5 have been calculated in [17] and, therefore, we can compute the values of H α ( T ) numerically for all α 0 . Considering different values of α , the Tsallis entropy of systems with 1–4 iid exponential components has been given in Table 2. Generally, H α ( T ) has well resulted concerning the standard deviations of T α for some α as was shown in Table 2. To compare the Tsallis entropy of two mixed systems with the identical signature having iid component lifetimes by using Equation (15) which is expressed below.
Theorem 6. 
The rv’s T X 1 and T X 2 are supposed to be the lifetime of two mixed systems with the same signature having n iid component lifetimes.
(i) If X 1 d X 2 , then T X 1 T E T X 2 .
(ii) Assume X 1 T E X 2 and consider
B 1 = 0 < v < 1 | f 2 ( F 2 1 ( v ) ) f 1 ( F 1 1 ( v ) ) < 1 , B 2 = 0 < v < 1 | f 2 ( F 2 1 ( v ) ) f 1 ( F 1 1 ( v ) ) 1 .
If B 1 = ϕ , B 2 = ϕ or inf B 1 g V ( v ) sup B 2 g V ( v ) , then T X 1 T E T X 2 .
Proof. 
(i) By the assumption X d Y and the two systems have the same signature, so expression (15) gives
( 1 α ) H α ( T X 1 ) ( 1 α ) H α ( T X 2 ) = 0 1 g V α ( v ) f 1 α 1 ( F 1 1 ( v ) ) f 2 α 1 ( F 2 1 ( v ) ) d v 0 ,
for all 0 < α < 1 and
( 1 α ) H α ( T X 1 ) ( 1 α ) H α ( T X 2 ) = 0 1 g V α ( v ) f 1 α 1 ( F 1 1 ( v ) ) f 2 α 1 ( F 2 1 ( v ) ) d v 0 ,
for α > 1 . Now, this completes the proof.
(ii) If the condition B 1 = ϕ or B 2 = ϕ hold, then the outcome is clear. Hence, we assume that B 1 ϕ and B 2 ϕ . Assumption X T E Y and Equation (2) imply, for α > 1 ,
0 1 f 1 α 1 ( F 1 1 ( v ) ) f 2 α 1 ( F 2 1 ( v ) ) d v 0 ,
and for 0 < α < 1 ,
0 1 f 1 α 1 ( F 1 1 ( v ) ) f 2 α 1 ( F 2 1 ( v ) ) d v 0 .
Let us assume
( 1 α ) H α ( T X 1 ) ( 1 α ) H α ( T X 2 ) = 0 1 g V α ( v ) f 1 α 1 ( F 1 1 ( v ) ) f 2 α 1 ( F 2 1 ( v ) ) d v .
Then, for all α > 1 , it holds that
B 1 g V α ( v ) f 1 α 1 ( F 1 1 ( v ) ) f 2 α 1 ( F 2 1 ( v ) ) d v + B 2 g V α ( v ) f 1 α 1 ( F 1 1 ( v ) ) f 2 α 1 ( F 2 1 ( v ) ) d v inf B 1 g V ( v ) α B 1 f 1 α 1 ( F 1 1 ( v ) ) f 2 α 1 ( F 2 1 ( v ) ) d v + sup B 2 g V ( v ) α B 2 f 1 α 1 ( F 1 1 ( v ) ) f 2 α 1 ( F 2 1 ( v ) ) d v sup B 2 g V ( v ) α B 1 f 1 α 1 ( F 1 1 ( v ) ) f 2 α 1 ( F 2 1 ( v ) ) d v + sup B 2 g V ( v ) α B 2 f 1 α 1 ( F 1 1 ( v ) ) f 2 α 1 ( F 2 1 ( v ) ) d v = sup B 2 g V ( v ) α 0 1 f 1 α 1 ( F 1 1 ( v ) ) f 2 α 1 ( F 2 1 ( v ) ) d v 0 .
In (18), the second inequality is given by the assumption inf B 1 g V ( v ) sup B 2 g V ( v ) while the last inequality is found by implementing (17). The result for 0 < α < 1 can also be lifted as in the proof above. □
Let us take the following example to demonstrate the above theorem.
Example 3. 
Consider the lifetime T X = min { X 1 , max { X 2 , X 3 } } in the iid case with basal cdf
F X ( t ) = 1 e 2 t ,
and T Z = min { Z 1 , max { Z 2 , Z 3 } } another lifetime with basal cdf
F Z ( t ) = 1 e t .
The signature is s = ( 1 3 , 2 3 , 0 ) . It can be readily seen, for all α D α , that
H α ( X ) = 2 α 1 α α ( 1 α ) and H α ( Z ) = 1 α .
So, one can see that X T E Z . Moreover, it is readily apparent that B 1 = [ 0 , 1 ) and B 2 = { 1 } , and hence inf B 1 g V ( v ) = sup B 2 g V ( v ) = 0 . Therefore Part (ii) of Theorem 6 results T X T E T Z .
We show that the minimum of lifetimes in the iid case has lower or equal Tsallis entropy than all mixed systems under the property of decreasing failure rate of component lifetimes.
Theorem 7. 
If T represents the mixed system’s lifetime in the iid case and the component lifetime is DFR, then X 1 : n T E T .
Proof. 
Recalling [18], it is evident that X 1 : n h r T and X 1 : n d T as the series system has a DFR lifetime provided that the parent component lifetime is DFR. Now, the result is found by using Theorem 1. □

Bounds for the Tsallis Entropy of Mixed Systems

With the insights from the previous section, we can now find some inequalities and bounds on the Tsallis entropy of mixed systems. We note that it is usually difficult or sometimes impossible to determine the Tsallis entropy when the system is identified by a complex structure-function or when the components included in the system are immeasurable. This is why it is valuable to provide bounds on the measure. The following result provides bounds for the Tsallis entropy in the mixed system concerning the Tsallis entropy of its components.
Theorem 8. 
If H α ( X ) < . Then
H α ( T ) sup v ( 0 , 1 ) g V ( v ) α H α ( X ) + sup v ( 0 , 1 ) g V ( v ) α 1 1 α ,
for all α > 1 and
H α ( T ) sup v ( 0 , 1 ) g V ( v ) α H α ( X ) + sup v ( 0 , 1 ) g V ( v ) α 1 1 α ,
for 0 < α < 1 .
Proof. 
Recall the relation (15), for all α D α , we have
1 + ( 1 α ) H α ( T ) = 0 1 g V α ( v ) f α 1 ( F 1 ( v ) ) d v sup v ( 0 , 1 ) g V ( v ) α 0 1 f α 1 ( F 1 ( v ) ) d v = sup v ( 0 , 1 ) g V ( v ) α ( 1 α ) H α ( X ) + 1 ,
where the last equality is obtained from (2) and this completes the proof. □
If the amount of the components of a system is very big or the system has a very complicated design, the provided bounds in (21) and (22) are very applicable in such circumstances. Here, we find an overall lower bound Tsallis entropy for the lifetime of the system by applying properties of Tsallis entropy.
Theorem 9. 
Underneath the requirements of the Theorem 8, we have
H α ( T ) H α L ( T ) ,
where H α L ( T ) = i = 1 n p i H α ( X i : n ) for all α D α and H α ( X i : n ) is the Tsallis entropy of the i-th order statistics.
Proof. 
Recall the Jensen’s inequality for the convex function t α (it is concave (convex) for 0 < α < 1 ( α > 1 ) ), it holds that
i = 1 n p i f i : n α ( ) i = 1 n p i f i : n α ( t ) , t > 0 ,
and hence we obtain
0 f T α ( t ) d t ( ) i = 1 n p i 0 f i : n α ( t ) d t .
Since 1 α > 0 ( 1 α < 0 ) , by multiplying both side (24) in 1 / ( 1 α ) , we obtain
H α ( T ) 1 1 α i = 1 n p i 0 f i : n α ( t ) d t 1 = 1 1 α i = 1 n p i 0 f i : n α ( t ) d t i = 1 n p i = i = 1 n p i 1 1 α 0 f i : n α ( t ) d t 1 = i = 1 n p i H α ( X i : n ) ,
and this completes the proof. □
The bound in (23) parallels the pdf of the lifetime of the system. It can be expressed as the linear transformation of the Tsallis entropy of an i-out-of-n system. Moreover, when the lower bound in Theorem 8 and also the lower bound in Theorem 9 can be determined, one may adopt the maximum of the two lower bounds.
Example 4. 
Let us suppose a system signature s = ( 1 3 , 2 3 , 0 ) with lifetime T including n = 3 iid component lifetimes having the common cdf given in (20). One can compute that sup x ( 0 , 1 ) g V ( v ) = 1.333 and hence recalling Theorem 8, the Tsallis entropy of T is bounded as follows:
H α ( T ) ( ) ( 1.333 ) α H α ( X ) + ( 1.333 ) α 1 1 α ,
for α > 1 ( 0 < α < 1 ) . The component lifetimes are considered to be exponential. In this case, we plotted the bounds (25), (23) and the true value of H α ( T ) using relation (15) which are shown in Figure 2. As it can be seen that the lower bound in (25) (dotted line) outperforms than (23) for α > 2 .

5. Conclusions

Extensions of Shannon entropy have been presented in the literature before. The Tsallis entropy, which can be considered as a different measure of uncertainty, is one of the extended entropy measures. We have described here some other properties of this measure. We first determined the Tsallis entropy of some known distributions and then proved that it is not invariant if a one-to-one transformation of the lifetime is taken in the continuous case. The connection to the usual stochastic order has been revealed. It is well known that systems having greater lifetimes and lower uncertainty are apposite systems, and that the reliability of a system usually decreases as its uncertainty increases. These findings led us to study the entropy of Tsallis for coherent structures and mixed systems in the iid case. Finally, we established some bounds on the Tsallis entropy of the systems and illustrated the usefulness of the given bounds.

Author Contributions

Conceptualization, G.A.; Formal analysis, M.K.; Investigation, M.K.; Methodology, G.A.; Project administration, G.A.; Resources, M.K.; Software, M.K.; Supervision, G.A.; Validation, M.K.; Writing—original draft, M.K.; Writing—review & editing, G.A. All authors have read and agreed to the published version of the manuscript.

Funding

Princess Nourah bint Abdulrahman University Researchers Supporting Project Number (PNURSP2023R226), Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Acknowledgments

The authors are grateful to the three anonymous Reviewers for their comments and suggestions, which led to this improved version of the paper. The authors extend their sincere appreciation to Princess Nourah bint Abdulrahman University Researchers Supporting Project Number (PNURSP2023R226), Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
  2. Tsallis, C. Possible generalization of boltzmann-gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
  3. Kumar, V.; Taneja, H. A generalized entropy-based residual lifetime distributions. Int. J. Biomath. 2011, 4, 171–184. [Google Scholar] [CrossRef]
  4. Nanda, A.K.; Paul, P. Some results on generalized residual entropy. Inf. Sci. 2006, 176, 27–47. [Google Scholar] [CrossRef]
  5. Wilk, G.; Wlodarczyk, Z. Example of a possible interpretation of tsallis entropy. Phys. A Stat. Mech. Its Appl. 2008, 387, 4809–4813. [Google Scholar] [CrossRef] [Green Version]
  6. Zhang, Z. Uniform estimates on the tsallis entropies. Lett. Math. Phys. 2007, 80, 171–181. [Google Scholar] [CrossRef]
  7. Toomaj, A.; Doostparast, M. A note on signature-based expressions for the entropy of mixed r-out-of-n systems. Nav. Res. Logist. 2014, 61, 202–206. [Google Scholar] [CrossRef]
  8. Toomaj, A. Renyi entropy properties of mixed systems. Commun. Stat.-Theory Methods 2017, 46, 906–916. [Google Scholar] [CrossRef]
  9. Toomaj, A.; Crescenzo, A.D.; Doostparast, M. Some results on information properties of coherent systems. Appl. Stoch. Model. Bus. Ind. 2018, 34, 128–143. [Google Scholar] [CrossRef]
  10. Toomaj, A.; Zarei, R. Some new results on information properties of mixture distributions. Filomat 2017, 31, 4225–4230. [Google Scholar] [CrossRef]
  11. Baratpour, S.; Khammar, A. Tsallis entropy properties of order statistics and some stochastic comparisons. J. Stat. Res. Iran 2016, 13, 25–41. [Google Scholar] [CrossRef] [Green Version]
  12. Shaked, M.; Shanthikumar, J.G. Stochastic Orders; Springer Science and Business Media: Berlin/Heidelberg, Germany, 2007. [Google Scholar]
  13. Arnold, B.C.; Balakrishnan, N.; Nagaraja, H.N. A First Course in Order Statistics; SIAM: New York, NY, USA, 2008. [Google Scholar]
  14. Ebrahimi, N.; Soofi, E.S.; Soyer, R. Information measures in perspective. Int. Stat. Rev. 2010, 78, 383–412. [Google Scholar] [CrossRef]
  15. Ebrahimi, N.; Maasoumi, E.; Soofi, E.S. Ordering univariate distributions by entropy and variance. J. Econom. 1999, 90, 317–336. [Google Scholar] [CrossRef]
  16. Samaniego, F.J. System Signatures and Their Applications in Engineering Reliability; Springer Science and Business Media: Berlin/Heidelberg, Germany, 2007; Volume 110. [Google Scholar]
  17. Navarro, J.; del Aguila, Y.; Asadi, M. Some new results on the cumulative residual entropy. J. Stat. Plan. Inference 2010, 140, 310–322. [Google Scholar] [CrossRef]
  18. Bagai, I.; Kochar, S.C. On tail-ordering and comparison of failure rates. Commun. Stat.-Theory Methods 1986, 15, 1377–1388. [Google Scholar] [CrossRef]
Figure 1. The residual Tsallis entropy of order α = 0.5 (left panel) and α = 2 (right panel) with respect to t given in Example 1.
Figure 1. The residual Tsallis entropy of order α = 0.5 (left panel) and α = 2 (right panel) with respect to t given in Example 1.
Entropy 25 00199 g001
Figure 2. The bound in (25) (dotted line) and the lower bound given in (23) (dashed line) along with the exact value of H α ( T ) for the standard exponential distribution concerning order α .
Figure 2. The bound in (25) (dotted line) and the lower bound given in (23) (dashed line) along with the exact value of H α ( T ) for the standard exponential distribution concerning order α .
Entropy 25 00199 g002
Table 1. Tsallis entropy for some prominent distributions.
Table 1. Tsallis entropy for some prominent distributions.
Distribution f ( x ) H α ( f )
Uniform 1 β , 0 < x < β b α 1 1 α , b > 0
Gamma λ Γ ( k ) x k 1 e λ x , x > 0 1 1 α λ α 1 Γ ( α ( k 1 ) + 1 ) ( α λ ) α ( k 1 ) Γ α ( k ) 1 , k > 0
Weibull k λ x λ k 1 e ( x / λ ) k , x > 0 1 1 α k λ α 1 Γ ( α ( k 1 ) k + 2 ) α α ( k 1 ) k + 2 1 , k > 0
Beta 1 B ( a , b ) x a 1 ( 1 x ) b 1 , 0 < x < 1 1 1 α B ( α ( a 1 ) + 1 , α ( b 1 ) + 1 ) B α ( a , b ) 1 , a , b > 0
B(a, b) = Γ(a + b)/Γ(a)Γ(b) means the beta function and Γ(·) is the gamma function.
Table 2. Comparisons of Tsallis entropy and the lower bound of T α for some values of α .
Table 2. Comparisons of Tsallis entropy and the lower bound of T α for some values of α .
Np H 0.5 ( T ) H 1.5 ( T ) H 3 ( T ) H 0.5 L ( T ) H 1.5 L ( T ) H 3 L ( T )
1(1)2.00000.66660.40002.00000.66660.4000
2(1,0)0.82840.1143−0.16660.82840.1143−0.1666
3(0,1)2.44280.88920.43332.44280.88920.4333
4(1,0,0)0.3094−0.3094−1.00000.3094−0.3094−1.0000
5(1/3,2/3,0)1.11110.39480.20231.90470.63090.2333
6(0,1,0)1.26590.50690.28571.26590.50690.2857
7(0,2/3,1/3)2.06980.74120.38451.71690.65270.3392
8(0,0,1)2.61880.94420.44642.61880.94420.4464
9(1,0,0,0)0.0000−2.1666−2.16660.0000−2.1666−2.1666
10(1/2,1/2,0,0)0.5030−0.0375−0.32420.3603−0.2260−1.0515
11(1/4,3/4,0,0)0.64070.1431−0.01210.5405−0.0057−0.4939
12(1/4,7/12,1/6,0)0.92810.27880.10810.66440.0608−0.4471
13(1/4,1/4,1/2,0)1.25140.49170.27320.91220.1941−0.3536
14(0,1,0,0)0.72060.21450.06360.72060.21450.0636
15(0,5/6,1/6,0)0.97680.33180.16010.84450.28110.1103
16,17(0,2/3,1/3,0)1.14150.42920.23330.96840.34780.1571
18,19(0,1/2,1/2,0)1.26590.50690.28571.09240.41440.2038
20,21(0,1/3,2/3,0)1.36120.56460.31991.21630.48100.2506
22(0,1/6,5/6,0)1.42990.60120.33851.34020.54770.2974
23(0,0,1,0)1.46410.61430.34411.46410.61430.3441
24(0,1/2,1/4,1/4)2.01790.69500.35931.40440.50310.2307
25(0,1/6,7/12,1/4)2.10100.76550.39581.65220.63640.3242
26(0,0,3/4,1/4)2.09790.76530.39591.77610.70300.3709
27(0,0,1/2,1/2)2.41610.87510.42902.08820.79170.3978
28(0,0,0,1)2.71230.96910.45152.71230.96910.4515
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Alomani, G.; Kayid, M. Further Properties of Tsallis Entropy and Its Application. Entropy 2023, 25, 199. https://doi.org/10.3390/e25020199

AMA Style

Alomani G, Kayid M. Further Properties of Tsallis Entropy and Its Application. Entropy. 2023; 25(2):199. https://doi.org/10.3390/e25020199

Chicago/Turabian Style

Alomani, Ghadah, and Mohamed Kayid. 2023. "Further Properties of Tsallis Entropy and Its Application" Entropy 25, no. 2: 199. https://doi.org/10.3390/e25020199

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop