Next Article in Journal
Exploring Quantum Simpson-Type Inequalities for Convex Functions: A Novel Investigation
Next Article in Special Issue
Reliability-Centered Preventive Maintenance Optimization for a Single-Component Mechanical Equipment
Previous Article in Journal
Structural Stability of γ-Boron under High Pressure up to 126 GPa with Fine Pressure Increments
Previous Article in Special Issue
Fault Diagnosis for China Space Station Circulating Pumps: Prototypical Network with Uncertainty Theory
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Rényi Entropy for Past Lifetime Distributions with Application in Inactive Coherent Systems

Department of Statistics and Operations Research, College of Science, King Saud University, P.O. Box 2455, Riyadh 11451, Saudi Arabia
*
Author to whom correspondence should be addressed.
Symmetry 2023, 15(7), 1310; https://doi.org/10.3390/sym15071310
Submission received: 1 June 2023 / Revised: 20 June 2023 / Accepted: 24 June 2023 / Published: 26 June 2023

Abstract

:
In parallel with the concept of Rényi entropy for residual lifetime distributions, the Rényi entropy of inactivity time of lifetime distributions belonging to asymmetric distributions is a useful measure of independent interest. For a system that turns out to be inactive in time t, the past entropy is considered as an uncertainty measure for the past lifetime distribution. In this study, we consider a coherent system that includes n components and has the property that all the components of the system have failed at time t. To assess the predictability of the coherent system’s lifetime, we use the system’s signature to determine the Rényi entropy of its past lifetime. We study several analytical results, including expressions, bounds, and order properties for this measure.

1. Introduction

An accurate quantification of the uncertainty in system lifetime is an important task for engineers engaged in survival analysis. The importance of reducing uncertainty and increasing system lifetime is widely recognized, with longer lifetimes and lower uncertainties being key indicators of higher system reliability (see, e.g., [1]). The problem of extending the life cycle of engineering systems is extremely important and has serious practical applications. To this end, we use the concept of Rényi entropy, denoted H ¯ α ( X ) , which measures the uncertainty associated with a non-negative continuous random variable (rv) X with a probability density function (pdf) f , defined as the Rényi entropy of order α , which is as follows
H α ( X ) = H α ( f ) = 1 1 α log 0 f α ( x ) d x , α > 0 ,
where “log” stands for the natural logarithm. In general, the uncertainty in the lifetime of a device or organism can be described by a non-negative rv X with a pdf f . This is different from the uncertainty associated with a probability p, which is measured by the Shannon entropy H ( p ) . The notion of uncertainty associated with a distribution, which is well defined for a discrete distribution, should be used with caution when dealing with a continuous distribution, and even more so when choosing Rényi entropy. In the continuous case, it is important to note that, unlike the discrete case, the Rényi entropy does not have to be positive. A probability distribution is typically used to quantify uncertainty and depends on information about the probability that the uncertain quantity has a single, true value. The frequency of instances of the variable, obtained from observed data, is used to quantify the variation. In our case, the variable is the random lifetime of the device, and the uncertainty is a function of event probabilities. Events, in the context of rvs, are those where a random lifetime has certain numerical values.
Rényi entropy has many applications in measuring uncertainty in dynamical systems and has also proved to be a useful criterion for optimization problems (see, e.g., Erdogmus and Principe [2], Lake [3], Bashkirov [4] Henríquez et al. [5]), Guo et al. [6], Ampilova and Soloviev [7], Koltcov [8], and Wang et al. [9]. Shannon’s differential entropy (Shannon [10]), a fundamental concept in information theory, can be derived as follows:
H ( X ) = lim α 1 H α ( X ) = 0 f ( x ) log f ( x ) d x .
By consideration of the lifetime a fresh system has, denoted by X, the Rényi entropy H α ( X ) is a useful measure of uncertainty. However, in certain scenarios, actors may know the current age of the system. For example, they know that the system has been in operation for a certain period of time t and want to measure the uncertainty induced by the remaining lifetime of the system, denoted by X t = X t | X > t . In such cases, Rényi entropy is no longer an appropriate measure of uncertainty. To address this problem, we introduce the residual Rényi entropy, which is specifically designed to measure the uncertainty associated with the remaining lifetime of a system. The residual Rényi entropy is (see [11])
H α ( X t ) = 1 1 α log 0 f t α ( x ) d x = 1 1 α log t f ( x ) S ( t ) α d x ,
for all α > 0 , where S ( t ) denotes the survival function of the rv X . The concept of H α ( X t ) provides a fascinating aspect of lifetime units in reliability engineering as the behavior of its fluctuations with respect to t (the current age of an item with original lifetime X) may be helpful to create models. This area of research has attracted the attention of researchers in various fields of science and engineering. This entropy measure is a generalization of the classical Shannon entropy, and it has been shown to have numerous valuable properties and applications in different contexts. In this area, Asadi et al. [12], Gupta and Nanda [13], Nanda and Paul [14], Mesfioui et al. [15], and many other researchers have studied the properties and applications of H α ( X t ) .
Uncertainty is a pervasive feature of a given (specific) parameter in real systems, such as their random lifetime, and its effects are felt not only in the future but also in the past. Even if there are facts in the past that we are not aware of, uncertainty remains. There are many real situations in nature, in society, in history, in geology, in other branches of science, and even in medicine, where there is no information about the exact timing of some past events. For example, the exact time when a disease began in a person’s body. This gives rise to a complementary notion of entropy, which captures the uncertainty of past events and is distinct from residual entropy, which describes uncertainty about future events. For another example, consider a system that is observed only at certain inspection times. If the system is inspected for the first time at time t and is in a failed state, the uncertainty relates to the past, more specifically to the time of failure within the interval [ 0 , t ] . Moreover, when an aircraft is discovered in a non-functional state, it is critical to quantify the degree of uncertainty associated with this situation, which is in the past. The uncertainty is in determining the exact point in the aircraft’s operational history that led to its current condition. Therefore, it is appropriate to introduce a complementary notion of uncertainty that refers to past events rather than future events and is distinct from residual entropy. Let ( Z Z A ) denote an rv with the conditional distribution of Z under the assumption that Z lies in A , where A is a subset of R such that P ( Z A ) > 0 . Suppose that X is the random lifetime of a fresh system that has a cumulative distribution function (cdf) F, and suppose that an inspection at time t finds that the system is inactive. Then, X ( t ) = ( t X X < t ) for all t 0 : F ( t ) > 0 , which is known as the inactivity time of the system and measures the time elapsed since the time when the failure of the system occurred (cf. Kayid and Ahmad [16]). The rv X t = ( X X < t ) is also called the past lifetime. The uncertainty in the distribution of X t is equivalent to the uncertainty in the rv X ( t ) . The study of past entropy, which deals with the entropy properties of the distribution of past lifetimes and its statistical applications, has received considerable attention in the literature, as evidenced by works such as Di Crescenzo and Longobardi [17], Nair and Sunoj [18], and Nair and Sunoj [19]. Li and Zhang [20] studied the monotonic properties of entropy in order statistics, record values, and weighted distributions in terms of Rényi entropy. Gupta et al. [21] have made significant contributions to the field by studying the properties and applications of past entropy in the context of order statistics. In particular, they have studied the residual and past entropies of order statistics and made stochastic comparisons between them. In general, the informational properties of the residual lifetime distribution are not related to the informational aspects of the past lifetime distribution, at least not for lifetime distributions, which is the case in this work. For further illustrative descriptions on this issue we refer the reader to Ahmad et al. [22]. Therefore, the study of uncertainty in the past life distribution was considered as a new problem compared to the uncertainty properties of the residual life distribution.
On the other hand, coherent systems are well-known in reliability engineering as a large class of such systems and as typical systems in practice (see, e.g., Barlow and Proschan [23] for the formal definition and initial properties of such systems). An example is the k-out-of-n system, which denotes a structure with n components, of which at least k components must be active for the whole system to work. This structure is one of the most important special cases of coherent systems, which has many applications. For example, an airplane with three engines, where at least two engines must be active for it to continue flying smoothly. The  ( n 1 ) -out-of-n structure, referred to in the literature as a fail-safe system, has many applications in the real world. A fail-safe system is a special design feature that, when a failure occurs, reacts in such a way that no damage is done to the system itself. The brake system on a train is an excellent example of a fail-safe system, where the brakes are held in the off position by air pressure. If a brake line ruptures or a car is cut off, the air pressure is lost; in this case, the brakes are applied by a local air reservoir. Consider a coherent system that turns out to be inactive at time t, when all components of the system are also inactive. The time t is the first time at which the coherent system is found to be inactive. The predictability of the exact time at which the system fails depends largely on the uncertainty properties of the past lifetime distributions. The goal of this work is to quantify the uncertainty about the exact time of failure of the coherent system that is inactive at time t, and furthermore, the uncertainty about the exact time of failure of a particular component of this inactive system. To this end, we will utilize the Rényi entropy of past life distribution.
In the context of lifetime distributions, the study of the Rényi entropy of the past lifetime and its associated information properties does not follow from the Rényi entropy properties of the residual lifetime distribution. To illustrate and explain this problem, we denote by X a random variable with a normal distribution with mean μ and variance σ 2 . We know that X has a symmetric distribution around μ , so μ X = s t X μ , where = s t denotes equality in the distribution. This gives X ( t ) = s t X 2 μ t , and therefore the Rényi entropy property of X t = ( X | X t ) follows directly from the same Rényi entropy property of ( X | X > 2 μ t ) . However, in the context of lifetime distributions (with support [ 0 , + ) ), there is no symmetry. For example, the Gamma distribution and the Weibull distribution as two standard lifetime distributions are asymmetric distributions.
In this paper, we present a comprehensive study of Rényi entropy for the distribution of past lifetimes, providing a generalized version of Equation (3). By allowing different averaging of the conditional probabilities by the parameter α , our proposed measure allows for a nuanced comparison of the shapes of different distributions of past lifetimes. The present results demonstrate the potential of this measure to uncover new insights into the mechanisms underlying these distributions. Furthermore, we assume a coherent system of n components, characterized by the property that all components of the system have failed at time t . We use the system signature method to calculate the Rényi entropy of the past lifetime of a coherent system. Throughout the manuscript, we will not use the words “increasing” and “decreasing” in the strict sense.

2. Results on the Past Rényi Entropy

Let us consider an rv X representing the lifetime of a system with the pdf f and the cdf F. Recall that the pdf of X t = [ X | X t ] is given by f t ( x ) = f ( x ) / F ( t ) , where 0 < x < t and f t ( x ) = 0 , for x 0 and x t . The rvs X t and Y t are called right-truncated rvs associated with X and Y , respectively. In this context, the past Rényi entropy at time t of X is defined as (see, e.g., [13])
H ¯ α ( X t ) = 1 1 α log 0 + f t α ( x ) d x = 1 1 α log 0 t f ( x ) F ( t ) α d x ,
for all α > 0 . Note that H ¯ α ( X t ) [ , ] . Suppose that at time t it is determined that a lifetime unit has failed. Then, H ¯ α ( X t ) measures the uncertainty about its past lifetime, i.e., about X t . The role of past Rényi entropy in comparing random lifetimes is illustrated by the following example. This highlights the importance of our proposed measure in detecting subtle differences in the shapes of different distributions of past lifetimes and underscores its potential to shed light on the mechanisms underlying these phenomena.
Example 1.
Let us consider two components in a system having random lifetimes X and Y with pdfs
f ( x ) = 2 x , 0 < x < 1 , a n d g ( x ) = 2 ( 1 x ) , 0 < x < 1 ,
respectively. The Rényi entropy of both X and Y are elegantly captured by the expression:
H ¯ α ( X ) = H ¯ α ( Y ) = 1 1 α α log 2 log ( α + 1 ) ,
This result implies that the expected uncertainty regarding the predictability of the outcomes of X and Y in terms of Rényi entropy is identical for f and g. In the case where both components failed at time t ( 0 , 1 ) during the inspection, the past Rényi entropy can be used to measure the uncertainty around the respective failure time points, in the spirit of Equation (4) as follows:
H ¯ α ( X t ) = 1 1 α α log 2 log ( α + 1 ) ( α 1 ) log t , H ¯ α ( Y t ) = 1 1 α α log 2 log ( 1 ( 1 t ) α + 1 ) α log ( 2 t t 2 ) log ( α + 1 ) ,
It can be shown that H ¯ α ( X t ) H ¯ α ( Y t ) , for all t ( 0 , 1 ) , i.e., the expected uncertainty related to the predictability of the failure time of the first component with original lifetime X, as long as X < t , is greater than that of the second component with original lifetime Y provided that Y < t , even if H ¯ α ( X ) = H ¯ α ( Y ) .
As mentioned before in Section 1, an interesting observation is that the statement in Equation (4) can be interpreted as the Rényi entropy of the inactivity time X ( t ) = [ t X | X t ] . This alternative identification sheds new light on the underlying dynamics. From (4), we also obtain the following expression for the past Rényi entropy:
H ¯ α ( X t ) = log α α 1 1 α 1 log E [ τ α 1 ( X α , t ) ] ,
where τ ( x ) = f ( x ) / F ( x ) denotes the reversed hazard rate of X and X α , t has the pdf
f α , t ( x ) = α f t ( x ) F t α 1 ( x ) ,
for all α > 0 , so that F t ( x ) = F ( x ) / F ( t ) for all t 0 . The following definition, which can be found in Shaked and Shanthikumar [24], is used throughout the paper.
Definition 1.
Suppose that X and Y are two general rvs with arbitrary supports with density functions f and g and reversed hazard rate functions τ X and τ Y , respectively. Then,
(i) 
We say X is smaller than Y in the reversed hazard rate order (denoted by X r h r Y ) whenever τ X ( a ) τ Y ( a ) , for all a, or equivalently if G ( a ) F ( a ) is increasing in a .
(ii) 
X is said to have a decreasing [resp. increasing] reversed hazard rate property (denoted as DRHR [resp. IRHR]) whenever τ X ( a ) is a decreasing [resp. increasing] function in a.
(iii) 
We say X is smaller than Y in the dispersion order (denoted by X d i s p Y ) whenever g ( G 1 ( u ) ) f ( F 1 ( u ) ) , for all u ( 0 , 1 )
Our analysis sheds new light on the behavior of past Rényi entropy in the presence of a decreasing reversed hazard rate (DRHR) property, contributing to our understanding of this important class of stochastic processes.
Theorem 1.
If X is DRHR, then H ¯ α ( X t ) is increasing in t.
Proof. 
Differentiating (4) with respect to t implies
( 1 α ) H ¯ α ( X t ) = f α ( t ) 0 t f α ( x ) d x α τ ( t ) = α f α ( t ) F α ( t ) α 0 t f α ( x ) F α ( t ) d x α τ ( t ) = α τ α ( t ) 0 t τ α 1 ( x ) f α , t ( x ) d x α τ ( t ) ,
where f α , t ( x ) is given in (7). Since X is DRHR, then τ ( x ) is decreasing in t and this results in for all α > 1 ( α < 1 ) that τ α 1 ( x ) ( ) τ α 1 ( t ) when x t . Thus, Equation (8) yields
α τ α ( t ) 0 t τ α 1 ( x ) f α , t ( x ) d x α τ ( t ) ( ) 0 ,
or equivalently
( 1 α ) H ¯ α ( X t ) ( ) 0 ,
and this gives the results.    □
The result of Theorem 1 shows that the DRHR property of a component lifetime translates to the increasing property of the past Rényi entropy for the component lifetime as a function of time. Thus, an interesting conclusion is that when we first find that a component with a random lifetime exhibiting the DRHR property has failed, the uncertainty about the exact time of failure (with respect to the past Rényi entropy) of the component increases accordingly. The following theorem refers to the ordering of the rvs according to the Rényi entropy of the right-truncated distributions and the ordering of the rvs based on the basis of the reversed hazard rates order.
Theorem 2.
Let X and Y be two continuous rvs having cdfs F and G with pdfs f and g , and reversed hazard rate functions τ X and τ Y , respectively. If  X r h r Y and either X or Y has IRHR property, then, for all α > 0 , we have H ¯ α ( X t ) H ¯ α ( Y t ) .
Proof. 
Notice that X t = [ X | X t ] and Y t = [ Y | Y t ] , as two right-truncated rvs, have pdfs f t and g t , respectively. The condition that X r h r Y implies that for all x t
F ( x ) F ( t ) G ( x ) G ( t ) .
For every α > 0 , the inequality
F α ( x ) F α ( t ) G α ( x ) G α ( t ) ,
then holds, which further implies that X α , t s t Y α , t , where X α and Y α have the cdfs F α ( x ) and G α ( x ) , respectively. Now, let us assume that X has IRHR property. For  α > 1 we obtain the following result
E [ τ X α 1 ( X α , t ) ] E [ τ X α 1 ( Y α , t ) ] E [ τ Y α 1 ( Y α , t ) ] ,
where the former inequality follows from the definition of the usual stochastic order and the fact that E [ ϕ ( X α , t ) ] E [ ϕ ( Y α , t ) ] for all increasing functions ϕ , and in particular, for  ϕ ( x ) = τ X α 1 ( x ) , which is an increasing function of x as X is IRHR and also since α > 1 . The latter inequality above is also satisfied because X r h r Y . For  α < 1 , we can easily check that
E [ τ X α 1 ( X α , t ) ] E [ τ X α 1 ( Y α , t ) ] E [ τ Y α 1 ( Y α , t ) ] .
For all α > 0 , we finally obtain
1 α 1 log E [ τ X α 1 ( X α , t ) ] 1 α 1 log E [ τ Y α 1 ( Y α , t ) ] ,
and this gives H ¯ α ( X t ) H ¯ α ( Y t ) by recalling (6). A similar conclusion can be drawn if we assume instead that the rv Y possesses the IRHR property.    □
According to Theorem 2, between two rvs, at least one of which has the IRHR property, the one with a larger reversed hazard rate leads to a smaller uncertainty in the Rényi entropy of the right-truncated distribution. Therefore, the rv which is stochastically smaller is expected to be less predictable.
In the next theorem, we give a bound for H ¯ α ( X t ) in terms of the reversed hazard rate function.
Theorem 3.
Assume that τ ( x ) < . If X is DRHR, then for all α > 0 , it holds that
H ¯ α ( X t ) log α α 1 log τ ( t ) , t > 0 .
Proof. 
If X is DRHR, then τ ( t ) is decreasing in t , and so recalling (6) for all α 1 > 0   ( α 1 < 0 ) , we have
H ¯ α ( X t ) = log α α 1 1 α 1 log 0 t τ α 1 ( x ) f α , t ( x ) d x log α α 1 1 α 1 log τ α 1 ( t ) 0 t f α , t ( x ) d x = log α α 1 log τ ( t ) ,
and this completes the proof.  □

3. Results on the Past Lifetimes of Coherent Systems

This section presents the application of the system signature approach to find a definition for the past life entropy of a coherent system with arbitrary structure. It is assumed that all components of the system have failed at a given time t . A coherent system is defined as one that satisfies two conditions: First, it contains no irrelevant components, and second, its structure function is monotonic. An n-dimensional vector p = ( p 1 ,   ,   p n ) , whose ith element is given by p i = P ( T = X i : n ) , i = 1 ,   2 ,   ,   n ; is the signature of such a system (see [25]). Consider a coherent system with independent and identically distributed (i.i.d.) component lifetimes X 1 ,   ,   X n , and a known signature vector p = ( p 1 ,   ,   p n ) . If  T t = [ t T | X n : n t ] stands for the past lifetime of the coherent system under the condition that, at time t all components of the system have failed, then from the results of Khaledi and Kochar [26], the survival function of T t can be expressed as
P ( T t > x ) = i = 1 n p i P ( t X i : n > x | X n : n t ) ,
where
P ( t X i : n > x | X n : n t ) = k = i n n k F ( t x ) F ( t ) k 1 F ( t x ) F ( t ) n k , 0 < x < t ,
denotes the past life survival function of an i-out-of-n system under the condition that all components have failed at time t. From (9), it follows that
f T t ( x ) = i = 1 n p i f T t i ( x ) ,
where
f T t i ( x ) = Γ ( n + 1 ) Γ ( i ) Γ ( n i + 1 ) F ( t x ) F ( t ) i 1 1 F ( t x ) F ( t ) n i f ( t x ) F ( t ) , 0 < x < t ,
such that Γ ( · ) is the complete gamma function and T t i = [ t X i : n | X n : n t ] , i = 1 ,   2 ,   ,   n , is the time that has passed from the failure of the component with lifetime X i : n in the system given that the system has failed at or before time t . It is worth mentioning, by (9), that T t i denotes the ith order statistics consisting of n i.i.d. components with the cdf F ( t x ) F ( t ) , 0 < x < t . Hereafter, we provide an expression for the entropy of T t . To this aim, let us keep in mind that F t ( x ) = F ( x ) F ( t ) , 0 < x < t . The probability integral transformation V = F t ( T t ) , as a crucial reformulation, plays an important role in our aim. It is evidently seen that U i : n = F t ( T t i ) has the beta distribution with parameters i and n i + 1 with the following pdf
g i ( u ) = Γ ( n + 1 ) Γ ( i ) Γ ( n i + 1 ) u i 1 ( 1 u ) n i , 0 < u < 1 ,
for all i = 1 ,   ,   n . In the forthcoming theorem, we provide an expression for the Rényi entropy of T t by using the earlier mentioned transforms.
Theorem 4.
Let T t stand for the past lifetime of the coherent system under the condition that, at time t all components of the system have failed. The Rényi entropy of T t can be expressed as follows:
H ¯ α ( T t ) = 1 1 α log 0 1 g V α ( u ) f t α 1 ( F t 1 ( u ) ) d u , t > 0 ,
where V is the lifetime of the coherent system, with the pdf g V ( v ) = i = 1 n p i g i ( v ) , and F t 1 ( u ) = inf { x ; F t ( x ) u } is the quantile function of F t ( x ) = F ( x ) / F ( t ) , 0 < x t for all α > 0 .
Proof. 
By (1) and (10), and by substituting z = t x and α ¯ = 1 α , we have
H ¯ α ( T t ) = 1 1 α log 0 t f T t ( x ) α d x = 1 α ¯ log 0 t i = 1 n p i f T t i ( x ) α d x = 1 α ¯ log 0 t i = 1 n p i Γ ( n + 1 ) Γ ( i ) Γ ( n i + 1 ) F ( t x ) F ( t ) i 1 1 F ( t x ) F ( t ) n i f ( t x ) F ( t ) α d x = 1 α ¯ log 0 t i = 1 n p i Γ ( n + 1 ) Γ ( i ) Γ ( n i + 1 ) F t ( z ) i 1 1 F t ( z ) n i f t ( z ) α d x = 1 α ¯ log 0 1 g V α ( u ) f t ( F t 1 ( u ) ) α 1 d u .
The identity at the last line above is acquired by substituting u = F t ( z ) , which ends the proof.  □
If all components have failed at time t, then H ¯ ( T t ) measures the expected uncertainty contained in the conditional density of t T given X n : n t , about the predictability of the past lifetime of the system. In the special case, if we consider an i-out-of-n system with the system signature p = ( 0 ,   ,   0 ,   1 i ,   0 ,   ,   0 ) , i = 1 ,   2 ,   , n , then Equation (13) reduces to
H ¯ α ( T t i ) = H ¯ α ( U i : n ) 1 α 1 log E [ f t α 1 ( F t 1 ( Z i ) ) ] ,
where Z i has the beta distribution with parameters α ( i 1 ) + 1 and α ( n i ) + 1 and
H α ( U i : n ) = α α 1 log B ( i , n i + 1 ) 1 α 1 log B ( α ( i 1 ) + 1 , α ( n i ) + 1 ) ,
so that B ( a , b ) = Γ ( a ) Γ ( b ) / Γ ( a + b ) stands for the beta function. In the special case when t goes to infinity, (14) coincides with the results of Abbasnejad and Arghami [27].
The next theorem immediately follows from Theorem 4 in terms of the property that the reversed hazard rate of X, i.e.,  τ ( x ) , is decreasing.
Theorem 5.
If X is DRHR, then H ¯ α ( T t ) is increasing in t.
Proof. 
By noting that  f t ( F t 1 ( x ) ) = x τ t ( F t 1 ( x ) ) , Equation (13) can be rewritten as
e ( 1 α ) H ¯ α ( T t 1 , n ) = 0 1 g V α ( u ) u α 1 τ t ( F t 1 ( u ) ) α 1 d u ,
for all α > 0 . It is easy to verify that F t 1 ( u ) = F 1 ( u F ( t ) ) for all 0 < u < 1 , and hence we have
τ t ( F t 1 ( u ) ) = τ ( F 1 ( u F ( t ) ) ) , 0 < u < 1 .
If t 1 t 2 , then F 1 ( u F ( t 1 ) ) F 1 ( u F ( t 2 ) ) . Consequently, when X is DRHR, then for all α > 1   ( 0 < α 1 ) we have
0 1 g V α ( u ) u α 1 τ t 1 ( F t 1 1 ( u ) ) α 1 d u = 0 1 g V α ( u ) u α 1 τ ( F 1 ( u F ( t 1 ) ) ) α 1 d u ( ) 0 1 g V α ( u ) u α 1 τ ( F 1 ( u F ( t 2 ) ) ) α 1 d u = 0 1 g V α ( u ) u α 1 τ t 2 ( F t 2 1 ( u ) ) α 1 d u ,
for all t 1 t 2 . Using (16), we obtain
e ( 1 α ) H ¯ α ( T t 1 ) ( ) e ( 1 α ) H ¯ α ( T t 2 ) ,
for all α > 1   ( 0 < α 1 ) . This requires that H ¯ α ( T t 1 ) H ¯ α ( T t 2 ) for all α > 0 . Hence, the proof is obtained.  □
Remark 1.
Another proof of Theorem 5 can be as follows, as one of the referees pointed out. Suppose that X has the DRHR property and, thus, F is log-concave. Then, from Theorem 3.B.19 in Shaked and Shanthikumar [24], [ X | X t 1 ] d i s p [ X | X t 1 ] for all t 1 t 2 . Thus, from definition, f t 1 ( F t 1 1 ( u ) ) f t 1 ( F t 1 1 ( u ) ) for all t 1 t 2 . It is plain to observe, in view of (13), that H ¯ α ( T t 1 ) H ¯ α ( T t 2 ) for all α > 0 .
The result of Theorem 5 in turn shows that when the component lifetimes in a coherent system satisfy the DRHR property, the past Rényi entropy increases when all components of the coherent system are inactive, thus decreasing predictability and making it very difficult to determine the exact time of failure in the past.
The next example is given to apply Theorems 4 and 5.
Example 2.
Let us consider a coherent system with the system signature p = ( 0 , 3 10 , 1 2 , 1 5 , 0 ) depicted in Figure 1. The exact value of H ¯ α ( T t ) can be computed using relation (13) when the component lifetime distributions are given. To this aim, let us suppose the following lifetime distributions.
(i) 
Let X follow the uniform distribution in [ 0 , 1 ] . From (13), we immediately obtain
H ¯ α ( T t ) = log ( t ) + 1 1 α log 0 1 g V α ( u ) d u , t > 0 .
It reveals that the Rényi entropy H ¯ α ( T t ) of the rv T t is a decreasing function of time t. This observation is consistent with previous results on the behavior of Rényi entropy for certain classes of rvs. In particular, the uniform distribution is known to possess the DRHR property, which implies that the Rényi entropy of T t should be an increasing function of time t, in line with Theorem 5.
(ii) 
Consider an rv X with the cdf given by
F ( x ) = e x k , x > 0 , k > 0 .
With some algebraic manipulation, we arrive at the following expression:
H ¯ α ( T t ) = 1 1 α log 0 1 t k log u ( 1 k + 1 ) ( α 1 ) u α 1 g V α ( u ) d u log k , t > 0 .
The numerical results, which are presented in Figure 2, showcase the increasing nature of the Rényi entropy of T t as a function of time t for α = 0.2 and various values of k (in this case, the past Rényi entropy for α > 1 is not defined). This observation is in line with Theorem 5, which predicts the monotonicity properties of the Rényi entropy in the case of DRHR rvs.
The above example sheds light on the complicated relationship between the Rényi entropy of an rv and time t (the time of inspection) where the failure of the device is realized, and highlights the importance of considering the DRHR property when analyzing such systems. Thus, our results suggest that the DRHR property of X plays a crucial role in shaping the temporal behavior of the Rényi entropy of T t , which could have far-reaching implications for various applications, including the analysis of complex systems and the development of efficient data compression techniques.
It is possible to lower the computing cost for determining the signatures of all coherent systems of a certain size by around half thanks to the widely used notion of a system’s dual in reliability and industry. A duality link between a system’s signature and that of its dual has been proposed by Kochar et al. [28]. If p = ( p 1 ,   ,   p n ) signifies the signature of a coherent system with lifetime T, then the signature by which its dual system with lifetime T D is recognized is given by p D = ( p n ,   ,   p 1 ) . In the following theorem, by the condition that f t ( F t 1 ( u ) ) is a symmetric function of u around u = 1 2 , we apply the duality property to simplify the calculation of the past Rényi entropy for coherent systems.
Theorem 6.
Let T t be the lifetime of a coherent system with signature p consisting of n i.i.d. components. If f t ( F t 1 ( u ) ) = f t ( F t 1 ( 1 u ) ) is satisfied for all 0 < u < 1 and t , then H ¯ α ( T t ) = H ¯ α ( T t D ) for all p and all n .
Proof. 
To prove sufficiency, let us assume that f t ( F t 1 ( u ) ) = f t ( F t 1 ( 1 u ) ) for all 0 < u < 1 . It is worth noting that g i ( 1 u ) = g n i + 1 ( u ) for all i = 1 ,   ,   n and 0 < u < 1 . Consequently, utilizing (13), we obtain that:
0 1 g V D α ( u ) f t ( F t 1 ( u ) ) α 1 d u = 0 1 i = 1 n p n i + 1 g i ( u ) α f t ( F t 1 ( u ) ) α 1 d u = 0 1 r = 1 n p r g n r + 1 ( u ) α f t ( F t 1 ( u ) ) α 1 d u = 0 1 r = 1 n p r g r ( 1 u ) α f t ( F t 1 ( u ) ) α 1 d u = 0 1 r = 1 n p r g r ( u ) α f t ( F t 1 ( u ) ) α 1 d u = 0 1 g V α ( u ) f t ( F t 1 ( u ) ) α 1 d u ,
and in the spirit of Equation (13) the proof is obtained.  □
In the following, a direct conclusion is drawn from Theorem 6 for the case that an i-out-of-n-system is the designated coherent system.
Corollary 1.
Let T t i be the lifetime of an i-out-of-n system consisting of n i.i.d. components. If f t ( F t 1 ( u ) ) = f t ( F t 1 ( 1 u ) ) is satisfied for all 0 < u < 1 and t , then H ¯ α ( T t i ) = H ¯ α ( T t n i + 1 ) for all n and i = 1 ,   2 ,   ,   n / 2 if n is even and i = 1 ,   2 ,   ,   ( n 1 ) / 2 if n is odd.

4. Bounds for the Past Rényi Entropy of Coherent Systems

For complex systems or uncertain distributions of component lifetimes, accurately calculating the past Rényi entropy H ¯ α ( T t ) of a coherent system can be a daunting task. This scenario is not uncommon in practice, and, thus, there is a growing need for effective approximations of the system behavior. One promising approach is to use previous Rényi entropy bounds, which have been shown to accurately approximate the lifetimes of coherent systems under such circumstances.
Toomaj and Doostparast [29,30] pioneered the development of such barriers for a new system, while more recently Toomaj et al. [31] have extended this work by deriving bounds on the entropy of a coherent system when all its components are working; see also Mesfioui et al. [15]. In the following theorem, we introduce new bounds on the past Rényi entropy of the coherent system’s lifetime, expressed in terms of the past Rényi entropy of the higher-order distribution H ¯ α ( X t ) . By incorporating these bounds into our analysis, we can achieve a more accurate and efficient characterization of complex systems, even in the face of limited information about component lifetimes.
Theorem 7.
Assume a coherent system with the past lifetime T t = [ t T | X n : n t ] consisting of n i.i.d. component lifetimes having the common cdf F with the signature p = ( p 1 ,   ,   p n ) . Then, we have
H ¯ α ( T t ) ( ) α 1 α log B n ( p ) + H ¯ α ( X t ) ,
for α > 1 ( 0 < α < 1 ) where B n ( p ) = i = 1 n p i g i ( m i ) , and m i = n i n 1 .
Proof. 
The beta distribution with parameters n i + 1 and i is a well-known distribution, where its mode, denoted by m i , can be conveniently expressed as m i = n i n 1 . This result allows us to obtain the following expression:
g V ( v ) i = 1 n p i g i ( m i ) = B n ( p ) , 0 < v < 1 .
Thus, for α > 1 ( 0 < α < 1 ) , we have
H ¯ α ( T ) = 1 1 α log 0 1 g V α ( v ) f t ( F t 1 ( u ) ) α 1 d v ( ) 1 1 α log 0 1 B n ( p ) α f t ( F t 1 ( u ) ) α 1 d v = α 1 α log B n ( p ) + H ¯ α ( X t ) .
The last equality is obtained from (4), from which the desired result follows. □
The lower and upper bounds shown in Equation (17) are a valuable tool for analyzing systems with a large number of components or complex configurations. However, in situations where these bounds are not applicable, we can resort to the Rényi information measure and mathematical concepts to derive a more general lower bound. This approach leverages the power of the Rényi information measure and mathematical ideas to provide new insights into the behavior of complex systems, which will be presented in the next theorem.
Theorem 8.
In the setting of Theorem 7,
H ¯ α ( T t ) H ¯ α L ( T t ) ,
where H ¯ α L ( T t ) = 1 1 α log i = 1 n p i 0 t f T t i α ( x ) d x for all α > 0 .
Proof. 
The Jensen’s inequality for the function x α (it is concave (convex) for 0 < α < 1   ( α > 1 ) ) yields
i = 1 n p i f T t i ( x ) α ( ) i = 1 n p i f T t i α ( x ) , x > 0 ,
and hence we obtain
0 t f T t α ( x ) d x ( ) i = 1 n p i 0 t f T t i α ( x ) d x .
The above inequality is obtained by the linearity property of integration. Since 1 α > 0   ( 1 α < 0 ) , by multiplying both sides (20) in 1 / ( 1 α ) , we obtain the desired result. □
It is noteworthy that the equality condition in (19) holds true for i-out-of-n systems, where the failure probability p j is zero for j i , and one for j = i . In this case, the conditional entropy of the system H ¯ α ( T t ) is equal to the conditional entropy of the ith component H ¯ α ( T t i ) . When the lower bounds for 0 < α < 1 in both parts of Theorems 7 and 8 can be computed, one may use the maximum of the two lower bounds.
Example 3.
Let T t = [ t T | X 3 : 3 t ] represent the past lifetime of a coherent system with the signature p = ( 1 3 , 2 3 , 0 ) consisting of n = 3 i.i.d. component lifetimes according to the standard exponential distribution, with cdf F ( t ) = 1 e t , t > 0 . It is easily seen that
H ¯ α ( X t ) = 1 1 α log ( 1 e α t ) ( 1 e t ) α log α , t > 0 .
Moreover, we can obtain B 3 ( p ) = 2 . Thus, by Theorem 7, the Rényi entropy of T t is bounded for α > 1 ( 0 < α < 1 ) as follows:
H ¯ α ( T t ) ( ) 1 1 α log 2 α α + log ( 1 e α t ) ( 1 e t ) α , t > 0 .
It is easily seen that
f t ( F t 1 ( u ) ) = 1 u ( 1 e t ) 1 e t , t > 0 ,
for all 0 < u < 1 . So, the lower bound given in (19) can be obtained as follows:
H ¯ α ( T t ) 1 1 α log i = 1 n p i 0 1 g i α ( u ) ( 1 u ( 1 e t ) ) α 1 d u log ( 1 e t ) , t > 0 ,
for all α > 0 .  Figure 3 depicts the time evolution of the Rényi entropy H ¯ α ( T t ) for the standard exponential distribution. The solid line represents the exact value of H ¯ α ( T t ) , while the dashed and dotted lines correspond to the bounds derived from Equations (21) and (22), respectively. The figure provides a clear visualization of the behavior of the past Rényi entropy over time and highlights the remarkable agreement between the exact value and the bounds. Notably, for α > 1 , the lower bound from (22) (dotted line) surpasses the lower bound from (21).

5. Concluding Remarks

In recent years, the assessment of predictability has become very important when considering the lifetimes of engineering systems. Quantification of uncertainty is a crucial criterion for measuring the degree of predictability in such systems. The Rényi entropy has proven to be an attractive measure for quantifying the uncertainty associated with the lifetimes of systems. In this work, we have presented an expression for the Rényi entropy of the lifetime of a system, under the condition that all system components have failed at time t. This situation may arise in practice because the time at which we normally observe a system and detect the failure of a system is quite late, so all components of the coherent system have also failed at that time. Moreover, we investigate the various properties of this proposed measure, including the determination of boundaries and partial orders between the random time points that have elapsed since the failure of two coherent systems, based on their Rényi entropy uncertainties using the concept of the system signature. Our approach provides an effective method for assessing the predictability of system lifetimes and is useful for engineering applications. We demonstrate the effectiveness of our proposed measure using several application examples. It is worth noting that the results of this paper can be used to study the predictability of engineering systems and its relevance to current research. The results presented provide compelling evidence for the value of Rényi entropy in engineering reliability analysis and highlight its potential for future research in this area.

Author Contributions

Conceptualization, M.K.; methodology, M.K.; software, M.S.; validation, M.K.; formal analysis, M.K.; investigation, M.S.; resources, M.S.; writing—original draft preparation, M.K.; writing—review and editing, M.S.; visualization, M.S.; supervision, M.S.; project administration, M.S.; funding acquisition, M.S. All authors have read and agreed to the published version of the manuscript.

Funding

Researchers Supporting Project number (RSP2023R464), King Saud University, Riyadh, Saudi Arabia.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors are grateful to two anonymous reviewers for their constructive comments and suggestions. The authors acknowledge financial support from the Researchers Supporting Project number (RSP2023R464), King Saud University, Riyadh, Saudi Arabia.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ebrahimi, N.; Pellerey, F. New partial ordering of survival functions based on the notion of uncertainty. J. Appl. Probab. 1995, 32, 202–211. [Google Scholar] [CrossRef]
  2. Erdogmus, D.; Principe, J.C. An error-entropy minimization algorithm for supervised training of nonlinear adaptive systems. IEEE Trans. Signal Process. 2002, 50, 1780–1786. [Google Scholar] [CrossRef] [Green Version]
  3. Lake, D.E. Renyi entropy measures of heart rate Gaussianity. IEEE Trans. Biomed. Eng. 2005, 53, 21–27. [Google Scholar] [CrossRef]
  4. Bashkirov, A.G. Renyi entropy as a statistical entropy for complex systems. Theor. Math. Phys. 2006, 149, 1559–1573. [Google Scholar] [CrossRef]
  5. Henríquez, P.; Alonso, J.B.; Ferrer, M.A.; Travieso, C.M.; Godino-Llorente, J.I.; Díaz-de María, F. Characterization of healthy and pathological voice through measures based on nonlinear dynamics. IEEE Trans. Audio Speech Lang. Process. 2009, 17, 1186–1195. [Google Scholar] [CrossRef] [Green Version]
  6. Guo, L.; Yin, L.; Wang, H.; Chai, T. Entropy optimization filtering for fault isolation of nonlinear non-Gaussian stochastic systems. IEEE Trans. Autom. Control 2009, 54, 804–810. [Google Scholar]
  7. Ampilova, N.; Soloviev, I. Entropies in investigation of dynamical systems and their application to digital image analysis. J. Meas. Eng. 2018, 6, 107–118. [Google Scholar] [CrossRef] [Green Version]
  8. Koltcov, S. Application of Rényi and Tsallis entropies to topic modeling optimization. Phys. A Stat. Mech. Appl. 2018, 512, 1192–1204. [Google Scholar] [CrossRef] [Green Version]
  9. Wang, Z.; Pang, C.K.; Ng, T.S. Data-driven scheduling optimization under uncertainty using Renyi entropy and skewness criterion. Comput. Ind. Eng. 2018, 126, 410–420. [Google Scholar] [CrossRef]
  10. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
  11. Abraham, B.; Sankaran, P. Renyi’s entropy for residual lifetime distribution. Stat. Pap. 2006, 47, 17–29. [Google Scholar] [CrossRef]
  12. Asadi, M.; Ebrahimi, N.; Soofi, E.S. Dynamic generalized information measures. Stat. Probab. Lett. 2005, 71, 85–98. [Google Scholar] [CrossRef]
  13. Gupta, R.; Nanda, A. α- and β-entropies and relative entropies of distributions. J. Stat. Theory Appl. 2002, 1, 177–190. [Google Scholar]
  14. Nanda, A.K.; Paul, P. Some results on generalized residual entropy. Inf. Sci. 2006, 176, 27–47. [Google Scholar] [CrossRef]
  15. Mesfioui, M.; Kayid, M.; Shrahili, M. Renyi Entropy of the Residual Lifetime of a Reliability System at the System Level. Axioms 2023, 12, 320. [Google Scholar] [CrossRef]
  16. Kayid, M.; Ahmad, I. On the mean inactivity time ordering with reliability applications. Probab. Eng. Inform. Sci. 2004, 18, 395–409. [Google Scholar] [CrossRef]
  17. Di Crescenzo, A.; Longobardi, M. Entropy-based measure of uncertainty in past lifetime distributions. J. Appl. Probab. 2002, 39, 434–440. [Google Scholar] [CrossRef]
  18. Nair, N.U.; Sunoj, S. Some aspects of reversed hazard rate and past entropy. Commun. Stat.-Theory Methods 2021, 32, 2106–2116. [Google Scholar] [CrossRef]
  19. Nair, N.U.; Sunoj, S. Some New Results on Renyi Entropy and Its Relative Measure. Calcutta Stat. Assoc. Bull. 2022, 74, 97–116. [Google Scholar] [CrossRef]
  20. Li, X.; Zhang, S. Some new results on Rényi entropy of residual life and inactivity time. Probab. Eng. Inform. Sci. 2011, 25, 237–250. [Google Scholar] [CrossRef]
  21. Gupta, R.C.; Taneja, H.; Thapliyal, R. Stochastic comparisons of residual entropy of order statistics and some characterization results. J. Stat. Theory Appl. 2014, 13, 27–37. [Google Scholar] [CrossRef] [Green Version]
  22. Ahmad, I.A.; Kayid, M.; Pellerey, F. Further results involving the MIT order and the IMIT class. Probab. Eng. Inform. Sci. 2005, 19, 377–395. [Google Scholar] [CrossRef] [Green Version]
  23. Barlow, R.E.; Proschan, F. Statistical Theory of Reliability and Life Testing: Probability Models; Technical Report; Florida State University: Tallahassee, FL, USA, 1975. [Google Scholar]
  24. Shaked, M.; Shanthikumar, J.G. Stochastic Orders; Springer Science & Business Media: New York, NY, USA, 2007. [Google Scholar]
  25. Samaniego, F.J. System Signatures and Their Applications in Engineering Reliability; Springer Science & Business Media: Cham, Switzerland, 2007; Volume 110. [Google Scholar]
  26. Khaledi, B.E.; Shaked, M. Ordering conditional lifetimes of coherent systems. J. Stat. Plan. Inference 2007, 137, 1173–1184. [Google Scholar] [CrossRef]
  27. Abbasnejad, M.; Arghami, N.R. Renyi entropy properties of order statistics. Commun. Stat. Methods 2010, 40, 40–52. [Google Scholar] [CrossRef]
  28. Kochar, S.; Mukerjee, H.; Samaniego, F.J. The “signature” of a coherent system and its application to comparisons among systems. Nav. Res. Logist. (NRL) 1999, 46, 507–523. [Google Scholar] [CrossRef]
  29. Abdolsaeed, T.; Doostparast, M. A note on signature-based expressions for the entropy of mixed r-out-of-n systems. Nav. Res. Logist. (NRL) 2014, 61, 202–206. [Google Scholar]
  30. Toomaj, A. Renyi entropy properties of mixed systems. Commun. Stat.-Theory Methods 2017, 46, 906–916. [Google Scholar] [CrossRef]
  31. Toomaj, A.; Chahkandi, M.; Balakrishnan, N. On the information properties of working used systems using dynamic signature. Appl. Stoch. Model. Bus. Ind. 2021, 37, 318–341. [Google Scholar] [CrossRef]
Figure 1. A coherent system with signature p = ( 0 , 3 10 , 1 2 , 1 5 , 0 ) .
Figure 1. A coherent system with signature p = ( 0 , 3 10 , 1 2 , 1 5 , 0 ) .
Symmetry 15 01310 g001
Figure 2. Exact values of H ¯ ( T t ) for part (b) of Example 2 for various values of k .
Figure 2. Exact values of H ¯ ( T t ) for part (b) of Example 2 for various values of k .
Symmetry 15 01310 g002
Figure 3. Exact values of H ¯ α ( T t ) (solid line) as well as the corresponding lower bounds (21) (dashed line) and (22) (dotted line) for the standard exponential distribution concerning time t .
Figure 3. Exact values of H ¯ α ( T t ) (solid line) as well as the corresponding lower bounds (21) (dashed line) and (22) (dotted line) for the standard exponential distribution concerning time t .
Symmetry 15 01310 g003
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kayid, M.; Shrahili, M. Rényi Entropy for Past Lifetime Distributions with Application in Inactive Coherent Systems. Symmetry 2023, 15, 1310. https://doi.org/10.3390/sym15071310

AMA Style

Kayid M, Shrahili M. Rényi Entropy for Past Lifetime Distributions with Application in Inactive Coherent Systems. Symmetry. 2023; 15(7):1310. https://doi.org/10.3390/sym15071310

Chicago/Turabian Style

Kayid, Mohamed, and Mansour Shrahili. 2023. "Rényi Entropy for Past Lifetime Distributions with Application in Inactive Coherent Systems" Symmetry 15, no. 7: 1310. https://doi.org/10.3390/sym15071310

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop