Abstract
In this work, we introduce a generalized measure of cumulative residual entropy and study its properties. We show that several existing measures of entropy, such as cumulative residual entropy, weighted cumulative residual entropy and cumulative residual Tsallis entropy, are all special cases of this generalized cumulative residual entropy. We also propose a measure of generalized cumulative entropy, which includes cumulative entropy, weighted cumulative entropy and cumulative Tsallis entropy as special cases. We discuss a generating function approach, using which we derive different entropy measures. We provide residual and cumulative versions of Sharma–Taneja–Mittal entropy and obtain them as special cases this generalized measure of entropy. Finally, using the newly introduced entropy measures, we establish some relationships between entropy and extropy measures.
1. Introduction
The uncertainty associated with a random variable can be evaluated using information measures. In many practical situations in lifetime data analysis, experimental physics, econometrics and demography, measuring the uncertainty associated with a random variable is very important. The seminal work on information theory started with the concept of Shannon entropy or differential entropy introduced by Shannon (1948) [1]. For an absolutely continuous non-negative random variable X, the differential entropy is given by
where is the probability density function of X and “log” stands for the natural logarithm, with taken as 0.
Several measures of entropy have been introduced in the literature since then, each one being suitable for some specific situations. The widely used measures of entropy are cumulative residual entropy (CRE) [2], cumulative entropy (CE) [3] and the corresponding weighed measures by Mirali et al. (2016) [4] and Mirali and Baratpour (2017) [5]. A unified formulation of entropy has been put forward by Balakrishnan et al. (2022) [6] recently. For a non-negative random variable X with distribution function , the cumulative residual entropy, which measures the uncertainty in the future of a lifetime of a system, is defined as
where is the survival function of X. Asadi and Zohrevand (2007) [7] gave a representation of (1) based on the mean residual life function as
where is the mean residual life function of X at time t given by
Di Crescenzo and Longobardi (2009) [3] introduced cumulative entropy for estimating the uncertainty in the past lifetime of a system as
The weighted versions of and have been studied in the literature as well. Weighted cumulative residual entropy, introduced by Mirali et al. (2016) [4], is defined as
Mirali and Baratpour (2017) [5] introduced weighted cumulative entropy as
A detailed discussion on weighted entropies has been made by Suhov and Sekeh (2015) [8]. Some additional recent developments in this area were due to [9,10,11,12].
Recently, Kharazmi and Balakrishnan (2020) [13] proposed Jensen cumulative residual entropy, which is an extension of (1). Kharazmi and Balakrishnan (2020) [14] then studied cumulative residual and relative cumulative residual Fisher information measures and their properties. More general cumulative residual-information-generating and relative cumulative residual-information-generating measures have been introduced and studied by Kharazmi and Balakrishnan (2021) [15]. Fractional generalized cumulative entropy and its dynamic version have been proposed by Di Crescenzo et al. (2021) [16].
Several extensions of Shannon entropy are available in the literature, obtained by introducing some additional parameters so that these measures become sensitive to different characteristics and shapes of probability distributions. One important generalization of Shannon entropy is due to Tsallis (1988) [17], known as generalized Tsallis entropy of order . For a continuous random variable X, the generalized Tsallis entropy of order is defined as [17]
Many extensions or modifications have also been provided for . Sati and Gupta (2015) [18] proposed a cumulative Tsallis residual entropy of order , and Rajesh and Sunoj (2019) [19] modified it and defined cumulative residual Tsallis entropy of order as
For a non-negative continuous random variable X, Chakraborty and Pradhan (2021) [20] defined weighted cumulative residual Tsallis entropy (WCRTE) of order as
They also introduced dynamic weighted cumulative residual Tsallis entropy of order .
Calì et al. (2017) [9] introduced cumulative Tsallis past entropy as
Calì et al. (2021) [21] subsequently introduced a family of mean past weighted entropies of order , using the concept of mean inactivity time. Chakraborty and Pradhan (2021) [20] defined weighted cumulative Tsallis entropy (WCTE) of order as
They also studied dynamic weighted cumulative Tsallis entropy of order . As is evident from the description above, several entropy measures are available in the literature. Recently, Balakrishnan et al. (2022) [22] have provided a unified formulation of entropy and demonstrated its applications.
In the present work, we define a generalized cumulative residual entropy and study its properties in Section 2. We show that cumulative residual entropy, weighted cumulative residual entropy, cumulative residual Tsallis entropy and weighted cumulative residual Tsallis entropy are all special cases of the proposed measure. We also propose a new generalized cumulative entropy measure and discuss some of its properties. We use the generating function approach to obtain some new entropy measures. In Section 3, we provide cumulative (residual) versions of Sharma–Taneja–Mittal entropy and obtain them as special cases of the generalized measures of entropy introduced in Section 2. In Section 4, we establish some relationships between entropy and extropy measures. Finally, we make some concluding remarks in Section 5.
2. Generalized Cumulative Entropy
In this section, we introduce generalized cumulative (residual) entropy measures. We then show that several entropy measures are special cases of the proposed entropies. Some generalizations of CRE and CE have been discussed in the literature and we now review these briefly. Drissi et al. (2008) [23] generalized the definition of CRE, given by Rao et al. (2004) [2], to the case of distributions with general support. They also showed that this generalized CRE can be used as an alternative to differential entropy. Kayal (2016) [24] introduced a generalization of CE proposed by Di Crescenzo and Longobardi (2009) [3] and their dynamic versions. Their definition is related to lower records and the reversed relevation transform. Psarrakos and Navarro (2013) [25] proposed a generalized cumulative residual entropy (GCRE), related to record values from a sequence of independent and identical random variables and with the relevation transform. Some properties and applications of the GCRE in actuarial risk measures have been discussed by Psarrakos and Toomaj (2017) [26]. Under some assumptions, Navarro and Psarrakos (2017) [27] proved that the GCRE function of a fixed order n uniquely determines the distribution function. Consequently, some characterizations of particular probability models have been obtained from this general result. Di Crescenzo and Toomaj (2017) [28] obtained some further results associated with generalized cumulative entropy, such as stochastic orders, bounds and characterization results. Some characterizations for the dynamic generalized cumulative entropy have also been derived. Recently, Di Crescenzo et al. (2021) [16] proposed the fractional generalized cumulative entropy and its dynamic version.
We introduce here generalized CRE and CE, which encompass most of the existing variations of these measures, as demonstrated below.
2.1. Generalized Cumulative Residual Entropy
Let X be a non-negative random variable with absolutely continuous distribution function and as the survival function. We assume that the mean is finite.
Definition 1.
Let X be a non-negative random variable with absolutely continuous distribution function F. Further, let be a function of X and be a weight function. Then, the generalized cumulative residual entropy of X is defined as
where w and ϕ can be chosen arbitrarily under the existence of the above integral such that becomes concave.
Entropy is defined as a measure of uncertainty associated with a model. Strict concavity implies that entropy will increase under averaging. The general definition of entropy we have given contains two arbitrary functions and w, and though it is difficult to state general conditions under which concavity of the general measure will hold, one can chose these functions so that becomes concave.
However, with different choices of weight function and , we can introduce several measures of entropy. First, we show that the new measure reduces to the cumulative residual entropy of Rao et al. (2004) [2] and the weighted cumulative residual entropy of Mirali et al. (2016) [4] for some specific choices of and .
Using integration by parts, from (1), we obtain (see [29])
Let us denote the hazard rate of X by
where f is the density function of X. Then, the cumulative hazard function can be expressed as
Now, with and , the expression in (9) becomes
Thus, reduces in this case to the cumulative residual entropy of Rao et al. (2004) [2].
The weighted cumulative residual entropy defined in (3) can be written as [29]
If we choose and and proceed as above, we can show that (9) becomes
Thus, reduces in this case to the weighted cumulative residual entropy, .
Next, we show that the cumulative residual Tsallis entropy of order is a special case of . An alternative representation of is [19]
where . If we now choose and , (9) becomes
Chakraborty and Pradhan (2021) [20] expressed a weighted version of cumulative residual Tsallis entropy of order in (6) as
Upon noting that the integral
(10) becomes
Now, for the choices of and , from (9), we obtain the above expression. Thus, is a special case of as well. The special cases of discussed here are all listed in Table 1.
Table 1.
Special cases of generalized residual entropy.
Next, we derive expressions for for some specific distributions.
Consider the exponential distribution with mean . Then, it is well-known that mean residual life is equal to mean. Thus, when , we have
In general, is a constant for any weight function. For the standard exponential, taking , we have
Thus, when and when .
Next, let us consider the standard uniform distribution with pdf , Then,
We thus obtain the residual entropy as . Additionally, when and , we get
as given in Example 5 of Balakrishnan et al. (2022) [29].
2.2. Generalized Cumulative Entropy
In this sub-section, we introduce a generalized cumulative entropy and discuss some of its properties.
Definition 2.
Let X be a non-negative random variable with absolutely continuous distribution function F. Further, let be a function of X and be a weight function. Then, the generalized cumulative entropy is defined as
where w and ϕ can be chosen arbitrarily under the existence of the above integral such that becomes concave.
For the choices of and , (11) reduces to [3]. Similarly, in (11) reduces to the weighted cumulative entropy of Mirali and Baratpour (2017) [5] for the choices of and .
The reversed hazard rate function of X, denoted by , is defined as
which yields the cumulative reversed hazard rate function as
The cumulative entropy in (2) can be expressed as (see [29])
Thus, by using the cumulative reversed hazard rate function, we can express
which is the special case of the generalized cumulative entropy in (11) for the choices of and . Proceeding similarly, we can show that reduces to the weighted cumulative entropy [5] for the choices of and .
Next, we show that the cumulative Tsallis entropy of order is a special case of in (11). The mean inactivity time function of a random variable X, at time x, is defined as
Using , can be expressed as [9]
Now, for the choices of and , (11) yields
An alternative expression for the weighted cumulative Tsallis entropy of order is given by [20]
As in Section 2, simple algebraic manipulations yield
Again, for the choices of and , (11) yields
Thus, is a special case of . In Table 2, we list the cumulative entropies derived from .
Table 2.
Special cases of generalized entropy.
Next, we derive expressions for for some specific distributions. Consider the standard exponential distribution with mean . Then, for the choices of and , we have
Next, let us consider the standard uniform distribution with pdf , Then, for the choices of and , we obtain
These two examples have been presented earlier by Balakrishnan et al. (2022) [29].
2.3. Generating Function
We now introduce generating function related to generalized entropy measures discussed in the preceding sections.
Definition 3.
Let X be a non-negative random variable with absolutely continuous distribution function F. Further, let be a function of X and be a weight function. We then define a generating function for the generalized cumulative residual entropy measure as
Being a function of t, we can interpret as a generating function for the general entropy measure introduced in Section 2. If we differentiate the expression in (13) with respect to t once, we obtain
Now, by setting in the above expression, we obtain the generalized entropy measure in (9). We, therefore, refer to it as generalized cumulative residual entropy of order 1. Higher-order derivatives with respect to t would similarly give rise to generalized cumulative residual entropies of orders 2, 3 and so on. For example, the generalized cumulative residual entropy of order two is given by
For the choice of and , reduces to the weighted cumulative residual entropy of Mirali et al. (2016) [4].
In a similar manner, we define the generating function for the generalized cumulative entropy as follows.
Definition 4.
Let X be a non-negative random variable with absolutely continuous distribution function F. Further, let be a function of X and be a weight function. We then define the generating function for the generalized cumulative entropy as
Once again, differentiating (15) with respect to t once and setting , we obtain the generalized cumulative entropy (of order 1) measure in (11). From (15), we can similarly obtain the generalized cumulative entropy of order 2 to be
The weighted cumulative entropy of Mirali and Baratpour (2017) [5] can be obtained from (16) for the choices of and . Naturally, higher-order derivatives with respect to t would give rise to generalized cumulative entropies of orders 3, 4 and so on.
3. Sharma–Taneja–Mittal Entropy
In this section, we introduce the cumulative (residual) versions of Sharma–Taneja–Mittal (STM) entropy. We then show that these are indeed special cases of the generalized residual and the past entropy introduced in Section 2.
Sharma and Taneja (1975) [30] and Mittal (1975) [31] independently introduced an entropy of the form
For different choices of and , from (17), we obtain some entropy measures discussed in the literature. In particular, for and , we obtain Kaniadakis entropy [32], as a special case of Sharma-Mittal entropy. For more details, see [33] along with Table 1 of Ilic et al. (2021) [34].
3.1. Sharma–Taneja–Mittal Cumulative Residual Entropy
In this sub-section, we introduce the cumulative residual version of the STM entropy.
Definition 5.
Let X be a non-negative random variable with absolutely continuous survival function . Then, the cumulative residual STM entropy is defined as
We also introduce the weighted cumulative residual STM entropy as follows.
Definition 6.
Let X be a non-negative random variable with absolutely continuous survival function . Then, the cumulative weighted residual STM entropy is defined as
3.2. Sharma–Taneja–Mittal Cumulative Entropy
In this sub-section, we introduce cumulative and weighted cumulative STM entropies, and then show that they are indeed special cases of the generalized entropy.
Definition 7.
Let X be a non-negative random variable with absolutely continuous distribution function F. Then, the cumulative STM entropy is defined as
In this case, the weighted cumulative STM entropy is defined as follows.
Definition 8.
Let X be a non-negative random variable with absolutely continuous distribution function F. Then, the cumulative weighted STM entropy is defined as
Of course, in the special cases of and , the above definitions would result in the corresponding generalized versions of Kaniadakis entropy. Following the same steps as those used to obtain alternative expressions for and , we can express and , respectively, as
Now, let . Then, from (11), we obtain and in (27) and (27) by taking and , respectively.
4. Connection between Entropy and Extropy
Apart from entropy, extropy and its properties have also been studied for quantifying the uncertainty associated with a random variable X. Using the new entropy measures introduced in the preceding sections, we establish some relationships between entropy and extropy measures in this section.
For a non-negative random variable X, extropy is defined as [35]
Now, we briefly discuss some recent developments associated with extropy measures. Jahanshahi et al. [36] defined the cumulative residual extropy as
and the cumulative extropy is defined as [37]
Sudheesh and Sreedevi (2022) [38] discussed non-parametric estimation of and for right-censored data.
Recently, Balakrishnan et al. [29], Bansal and Gupta [39] and Sathar and Nair [40,41,42] introduced different weighted versions of extropy. The weighted version of the survival extropy is given by [41]
These authors also introduced the weighted version of the cumulative extropy as
and Sathar and Nair [42] subsequently defined dynamic survival extropy as
For various properties of , one may see [36]. Sudheesh and Sreedevi (2022) [38] proposed simple alternative expressions for different extropy measures. Using these expressions, they established relationships between different dynamic and weighted extropy measures, and reliability concepts. In particular, they expressed as
Thus, is the mean residual life function of a series system having two identical components.
Sathar and Nair [41] defined weighted dynamic survival extropy as
Kundu (2021) [43] introduced dynamic cumulative extropy as
while Sathar and Nair [41] defined the weighted dynamic cumulative extropy as
Sudheesh and Sreedevi (2022) [38] expressed as
Thus, is the mean past life function of a parallel system having two identical components, where the mean past life function of a random variable X is defined as
Next, we establish some connections between different entropy and extropy measures. For the choice of and , from (9), we obtain
Thus, using (28), we have the relationship
Again, for the choices of and , from (9), we obtain
For a non-negative random variable X, we have
Thus, in this case, we obtain the relationship
For the choices of and , from (12), we obtain
Using the identity , from (29) and (32), we have the relationship
Additionally, for the choices of and , from (12), we obtain
Thus, in this case, we obtain the relationship
Let and be two independent and identical random variables having the same distribution function F. Let be the lifetime of a series system having two identical components. Using (9), we define the generalized residual entropy associated with Z as
where is a function of Z and is a weight function. Now, for the choices of and , from (33), we obtain
Thus, the generalized residual entropy associated with Z is the weighted average of the dynamic survival extropy in (30).
Next, let be the lifetime of a parallel system having two identical components. Then, the generalized cumulative entropy associated with Z is defined as
Again, for the choices of and , from (34), we obtain
Thus, the generalized cumulative entropy associated with Z is the weighted average of the dynamic cumulative extropy in (31).
5. Concluding Remarks
In this work, we have introduced two general measures of entropy, viz., generalized cumulative residual entropy and generalized cumulative entropy. Several entropy measures known in the literature were all shown to be special cases of these generalized measures. Cumulative residual entropy, weighted cumulative residual entropy, cumulative residual Tsallis entropy and weighted cumulative residual Tsallis entropy are all special cases of the generalized cumulative residual entropy. Cumulative entropy, weighted cumulative entropy, cumulative Tsallis entropy and weighted cumulative Tsallis entropy are all special cases of the generalized cumulative entropy.
We have presented a generating function approach to obtain generalized measures of higher-order. We have shown that the generalized cumulative residual entropy of order two reduces to the weighted cumulative residual entropy of Mirali et al. (2016) [4]. Moreover, the weighted cumulative entropy of Mirali and Baratpour (2017) [5] is a special case of the generalized cumulative entropy of order two. We have also established some relationships between entropy and extropy measures.
In information theory literature, conditional entropy is the amount of information required to describe the outcome of one random variable Y, given the value of another random variable X. Conditional entropy, as a measure of information, can be defined through any measure, such as the Shannon entropy measure (denoted by ). The conditional entropy defined through Shannon entropy measure, for example, is given by
In this way, we can define the conditional entropy measures even in a generalized form. The generalized versions we have introduced in the present work can thus be extended to conditional entropy notions. However, we plan to carry out a detailed study of this in our future work. It will be of interest to develop some inferential methods for these measures as well. We are currently working in these directions and hope to report these finding in a future paper.
Author Contributions
Methodology, S.K.K., E.P.S. and N.B. All authors have read and agreed to the published version of the manuscript.
Funding
This research received no external funding.
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
Not applicable.
Acknowledgments
Our sincere thanks go to the anonymous reviewers for their valuable suggestions and comments on an earlier version of this manuscript, which resulted in this improved version.
Conflicts of Interest
The authors declare no conflict of interest.
Abbreviations
The following abbreviations are used in this manuscript:
| CE | Cumulative entropy |
| CRE | Cumulative residual entropy |
| STM | Sharma–Taneja–Mittal |
| WCRTE | Weighted cumulative residual Tsallis entropy |
| WCTE | weighted cumulative Tsallis entropy |
References
- Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
- Rao, M.; Chen, Y.; Vemuri, B.; Wang, F. Cumulative residual entropy: A new measure of information. IEEE Trans. Inf. Theory 2004, 50, 1220–1228. [Google Scholar] [CrossRef]
- Di Crescenzo, A.; Longobardi, M. On cumulative entropies. J. Stat. Plan. Inference 2009, 139, 4072–4087. [Google Scholar] [CrossRef]
- Mirali, M.; Baratpour, S.; Fakoor, V. On weighted cumulative residual entropy. Commun. Stat.-Theory Methods 2016, 46, 2857–2869. [Google Scholar] [CrossRef]
- Mirali, M.; Baratpour, S. Some results on weighted cumulative entropy. J. Iran. Stat. Soc. 2017, 16, 21–32. [Google Scholar]
- Balakrishnan, N.; Buono, F.; Longobardi, M. A unified formulation of entropy and its application. Phys. A Stat. Mech. Its Appl. 2022, 127214. [Google Scholar] [CrossRef]
- Asadi, M.; Zohrevand, Y. On the dynamic cumulative residual entropy. J. Stat. Plan. Inference 2007, 137, 1931–1941. [Google Scholar] [CrossRef]
- Suhov, Y.; Sekeh, S.Y. Weighted cumulative entropies: An extension of CRE and CE. arXiv 2015, arXiv:1507.07051. [Google Scholar]
- Calì, C.; Longobardi, M.; Ahmadi, J. Some properties of cumulative Tsallis entropy. Phys. A Stat. Mech. Its Appl. 2017, 486, 1012–1021. [Google Scholar] [CrossRef] [Green Version]
- Calì, C.; Longobardi, M.; Navarro, J. Properties for generalized cumulative past measures of information. Probab. Eng. Inform. Sci. 2020, 34, 92–111. [Google Scholar] [CrossRef]
- Tahmasebi, S. Weighted extensions of generalized cumulative residual entropy and their applications. Commun. Stat.-Theory Methods 2020, 49, 5196–5219. [Google Scholar] [CrossRef]
- Toomaj, A.; Di Crescenzo, A. Connections between weighted generalized cumulative residual entropy and variance. Mathematics 2020, 8, 1072. [Google Scholar] [CrossRef]
- Kharazmi, O.; Balakrishnan, N. Jensen-information generating function and its connections to some well-known information measures. Stat. Probab. Lett. 2020, 170, 108995. [Google Scholar] [CrossRef]
- Kharazmi, O.; Balakrishnan, N. Cumulative residual and relative cumulative residual Fisher information and their properties. IEEE Trans. Inf. Theory 2020, 67, 6306–6312. [Google Scholar] [CrossRef]
- Kharazmi, O.; Balakrishnan, N. Cumulative and relative cumulative residual information generating measures and associated properties. Commun. Stat.-Theory Methods 2021, 1–14. [Google Scholar] [CrossRef]
- Di Crescenzo, A.; Kayal, S.; Meoli, A. Fractional generalized cumulative entropy and its dynamic version. Commun. Nonlinear Sci. Numer. Simul. 2021, 102, 105899. [Google Scholar] [CrossRef]
- Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
- Sati, M.M.; Gupta, N. Some characterization results on dynamic cumulative residual Tsallis entropy. J. Probab. Stat. 2015, 2015, 1155. [Google Scholar] [CrossRef] [Green Version]
- Rajesh, G.; Sunoj, S.M. Some properties of cumulative Tsallis entropy of order α. Stat. Pap. 2019, 60, 933–943. [Google Scholar] [CrossRef]
- Chakraborty, S.; Pradhan, B. On weighted cumulative Tsallis residual and past entropy measures. Commun. Stat.-Simul. Comput. 2021, 1–15. [Google Scholar] [CrossRef]
- Calì, C.; Longobardi, M.; Psarrakos, G. A family of weighted distributions based on the mean inactivity time and cumulative past entropies. Ric. Mat. 2021, 70, 395–409. [Google Scholar] [CrossRef]
- Balakrishnan, N.; Buono, F.; Longobardi, M. On cumulative entropies in terms of moments of order statistics. Methodol. Comput. Appl. Probab. 2022, 24, 345–359. [Google Scholar] [CrossRef]
- Drissi, N.; Chonavel, T.; Boucher, J.M. Generalized cumulative residual entropy for distributions with unrestricted supports. Res. Lett. Signal Process. 2008, 2008, 79060. [Google Scholar] [CrossRef] [Green Version]
- Kayal, S. On generalized cumulative entropies. Probab. Eng. Inform. Sci. 2016, 30, 640–662. [Google Scholar] [CrossRef]
- Psarrakos, G.; Navarro, J. Generalized cumulative residual entropy and record values. Metrika 2013, 76, 623–640. [Google Scholar] [CrossRef]
- Psarrakos, G.; Toomaj, A. On the generalized cumulative residual entropy with applications in actuarial science. J. Comput. Appl. Math. 2017, 309, 186–199. [Google Scholar] [CrossRef]
- Navarro, J.; Psarrakos, G. Characterizations based on generalized cumulative residual entropy functions. Commun. Stat.-Theory Methods 2017, 46, 1247–1260. [Google Scholar] [CrossRef]
- Di Crescenzo, A.; Toomaj, A. Further results on the generalized cumulative entropy. Kybernetika 2017, 53, 959–982. [Google Scholar] [CrossRef]
- Balakrishnan, N.; Buono, F.; Longobardi, M. On weighted extropies. Commun. Stat.-Theory Methods 2020, 1–31. [Google Scholar] [CrossRef]
- Sharma, B.D.; Taneja, I.J. Entropy of type (α,β) and other generalized measures in information theory. Metrika 1975, 22, 205–215. [Google Scholar] [CrossRef]
- Mittal, D.P. On some functional equations concerning entropy, directed divergence and inaccuracy. Metrika 1975, 22, 35–45. [Google Scholar] [CrossRef]
- Kaniadakis, G. Non-linear kinetics underlying generalized statistics. Phys. A Stat. Mech. Its Appl. 2001, 296, 405–425. [Google Scholar] [CrossRef] [Green Version]
- Lopes, A.M.; Machado, J.A.T. A review of fractional order entropies. Entropy 2020, 22, 1374. [Google Scholar] [CrossRef] [PubMed]
- Ilić, V.M.; Korbel, J.; Gupta, S.; Scarfone, A.M. An overview of generalized entropic forms (a). EPL (Europhys. Lett.) 2021, 133, 50005. [Google Scholar] [CrossRef]
- Lad, F.; Sanfilippo, G.; Agro, G. Extropy: Complementary dual of entropy. Stat. Sci. 2015, 30, 40–58. [Google Scholar] [CrossRef]
- Jahanshahi, S.M.A.; Zarei, H.; Khammar, A.H. On cumulative residual extropy. Probab. Eng. Inf. Sci. 2020, 34, 605–625. [Google Scholar] [CrossRef]
- Tahmasebi, S.; Toomaj, A. On negative cumulative extropy with applications. Commun. Stat.-Theory Methods 2020, 1–23. [Google Scholar] [CrossRef]
- Sudheesh, K.K.; Sreedevi, E.P. Non-parametric estimation of cumulative (residual) extropy with censored observations. Stat. Probab. Lett. 2022, 185, 109434. [Google Scholar] [CrossRef]
- Bansal, S.; Gupta, N. Weighted extropies and past extropy of order statistics and k-record values. Commun. Stat.-Theory Methods 2020, 1–24. [Google Scholar] [CrossRef]
- Sathar, E.A.; Nair, R.D. On dynamic weighted extropy. J. Comput. Appl. Math. 2021, 393, 113507. [Google Scholar] [CrossRef]
- Sathar, E.A.; Nair, R.D. A study on weighted dynamic survival and failure extropies. Commun. Stat.-Theory Methods 2021, 1–20. [Google Scholar] [CrossRef]
- Sathar, E.A.; Nair, R.D. On dynamic survival extropy. Commun. Stat.-Theory Methods 2021, 50, 1295–1313. [Google Scholar] [CrossRef]
- Kundu, C. On cumulative residual (past) extropy of extreme order statistics. Commun. Stat.-Theory Methods 2021, 1–18. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).