Next Article in Journal
Thermoeconomic Coherence: A Methodology for the Analysis and Optimisation of Thermal Systems
Previous Article in Journal
Entropy? Honest!
Open AccessArticle

Cumulative Paired φ-Entropy

Department of Statistics and Econometrics, Friedrich-Alexander-Universität Erlangen-Nürnberg, Lange Gasse, Nürnberg 90403, Germany
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Academic Editor: Adom Giffin
Entropy 2016, 18(7), 248; https://doi.org/10.3390/e18070248
Received: 13 April 2016 / Revised: 9 June 2016 / Accepted: 24 June 2016 / Published: 1 July 2016
A new kind of entropy will be introduced which generalizes both the differential entropy and the cumulative (residual) entropy. The generalization is twofold. First, we simultaneously define the entropy for cumulative distribution functions (cdfs) and survivor functions (sfs), instead of defining it separately for densities, cdfs, or sfs. Secondly, we consider a general “entropy generating function” φ, the same way Burbea et al. (IEEE Trans. Inf. Theory 1982, 28, 489–495) and Liese et al. (Convex Statistical Distances; Teubner-Verlag, 1987) did in the context of φ-divergences. Combining the ideas of φ-entropy and cumulative entropy leads to the new “cumulative paired φ-entropy” ( C P E φ ). This new entropy has already been discussed in at least four scientific disciplines, be it with certain modifications or simplifications. In the fuzzy set theory, for example, cumulative paired φ-entropies were defined for membership functions, whereas in uncertainty and reliability theories some variations of C P E φ were recently considered as measures of information. With a single exception, the discussions in the scientific disciplines appear to be held independently of each other. We consider C P E φ for continuous cdfs and show that C P E φ is rather a measure of dispersion than a measure of information. In the first place, this will be demonstrated by deriving an upper bound which is determined by the standard deviation and by solving the maximum entropy problem under the restriction of a fixed variance. Next, this paper specifically shows that C P E φ satisfies the axioms of a dispersion measure. The corresponding dispersion functional can easily be estimated by an L-estimator, containing all its known asymptotic properties. C P E φ is the basis for several related concepts like mutual φ-information, φ-correlation, and φ-regression, which generalize Gini correlation and Gini regression. In addition, linear rank tests for scale that are based on the new entropy have been developed. We show that almost all known linear rank tests are special cases, and we introduce certain new tests. Moreover, formulas for different distributions and entropy calculations are presented for C P E φ if the cdf is available in a closed form. View Full-Text
Keywords: φ-entropy; absolute mean deviation; cumulative residual entropy; measure of dispersion; generalized maximum entropy principle; Tukey’s λ distribution; φ-regression; L-estimator; linear rank test φ-entropy; absolute mean deviation; cumulative residual entropy; measure of dispersion; generalized maximum entropy principle; Tukey’s λ distribution; φ-regression; L-estimator; linear rank test
Show Figures

Figure 1

MDPI and ACS Style

Klein, I.; Mangold, B.; Doll, M. Cumulative Paired φ-Entropy. Entropy 2016, 18, 248.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop