Next Article in Journal
Intermediate-Temperature Creep Deformation and Microstructural Evolution of an Equiatomic FCC-Structured CoCrFeNiMn High-Entropy Alloy
Previous Article in Journal
Spatial Heterogeneity in the Occurrence Probability of Rainstorms over China
Due to scheduled maintenance work on our core network, there may be short service disruptions on this website between 16:00 and 16:30 CEST on September 25th.
Article

Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence

1
College of Intelligence and Computing, Tianjin University, Tianjin 300350, China
2
Graphics and Imaging Laboratory, University of Girona, Campus Montilivi, 17003 Girona, Spain
3
Department of Engineering Science, University of Oxford, Oxford OX1 3PJ, UK
*
Author to whom correspondence should be addressed.
Entropy 2018, 20(12), 959; https://doi.org/10.3390/e20120959
Received: 8 November 2018 / Revised: 2 December 2018 / Accepted: 8 December 2018 / Published: 12 December 2018
Cross entropy and Kullback–Leibler (K-L) divergence are fundamental quantities of information theory, and they are widely used in many fields. Since cross entropy is the negated logarithm of likelihood, minimizing cross entropy is equivalent to maximizing likelihood, and thus, cross entropy is applied for optimization in machine learning. K-L divergence also stands independently as a commonly used metric for measuring the difference between two distributions. In this paper, we introduce new inequalities regarding cross entropy and K-L divergence by using the fact that cross entropy is the negated logarithm of the weighted geometric mean. We first apply the well-known rearrangement inequality, followed by a recent theorem on weighted Kolmogorov means, and, finally, we introduce a new theorem that directly applies to inequalities between K-L divergences. To illustrate our results, we show numerical examples of distributions. View Full-Text
Keywords: cross entropy; Kullback–Leibler divergence; likelihood; Kolmogorov mean; generalized mean; weighted mean; stochastic dominance; stochastic order cross entropy; Kullback–Leibler divergence; likelihood; Kolmogorov mean; generalized mean; weighted mean; stochastic dominance; stochastic order
Show Figures

Figure 1

MDPI and ACS Style

Sbert, M.; Chen, M.; Poch, J.; Bardera, A. Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence. Entropy 2018, 20, 959. https://doi.org/10.3390/e20120959

AMA Style

Sbert M, Chen M, Poch J, Bardera A. Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence. Entropy. 2018; 20(12):959. https://doi.org/10.3390/e20120959

Chicago/Turabian Style

Sbert, Mateu, Min Chen, Jordi Poch, and Anton Bardera. 2018. "Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence" Entropy 20, no. 12: 959. https://doi.org/10.3390/e20120959

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop