Next Article in Journal
Intermediate-Temperature Creep Deformation and Microstructural Evolution of an Equiatomic FCC-Structured CoCrFeNiMn High-Entropy Alloy
Previous Article in Journal
Spatial Heterogeneity in the Occurrence Probability of Rainstorms over China
Article Menu
Issue 12 (December) cover image

Export Article

Open AccessArticle
Entropy 2018, 20(12), 959; https://doi.org/10.3390/e20120959

Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence

1
College of Intelligence and Computing, Tianjin University, Tianjin 300350, China
2
Graphics and Imaging Laboratory, University of Girona, Campus Montilivi, 17003 Girona, Spain
3
Department of Engineering Science, University of Oxford, Oxford OX1 3PJ, UK
*
Author to whom correspondence should be addressed.
Received: 8 November 2018 / Revised: 2 December 2018 / Accepted: 8 December 2018 / Published: 12 December 2018
Full-Text   |   PDF [836 KB, uploaded 12 December 2018]   |  

Abstract

Cross entropy and Kullback–Leibler (K-L) divergence are fundamental quantities of information theory, and they are widely used in many fields. Since cross entropy is the negated logarithm of likelihood, minimizing cross entropy is equivalent to maximizing likelihood, and thus, cross entropy is applied for optimization in machine learning. K-L divergence also stands independently as a commonly used metric for measuring the difference between two distributions. In this paper, we introduce new inequalities regarding cross entropy and K-L divergence by using the fact that cross entropy is the negated logarithm of the weighted geometric mean. We first apply the well-known rearrangement inequality, followed by a recent theorem on weighted Kolmogorov means, and, finally, we introduce a new theorem that directly applies to inequalities between K-L divergences. To illustrate our results, we show numerical examples of distributions. View Full-Text
Keywords: cross entropy; Kullback–Leibler divergence; likelihood; Kolmogorov mean; generalized mean; weighted mean; stochastic dominance; stochastic order cross entropy; Kullback–Leibler divergence; likelihood; Kolmogorov mean; generalized mean; weighted mean; stochastic dominance; stochastic order
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Sbert, M.; Chen, M.; Poch, J.; Bardera, A. Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence. Entropy 2018, 20, 959.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top