Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities
1
Computer Science Department LIX, École Polytechnique, 91128 Palaiseau Cedex, France
2
Sony Computer Science Laboratories Inc, Tokyo 141-0022, Japan
3
King Abdullah University of Science and Technology, Thuwal 23955, Saudi Arabia
*
Author to whom correspondence should be addressed.
Academic Editor: Antonio M. Scarfone
Entropy 2016, 18(12), 442; https://doi.org/10.3390/e18120442
Received: 20 October 2016 / Revised: 4 December 2016 / Accepted: 5 December 2016 / Published: 9 December 2016
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
Information-theoretic measures, such as the entropy, the cross-entropy and the Kullback–Leibler divergence between two mixture models, are core primitives in many signal processing tasks. Since the Kullback–Leibler divergence of mixtures provably does not admit a closed-form formula, it is in practice either estimated using costly Monte Carlo stochastic integration, approximated or bounded using various techniques. We present a fast and generic method that builds algorithmically closed-form lower and upper bounds on the entropy, the cross-entropy, the Kullback–Leibler and the α-divergences of mixtures. We illustrate the versatile method by reporting our experiments for approximating the Kullback–Leibler and the α-divergences between univariate exponential mixtures, Gaussian mixtures, Rayleigh mixtures and Gamma mixtures.
View Full-Text
▼
Show Figures
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited
MDPI and ACS Style
Nielsen, F.; Sun, K. Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities. Entropy 2016, 18, 442.
AMA Style
Nielsen F, Sun K. Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities. Entropy. 2016; 18(12):442.
Chicago/Turabian StyleNielsen, Frank; Sun, Ke. 2016. "Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities" Entropy 18, no. 12: 442.
Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.
Search more from Scilit