Next Article in Journal
The Evaluation of Noise Spectroscopy Tests
Next Article in Special Issue
A Sequence of Escort Distributions and Generalizations of Expectations on q-Exponential Family
Previous Article in Journal
Supply Chain Strategies for Quality Inspection under a Customer Return Policy: A Game Theoretical Approach
Previous Article in Special Issue
Foliations-Webs-Hessian Geometry-Information Geometry-Entropy and Cohomology
Open AccessArticle

Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities

by 1,2,* and 3
1
Computer Science Department LIX, École Polytechnique, 91128 Palaiseau Cedex, France
2
Sony Computer Science Laboratories Inc, Tokyo 141-0022, Japan
3
King Abdullah University of Science and Technology, Thuwal 23955, Saudi Arabia
*
Author to whom correspondence should be addressed.
Academic Editor: Antonio M. Scarfone
Entropy 2016, 18(12), 442; https://doi.org/10.3390/e18120442
Received: 20 October 2016 / Revised: 4 December 2016 / Accepted: 5 December 2016 / Published: 9 December 2016
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
Information-theoretic measures, such as the entropy, the cross-entropy and the Kullback–Leibler divergence between two mixture models, are core primitives in many signal processing tasks. Since the Kullback–Leibler divergence of mixtures provably does not admit a closed-form formula, it is in practice either estimated using costly Monte Carlo stochastic integration, approximated or bounded using various techniques. We present a fast and generic method that builds algorithmically closed-form lower and upper bounds on the entropy, the cross-entropy, the Kullback–Leibler and the α-divergences of mixtures. We illustrate the versatile method by reporting our experiments for approximating the Kullback–Leibler and the α-divergences between univariate exponential mixtures, Gaussian mixtures, Rayleigh mixtures and Gamma mixtures. View Full-Text
Keywords: information geometry; mixture models; α-divergences; log-sum-exp bounds information geometry; mixture models; α-divergences; log-sum-exp bounds
Show Figures

Figure 1

MDPI and ACS Style

Nielsen, F.; Sun, K. Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities. Entropy 2016, 18, 442.

AMA Style

Nielsen F, Sun K. Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities. Entropy. 2016; 18(12):442.

Chicago/Turabian Style

Nielsen, Frank; Sun, Ke. 2016. "Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities" Entropy 18, no. 12: 442.

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop