Next Article in Journal
The Evaluation of Noise Spectroscopy Tests
Next Article in Special Issue
A Sequence of Escort Distributions and Generalizations of Expectations on q-Exponential Family
Previous Article in Journal
Supply Chain Strategies for Quality Inspection under a Customer Return Policy: A Game Theoretical Approach
Previous Article in Special Issue
Foliations-Webs-Hessian Geometry-Information Geometry-Entropy and Cohomology
Article Menu
Issue 12 (December) cover image

Export Article

Open AccessArticle
Entropy 2016, 18(12), 442; doi:10.3390/e18120442

Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities

1,2,* and 3
1
Computer Science Department LIX, École Polytechnique, 91128 Palaiseau Cedex, France
2
Sony Computer Science Laboratories Inc, Tokyo 141-0022, Japan
3
King Abdullah University of Science and Technology, Thuwal 23955, Saudi Arabia
*
Author to whom correspondence should be addressed.
Academic Editor: Antonio M. Scarfone
Received: 20 October 2016 / Revised: 4 December 2016 / Accepted: 5 December 2016 / Published: 9 December 2016
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
View Full-Text   |   Download PDF [766 KB, uploaded 9 December 2016]   |  

Abstract

Information-theoretic measures, such as the entropy, the cross-entropy and the Kullback–Leibler divergence between two mixture models, are core primitives in many signal processing tasks. Since the Kullback–Leibler divergence of mixtures provably does not admit a closed-form formula, it is in practice either estimated using costly Monte Carlo stochastic integration, approximated or bounded using various techniques. We present a fast and generic method that builds algorithmically closed-form lower and upper bounds on the entropy, the cross-entropy, the Kullback–Leibler and the α-divergences of mixtures. We illustrate the versatile method by reporting our experiments for approximating the Kullback–Leibler and the α-divergences between univariate exponential mixtures, Gaussian mixtures, Rayleigh mixtures and Gamma mixtures. View Full-Text
Keywords: information geometry; mixture models; α-divergences; log-sum-exp bounds information geometry; mixture models; α-divergences; log-sum-exp bounds
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Nielsen, F.; Sun, K. Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities. Entropy 2016, 18, 442.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top