Next Article in Journal
A Survey on Robust Interference Management in Wireless Networks
Next Article in Special Issue
Survey on Probabilistic Models of Low-Rank Matrix Factorizations
Previous Article in Journal
Cosmological Time, Entropy and Infinity
Previous Article in Special Issue
Overfitting Reduction of Text Classification Based on AdaBELM
Article Menu
Issue 7 (July) cover image

Export Article

Correction published on 3 November 2017, see Entropy 2017, 19(11), 588.

Open AccessArticle
Entropy 2017, 19(7), 361; https://doi.org/10.3390/e19070361

Estimating Mixture Entropy with Pairwise Distances

1
Santa Fe Institute, Santa Fe, NM 87501, USA
2
Department of Aeronautics and Astronautics, Massachusetts Institute of Technology, Cambridge, MA 02139, USA
*
Author to whom correspondence should be addressed.
Received: 8 June 2017 / Revised: 8 July 2017 / Accepted: 12 July 2017 / Published: 14 July 2017
(This article belongs to the Special Issue Information Theory in Machine Learning and Data Science)
View Full-Text   |   Download PDF [325 KB, uploaded 8 November 2017]   |  

Abstract

Mixture distributions arise in many parametric and non-parametric settings—for example, in Gaussian mixture models and in non-parametric estimation. It is often necessary to compute the entropy of a mixture, but, in most cases, this quantity has no closed-form expression, making some form of approximation necessary. We propose a family of estimators based on a pairwise distance function between mixture components, and show that this estimator class has many attractive properties. For many distributions of interest, the proposed estimators are efficient to compute, differentiable in the mixture parameters, and become exact when the mixture components are clustered. We prove this family includes lower and upper bounds on the mixture entropy. The Chernoff α -divergence gives a lower bound when chosen as the distance function, with the Bhattacharyaa distance providing the tightest lower bound for components that are symmetric and members of a location family. The Kullback–Leibler divergence gives an upper bound when used as the distance function. We provide closed-form expressions of these bounds for mixtures of Gaussians, and discuss their applications to the estimation of mutual information. We then demonstrate that our bounds are significantly tighter than well-known existing bounds using numeric simulations. This estimator class is very useful in optimization problems involving maximization/minimization of entropy and mutual information, such as MaxEnt and rate distortion problems. View Full-Text
Keywords: mixture distribution; mixture of Gaussians; entropy estimation; MaxEnt; rate distortion mixture distribution; mixture of Gaussians; entropy estimation; MaxEnt; rate distortion
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Kolchinsky, A.; Tracey, B.D. Estimating Mixture Entropy with Pairwise Distances. Entropy 2017, 19, 361.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top