Entropy 2013, 15(5), 1738-1755; doi:10.3390/e15051738
Article

Bayesian and Quasi-Bayesian Estimators for Mutual Information from Discrete Data

1 Institute for Computational Engineering Sciences, The University of Texas at Austin, Austin, TX 78712, USA 2 Center for Perceptual Systems, The University of Texas at Austin, Austin, TX 78712, USA 3 Department of Psychology, The University of Texas at Austin, Austin, TX 78712, USA 4 Section of Neurobiology, The University of Texas at Austin, Austin, TX 78712, USA 5 Division of Statistics and Scientific Computation, The University of Texas at Austin, Austin, TX 78712, USA
* Author to whom correspondence should be addressed.
Received: 16 February 2013; in revised form: 24 April 2013 / Accepted: 2 May 2013 / Published: 10 May 2013
(This article belongs to the Special Issue Estimating Information-Theoretic Quantities from Data)
PDF Full-text Download PDF Full-Text [1877 KB, Updated Version, uploaded 13 May 2013 15:08 CEST]
The original version is still available [1951 KB, uploaded 10 May 2013 18:24 CEST]
Abstract: Mutual information (MI) quantifies the statistical dependency between a pair of random variables, and plays a central role in the analysis of engineering and biological systems. Estimation of MI is difficult due to its dependence on an entire joint distribution, which is difficult to estimate from samples. Here we discuss several regularized estimators for MI that employ priors based on the Dirichlet distribution. First, we discuss three “quasi-Bayesian” estimators that result from linear combinations of Bayesian estimates for conditional and marginal entropies. We show that these estimators are not in fact Bayesian, and do not arise from a well-defined posterior distribution and may in fact be negative. Second, we show that a fully Bayesian MI estimator proposed by Hutter (2002), which relies on a fixed Dirichlet prior, exhibits strong prior dependence and has large bias for small datasets. Third, we formulate a novel Bayesian estimator using a mixture-of-Dirichlets prior, with mixing weights designed to produce an approximately flat prior over MI. We examine the performance of these estimators with a variety of simulated datasets and show that, surprisingly, quasi-Bayesian estimators generally outperform our Bayesian estimator. We discuss outstanding challenges for MI estimation and suggest promising avenues for future research.
Keywords: mutual information; entropy; Dirichlet distribution; Bayes least squares

Article Statistics

Load and display the download statistics.

Citations to this Article

Cite This Article

MDPI and ACS Style

Archer, E.; Park, I.M.; Pillow, J.W. Bayesian and Quasi-Bayesian Estimators for Mutual Information from Discrete Data. Entropy 2013, 15, 1738-1755.

AMA Style

Archer E, Park IM, Pillow JW. Bayesian and Quasi-Bayesian Estimators for Mutual Information from Discrete Data. Entropy. 2013; 15(5):1738-1755.

Chicago/Turabian Style

Archer, Evan; Park, Il M.; Pillow, Jonathan W. 2013. "Bayesian and Quasi-Bayesian Estimators for Mutual Information from Discrete Data." Entropy 15, no. 5: 1738-1755.

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert