Next Article in Journal
The Data-Constrained Generalized Maximum Entropy Estimator of the GLM: Asymptotic Theory and Inference
Next Article in Special Issue
Bias Adjustment for a Nonparametric Entropy Estimator
Previous Article in Journal
Equiangular Vectors Approach to Mutually Unbiased Bases
Previous Article in Special Issue
An Estimate of Mutual Information that Permits Closed-Form Optimisation
Entropy 2013, 15(5), 1738-1755; doi:10.3390/e15051738
Article

Bayesian and Quasi-Bayesian Estimators for Mutual Information from Discrete Data

1, 2 and 3,4,5,*
Received: 16 February 2013; in revised form: 24 April 2013 / Accepted: 2 May 2013 / Published: 10 May 2013
(This article belongs to the Special Issue Estimating Information-Theoretic Quantities from Data)
Download PDF [1877 KB, updated 13 May 2013; original version uploaded 10 May 2013]
Abstract: Mutual information (MI) quantifies the statistical dependency between a pair of random variables, and plays a central role in the analysis of engineering and biological systems. Estimation of MI is difficult due to its dependence on an entire joint distribution, which is difficult to estimate from samples. Here we discuss several regularized estimators for MI that employ priors based on the Dirichlet distribution. First, we discuss three “quasi-Bayesian” estimators that result from linear combinations of Bayesian estimates for conditional and marginal entropies. We show that these estimators are not in fact Bayesian, and do not arise from a well-defined posterior distribution and may in fact be negative. Second, we show that a fully Bayesian MI estimator proposed by Hutter (2002), which relies on a fixed Dirichlet prior, exhibits strong prior dependence and has large bias for small datasets. Third, we formulate a novel Bayesian estimator using a mixture-of-Dirichlets prior, with mixing weights designed to produce an approximately flat prior over MI. We examine the performance of these estimators with a variety of simulated datasets and show that, surprisingly, quasi-Bayesian estimators generally outperform our Bayesian estimator. We discuss outstanding challenges for MI estimation and suggest promising avenues for future research.
Keywords: mutual information; entropy; Dirichlet distribution; Bayes least squares mutual information; entropy; Dirichlet distribution; Bayes least squares
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Export to BibTeX |
EndNote


MDPI and ACS Style

Archer, E.; Park, I.M.; Pillow, J.W. Bayesian and Quasi-Bayesian Estimators for Mutual Information from Discrete Data. Entropy 2013, 15, 1738-1755.

AMA Style

Archer E, Park IM, Pillow JW. Bayesian and Quasi-Bayesian Estimators for Mutual Information from Discrete Data. Entropy. 2013; 15(5):1738-1755.

Chicago/Turabian Style

Archer, Evan; Park, Il M.; Pillow, Jonathan W. 2013. "Bayesian and Quasi-Bayesian Estimators for Mutual Information from Discrete Data." Entropy 15, no. 5: 1738-1755.


Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert