Next Article in Journal
Thermoelectric Effects under Adiabatic Conditions
Next Article in Special Issue
Non–Parametric Estimation of Mutual Information through the Entropy of the Linkage
Previous Article in Journal
From Observable Behaviors to Structures of Interaction in Binary Games of Strategic Complements
Previous Article in Special Issue
The Measurement of Information Transmitted by a Neural Population: Promises and Challenges
Entropy 2013, 15(11), 4668-4699; doi:10.3390/e15114668

Estimating Functions of Distributions Defined over Spaces of Unknown Size

1,*  and 1,2
1 Santa Fe Institute, 1399 Hyde Park Rd., Santa Fe, NM 87501, USA 2 School of Informatics and Computing, Indiana University, 901 E 10th St, Bloomington, IN 47408, USA
* Author to whom correspondence should be addressed.
Received: 3 August 2013 / Revised: 11 September 2013 / Accepted: 17 October 2013 / Published: 31 October 2013
(This article belongs to the Special Issue Estimating Information-Theoretic Quantities from Data)
View Full-Text   |   Download PDF [459 KB, 24 February 2015; original version 24 February 2015]   |   Browse Figures


We consider Bayesian estimation of information-theoretic quantities from data, using a Dirichlet prior. Acknowledging the uncertainty of the event space size m and the Dirichlet prior’s concentration parameter c, we treat both as random variables set by a hyperprior. We show that the associated hyperprior, P(c, m), obeys a simple “Irrelevance of Unseen Variables” (IUV) desideratum iff P(c, m) = P(c)P(m). Thus, requiring IUV greatly reduces the number of degrees of freedom of the hyperprior. Some information-theoretic quantities can be expressed multiple ways, in terms of different event spaces, e.g., mutual information. With all hyperpriors (implicitly) used in earlier work, different choices of this event space lead to different posterior expected values of these information-theoretic quantities. We show that there is no such dependence on the choice of event space for a hyperprior that obeys IUV. We also derive a result that allows us to exploit IUV to greatly simplify calculations, like the posterior expected mutual information or posterior expected multi-information. We also use computer experiments to favorably compare an IUV-based estimator of entropy to three alternative methods in common use. We end by discussing how seemingly innocuous changes to the formalization of an estimation problem can substantially affect the resultant estimates of posterior expectations.
Keywords: Bayesian analysis; entropy; mutual information; variable number of bins; hidden variables; Dirichlet prior Bayesian analysis; entropy; mutual information; variable number of bins; hidden variables; Dirichlet prior
This is an open access article distributed under the Creative Commons Attribution License (CC BY) which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Share & Cite This Article

Further Mendeley | CiteULike
Export to BibTeX |
EndNote |
MDPI and ACS Style

Wolpert, D.H.; DeDeo, S. Estimating Functions of Distributions Defined over Spaces of Unknown Size. Entropy 2013, 15, 4668-4699.

View more citation formats

Related Articles

Article Metrics

For more information on the journal, click here


[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert