Next Article in Journal
Sound Symbolism in Basic Vocabulary
Next Article in Special Issue
Distances in Probability Space and the Statistical Complexity Setup
Previous Article in Journal
Article Omission in Dutch Children with SLI: A Processing Approach
Previous Article in Special Issue
Entropy and Divergence Associated with Power Function and the Statistical Application
Entropy 2010, 12(4), 818-843; doi:10.3390/e12040818

Parametric Bayesian Estimation of Differential Entropy and Relative Entropy

1,*  and 2
1 Department of Electrical Engineering, University of Washington, Seattle WA 98195-2500, USA 2 Computational Biology, Fred Hutchinson Cancer Research Center, Seattle WA 98109, USA
* Author to whom correspondence should be addressed.
Received: 16 November 2009 / Revised: 28 March 2010 / Accepted: 2 April 2010 / Published: 9 April 2010
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
View Full-Text   |   Download PDF [356 KB, uploaded 24 February 2015]   |   Browse Figure


Given iid samples drawn from a distribution with known parametric form, we propose the minimization of expected Bregman divergence to form Bayesian estimates of differential entropy and relative entropy, and derive such estimators for the uniform, Gaussian, Wishart, and inverse Wishart distributions. Additionally, formulas are given for a log gamma Bregman divergence and the differential entropy and relative entropy for the Wishart and inverse Wishart. The results, as always with Bayesian estimates, depend on the accuracy of the prior parameters, but example simulations show that the performance can be substantially improved compared to maximum likelihood or state-of-the-art nonparametric estimators.
Keywords: Kullback-Leibler; relative entropy; differential entropy; Pareto; Wishart Kullback-Leibler; relative entropy; differential entropy; Pareto; Wishart
This is an open access article distributed under the Creative Commons Attribution License (CC BY) which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Share & Cite This Article

Further Mendeley | CiteULike
Export to BibTeX |
EndNote |
MDPI and ACS Style

Gupta, M.; Srivastava, S. Parametric Bayesian Estimation of Differential Entropy and Relative Entropy. Entropy 2010, 12, 818-843.

View more citation formats

Related Articles

Article Metrics

For more information on the journal, click here


[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert