Next Article in Journal
Next Article in Special Issue
Previous Article in Journal
Previous Article in Special Issue
Entropy 2013, 15(7), 2861-2873; doi:10.3390/e15072861
Article

Relative Entropy Derivative Bounds

* ,
 and
Received: 24 May 2013; in revised form: 12 July 2013 / Accepted: 16 July 2013 / Published: 23 July 2013
(This article belongs to the Special Issue Maximum Entropy and Bayes Theorem)
Download PDF [289 KB, uploaded 23 July 2013]
Abstract: We show that the derivative of the relative entropy with respect to its parameters is lower and upper bounded. We characterize the conditions under which this derivative can reach zero. We use these results to explain when the minimum relative entropy and the maximum log likelihood approaches can be valid. We show that these approaches naturally activate in the presence of large data sets and that they are inherent properties of any density estimation process involving large numbers of random variables.
Keywords: relative entropy; Kullback-Leibler divergence; Shannon differential entropy; asymptotic equipartition principle; typical set; Fisher information; maximum log likelihood relative entropy; Kullback-Leibler divergence; Shannon differential entropy; asymptotic equipartition principle; typical set; Fisher information; maximum log likelihood
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Export to BibTeX |
EndNote


MDPI and ACS Style

Zegers, P.; Fuentes, A.; Alarcón, C. Relative Entropy Derivative Bounds. Entropy 2013, 15, 2861-2873.

AMA Style

Zegers P, Fuentes A, Alarcón C. Relative Entropy Derivative Bounds. Entropy. 2013; 15(7):2861-2873.

Chicago/Turabian Style

Zegers, Pablo; Fuentes, Alexis; Alarcón, Carlos. 2013. "Relative Entropy Derivative Bounds." Entropy 15, no. 7: 2861-2873.


Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert