Entropy 2013, 15(7), 2861-2873; doi:10.3390/e15072861
Article

Relative Entropy Derivative Bounds

Universidad de los Andes, Facultad de Ingeniería y Ciencias Aplicadas, Monseñor Álvaro del Portillo 12455, Las Condes, Santiago, Chile
* Author to whom correspondence should be addressed.
Received: 24 May 2013; in revised form: 12 July 2013 / Accepted: 16 July 2013 / Published: 23 July 2013
(This article belongs to the Special Issue Maximum Entropy and Bayes Theorem)
PDF Full-text Download PDF Full-Text [289 KB, uploaded 23 July 2013 14:05 CEST]
Abstract: We show that the derivative of the relative entropy with respect to its parameters is lower and upper bounded. We characterize the conditions under which this derivative can reach zero. We use these results to explain when the minimum relative entropy and the maximum log likelihood approaches can be valid. We show that these approaches naturally activate in the presence of large data sets and that they are inherent properties of any density estimation process involving large numbers of random variables.
Keywords: relative entropy; Kullback-Leibler divergence; Shannon differential entropy; asymptotic equipartition principle; typical set; Fisher information; maximum log likelihood

Article Statistics

Load and display the download statistics.

Citations to this Article

Cite This Article

MDPI and ACS Style

Zegers, P.; Fuentes, A.; Alarcón, C. Relative Entropy Derivative Bounds. Entropy 2013, 15, 2861-2873.

AMA Style

Zegers P, Fuentes A, Alarcón C. Relative Entropy Derivative Bounds. Entropy. 2013; 15(7):2861-2873.

Chicago/Turabian Style

Zegers, Pablo; Fuentes, Alexis; Alarcón, Carlos. 2013. "Relative Entropy Derivative Bounds." Entropy 15, no. 7: 2861-2873.

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert