Next Article in Journal
Next Article in Special Issue
Previous Article in Journal
Previous Article in Special Issue
Entropy 2010, 12(2), 262-274; doi:10.3390/e12020262
Article

Entropy and Divergence Associated with Power Function and the Statistical Application

*  and
Received: 29 December 2009; in revised form: 20 February 2010 / Accepted: 23 February 2010 / Published: 25 February 2010
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Download PDF [133 KB, uploaded 25 February 2010]
Abstract: In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects Boltzmann-Shannon entropy and the expected log-likelihood function. The maximum likelihood estimation has been supported for the optimal performance, which is known to be easily broken down in the presence of a small degree of model uncertainty. To deal with this problem, a new statistical method, closely related to Tsallis entropy, is proposed and shown to be robust for outliers, and we discuss a local learning property associated with the method.
Keywords: Tsallis entropy; projective power divergence; robustness Tsallis entropy; projective power divergence; robustness
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Export to BibTeX |
EndNote


MDPI and ACS Style

Eguchi, S.; Kato, S. Entropy and Divergence Associated with Power Function and the Statistical Application. Entropy 2010, 12, 262-274.

AMA Style

Eguchi S, Kato S. Entropy and Divergence Associated with Power Function and the Statistical Application. Entropy. 2010; 12(2):262-274.

Chicago/Turabian Style

Eguchi, Shinto; Kato, Shogo. 2010. "Entropy and Divergence Associated with Power Function and the Statistical Application." Entropy 12, no. 2: 262-274.


Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert