Entropy 2010, 12(2), 262-274; doi:10.3390/e12020262
Article

Entropy and Divergence Associated with Power Function and the Statistical Application

The Institute of Statistical Mathematics, Tachikawa, Tokyo 190-8562, Japan
* Author to whom correspondence should be addressed.
Received: 29 December 2009; in revised form: 20 February 2010 / Accepted: 23 February 2010 / Published: 25 February 2010
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
PDF Full-text Download PDF Full-Text [133 KB, uploaded 25 February 2010 10:31 CET]
Abstract: In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects Boltzmann-Shannon entropy and the expected log-likelihood function. The maximum likelihood estimation has been supported for the optimal performance, which is known to be easily broken down in the presence of a small degree of model uncertainty. To deal with this problem, a new statistical method, closely related to Tsallis entropy, is proposed and shown to be robust for outliers, and we discuss a local learning property associated with the method.
Keywords: Tsallis entropy; projective power divergence; robustness

Article Statistics

Load and display the download statistics.

Citations to this Article

Cite This Article

MDPI and ACS Style

Eguchi, S.; Kato, S. Entropy and Divergence Associated with Power Function and the Statistical Application. Entropy 2010, 12, 262-274.

AMA Style

Eguchi S, Kato S. Entropy and Divergence Associated with Power Function and the Statistical Application. Entropy. 2010; 12(2):262-274.

Chicago/Turabian Style

Eguchi, Shinto; Kato, Shogo. 2010. "Entropy and Divergence Associated with Power Function and the Statistical Application." Entropy 12, no. 2: 262-274.

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert