Next Article in Journal
Self-Organization during Friction in Complex Surface Engineered Tribosystems
Next Article in Special Issue
Parametric Bayesian Estimation of Differential Entropy and Relative Entropy
Previous Article in Journal
Improvement of Energy Conversion/Utilization by Exergy Analysis: Selected Cases for Non-Reactive and Reactive Systems
Previous Article in Special Issue
Transport of Heat and Charge in Electromagnetic Metrology Based on Nonequilibrium Statistical Mechanics
Entropy 2010, 12(2), 262-274; doi:10.3390/e12020262

Entropy and Divergence Associated with Power Function and the Statistical Application

*  and
The Institute of Statistical Mathematics, Tachikawa, Tokyo 190-8562, Japan
* Author to whom correspondence should be addressed.
Received: 29 December 2009 / Revised: 20 February 2010 / Accepted: 23 February 2010 / Published: 25 February 2010
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
View Full-Text   |   Download PDF [133 KB, uploaded 24 February 2015]


In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects Boltzmann-Shannon entropy and the expected log-likelihood function. The maximum likelihood estimation has been supported for the optimal performance, which is known to be easily broken down in the presence of a small degree of model uncertainty. To deal with this problem, a new statistical method, closely related to Tsallis entropy, is proposed and shown to be robust for outliers, and we discuss a local learning property associated with the method.
Keywords: Tsallis entropy; projective power divergence; robustness Tsallis entropy; projective power divergence; robustness
This is an open access article distributed under the Creative Commons Attribution License (CC BY) which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Share & Cite This Article

Further Mendeley | CiteULike
Export to BibTeX |
EndNote |
MDPI and ACS Style

Eguchi, S.; Kato, S. Entropy and Divergence Associated with Power Function and the Statistical Application. Entropy 2010, 12, 262-274.

View more citation formats

Related Articles

Article Metrics

For more information on the journal, click here


[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert