# Entropy and Divergence Associated with Power Function and the Statistical Application

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Power Divergence

**Proposition 1**.

## 3. Minimum Power Divergence Method

#### 3.1. Super Robustness

#### 3.2. Local Learning

## 4. Concluding Remarks

## Acknowledgements

## Appendix 1

## Appendix 2

**Proof**.

## Appendix 3

## References

- Fisher, R.A. On the mathematical foundations of theoretical statistics. Philos. Trans. Roy. Soc. London Ser. A
**1922**, 222, 309–368. [Google Scholar] [CrossRef] - Amari, S. Lecture Notes in Statistics. In Differential-Geometrical Methods in Statistics; Springer-Verlag: New York, NY, USA, 1985; Volume 28. [Google Scholar]
- Amari, S.; Nagaoka, H. Translations of Mathematical Monographs. In Methods of Information Geometry; Oxford University Press: Oxford, UK, 2000; Volume 191. [Google Scholar]
- Akahira, M; Takeuchi, K. Lecture Notes in Statistics. In Asymptotic Efficiency of Statistical Estimators: Concepts and Higher Order Asymptotic Efficiency; Springer-Verlag: New York, NY, USA, 1981; Volume 7. [Google Scholar]
- Box, G.E.P.; Cox, D.R. An Analysis of Transformations. J. R. Statist. Soc. B
**1964**, 26, 211–252. [Google Scholar] - Fujisawa, H.; Eguchi, S. Robust estimation in the normal mixture model. J. Stat. Plan Inference
**2006**, 136, 3989–4011. [Google Scholar] [CrossRef] - Minami, M.; Eguchi, S. Robust blind source separation by beta-divergence. Neural Comput.
**2002**, 14, 1859–1886. [Google Scholar] - Mollah, N.H.; Minami, M.; Eguchi, S. Exploring latent structure of mixture ICA models by the minimum beta-divergence method. Neural Comput.
**2006**, 18, 166–190. [Google Scholar] [CrossRef] - Scott, D.W. Parametric statistical modeling by minimum integrated square error. Technometrics
**2001**, 43, 274–285. [Google Scholar] [CrossRef] - Eguchi, S.; Copas, J.B. A class of logistic type discriminant functions. Biometrika
**2002**, 89, 1–22. [Google Scholar] [CrossRef] - Kanamori, T.; Takenouchi, T.; Eguchi, S.; Murata, N. Robust loss functions for boosting. Neural Comput.
**2007**, 19, 2183–2244. [Google Scholar] [CrossRef] [PubMed] - Lebanon, G.; Lafferty, J. Boosting and maximum likelihood for exponential models. In Advances in Neural Information Processing Systems; 2002; Volume 14, pp. 447–454. MIT Press: New York, NY, USA. [Google Scholar]
- Murata, N.; Takenouchi, T.; Kanamori, T.; Eguchi, S. Information geometry of U-Boost and Bregman divergence. Neural Comput.
**2004**, 16, 1437–1481. [Google Scholar] [CrossRef] [PubMed] - Takenouchi, T.; Eguchi, S. Robustifying AdaBoost by adding the naive error rate. Neural Comput.
**2004**, 16, 767–787. [Google Scholar] [CrossRef] [PubMed] - Takenouchi, T.; Eguchi, S.; Murata, N.; Kanamori, T. Robust boosting algorithm for multiclass problem by mislabelling model. Neural Comput.
**2008**, 20, 1596–1630. [Google Scholar] [CrossRef] [PubMed] - Eguchi, S. Information geometry and statistical pattern recognition. Sugaku Expo.
**2006**, 19, 197–216. [Google Scholar] - Basu, A.; Harris, I.R.; Hjort, N.L.; Jones, M.C. Robust and efficient estimation by minimising a density power divergence. Biometrika
**1998**, 85, 549–559. [Google Scholar] [CrossRef] - Wald, A. Note on the Consistency of the Maximum Likelihood Estimate. Ann. Math. Statist.
**1949**, 20, 595–601. [Google Scholar] [CrossRef] - Fujisawa, H.; Eguchi, S. Robust parameter estimation with a small bias against heavy contamination. J. Multivariate Anal.
**2008**, 99, 2053–2081. [Google Scholar] [CrossRef] - Hampel, F.R.; Ronchetti, E.M.; Rousseeuw, P.J.; Stahel, W.A. Robust Statistics: The Approach Based on Influence Functions; Wiley: New York, NY, USA, 2005. [Google Scholar]
- Eguchi, S.; Copas, J.A. Class of local likelihood methods and near-parametric asymptotics. J. R. Statist. Soc. B
**1998**, 60, 709–724. [Google Scholar] [CrossRef] - Mollah, N.H.; Sultana, N.; Minami, M.; Eguchi, S. Robust extraction of local structures by the minimum beta-divergence method. Neural Netw.
**2010**, 23, 226–238. [Google Scholar] [CrossRef] [PubMed] - Friedman, J.H.; Hastie, T.; Tibshirani, R. Additive logistic regression: A statistical view of boosting. Annals of Statistics
**2000**, 28, 337–407. [Google Scholar] [CrossRef] - Hastie, T.; Tibishirani, R.; Friedman, J. The Elements of Statistical Learning; Springer: New York, NY, USA, 2001. [Google Scholar]
- Schapire, R.E.; Freund, Y.; Bartlett, P.; Lee, W.S. Boosting the margin: A new explanation for the effectiveness of voting methods. Ann. Statist.
**1998**, 26, 1651–1686. [Google Scholar] [CrossRef]

© 2010 by the authors; licensee Molecular Diversity Preservation International, Basel, Switzerland. This article is an open-access article distributed under the terms and conditions of the Creative Commons Attribution license http://creativecommons.org/licenses/by/3.0/.

## Share and Cite

**MDPI and ACS Style**

Eguchi, S.; Kato, S.
Entropy and Divergence Associated with Power Function and the Statistical Application. *Entropy* **2010**, *12*, 262-274.
https://doi.org/10.3390/e12020262

**AMA Style**

Eguchi S, Kato S.
Entropy and Divergence Associated with Power Function and the Statistical Application. *Entropy*. 2010; 12(2):262-274.
https://doi.org/10.3390/e12020262

**Chicago/Turabian Style**

Eguchi, Shinto, and Shogo Kato.
2010. "Entropy and Divergence Associated with Power Function and the Statistical Application" *Entropy* 12, no. 2: 262-274.
https://doi.org/10.3390/e12020262