Reprint

New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

Edited by
May 2019
344 pages
  • ISBN978-3-03897-936-4 (Paperback)
  • ISBN978-3-03897-937-1 (PDF)

This book is a reprint of the Special Issue New Developments in Statistical Information Theory Based on Entropy and Divergence Measures that was published in

Chemistry & Materials Science
Computer Science & Mathematics
Physical Sciences
Summary

This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.

Format
  • Paperback
License
© 2019 by the authors; CC BY-NC-ND license
Keywords
sparse; robust; divergence; MM algorithm; Bregman divergence; generalized linear model; local-polynomial regression; model check; nonparametric test; quasi-likelihood; semiparametric model; Wald statistic; composite likelihood; maximum composite likelihood estimator; Wald test statistic; composite minimum density power divergence estimator; Wald-type test statistics; Bregman divergence; general linear model; hypothesis testing; influence function; robust; Wald-type test; log-linear models; ordinal classification variables; association models; correlation models; minimum penalized ϕ-divergence estimator; consistency; asymptotic normality; goodness-of-fit; bootstrap distribution estimator; thematic quality assessment; relative entropy; logarithmic super divergence; robustness; minimum divergence inference; generalized renyi entropy; minimum divergence methods; robustness; single index model; model assessment; statistical distance; non-quadratic distance; total variation; mixture index of fit; Kullback-Leibler distance; divergence measure; γ-divergence; relative error estimation; robust estimation; information geometry; centroid; Bregman information; Hölder divergence; indoor localization; robustness; efficiency; Bayesian nonparametric; Bayesian semi-parametric; asymptotic property; minimum disparity methods; Hellinger distance; Berstein von Mises theorem; measurement errors; robust testing; two-sample test; misspecified hypothesis and alternative; 2-alternating capacities; composite hypotheses; corrupted data; least-favorable hypotheses; Neyman Pearson test; divergence based testing; Chernoff Stein lemma; compressed data; Hellinger distance; representation formula; iterated limits; influence function; consistency; asymptotic normality; location-scale family; n/a