Special Issue "New Developments in Statistical Information Theory Based on Entropy and Divergence Measures"
A special issue of Entropy (ISSN 1099-4300).
Deadline for manuscript submissions: closed (30 September 2018)
Professor Leandro Pardo
Statistics and Operation Research, Faculty of Mathematics, Universidad Complutense de Madrid, 28040 Madrid, Spain
Website | E-Mail
Interests: minimum divergence estimators: robustness and efficiency; robust test procedures based on minimum divergence estimators; robust test procedures in composite likelihood, empirical likelihood, change point, and time series
The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. In fact, the so-called Statistical Information Theory has been the subject of much statistical research over the last fifty years. Minimum divergence estimators or minimum distance estimators have been used successfully in models for continuous and discrete data due to its robustness properties. Divergence statistics, i.e., those ones obtained by replacing either one or both arguments in the measures of divergence by suitable estimators, have become a very good alternative to the classical likelihood ratio test in both continuous and discrete models, as well as to the classical Pearson-type statistic in discrete models.
Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, enjoy several optimum asymptotic properties but they are highly non-robust in case of model misspecification under presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Thus, the practical importance of a robust test procedure is beyond doubt; and it is helpful for solving several real-life problems containing some outliers in the observed sample. For this reason in the last years a robust version of the classical Wald test statistic, for testing simple and composite null hypotheses for general parametric models, have been introduced and studied for different problems in the statistical literature. These test statistics are based on minimum divergence estimators instead of the maximum likelihood and have been considered in many different statistical problems: Censoring, equality of means in normal and lognormal models, logistic regression models in particular and GLM models in general, etc.
The scope of the contributions to this Special Issue will be to present new and original research based on minimum divergence estimators and divergence statistics, from a theoretical and applied point of view, in different statistical problem with special emphasis on efficiency and robustness. Manuscripts summarizing the most recent state-of-the-art of these topics will also be welcome.
Prof. Dr. Leandro Pardo
Manuscript Submission Information
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.
Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.
- Divergence measures
- Entropy measures
- Minimum divergence estimator (MDE)
- Testing based on divergence measures
- Wald-type tests based on MDE
- Parametric models
- Complex random sampling
- Composite Likelihood
- Empirical likelihood
- Change point
- GLM models