Next Article in Journal
Expected Logarithm of Central Quadratic Form and Its Use in KL-Divergence of Some Distributions
Next Article in Special Issue
Entropy Minimizing Curves with Application to Flight Path Design and Clustering
Previous Article in Journal
Symmetric Fractional Diffusion and Entropy Production
Previous Article in Special Issue
Link between Lie Group Statistical Mechanics and Thermodynamics of Continua
Article Menu

Export Article

Open AccessArticle
Entropy 2016, 18(8), 277; doi:10.3390/e18080277

A Proximal Point Algorithm for Minimum Divergence Estimators with Application to Mixture Models

Laboratoire de Statistique Théorique et Appliquée, Université Pierre et Marie CURIE, 4 place Jussieu, 75005 Paris, France
This paper is an extended version of our paper published in the 2nd Conference on Geometric Science of Information, Palaiseau, France, 28–30 October 2015.
*
Author to whom correspondence should be addressed.
Academic Editors: Frédéric Barbaresco and Frank Nielsen
Received: 11 June 2016 / Revised: 20 July 2016 / Accepted: 21 July 2016 / Published: 27 July 2016
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
View Full-Text   |   Download PDF [347 KB, uploaded 27 July 2016]   |  

Abstract

Estimators derived from a divergence criterion such as φ - divergences are generally more robust than the maximum likelihood ones. We are interested in particular in the so-called minimum dual φ–divergence estimator (MDφDE), an estimator built using a dual representation of φ–divergences. We present in this paper an iterative proximal point algorithm that permits the calculation of such an estimator. The algorithm contains by construction the well-known Expectation Maximization (EM) algorithm. Our work is based on the paper of Tseng on the likelihood function. We provide some convergence properties by adapting the ideas of Tseng. We improve Tseng’s results by relaxing the identifiability condition on the proximal term, a condition which is not verified for most mixture models and is hard to be verified for “non mixture” ones. Convergence of the EM algorithm in a two-component Gaussian mixture is discussed in the spirit of our approach. Several experimental results on mixture models are provided to confirm the validity of the approach. View Full-Text
Keywords: ϕ–divergences; robust estimation; EM algorithm; proximal-point algorithms; mixture models ϕ–divergences; robust estimation; EM algorithm; proximal-point algorithms; mixture models
Figures

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Al Mohamad, D.; Broniatowski, M. A Proximal Point Algorithm for Minimum Divergence Estimators with Application to Mixture Models. Entropy 2016, 18, 277.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top