Duality of Maximum Entropy and Minimum Divergence
AbstractWe discuss a special class of generalized divergence measures by the use of generator functions. Any divergence measure in the class is separated into the difference between cross and diagonal entropy. The diagonal entropy measure in the class associates with a model of maximum entropy distributions; the divergence measure leads to statistical estimation via minimization, for arbitrarily giving a statistical model. The dualistic relationship between the maximum entropy model and the minimum divergence estimation is explored in the framework of information geometry. The model of maximum entropy distributions is characterized to be totally geodesic with respect to the linear connection associated with the divergence. A natural extension for the classical theory for the maximum likelihood method under the maximum entropy model in terms of the Boltzmann-Gibbs-Shannon entropy is given. We discuss the duality in detail for Tsallis entropy as a typical example. View Full-Text
Scifeed alert for new publicationsNever miss any articles matching your research from any publisher
- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now
Eguchi, S.; Komori, O.; Ohara, A. Duality of Maximum Entropy and Minimum Divergence. Entropy 2014, 16, 3552-3572.
Eguchi S, Komori O, Ohara A. Duality of Maximum Entropy and Minimum Divergence. Entropy. 2014; 16(7):3552-3572.Chicago/Turabian Style
Eguchi, Shinto; Komori, Osamu; Ohara, Atsumi. 2014. "Duality of Maximum Entropy and Minimum Divergence." Entropy 16, no. 7: 3552-3572.