Projective Power Entropy and Maximum Tsallis Entropy Distributions
AbstractWe discuss a one-parameter family of generalized cross entropy between two distributions with the power index, called the projective power entropy. The cross entropy is essentially reduced to the Tsallis entropy if two distributions are taken to be equal. Statistical and probabilistic properties associated with the projective power entropy are extensively investigated including a characterization problem of which conditions uniquely determine the projective power entropy up to the power index. A close relation of the entropy with the Lebesgue space Lp and the dual Lq is explored, in which the escort distribution associates with an interesting property. When we consider maximum Tsallis entropy distributions under the constraints of the mean vector and variance matrix, the model becomes a multivariate q-Gaussian model with elliptical contours, including a Gaussian and t-distribution model. We discuss the statistical estimation by minimization of the empirical loss associated with the projective power entropy. It is shown that the minimum loss estimator for the mean vector and variance matrix under the maximum entropy model are the sample mean vector and the sample variance matrix. The escort distribution of the maximum entropy distribution plays the key role for the derivation. View Full-Text
Scifeed alert for new publicationsNever miss any articles matching your research from any publisher
- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now
Eguchi, S.; Komori, O.; Kato, S. Projective Power Entropy and Maximum Tsallis Entropy Distributions. Entropy 2011, 13, 1746-1764.
Eguchi S, Komori O, Kato S. Projective Power Entropy and Maximum Tsallis Entropy Distributions. Entropy. 2011; 13(10):1746-1764.Chicago/Turabian Style
Eguchi, Shinto; Komori, Osamu; Kato, Shogo. 2011. "Projective Power Entropy and Maximum Tsallis Entropy Distributions." Entropy 13, no. 10: 1746-1764.