A Characterization of Entropy in Terms of Information Loss
AbstractThere are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the “information loss”, or change in entropy, associated with a measure-preserving function. Information loss is a special case of conditional entropy: namely, it is the entropy of a random variable conditioned on some function of that variable. We show that Shannon entropy gives the only concept of information loss that is functorial, convex-linear and continuous. This characterization naturally generalizes to Tsallis entropy as well. View Full-Text
Scifeed alert for new publicationsNever miss any articles matching your research from any publisher
- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now
Baez, J.C.; Fritz, T.; Leinster, T. A Characterization of Entropy in Terms of Information Loss. Entropy 2011, 13, 1945-1957.
Baez JC, Fritz T, Leinster T. A Characterization of Entropy in Terms of Information Loss. Entropy. 2011; 13(11):1945-1957.Chicago/Turabian Style
Baez, John C.; Fritz, Tobias; Leinster, Tom. 2011. "A Characterization of Entropy in Terms of Information Loss." Entropy 13, no. 11: 1945-1957.