Open AccessThis article is
- freely available
A Characterization of Entropy in Terms of Information Loss
Department of Mathematics, University of California, Riverside, CA 92521, USA
Centre for Quantum Technologies, National University of Singapore, 117543, Singapore
Institut de Ciències Fotòniques, Mediterranean Technology Park, 08860 Castelldefels (Barcelona), Spain
School of Mathematics and Statistics, University of Glasgow, Glasgow G12 8QW, UK
* Author to whom correspondence should be addressed.
Received: 11 October 2011; in revised form: 18 November 2011 / Accepted: 21 November 2011 / Published: 24 November 2011
Abstract: There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the “information loss”, or change in entropy, associated with a measure-preserving function. Information loss is a special case of conditional entropy: namely, it is the entropy of a random variable conditioned on some function of that variable. We show that Shannon entropy gives the only concept of information loss that is functorial, convex-linear and continuous. This characterization naturally generalizes to Tsallis entropy as well.
Keywords: Shannon entropy; Tsallis entropy; information theory; measure-preserving function
Citations to this Article
Cite This Article
MDPI and ACS Style
Baez, J.C.; Fritz, T.; Leinster, T. A Characterization of Entropy in Terms of Information Loss. Entropy 2011, 13, 1945-1957.
Baez JC, Fritz T, Leinster T. A Characterization of Entropy in Terms of Information Loss. Entropy. 2011; 13(11):1945-1957.
Baez, John C.; Fritz, Tobias; Leinster, Tom. 2011. "A Characterization of Entropy in Terms of Information Loss." Entropy 13, no. 11: 1945-1957.