Next Article in Journal
Loop Entropy Assists Tertiary Order: Loopy Stabilization of Stacking Motifs
Previous Article in Journal
Classes of N-Dimensional Nonlinear Fokker-Planck Equations Associated to Tsallis Entropy
Entropy 2011, 13(11), 1945-1957; doi:10.3390/e13111945
Article

A Characterization of Entropy in Terms of Information Loss

1,2
,
3,*  and 4
1 Department of Mathematics, University of California, Riverside, CA 92521, USA 2 Centre for Quantum Technologies, National University of Singapore, 117543, Singapore 3 Institut de Ciències Fotòniques, Mediterranean Technology Park, 08860 Castelldefels (Barcelona), Spain 4 School of Mathematics and Statistics, University of Glasgow, Glasgow G12 8QW, UK
* Author to whom correspondence should be addressed.
Received: 11 October 2011 / Revised: 18 November 2011 / Accepted: 21 November 2011 / Published: 24 November 2011
Download PDF [246 KB, 24 February 2015; original version 24 February 2015]

Abstract

There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the “information loss”, or change in entropy, associated with a measure-preserving function. Information loss is a special case of conditional entropy: namely, it is the entropy of a random variable conditioned on some function of that variable. We show that Shannon entropy gives the only concept of information loss that is functorial, convex-linear and continuous. This characterization naturally generalizes to Tsallis entropy as well.
Keywords: Shannon entropy; Tsallis entropy; information theory; measure-preserving function Shannon entropy; Tsallis entropy; information theory; measure-preserving function
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Share & Cite This Article

Further Mendeley | CiteULike
Export to BibTeX |
EndNote
MDPI and ACS Style

Baez, J.C.; Fritz, T.; Leinster, T. A Characterization of Entropy in Terms of Information Loss. Entropy 2011, 13, 1945-1957.

View more citation formats

Related Articles

Article Metrics

For more information on the journal, click here

Comments

Cited By

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert