Next Article in Journal
Previous Article in Journal
Previous Article in Special Issue
Entropy 2008, 10(3), 261-273; doi:10.3390/e10030261
Article

Axiomatic Characterizations of Information Measures

Received: 1 September 2008; Accepted: 12 September 2008 / Published: 19 September 2008
Download PDF [186 KB, uploaded 28 September 2008]
Abstract: Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1; : : : ;N} representable by joint entropies of components of an N-dimensional random vector. (C) Axiomatic characterization of MaxEnt and related inference rules. The paper concludes with a brief discussion of the relevance of the axiomatic approach for information theory.
Keywords: Shannon entropy; Kullback I-divergence; Rényi information measures; f- divergence; f-entropy; functional equation; proper score; maximum entropy; transitive inference rule; Bregman distance Shannon entropy; Kullback I-divergence; Rényi information measures; f- divergence; f-entropy; functional equation; proper score; maximum entropy; transitive inference rule; Bregman distance
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Export to BibTeX |
EndNote


MDPI and ACS Style

Csiszár, I. Axiomatic Characterizations of Information Measures. Entropy 2008, 10, 261-273.

AMA Style

Csiszár I. Axiomatic Characterizations of Information Measures. Entropy. 2008; 10(3):261-273.

Chicago/Turabian Style

Csiszár, Imre. 2008. "Axiomatic Characterizations of Information Measures." Entropy 10, no. 3: 261-273.


Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert