Entropy 2008, 10(3), 261-273; doi:10.3390/e10030261
Article

Axiomatic Characterizations of Information Measures

Rényi Institute of Mathematics, Hungarian Academy of Sciences, P.O.Box 127, H1364 Budapest, Hungary
Received: 1 September 2008; Accepted: 12 September 2008 / Published: 19 September 2008
PDF Full-text Download PDF Full-Text [186 KB, uploaded 28 September 2008 17:08 CEST]
Abstract: Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1; : : : ;N} representable by joint entropies of components of an N-dimensional random vector. (C) Axiomatic characterization of MaxEnt and related inference rules. The paper concludes with a brief discussion of the relevance of the axiomatic approach for information theory.
Keywords: Shannon entropy; Kullback I-divergence; Rényi information measures; f- divergence; f-entropy; functional equation; proper score; maximum entropy; transitive inference rule; Bregman distance

Article Statistics

Load and display the download statistics.

Citations to this Article

Cite This Article

MDPI and ACS Style

Csiszár, I. Axiomatic Characterizations of Information Measures. Entropy 2008, 10, 261-273.

AMA Style

Csiszár I. Axiomatic Characterizations of Information Measures. Entropy. 2008; 10(3):261-273.

Chicago/Turabian Style

Csiszár, Imre. 2008. "Axiomatic Characterizations of Information Measures." Entropy 10, no. 3: 261-273.

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert