%PDF-1.6
%
2 0 obj
<>
endobj
108 0 obj
<<>>
endobj
106 0 obj
<<>>
endobj
113 0 obj
<>stream
2008-09-19T11:43:27+01:00
TeX output 2008.09.19:1143
2008-09-19T11:45:42+02:00
2008-09-19T11:45:42+02:00
dvipdfm 0.13.2d, Copyright © 1998, by Mark A. Wicks
Shannon entropy; Kullback I-divergence; Rényi information measures; f-divergence; f-entropy; functional equation; proper score; maximum entropy; transitive inference rule; Bregman distance
False
application/pdf
Axiomatic Characterizations of Information Measures
Imre Csiszár
Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some
generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1,... ,N} representable by joint entropies of components of an N-dimensional random vector. (C) Axiomatic characterization of MaxEnt and related inference rules. The paper concludes with a brief discussion of the relevance of the axiomatic approach for information theory.
Shannon entropy
Kullback I-divergence
Rényi information measures
f-divergence
f-entropy
functional equation
proper score
maximum entropy
transitive inference rule
Bregman distance
uuid:e3e16a62-551d-48a1-ba3e-f75e9609b4d0
uuid:cb688826-30d8-4f72-a57d-7c11d6ac8133
endstream
endobj
3 0 obj
<>
endobj
107 0 obj
null
endobj
102 0 obj
<>
endobj
103 0 obj
<>
endobj
104 0 obj
<>
endobj
105 0 obj
<>
endobj
86 0 obj
<>
endobj
90 0 obj
<>
endobj
94 0 obj
<>
endobj
98 0 obj
<>
endobj
99 0 obj
<>
endobj
101 0 obj
<>
endobj
10 0 obj
<>
endobj
13 0 obj
<>
endobj
16 0 obj
<>
endobj
25 0 obj
<>
endobj
24 0 obj
<>
endobj
23 0 obj
<>stream
xWy