The Kullback–Leibler Information Function for Infinite Measures
Department of Mathematics, IT and Landscape Architecture, John Paul II Catholic University of Lublin, Konstantynuv Str. 1H, 20-708 Lublin, Poland
Department of Mechanics and Mathematics, Belarusian State University, Nezavisimosti Ave. 4, 220030 Minsk, Belarus
Author to whom correspondence should be addressed.
Academic Editor: Raúl Alcaraz Martínez
Received: 21 July 2016 / Revised: 1 December 2016 / Accepted: 12 December 2016 / Published: 15 December 2016
In this paper, we introduce the Kullback–Leibler information function
and prove the local large deviation principle for σ
-finite measures μ
and finitely additive probability measures ν
. In particular, the entropy of a continuous probability distribution ν
on the real axis is interpreted as the exponential rate of asymptotics for the Lebesgue measure of the set of those samples that generate empirical measures close to ν
in a suitable fine topology.
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).
Share & Cite This Article
MDPI and ACS Style
Bakhtin, V.; Sokal, E. The Kullback–Leibler Information Function for Infinite Measures. Entropy 2016, 18, 448.
Bakhtin V, Sokal E. The Kullback–Leibler Information Function for Infinite Measures. Entropy. 2016; 18(12):448.
Bakhtin, Victor; Sokal, Edvard. 2016. "The Kullback–Leibler Information Function for Infinite Measures." Entropy 18, no. 12: 448.
Show more citation formats
Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.
[Return to top]
For more information on the journal statistics, click here
Multiple requests from the same IP address are counted as one view.