Next Article in Journal
Predicting the Outcome of NBA Playoffs Based on the Maximum Entropy Principle
Previous Article in Journal
Quantum Thermodynamics with Degenerate Eigenstate Coherences
Open AccessArticle

The Kullback–Leibler Information Function for Infinite Measures

1
Department of Mathematics, IT and Landscape Architecture, John Paul II Catholic University of Lublin, Konstantynuv Str. 1H, 20-708 Lublin, Poland
2
Department of Mechanics and Mathematics, Belarusian State University, Nezavisimosti Ave. 4, 220030 Minsk, Belarus
*
Author to whom correspondence should be addressed.
Academic Editor: Raúl Alcaraz Martínez
Entropy 2016, 18(12), 448; https://doi.org/10.3390/e18120448
Received: 21 July 2016 / Revised: 1 December 2016 / Accepted: 12 December 2016 / Published: 15 December 2016
(This article belongs to the Section Information Theory, Probability and Statistics)
In this paper, we introduce the Kullback–Leibler information function ρ ( ν , μ ) and prove the local large deviation principle for σ-finite measures μ and finitely additive probability measures ν. In particular, the entropy of a continuous probability distribution ν on the real axis is interpreted as the exponential rate of asymptotics for the Lebesgue measure of the set of those samples that generate empirical measures close to ν in a suitable fine topology. View Full-Text
Keywords: Kullback–Leibler information function; entropy; large deviation principle; empirical measure; fine topology; spectral potential Kullback–Leibler information function; entropy; large deviation principle; empirical measure; fine topology; spectral potential
MDPI and ACS Style

Bakhtin, V.; Sokal, E. The Kullback–Leibler Information Function for Infinite Measures. Entropy 2016, 18, 448.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop