Entropy and Information Inequalities
- Inequalities for discrete domains:The paper by Abbe, Li, and Madiman [1] explores entropy inequalities for sums and differences of random variables on cyclic groups. This paper also shows applications in the design of polar codes on non-binary alphabets. In a completely different interpretation of “discrete”, the paper by Harremoës [2] explores entropy inequalities for random variables with functional dependencies that are expressed as lattices. In particular, the paper explores when Shannon-type inequalities are sufficient to completely characterize the entropy region.
- Inequalities in functional analysis:The Poincaré and log-Sobolev inequalities are crucial tools in probability and functional analysis. The paper Shikegawa [3] explores the exponential rate of convergence of Markov semigroups assuming the log-Sobolev inequality holds. Schlichting [4] investigates Poincaré and log-Sobolev constants in mixture distributions, when the components of the mixtures are assumed to satisfy the Poicaré and log-Sobolev inequalities. Liu, Courtade, Cuff, and Verdú [5] show how to combine the Brascamp-Lieb and Barthe inequalities from functional analysis into a single entropy inequality. Sason [6] proves new inequalities for f-divergences—generalizations of the well-known Kullback–Liebler divergence—and shows applications to hypothesis testing.
- Geometry-inspired inequalities:
- Miscellaneous:Mossel and Ohannessian [9] show the impossibility of learning rare events without making distributional assumptions. Their paper has an explicit construction and is relevant to estimating the “missing mass” in distribution estimation. Gu, Zha, and Yu [10] contribute to the area of rough random theory by proving basic probabilistic inequalities in this setting.
Acknowledgments
Conflicts of Interest
References
- Abbe, E.; Li, J.; Madiman, M. Entropies of Weighted Sums in Cyclic Groups and an Application to Polar Codes. Entropy 2017, 19, 235. [Google Scholar] [CrossRef] [Green Version]
- Harremoës, P. Entropy Inequalities for Lattices. Entropy 2018, 20, 784. [Google Scholar] [CrossRef] [Green Version]
- Shigekawa, I. Logarithmic Sobolev Inequality and Exponential Convergence of a Markovian Semigroup in the Zygmund Space. Entropy 2018, 20, 220. [Google Scholar] [CrossRef] [Green Version]
- Schlichting, A. Poincaré and Log-Sobolev Inequalities for Mixtures. Entropy 2019, 21, 89. [Google Scholar] [CrossRef] [Green Version]
- Liu, J.; Courtade, T.A.; Cuff, P.W.; Verdú, S. A Forward-Reverse Brascamp-Lieb Inequality: Entropic Duality and Gaussian Optimality. Entropy 2018, 20, 418. [Google Scholar] [CrossRef] [Green Version]
- Sason, I. On f-Divergences: Integral Representations, Local Behavior, and Inequalities. Entropy 2018, 20, 383. [Google Scholar] [CrossRef] [Green Version]
- Marsiglietti, A.; Kostina, V. A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications. Entropy 2018, 20, 185. [Google Scholar] [CrossRef] [Green Version]
- Hao, J.; Jog, V. Dual Loomis-Whitney Inequalities via Information Theory. Entropy 2019, 21, 809. [Google Scholar] [CrossRef] [Green Version]
- Mossel, E.; Ohannessian, M.I. On the Impossibility of Learning the Missing Mass. Entropy 2019, 21, 28. [Google Scholar] [CrossRef] [Green Version]
- Gu, Y.; Zhang, Q.; Yu, L. Some Inequalities Combining Rough and Random Information. Entropy 2018, 20, 211. [Google Scholar] [CrossRef] [Green Version]
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jog, V.; Melbourne, J. Entropy and Information Inequalities. Entropy 2020, 22, 320. https://doi.org/10.3390/e22030320
Jog V, Melbourne J. Entropy and Information Inequalities. Entropy. 2020; 22(3):320. https://doi.org/10.3390/e22030320
Chicago/Turabian StyleJog, Varun, and James Melbourne. 2020. "Entropy and Information Inequalities" Entropy 22, no. 3: 320. https://doi.org/10.3390/e22030320
APA StyleJog, V., & Melbourne, J. (2020). Entropy and Information Inequalities. Entropy, 22(3), 320. https://doi.org/10.3390/e22030320