Next Article in Journal
Unification of the Nature’s Complexities via a Matrix Permanent—Critical Phenomena, Fractals, Quantum Computing, ♯P-Complexity
Previous Article in Journal
Relative Distribution Entropy Loss Function in CNN Image Retrieval
Previous Article in Special Issue
Dual Loomis-Whitney Inequalities via Information Theory
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Entropy and Information Inequalities

1
Department of Electrical and Computer Engineering, University of Wisconsin-Madison, Madison, WI 53706, USA
2
Department of Mathematical Sciences, University of Delaware, Newark, DE 19716, USA
*
Author to whom correspondence should be addressed.
Entropy 2020, 22(3), 320; https://doi.org/10.3390/e22030320
Submission received: 26 February 2020 / Accepted: 3 March 2020 / Published: 12 March 2020
(This article belongs to the Special Issue Entropy and Information Inequalities)
Entropy and information inequalities are vitally important in many areas of mathematics and engineering. In this special issue, we solicited contributions that showcase the wide applicability of information inequalities and serve as a repository of mathematical tools and techniques needed to prove information inequalities. The papers we received span various areas of mathematics, including information theory, geometry, functional analysis, hypothesis testing, and estimation theory. We are confident that this special issue will lead to an exchange of ideas between different areas of mathematics and foster new interdisciplinary research at the boundary between these areas. The ten submissions we received may be broadly divided into four categories: (1) Inequalities for discrete domains; (2) Inequalities in functional analysis; (3) Geometry-inspired inequalities; and (4) Miscellaneous topics.
  • Inequalities for discrete domains:
    The paper by Abbe, Li, and Madiman [1] explores entropy inequalities for sums and differences of random variables on cyclic groups. This paper also shows applications in the design of polar codes on non-binary alphabets. In a completely different interpretation of “discrete”, the paper by Harremoës [2] explores entropy inequalities for random variables with functional dependencies that are expressed as lattices. In particular, the paper explores when Shannon-type inequalities are sufficient to completely characterize the entropy region.
  • Inequalities in functional analysis:
    The Poincaré and log-Sobolev inequalities are crucial tools in probability and functional analysis. The paper Shikegawa [3] explores the exponential rate of convergence of Markov semigroups assuming the log-Sobolev inequality holds. Schlichting [4] investigates Poincaré and log-Sobolev constants in mixture distributions, when the components of the mixtures are assumed to satisfy the Poicaré and log-Sobolev inequalities. Liu, Courtade, Cuff, and Verdú [5] show how to combine the Brascamp-Lieb and Barthe inequalities from functional analysis into a single entropy inequality. Sason [6] proves new inequalities for f-divergences—generalizations of the well-known Kullback–Liebler divergence—and shows applications to hypothesis testing.
  • Geometry-inspired inequalities:
    The paper by Marsiglietti and Kostina [7] shows new entropy inequalities for the entropy of log-concave random vectors using ideas from convex geometry. Hao and Jog [8] derive new volume and surface area inequalities in geometry using information theoretic inequalities.
  • Miscellaneous:
    Mossel and Ohannessian [9] show the impossibility of learning rare events without making distributional assumptions. Their paper has an explicit construction and is relevant to estimating the “missing mass” in distribution estimation. Gu, Zha, and Yu [10] contribute to the area of rough random theory by proving basic probabilistic inequalities in this setting.

Acknowledgments

We express our thanks to the authors of the above contributions, and to the journal Entropy and MDPI for their support during this work.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Abbe, E.; Li, J.; Madiman, M. Entropies of Weighted Sums in Cyclic Groups and an Application to Polar Codes. Entropy 2017, 19, 235. [Google Scholar] [CrossRef] [Green Version]
  2. Harremoës, P. Entropy Inequalities for Lattices. Entropy 2018, 20, 784. [Google Scholar] [CrossRef] [Green Version]
  3. Shigekawa, I. Logarithmic Sobolev Inequality and Exponential Convergence of a Markovian Semigroup in the Zygmund Space. Entropy 2018, 20, 220. [Google Scholar] [CrossRef] [Green Version]
  4. Schlichting, A. Poincaré and Log-Sobolev Inequalities for Mixtures. Entropy 2019, 21, 89. [Google Scholar] [CrossRef] [Green Version]
  5. Liu, J.; Courtade, T.A.; Cuff, P.W.; Verdú, S. A Forward-Reverse Brascamp-Lieb Inequality: Entropic Duality and Gaussian Optimality. Entropy 2018, 20, 418. [Google Scholar] [CrossRef] [Green Version]
  6. Sason, I. On f-Divergences: Integral Representations, Local Behavior, and Inequalities. Entropy 2018, 20, 383. [Google Scholar] [CrossRef] [Green Version]
  7. Marsiglietti, A.; Kostina, V. A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications. Entropy 2018, 20, 185. [Google Scholar] [CrossRef] [Green Version]
  8. Hao, J.; Jog, V. Dual Loomis-Whitney Inequalities via Information Theory. Entropy 2019, 21, 809. [Google Scholar] [CrossRef] [Green Version]
  9. Mossel, E.; Ohannessian, M.I. On the Impossibility of Learning the Missing Mass. Entropy 2019, 21, 28. [Google Scholar] [CrossRef] [Green Version]
  10. Gu, Y.; Zhang, Q.; Yu, L. Some Inequalities Combining Rough and Random Information. Entropy 2018, 20, 211. [Google Scholar] [CrossRef] [Green Version]

Share and Cite

MDPI and ACS Style

Jog, V.; Melbourne, J. Entropy and Information Inequalities. Entropy 2020, 22, 320. https://doi.org/10.3390/e22030320

AMA Style

Jog V, Melbourne J. Entropy and Information Inequalities. Entropy. 2020; 22(3):320. https://doi.org/10.3390/e22030320

Chicago/Turabian Style

Jog, Varun, and James Melbourne. 2020. "Entropy and Information Inequalities" Entropy 22, no. 3: 320. https://doi.org/10.3390/e22030320

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop