E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Special Issue "Entropy and Information Inequalities"

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: 30 November 2018

Special Issue Editors

Guest Editor
Dr. James Melbourne

Department of Mathematical Sciences, University of Delaware, Newark, DE 19716, USA
E-Mail
Interests: statistics; probability theory; geometry and topology
Guest Editor
Dr. Varun Jog

Departments of Electrical & Computer Engineering, University of Wisconsin, Madison, WI 53706, USA
E-Mail
Interests: information theory; probability; convex geometry

Special Issue Information

Dear Colleagues,

In recent decades, information theoretic inequalities have provided an interface with both neighboring and seemingly disparate disciplines. What is more, bridges built from these interactions have produced new and richer understandings of Information theory itself. Important connections have been established between information theoretic inequalities and subjects including, convex geometry, optimal transport, concentration of measure, probability, statistics, estimation theory, additive combinatorics, and thermodynamics, by way of inequalities; entropy power, Brunn–Minkowski, HWI, log-Sobolev, monotonicity in CLT,  Sanov, sum-set, Landauer, and many more. Even within information theory, there has been renewed interest in developing inequalities in non-conventional settings such as convolution inequalities for Renyi or Tsallis entropy, inequalities for f-divergences, and entropy inequalities over discrete spaces.

In this Special Issue, we would like to invite contributions that establish novel information theoretic inequalities (broadly defined), extend the applications thereof, and deepen our understanding of information theory and related fields. Expository submissions are welcomed, and we envisage that these contributions will lead to an improvement of acumen in information theory, while also strengthening the growing bonds between the subject and the other areas outlined above, with the hope of generating further inter-field and interdisciplinary dialog.

Dr. James Melbourne
Dr. Varun Jog
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1500 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Entropy
  • Rényi entropy
  • Tsallis entropy
  • Fisher Information
  • Entropic Distances
  • Information-Theoretic Inequalities
  • Entropy Power Inequalities
  • Logarithmic Sobolev Inequalities

Published Papers (6 papers)

View options order results:
result details:
Displaying articles 1-6
Export citation of selected articles as:

Research

Open AccessArticle A Forward-Reverse Brascamp-Lieb Inequality: Entropic Duality and Gaussian Optimality
Entropy 2018, 20(6), 418; https://doi.org/10.3390/e20060418
Received: 30 March 2018 / Revised: 25 May 2018 / Accepted: 25 May 2018 / Published: 30 May 2018
PDF Full-text (472 KB) | HTML Full-text | XML Full-text
Abstract
Inspired by the forward and the reverse channels from the image-size characterization problem in network information theory, we introduce a functional inequality that unifies both the Brascamp-Lieb inequality and Barthe’s inequality, which is a reverse form of the Brascamp-Lieb inequality. For Polish spaces,
[...] Read more.
Inspired by the forward and the reverse channels from the image-size characterization problem in network information theory, we introduce a functional inequality that unifies both the Brascamp-Lieb inequality and Barthe’s inequality, which is a reverse form of the Brascamp-Lieb inequality. For Polish spaces, we prove its equivalent entropic formulation using the Legendre-Fenchel duality theory. Capitalizing on the entropic formulation, we elaborate on a “doubling trick” used by Lieb and Geng-Nair to prove the Gaussian optimality in this inequality for the case of Gaussian reference measures. Full article
(This article belongs to the Special Issue Entropy and Information Inequalities)
Figures

Figure 1

Open AccessArticle On f-Divergences: Integral Representations, Local Behavior, and Inequalities
Entropy 2018, 20(5), 383; https://doi.org/10.3390/e20050383
Received: 15 April 2018 / Revised: 7 May 2018 / Accepted: 15 May 2018 / Published: 19 May 2018
PDF Full-text (1210 KB) | HTML Full-text | XML Full-text
Abstract
This paper is focused on f-divergences, consisting of three main contributions. The first one introduces integral representations of a general f-divergence by means of the relative information spectrum. The second part provides a new approach for the derivation of f-divergence
[...] Read more.
This paper is focused on f-divergences, consisting of three main contributions. The first one introduces integral representations of a general f-divergence by means of the relative information spectrum. The second part provides a new approach for the derivation of f-divergence inequalities, and it exemplifies their utility in the setup of Bayesian binary hypothesis testing. The last part of this paper further studies the local behavior of f-divergences. Full article
(This article belongs to the Special Issue Entropy and Information Inequalities)
Figures

Figure 1

Open AccessArticle Logarithmic Sobolev Inequality and Exponential Convergence of a Markovian Semigroup in the Zygmund Space
Entropy 2018, 20(4), 220; https://doi.org/10.3390/e20040220
Received: 29 December 2017 / Revised: 19 March 2018 / Accepted: 19 March 2018 / Published: 23 March 2018
PDF Full-text (345 KB) | HTML Full-text | XML Full-text
Abstract
We investigate the exponential convergence of a Markovian semigroup in the Zygmund space under the assumption of logarithmic Sobolev inequality. We show that the convergence rate is greater than the logarithmic Sobolev constant. To do this, we use the notion of entropy. We
[...] Read more.
We investigate the exponential convergence of a Markovian semigroup in the Zygmund space under the assumption of logarithmic Sobolev inequality. We show that the convergence rate is greater than the logarithmic Sobolev constant. To do this, we use the notion of entropy. We also give an example of a Laguerre operator. We determine the spectrum in the Orlicz space and discuss the relation between the logarithmic Sobolev constant and the spectral gap. Full article
(This article belongs to the Special Issue Entropy and Information Inequalities)
Figures

Figure 1

Open AccessArticle Some Inequalities Combining Rough and Random Information
Entropy 2018, 20(3), 211; https://doi.org/10.3390/e20030211
Received: 1 February 2018 / Revised: 18 March 2018 / Accepted: 18 March 2018 / Published: 20 March 2018
PDF Full-text (744 KB) | HTML Full-text | XML Full-text
Abstract
Rough random theory, generally applied to statistics, decision-making, and so on, is an extension of rough set theory and probability theory, in which a rough random variable is described as a random variable taking “rough variable” values. In order to extend and enrich
[...] Read more.
Rough random theory, generally applied to statistics, decision-making, and so on, is an extension of rough set theory and probability theory, in which a rough random variable is described as a random variable taking “rough variable” values. In order to extend and enrich the research area of rough random theory, in this paper, the well-known probabilistic inequalities (Markov inequality, Chebyshev inequality, Holder’s inequality, Minkowski inequality and Jensen’s inequality) are proven for rough random variables, which gives a firm theoretical support to the further development of rough random theory. Besides, considering that the critical values always act as a vital tool in engineering, science and other application fields, some significant properties of the critical values of rough random variables involving the continuity and the monotonicity are investigated deeply to provide a novel analytical approach for dealing with the rough random optimization problems. Full article
(This article belongs to the Special Issue Entropy and Information Inequalities)
Open AccessArticle A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications
Entropy 2018, 20(3), 185; https://doi.org/10.3390/e20030185
Received: 18 January 2018 / Revised: 6 March 2018 / Accepted: 6 March 2018 / Published: 9 March 2018
Cited by 1 | PDF Full-text (496 KB) | HTML Full-text | XML Full-text
Abstract
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of the p-th absolute moment of X. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new
[...] Read more.
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of the p-th absolute moment of X. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new bounds on the rate-distortion function and the channel capacity. Specifically, we study the rate-distortion function for log-concave sources and distortion measure d ( x , x ^ ) = | x x ^ | r , with r 1 , and we establish that the difference between the rate-distortion function and the Shannon lower bound is at most log ( π e ) 1 . 5 bits, independently of r and the target distortion d. For mean-square error distortion, the difference is at most log ( π e 2 ) 1 bit, regardless of d. We also provide bounds on the capacity of memoryless additive noise channels when the noise is log-concave. We show that the difference between the capacity of such channels and the capacity of the Gaussian channel with the same noise power is at most log ( π e 2 ) 1 bit. Our results generalize to the case of a random vector X with possibly dependent coordinates. Our proof technique leverages tools from convex geometry. Full article
(This article belongs to the Special Issue Entropy and Information Inequalities)
Figures

Figure 1

Open AccessArticle Entropies of Weighted Sums in Cyclic Groups and an Application to Polar Codes
Entropy 2017, 19(9), 235; https://doi.org/10.3390/e19090235
Received: 14 February 2017 / Revised: 5 April 2017 / Accepted: 5 April 2017 / Published: 7 September 2017
PDF Full-text (309 KB) | HTML Full-text | XML Full-text
Abstract
In this note, the following basic question is explored: in a cyclic group, how are the Shannon entropies of the sum and difference of i.i.d. random variables related to each other? For the integer group, we show that they can differ by any
[...] Read more.
In this note, the following basic question is explored: in a cyclic group, how are the Shannon entropies of the sum and difference of i.i.d. random variables related to each other? For the integer group, we show that they can differ by any real number additively, but not too much multiplicatively; on the other hand, for Z / 3 Z , the entropy of the difference is always at least as large as that of the sum. These results are closely related to the study of more-sums-than-differences (i.e., MSTD) sets in additive combinatorics. We also investigate polar codes for q-ary input channels using non-canonical kernels to construct the generator matrix and present applications of our results to constructing polar codes with significantly improved error probability compared to the canonical construction. Full article
(This article belongs to the Special Issue Entropy and Information Inequalities)
Figures

Figure 1

Planned Papers

The below list represents only planned manuscripts. Some of these manuscripts have not been received by the Editorial Office yet. Papers submitted to MDPI journals are subject to peer-review.

Tentative title: Lattices with non-Shannon Inequalities
Author: Peter Harremoes

Back to Top