Next Article in Journal
Conformal Flattening for Deformed Information Geometries on the Probability Simplex
Next Article in Special Issue
Some Inequalities Combining Rough and Random Information
Previous Article in Journal
Fisher Information Based Meteorological Factors Introduction and Features Selection for Short-Term Load Forecasting
Previous Article in Special Issue
Entropies of Weighted Sums in Cyclic Groups and an Application to Polar Codes
Article Menu
Issue 3 (March) cover image

Export Article

Open AccessArticle
Entropy 2018, 20(3), 185; https://doi.org/10.3390/e20030185

A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications

1
Center for the Mathematics of Information, California Institute of Technology, Pasadena, CA 91125, USA
2
Department of Electrical Engineering, California Institute of Technology, Pasadena, CA 91125, USA
*
Author to whom correspondence should be addressed.
Received: 18 January 2018 / Revised: 6 March 2018 / Accepted: 6 March 2018 / Published: 9 March 2018
(This article belongs to the Special Issue Entropy and Information Inequalities)
View Full-Text   |   Download PDF [496 KB, uploaded 12 March 2018]   |  

Abstract

We derive a lower bound on the differential entropy of a log-concave random variable X in terms of the p-th absolute moment of X. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new bounds on the rate-distortion function and the channel capacity. Specifically, we study the rate-distortion function for log-concave sources and distortion measure d ( x , x ^ ) = | x x ^ | r , with r 1 , and we establish that the difference between the rate-distortion function and the Shannon lower bound is at most log ( π e ) 1 . 5 bits, independently of r and the target distortion d. For mean-square error distortion, the difference is at most log ( π e 2 ) 1 bit, regardless of d. We also provide bounds on the capacity of memoryless additive noise channels when the noise is log-concave. We show that the difference between the capacity of such channels and the capacity of the Gaussian channel with the same noise power is at most log ( π e 2 ) 1 bit. Our results generalize to the case of a random vector X with possibly dependent coordinates. Our proof technique leverages tools from convex geometry. View Full-Text
Keywords: differential entropy; reverse entropy power inequality; rate-distortion function; Shannon lower bound; channel capacity; log-concave distribution; hyperplane conjecture differential entropy; reverse entropy power inequality; rate-distortion function; Shannon lower bound; channel capacity; log-concave distribution; hyperplane conjecture
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Marsiglietti, A.; Kostina, V. A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications. Entropy 2018, 20, 185.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top