Next Article in Journal
Equilibrium States in Two-Temperature Systems
Previous Article in Journal
Content Adaptive Lagrange Multiplier Selection for Rate-Distortion Optimization in 3-D Wavelet-Based Scalable Video Coding
Article Menu
Issue 3 (March) cover image

Export Article

Open AccessArticle
Entropy 2018, 20(3), 182; https://doi.org/10.3390/e20030182

Gaussian Optimality for Derivatives of Differential Entropy Using Linear Matrix Inequalities

1
Shanghai Institute of Microsystem and Information Technology, Chinese Academy of Sciences, Shanghai 200050, China
2
School of Information Science and Technology, ShanghaiTech University, Shanghai 201210, China
3
University of Chinese Academy of Sciences, Beijing 100049, China
4
Department of Electrical Engineering and Computer Sciences (EECS), University of California, Berkeley, CA 94720-1234, USA
5
Shanghai Institute of Fog Computing Technology, Shanghai Tech University, Shanghai 201210, China
This paper is an extended version of our paper submitted to 2018 IEEE International Symposium on Information Theory (ISIT), Vail, CO, USA, 17–22 June 2018.
*
Author to whom correspondence should be addressed.
Received: 23 January 2018 / Revised: 22 February 2018 / Accepted: 5 March 2018 / Published: 9 March 2018
(This article belongs to the Section Information Theory)
View Full-Text   |   Download PDF [792 KB, uploaded 9 March 2018]

Abstract

Let Z be a standard Gaussian random variable, X be independent of Z, and t be a strictly positive scalar. For the derivatives in t of the differential entropy of X + t Z , McKean noticed that Gaussian X achieves the extreme for the first and second derivatives, among distributions with a fixed variance, and he conjectured that this holds for general orders of derivatives. This conjecture implies that the signs of the derivatives alternate. Recently, Cheng and Geng proved that this alternation holds for the first four orders. In this work, we employ the technique of linear matrix inequalities to show that: firstly, Cheng and Geng’s method may not generalize to higher orders; secondly, when the probability density function of X + t Z is log-concave, McKean’s conjecture holds for orders up to at least five. As a corollary, we also recover Toscani’s result on the sign of the third derivative of the entropy power of X + t Z , using a much simpler argument. View Full-Text
Keywords: differential entropy; entropy power; log-concavity; linear matrix inequality; Gaussian optimality differential entropy; entropy power; log-concavity; linear matrix inequality; Gaussian optimality
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Zhang, X.; Anantharam, V.; Geng, Y. Gaussian Optimality for Derivatives of Differential Entropy Using Linear Matrix Inequalities. Entropy 2018, 20, 182.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top