Gaussian Optimality for Derivatives of Differential Entropy Using Linear Matrix Inequalities†
Shanghai Institute of Microsystem and Information Technology, Chinese Academy of Sciences, Shanghai 200050, China
School of Information Science and Technology, ShanghaiTech University, Shanghai 201210, China
University of Chinese Academy of Sciences, Beijing 100049, China
Department of Electrical Engineering and Computer Sciences (EECS), University of California, Berkeley, CA 94720-1234, USA
Shanghai Institute of Fog Computing Technology, Shanghai Tech University, Shanghai 201210, China
This paper is an extended version of our paper submitted to 2018 IEEE International Symposium on Information Theory (ISIT), Vail, CO, USA, 17–22 June 2018.
Author to whom correspondence should be addressed.
Received: 23 January 2018 / Revised: 22 February 2018 / Accepted: 5 March 2018 / Published: 9 March 2018
be a standard Gaussian random variable, X
be independent of Z
, and t
be a strictly positive scalar. For the derivatives in t
of the differential entropy of
, McKean noticed that Gaussian X
achieves the extreme for the first and second derivatives, among distributions with a fixed variance, and he conjectured that this holds for general orders of derivatives. This conjecture implies that the signs of the derivatives alternate. Recently, Cheng and Geng proved that this alternation holds for the first four orders. In this work, we employ the technique of linear matrix inequalities to show that: firstly, Cheng and Geng’s method may not generalize to higher orders; secondly, when the probability density function of
is log-concave, McKean’s conjecture holds for orders up to at least five. As a corollary, we also recover Toscani’s result on the sign of the third derivative of the entropy power of
, using a much simpler argument.
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
Share & Cite This Article
MDPI and ACS Style
Zhang, X.; Anantharam, V.; Geng, Y. Gaussian Optimality for Derivatives of Differential Entropy Using Linear Matrix Inequalities. Entropy 2018, 20, 182.
Zhang X, Anantharam V, Geng Y. Gaussian Optimality for Derivatives of Differential Entropy Using Linear Matrix Inequalities. Entropy. 2018; 20(3):182.
Zhang, Xiaobing; Anantharam, Venkat; Geng, Yanlin. 2018. "Gaussian Optimality for Derivatives of Differential Entropy Using Linear Matrix Inequalities." Entropy 20, no. 3: 182.
Show more citation formats
Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.
[Return to top]
For more information on the journal statistics, click here
Multiple requests from the same IP address are counted as one view.