Next Article in Journal
Next Article in Special Issue
Previous Article in Journal
Previous Article in Special Issue
Entropy 2012, 14(6), 1103-1126; doi:10.3390/e14061103
Article

Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties

*  and
Received: 20 May 2012; in revised form: 8 June 2012 / Accepted: 15 June 2012 / Published: 19 June 2012
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Download PDF [663 KB, uploaded 19 June 2012]
Abstract: The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), I(X,Y), between random variables X,Y, which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaussian marginal distributions, it allows for the MI decomposition into two positive terms: the Gaussian MI (Ig), depending upon the Gaussian correlation or the correlation between ‘Gaussianized variables’, and a non‑Gaussian MI (Ing), coinciding with joint negentropy and depending upon nonlinear correlations. Joint moments of a prescribed total order p are bounded within a compact set defined by Schwarz-like inequalities, where Ing grows from zero at the ‘Gaussian manifold’ where moments are those of Gaussian distributions, towards infinity at the set’s boundary where a deterministic relationship holds. Sources of joint non-Gaussianity have been systematized by estimating Ing between the input and output from a nonlinear synthetic channel contaminated by multiplicative and non-Gaussian additive noises for a full range of signal-to-noise ratio (snr) variances. We have studied the effect of varying snr on Ig and Ing under several signal/noise scenarios.
Keywords: mutual information; non-Gaussianity; maximum entropy distributions; non‑Gaussian noise mutual information; non-Gaussianity; maximum entropy distributions; non‑Gaussian noise
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Export to BibTeX |
EndNote


MDPI and ACS Style

Pires, C.A.L.; Perdigão, R.A.P. Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties. Entropy 2012, 14, 1103-1126.

AMA Style

Pires CAL, Perdigão RAP. Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties. Entropy. 2012; 14(6):1103-1126.

Chicago/Turabian Style

Pires, Carlos A. L.; Perdigão, Rui A. P. 2012. "Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties." Entropy 14, no. 6: 1103-1126.


Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert