Next Article in Journal / Special Issue
Fourth Order Diffusion Equations with Increasing Entropy
Previous Article in Journal
Case Studies and Benchmark Examples for the Use of Grading Entropy in Geotechnics
Previous Article in Special Issue
Deterministic Thermal Reservoirs
Article

Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties

Instituto Dom Luiz, Faculdade de Ciências, University of Lisbon, DEGGE, Ed. C8, Campo-Grande, 1749-016 Lisbon, Portugal
*
Author to whom correspondence should be addressed.
Entropy 2012, 14(6), 1103-1126; https://doi.org/10.3390/e14061103
Received: 20 May 2012 / Revised: 8 June 2012 / Accepted: 15 June 2012 / Published: 19 June 2012
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), I(X,Y), between random variables X,Y, which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaussian marginal distributions, it allows for the MI decomposition into two positive terms: the Gaussian MI (Ig), depending upon the Gaussian correlation or the correlation between ‘Gaussianized variables’, and a non‑Gaussian MI (Ing), coinciding with joint negentropy and depending upon nonlinear correlations. Joint moments of a prescribed total order p are bounded within a compact set defined by Schwarz-like inequalities, where Ing grows from zero at the ‘Gaussian manifold’ where moments are those of Gaussian distributions, towards infinity at the set’s boundary where a deterministic relationship holds. Sources of joint non-Gaussianity have been systematized by estimating Ing between the input and output from a nonlinear synthetic channel contaminated by multiplicative and non-Gaussian additive noises for a full range of signal-to-noise ratio (snr) variances. We have studied the effect of varying snr on Ig and Ing under several signal/noise scenarios. View Full-Text
Keywords: mutual information; non-Gaussianity; maximum entropy distributions; non‑Gaussian noise mutual information; non-Gaussianity; maximum entropy distributions; non‑Gaussian noise
Show Figures

Figure 1

MDPI and ACS Style

Pires, C.A.L.; Perdigão, R.A.P. Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties. Entropy 2012, 14, 1103-1126. https://doi.org/10.3390/e14061103

AMA Style

Pires CAL, Perdigão RAP. Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties. Entropy. 2012; 14(6):1103-1126. https://doi.org/10.3390/e14061103

Chicago/Turabian Style

Pires, Carlos A. L., and Rui A. P. Perdigão. 2012. "Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties" Entropy 14, no. 6: 1103-1126. https://doi.org/10.3390/e14061103

Find Other Styles

Article Access Map by Country/Region

1
Only visits after 24 November 2015 are recorded.
Back to TopTop