Next Article in Journal / Special Issue
Fourth Order Diffusion Equations with Increasing Entropy
Previous Article in Journal
Case Studies and Benchmark Examples for the Use of Grading Entropy in Geotechnics
Previous Article in Special Issue
Deterministic Thermal Reservoirs
Article Menu

Export Article

Open AccessArticle
Entropy 2012, 14(6), 1103-1126; doi:10.3390/e14061103

Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties

Instituto Dom Luiz, Faculdade de Ciências, University of Lisbon, DEGGE, Ed. C8, Campo-Grande, 1749-016 Lisbon, Portugal
*
Author to whom correspondence should be addressed.
Received: 20 May 2012 / Revised: 8 June 2012 / Accepted: 15 June 2012 / Published: 19 June 2012
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
View Full-Text   |   Download PDF [663 KB, uploaded 24 February 2015]   |  

Abstract

The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), I(X,Y), between random variables X,Y, which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaussian marginal distributions, it allows for the MI decomposition into two positive terms: the Gaussian MI (Ig), depending upon the Gaussian correlation or the correlation between ‘Gaussianized variables’, and a non‑Gaussian MI (Ing), coinciding with joint negentropy and depending upon nonlinear correlations. Joint moments of a prescribed total order p are bounded within a compact set defined by Schwarz-like inequalities, where Ing grows from zero at the ‘Gaussian manifold’ where moments are those of Gaussian distributions, towards infinity at the set’s boundary where a deterministic relationship holds. Sources of joint non-Gaussianity have been systematized by estimating Ing between the input and output from a nonlinear synthetic channel contaminated by multiplicative and non-Gaussian additive noises for a full range of signal-to-noise ratio (snr) variances. We have studied the effect of varying snr on Ig and Ing under several signal/noise scenarios. View Full-Text
Keywords: mutual information; non-Gaussianity; maximum entropy distributions; non‑Gaussian noise mutual information; non-Gaussianity; maximum entropy distributions; non‑Gaussian noise
This is an open access article distributed under the Creative Commons Attribution License (CC BY 3.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Pires, C.A.L.; Perdigão, R.A.P. Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties. Entropy 2012, 14, 1103-1126.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top