Entropy 2013, 15(3), 721-752; doi:10.3390/e15030721
Article

Minimum Mutual Information and Non-Gaussianity through the Maximum Entropy Method: Estimation from Finite Samples

1 Instituto Dom Luiz (IDL), University of Lisbon (UL), Lisbon, P-1749-016, Portugal 2 Institute of Hydraulic Engineering and Water Resources Management, Vienna University of Technology, Vienna, A-1040, Austria
* Author to whom correspondence should be addressed.
Received: 8 November 2012; in revised form: 15 February 2013 / Accepted: 19 February 2013 / Published: 25 February 2013
(This article belongs to the Special Issue Estimating Information-Theoretic Quantities from Data)
PDF Full-text Download PDF Full-Text [637 KB, uploaded 25 February 2013 11:56 CET]
Abstract: The Minimum Mutual Information (MinMI) Principle provides the least committed, maximum-joint-entropy (ME) inferential law that is compatible with prescribed marginal distributions and empirical cross constraints. Here, we estimate MI bounds (the MinMI values) generated by constraining sets Tcr comprehended by mcr linear and/or nonlinear joint expectations, computed from samples of N iid outcomes. Marginals (and their entropy) are imposed by single morphisms of the original random variables. N-asymptotic formulas are given both for the distribution of cross expectation’s estimation errors, the MinMI estimation bias, its variance and distribution. A growing Tcr leads to an increasing MinMI, converging eventually to the total MI. Under N-sized samples, the MinMI increment relative to two encapsulated sets Tcr1 Tcr2 (with numbers of constraints mcr1<mcr2 ) is the test-difference δH = Hmax 1, N - Hmax 2, N ≥ 0  between the two respective estimated MEs. Asymptotically, δH follows a Chi-Squared distribution 1/2NΧ2 (mcr2-mcr1) whose upper quantiles determine if constraints in Tcr2/Tcr1 explain significant extra MI. As an example, we have set marginals to being normally distributed (Gaussian) and have built a sequence of MI bounds, associated to successive non-linear correlations due to joint non-Gaussianity. Noting that in real-world situations available sample sizes can be rather low, the relationship between MinMI bias, probability density over-fitting and outliers is put in evidence for under-sampled data.
Keywords: mutual information; non-Gaussianity; maximum entropy distributions; Entropy bias; mutual information distribution; morphism

Article Statistics

Load and display the download statistics.

Citations to this Article

Cite This Article

MDPI and ACS Style

Pires, C.A.L.; Perdigão, R.A.P. Minimum Mutual Information and Non-Gaussianity through the Maximum Entropy Method: Estimation from Finite Samples. Entropy 2013, 15, 721-752.

AMA Style

Pires CAL, Perdigão RAP. Minimum Mutual Information and Non-Gaussianity through the Maximum Entropy Method: Estimation from Finite Samples. Entropy. 2013; 15(3):721-752.

Chicago/Turabian Style

Pires, Carlos A.L.; Perdigão, Rui A.P. 2013. "Minimum Mutual Information and Non-Gaussianity through the Maximum Entropy Method: Estimation from Finite Samples." Entropy 15, no. 3: 721-752.

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert