Next Article in Journal
Identifying the Probability Distribution of Fatigue Life Using the Maximum Entropy Principle
Previous Article in Journal
Introduction to Supersymmetric Theory of Stochastics
Article Menu

Export Article

Open AccessArticle
Entropy 2016, 18(4), 109; doi:10.3390/e18040109

An Estimator of Mutual Information and its Application to Independence Testing

Department of Mathematics, Graduate School of Science, Osaka University, Toyonaka, Osaka 560-0043, Japan
Academic Editor: Raúl Alcaraz Martínez
Received: 22 February 2016 / Revised: 14 March 2016 / Accepted: 23 March 2016 / Published: 29 March 2016
(This article belongs to the Section Information Theory)
View Full-Text   |   Download PDF [910 KB, uploaded 29 March 2016]   |  

Abstract

This paper proposes a novel estimator of mutual information for discrete and continuous variables. The main feature of this estimator is that it is zero for a large sample size n if and only if the two variables are independent. The estimator can be used to construct several histograms, compute estimations of mutual information, and choose the maximum value. We prove that the number of histograms constructed has an upper bound of O(log n) and apply this fact to the search. We compare the performance of the proposed estimator with an estimator of the Hilbert-Schmidt independence criterion (HSIC), though the proposed method is based on the minimum description length (MDL) principle and the HSIC provides a statistical test. The proposed method completes the estimation in O(n log n) time, whereas the HSIC kernel computation requires O(n3) time. We also present examples in which the HSIC fails to detect independence but the proposed method successfully detects it. View Full-Text
Keywords: mutual information; kernel; independence testing; Hilbert-Schmidt independence criterion (HSIC); minimum description length (MDL) principle; histogram mutual information; kernel; independence testing; Hilbert-Schmidt independence criterion (HSIC); minimum description length (MDL) principle; histogram
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Suzuki, J. An Estimator of Mutual Information and its Application to Independence Testing. Entropy 2016, 18, 109.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top