Next Article in Journal
Generalized Statistical Mechanics at the Onset of Chaos
Previous Article in Journal
The κ-Generalizations of Stirling Approximation and Multinominal Coefficients
Previous Article in Special Issue
Estimating Functions of Distributions Defined over Spaces of Unknown Size
Article Menu

Export Article

Open AccessArticle
Entropy 2013, 15(12), 5154-5177; doi:10.3390/e15125154

Non–Parametric Estimation of Mutual Information through the Entropy of the Linkage

Department of Mathematics, University of Torino, Via Carlo Alberto 10, Torino 10123, Italy
*
Authors to whom correspondence should be addressed.
Received: 25 September 2013 / Revised: 30 October 2013 / Accepted: 11 November 2013 / Published: 26 November 2013
(This article belongs to the Special Issue Estimating Information-Theoretic Quantities from Data)
View Full-Text   |   Download PDF [376 KB, uploaded 24 February 2015]   |  

Abstract

A new, non–parametric and binless estimator for the mutual information of a d–dimensional random vector is proposed. First of all, an equation that links the mutual information to the entropy of a suitable random vector with uniformly distributed components is deduced. When d = 2 this equation reduces to the well known connection between mutual information and entropy of the copula function associated to the original random variables. Hence, the problem of estimating the mutual information of the original random vector is reduced to the estimation of the entropy of a random vector obtained through a multidimensional transformation. The estimator we propose is a two–step method: first estimate the transformation and obtain the transformed sample, then estimate its entropy. The properties of the new estimator are discussed through simulation examples and its performances are compared to those of the best estimators in the literature. The precision of the estimator converges to values of the same order of magnitude of the best estimator tested. However, the new estimator is unbiased even for larger dimensions and smaller sample sizes, while the other tested estimators show a bias in these cases.
Keywords: information measures; mutual information; entropy; copula function; linkage function; kernel method; binless estimator information measures; mutual information; entropy; copula function; linkage function; kernel method; binless estimator
This is an open access article distributed under the Creative Commons Attribution License (CC BY 3.0).

Supplementary material

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Giraudo, M.T.; Sacerdote, L.; Sirovich, R. Non–Parametric Estimation of Mutual Information through the Entropy of the Linkage. Entropy 2013, 15, 5154-5177.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top