Next Article in Journal
Structure Extension of Tree-Augmented Naive Bayes
Next Article in Special Issue
Two Measures of Dependence
Previous Article in Journal
Multi-Party Quantum Summation Based on Quantum Teleportation
Open AccessArticle

Empirical Estimation of Information Measures: A Literature Guide

Independent Researcher, Princeton, NJ 08540, USA
Entropy 2019, 21(8), 720; https://doi.org/10.3390/e21080720
Received: 16 May 2019 / Revised: 10 June 2019 / Accepted: 11 June 2019 / Published: 24 July 2019
(This article belongs to the Special Issue Information Measures with Applications)
We give a brief survey of the literature on the empirical estimation of entropy, differential entropy, relative entropy, mutual information and related information measures. While those quantities are of central importance in information theory, universal algorithms for their estimation are increasingly important in data science, machine learning, biology, neuroscience, economics, language, and other experimental sciences. View Full-Text
Keywords: information measures; empirical estimators; entropy; relative entropy; mutual information; universal estimation information measures; empirical estimators; entropy; relative entropy; mutual information; universal estimation
Show Figures

Figure 1

MDPI and ACS Style

Verdú, S. Empirical Estimation of Information Measures: A Literature Guide. Entropy 2019, 21, 720.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map

1
Back to TopTop