Next Article in Journal
Dynamic and Thermodynamic Properties of a CA Engine with Non-Instantaneous Adiabats
Next Article in Special Issue
Divergence from, and Convergence to, Uniformity of Probability Density Quantiles
Previous Article in Journal
Metacomputable
Article Menu
Issue 11 (November) cover image

Export Article

Open AccessArticle
Entropy 2017, 19(11), 631; https://doi.org/10.3390/e19110631

On Normalized Mutual Information: Measure Derivations and Properties

1
Department of Mechanical Engineering, University of Minnesota, Minneapolis, MN 55455, USA
2
Department of Industrial & Systems Engineering, University of Minnesota, Minneapolis, MN 55455, USA
Received: 26 October 2017 / Revised: 12 November 2017 / Accepted: 20 November 2017 / Published: 22 November 2017
(This article belongs to the Special Issue Entropy: From Physics to Information Sciences and Geometry)
View Full-Text   |   Download PDF [278 KB, uploaded 24 November 2017]

Abstract

Starting with a new formulation for the mutual information (MI) between a pair of events, this paper derives alternative upper bounds and extends those to the case of two discrete random variables. Normalized mutual information (NMI) measures are then obtained from those bounds, emphasizing the use of least upper bounds. Conditional NMI measures are also derived for three different events and three different random variables. Since the MI formulation for a pair of events is always nonnegative, it can properly be extended to include weighted MI and NMI measures for pairs of events or for random variables that are analogous to the well-known weighted entropy. This weighted MI is generalized to the case of continuous random variables. Such weighted measures have the advantage over previously proposed measures of always being nonnegative. A simple transformation is derived for the NMI, such that the transformed measures have the value-validity property necessary for making various appropriate comparisons between values of those measures. A numerical example is provided. View Full-Text
Keywords: mutual information; normalized mutual information; association measures; similarity measures; value validity mutual information; normalized mutual information; association measures; similarity measures; value validity
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Kvålseth, T.O. On Normalized Mutual Information: Measure Derivations and Properties. Entropy 2017, 19, 631.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top