Next Article in Journal
Dynamic and Thermodynamic Properties of a CA Engine with Non-Instantaneous Adiabats
Next Article in Special Issue
Divergence from, and Convergence to, Uniformity of Probability Density Quantiles
Previous Article in Journal
Metacomputable
Open AccessArticle

On Normalized Mutual Information: Measure Derivations and Properties

1
Department of Mechanical Engineering, University of Minnesota, Minneapolis, MN 55455, USA
2
Department of Industrial & Systems Engineering, University of Minnesota, Minneapolis, MN 55455, USA
Entropy 2017, 19(11), 631; https://doi.org/10.3390/e19110631
Received: 26 October 2017 / Revised: 12 November 2017 / Accepted: 20 November 2017 / Published: 22 November 2017
(This article belongs to the Special Issue Entropy: From Physics to Information Sciences and Geometry)
Starting with a new formulation for the mutual information (MI) between a pair of events, this paper derives alternative upper bounds and extends those to the case of two discrete random variables. Normalized mutual information (NMI) measures are then obtained from those bounds, emphasizing the use of least upper bounds. Conditional NMI measures are also derived for three different events and three different random variables. Since the MI formulation for a pair of events is always nonnegative, it can properly be extended to include weighted MI and NMI measures for pairs of events or for random variables that are analogous to the well-known weighted entropy. This weighted MI is generalized to the case of continuous random variables. Such weighted measures have the advantage over previously proposed measures of always being nonnegative. A simple transformation is derived for the NMI, such that the transformed measures have the value-validity property necessary for making various appropriate comparisons between values of those measures. A numerical example is provided. View Full-Text
Keywords: mutual information; normalized mutual information; association measures; similarity measures; value validity mutual information; normalized mutual information; association measures; similarity measures; value validity
MDPI and ACS Style

Kvålseth, T.O. On Normalized Mutual Information: Measure Derivations and Properties. Entropy 2017, 19, 631.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop