Next Article in Journal
The Relation between Granger Causality and Directed Information Theory: A Review
Next Article in Special Issue
Minimum Mutual Information and Non-Gaussianity through the Maximum Entropy Method: Estimation from Finite Samples
Previous Article in Journal
Ordered Regions within a Nonlinear Time Series Solution of a Lorenz Form of the Townsend Equations for a Boundary-Layer Flow
Article Menu

Export Article

Open AccessReview
Entropy 2013, 15(1), 80-112; doi:10.3390/e15010080

Machine Learning with Squared-Loss Mutual Information

Department of Computer Science, Tokyo Institute of Technology 2-12-1 O-okayama, Meguro-ku,Tokyo 152-8552, Japan
Received: 29 October 2012 / Revised: 7 December 2012 / Accepted: 21 December 2012 / Published: 27 December 2012
(This article belongs to the Special Issue Estimating Information-Theoretic Quantities from Data)
View Full-Text   |   Download PDF [350 KB, uploaded 24 February 2015]

Abstract

Mutual information (MI) is useful for detecting statistical independence between random variables, and it has been successfully applied to solving various machine learning problems. Recently, an alternative to MI called squared-loss MI (SMI) was introduced. While ordinary MI is the Kullback–Leibler divergence from the joint distribution to the product of the marginal distributions, SMI is its Pearson divergence variant. Because both the divergences belong to the ƒ-divergence family, they share similar theoretical properties. However, a notable advantage of SMI is that it can be approximated from data in a computationally more efficient and numerically more stable way than ordinary MI. In this article, we review recent development in SMI approximation based on direct density-ratio estimation and SMI-based machine learning techniques such as independence testing, dimensionality reduction, canonical dependency analysis, independent component analysis, object matching, clustering, and causal inference.
Keywords: squared-loss mutual information; Pearson divergence; density-ratio estimation; independence testing; dimensionality reduction; independent component analysis; object matching; clustering; causal inference; machine learning squared-loss mutual information; Pearson divergence; density-ratio estimation; independence testing; dimensionality reduction; independent component analysis; object matching; clustering; causal inference; machine learning
This is an open access article distributed under the Creative Commons Attribution License (CC BY 3.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Sugiyama, M. Machine Learning with Squared-Loss Mutual Information. Entropy 2013, 15, 80-112.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top