Open AccessThis article is
- freely available
Machine Learning with Squared-Loss Mutual Information
Department of Computer Science, Tokyo Institute of Technology 2-12-1 O-okayama, Meguro-ku,Tokyo 152-8552, Japan
Received: 29 October 2012; in revised form: 7 December 2012 / Accepted: 21 December 2012 / Published: 27 December 2012
Abstract: Mutual information (MI) is useful for detecting statistical independence between random variables, and it has been successfully applied to solving various machine learning problems. Recently, an alternative to MI called squared-loss MI (SMI) was introduced. While ordinary MI is the Kullback–Leibler divergence from the joint distribution to the product of the marginal distributions, SMI is its Pearson divergence variant. Because both the divergences belong to the ƒ-divergence family, they share similar theoretical properties. However, a notable advantage of SMI is that it can be approximated from data in a computationally more efficient and numerically more stable way than ordinary MI. In this article, we review recent development in SMI approximation based on direct density-ratio estimation and SMI-based machine learning techniques such as independence testing, dimensionality reduction, canonical dependency analysis, independent component analysis, object matching, clustering, and causal inference.
Keywords: squared-loss mutual information; Pearson divergence; density-ratio estimation; independence testing; dimensionality reduction; independent component analysis; object matching; clustering; causal inference; machine learning
Article StatisticsClick here to load and display the download statistics.
Notes: Multiple requests from the same IP address are counted as one view.
Cite This Article
MDPI and ACS Style
Sugiyama, M. Machine Learning with Squared-Loss Mutual Information. Entropy 2013, 15, 80-112.
Sugiyama M. Machine Learning with Squared-Loss Mutual Information. Entropy. 2013; 15(1):80-112.
Sugiyama, Masashi. 2013. "Machine Learning with Squared-Loss Mutual Information." Entropy 15, no. 1: 80-112.