Entropy 2011, 13(1), 134-170; doi:10.3390/e13010134
Article

Generalized Alpha-Beta Divergences and Their Application to Robust Nonnegative Matrix Factorization

1 Laboratory for Advanced Brain Signal Processing, Brain Science Institute, RIKEN, 2-1 Hirosawa, Wako, 351-0198 Saitama, Japan 2 Systems Research Institute, Intelligent Systems Laboratory, PAS, Newelska 6 str., 01-447 Warsaw, Poland 3 Dpto de Teoría de la Señal y Comunicaciones, University of Seville, Camino de los Descubrimientos s/n, 41092-Seville, Spain 4 Laboratory for Mathematical Neuroscience, RIKEN BSI, Wako, 351-0198 Saitama, Japan
* Authors to whom correspondence should be addressed.
Received: 13 December 2010; in revised form: 4 January 2011 / Accepted: 4 January 2011 / Published: 14 January 2011
PDF Full-text Download PDF Full-Text [847 KB, uploaded 14 January 2011 12:32 CET]
Abstract: We propose a class of multiplicative algorithms for Nonnegative Matrix Factorization (NMF) which are robust with respect to noise and outliers. To achieve this, we formulate a new family generalized divergences referred to as the Alpha-Beta-divergences (AB-divergences), which are parameterized by the two tuning parameters, alpha and beta, and smoothly connect the fundamental Alpha-, Beta- and Gamma-divergences. By adjusting these tuning parameters, we show that a wide range of standard and new divergences can be obtained. The corresponding learning algorithms for NMF are shown to integrate and generalize many existing ones, including the Lee-Seung, ISRA (Image Space Reconstruction Algorithm), EMML (Expectation Maximization Maximum Likelihood), Alpha-NMF, and Beta-NMF. Owing to more degrees of freedom in tuning the parameters, the proposed family of AB-multiplicative NMF algorithms is shown to improve robustness with respect to noise and outliers. The analysis illuminates the links of between AB-divergence and other divergences, especially Gamma- and Itakura-Saito divergences.
Keywords: nonnegative matrix factorization (NMF); robust multiplicative NMF algorithms; similarity measures; generalized divergences; Alpha-; Beta-; Gamma- divergences; extended Itakura-Saito like divergences; generalized Kullback-Leibler divergence

Article Statistics

Load and display the download statistics.

Citations to this Article

Cite This Article

MDPI and ACS Style

Cichocki, A.; Cruces, S.; Amari, S.-I. Generalized Alpha-Beta Divergences and Their Application to Robust Nonnegative Matrix Factorization. Entropy 2011, 13, 134-170.

AMA Style

Cichocki A, Cruces S, Amari S-I. Generalized Alpha-Beta Divergences and Their Application to Robust Nonnegative Matrix Factorization. Entropy. 2011; 13(1):134-170.

Chicago/Turabian Style

Cichocki, Andrzej; Cruces, Sergio; Amari, Shun-ichi. 2011. "Generalized Alpha-Beta Divergences and Their Application to Robust Nonnegative Matrix Factorization." Entropy 13, no. 1: 134-170.

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert