Next Article in Journal
Finite-Time Thermoeconomic Optimization of a Solar-Driven Heat Engine Model
Previous Article in Journal
Mean-Variance-Skewness-Entropy Measures: A Multi-Objective Approach for Portfolio Selection
Article Menu

Export Article

Open AccessArticle
Entropy 2011, 13(1), 134-170; https://doi.org/10.3390/e13010134

Generalized Alpha-Beta Divergences and Their Application to Robust Nonnegative Matrix Factorization

1
Laboratory for Advanced Brain Signal Processing, Brain Science Institute, RIKEN, 2-1 Hirosawa, Wako, 351-0198 Saitama, Japan
2
Systems Research Institute, Intelligent Systems Laboratory, PAS, Newelska 6 str., 01-447 Warsaw, Poland
3
Dpto de Teoría de la Señal y Comunicaciones, University of Seville, Camino de los Descubrimientos s/n, 41092-Seville, Spain
4
Laboratory for Mathematical Neuroscience, RIKEN BSI, Wako, 351-0198 Saitama, Japan
*
Authors to whom correspondence should be addressed.
Received: 13 December 2010 / Revised: 4 January 2011 / Accepted: 4 January 2011 / Published: 14 January 2011
View Full-Text   |   Download PDF [847 KB, uploaded 24 February 2015]   |  

Abstract

We propose a class of multiplicative algorithms for Nonnegative Matrix Factorization (NMF) which are robust with respect to noise and outliers. To achieve this, we formulate a new family generalized divergences referred to as the Alpha-Beta-divergences (AB-divergences), which are parameterized by the two tuning parameters, alpha and beta, and smoothly connect the fundamental Alpha-, Beta- and Gamma-divergences. By adjusting these tuning parameters, we show that a wide range of standard and new divergences can be obtained. The corresponding learning algorithms for NMF are shown to integrate and generalize many existing ones, including the Lee-Seung, ISRA (Image Space Reconstruction Algorithm), EMML (Expectation Maximization Maximum Likelihood), Alpha-NMF, and Beta-NMF. Owing to more degrees of freedom in tuning the parameters, the proposed family of AB-multiplicative NMF algorithms is shown to improve robustness with respect to noise and outliers. The analysis illuminates the links of between AB-divergence and other divergences, especially Gamma- and Itakura-Saito divergences. View Full-Text
Keywords: nonnegative matrix factorization (NMF); robust multiplicative NMF algorithms; similarity measures; generalized divergences; Alpha-; Beta-; Gamma- divergences; extended Itakura-Saito like divergences; generalized Kullback-Leibler divergence nonnegative matrix factorization (NMF); robust multiplicative NMF algorithms; similarity measures; generalized divergences; Alpha-; Beta-; Gamma- divergences; extended Itakura-Saito like divergences; generalized Kullback-Leibler divergence
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License (CC BY 3.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Cichocki, A.; Cruces, S.; Amari, S.-I. Generalized Alpha-Beta Divergences and Their Application to Robust Nonnegative Matrix Factorization. Entropy 2011, 13, 134-170.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top