Next Article in Journal
Dimensional Upgrade Approach for Spatial-Temporal Fusion of Trend Series in Subsidence Evaluation
Previous Article in Journal
Kolmogorov Complexity Based Information Measures Applied to the Analysis of Different River Flow Regimes
Open AccessReview

Log-Determinant Divergences Revisited: Alpha-Beta and Gamma Log-Det Divergences

1
Laboratory for Advanced Brain Signal Processing, Brain Science Institute, RIKEN, 2-1 Hirosawa, Wako, 351-0198 Saitama, Japan
2
Systems Research Institute, Intelligent Systems Laboratory, Newelska 6, 01-447 Warsaw, Poland
3
Dpto de Teoría de la Señal y Comunicaciones, University of Seville, Camino de los Descubrimientos s/n, 41092 Seville, Spain
4
Laboratory for Mathematical Neuroscience, RIKEN BSI, Wako, 351-0198 Saitama, Japan
*
Authors to whom correspondence should be addressed.
Academic Editor: Raúl Alcaraz Martínez
Entropy 2015, 17(5), 2988-3034; https://doi.org/10.3390/e17052988
Received: 19 December 2014 / Revised: 18 March 2015 / Accepted: 5 May 2015 / Published: 8 May 2015
(This article belongs to the Section Information Theory, Probability and Statistics)
This work reviews and extends a family of log-determinant (log-det) divergences for symmetric positive definite (SPD) matrices and discusses their fundamental properties. We show how to use parameterized Alpha-Beta (AB) and Gamma log-det divergences to generate many well-known divergences; in particular, we consider the Stein’s loss, the S-divergence, also called Jensen-Bregman LogDet (JBLD) divergence, Logdet Zero (Bhattacharyya) divergence, Affine Invariant Riemannian Metric (AIRM), and other divergences. Moreover, we establish links and correspondences between log-det divergences and visualise them on an alpha-beta plane for various sets of parameters. We use this unifying framework to interpret and extend existing similarity measures for semidefinite covariance matrices in finite-dimensional Reproducing Kernel Hilbert Spaces (RKHS). This paper also shows how the Alpha-Beta family of log-det divergences relates to the divergences of multivariate and multilinear normal distributions. Closed form formulas are derived for Gamma divergences of two multivariate Gaussian densities; the special cases of the Kullback-Leibler, Bhattacharyya, Rényi, and Cauchy-Schwartz divergences are discussed. Symmetrized versions of log-det divergences are also considered and briefly reviewed. Finally, a class of divergences is extended to multiway divergences for separable covariance (or precision) matrices. View Full-Text
Keywords: Similarity measures; generalized divergences for symmetric positive definite (covariance) matrices; Stein’s loss; Burg’s matrix divergence; Affine Invariant Riemannian Metric (AIRM); Riemannian metric; geodesic distance; Jensen-Bregman LogDet (JBLD); S-divergence; LogDet Zero divergence; Jeffrey’s KL divergence; symmetrized KL Divergence Metric (KLDM); Alpha-Beta Log-Det divergences; Gamma divergences; Hilbert projective metric and their extensions Similarity measures; generalized divergences for symmetric positive definite (covariance) matrices; Stein’s loss; Burg’s matrix divergence; Affine Invariant Riemannian Metric (AIRM); Riemannian metric; geodesic distance; Jensen-Bregman LogDet (JBLD); S-divergence; LogDet Zero divergence; Jeffrey’s KL divergence; symmetrized KL Divergence Metric (KLDM); Alpha-Beta Log-Det divergences; Gamma divergences; Hilbert projective metric and their extensions
MDPI and ACS Style

Cichocki, A.; Cruces, S.; Amari, S.-I. Log-Determinant Divergences Revisited: Alpha-Beta and Gamma Log-Det Divergences. Entropy 2015, 17, 2988-3034.

Show more citation formats Show less citations formats

Article Access Map by Country/Region

1
Only visits after 24 November 2015 are recorded.
Back to TopTop