Next Article in Journal
Nonequilibrium Thermodynamics and Scale Invariance
Next Article in Special Issue
Information Submanifold Based on SPD Matrices and Its Applications to Sensor Networks
Previous Article in Journal
Packer Detection for Multi-Layer Executables Using Entropy Analysis
Previous Article in Special Issue
Witnessing Multipartite Entanglement by Detecting Asymmetry
Article Menu
Issue 3 (March) cover image

Export Article

Open AccessArticle
Entropy 2017, 19(3), 122; doi:10.3390/e19030122

On Hölder Projective Divergences

1
Computer Science Department LIX, École Polytechnique, 91128 Palaiseau Cedex, France
2
Sony Computer Science Laboratories Inc., Tokyo 141-0022, Japan
3
Computer, Electrical and Mathematical Sciences and Engineering Division, King Abdullah University of Science and Technology, Thuwal 23955, Saudi Arabia
4
Computer Vision and Multimedia Laboratory (Viper), University of Geneva, CH-1211 Geneva, Switzerland
*
Author to whom correspondence should be addressed.
Received: 20 January 2017 / Revised: 8 March 2017 / Accepted: 10 March 2017 / Published: 16 March 2017
(This article belongs to the Special Issue Information Geometry II)
View Full-Text   |   Download PDF [6948 KB, uploaded 17 March 2017]   |  

Abstract

We describe a framework to build distances by measuring the tightness of inequalities and introduce the notion of proper statistical divergences and improper pseudo-divergences. We then consider the Hölder ordinary and reverse inequalities and present two novel classes of Hölder divergences and pseudo-divergences that both encapsulate the special case of the Cauchy–Schwarz divergence. We report closed-form formulas for those statistical dissimilarities when considering distributions belonging to the same exponential family provided that the natural parameter space is a cone (e.g., multivariate Gaussians) or affine (e.g., categorical distributions). Those new classes of Hölder distances are invariant to rescaling and thus do not require distributions to be normalized. Finally, we show how to compute statistical Hölder centroids with respect to those divergences and carry out center-based clustering toy experiments on a set of Gaussian distributions which demonstrate empirically that symmetrized Hölder divergences outperform the symmetric Cauchy–Schwarz divergence. View Full-Text
Keywords: Hölder inequalities; Hölder divergences; projective divergences; Cauchy–Schwarz divergence; Hölder escort divergences; skew Bhattacharyya divergences; exponential families; conic exponential families; escort distribution; clustering Hölder inequalities; Hölder divergences; projective divergences; Cauchy–Schwarz divergence; Hölder escort divergences; skew Bhattacharyya divergences; exponential families; conic exponential families; escort distribution; clustering
Figures

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Nielsen, F.; Sun, K.; Marchand-Maillet, S. On Hölder Projective Divergences. Entropy 2017, 19, 122.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top