Next Article in Journal
Quantum Probes for Ohmic Environments at Thermal Equilibrium
Previous Article in Journal
Numerical Simulation of Entropy Generation for Power-Law Liquid Flow over a Permeable Exponential Stretched Surface with Variable Heat Source and Heat Flux
Article Menu
Issue 5 (May) cover image

Export Article

Open AccessArticle

On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means

Sony Computer Science Laboratories, Takanawa Muse Bldg., 3-14-13, Higashigotanda, Shinagawa-ku, Tokyo 141-0022, Japan
Entropy 2019, 21(5), 485; https://doi.org/10.3390/e21050485
Received: 10 April 2019 / Revised: 8 May 2019 / Accepted: 9 May 2019 / Published: 11 May 2019
(This article belongs to the Section Information Theory, Probability and Statistics)
PDF [380 KB, uploaded 11 May 2019]

Abstract

The Jensen–Shannon divergence is a renowned bounded symmetrization of the unbounded Kullback–Leibler divergence which measures the total Kullback–Leibler divergence to the average mixture distribution. However, the Jensen–Shannon divergence between Gaussian distributions is not available in closed form. To bypass this problem, we present a generalization of the Jensen–Shannon (JS) divergence using abstract means which yields closed-form expressions when the mean is chosen according to the parametric family of distributions. More generally, we define the JS-symmetrizations of any distance using parameter mixtures derived from abstract means. In particular, we first show that the geometric mean is well-suited for exponential families, and report two closed-form formula for (i) the geometric Jensen–Shannon divergence between probability densities of the same exponential family; and (ii) the geometric JS-symmetrization of the reverse Kullback–Leibler divergence between probability densities of the same exponential family. As a second illustrating example, we show that the harmonic mean is well-suited for the scale Cauchy distributions, and report a closed-form formula for the harmonic Jensen–Shannon divergence between scale Cauchy distributions. Applications to clustering with respect to these novel Jensen–Shannon divergences are touched upon.
Keywords: Jensen–Shannon divergence; Jeffreys divergence; resistor average distance; Bhattacharyya distance; f-divergence; Jensen/Burbea–Rao divergence; Bregman divergence; abstract weighted mean; quasi-arithmetic mean; mixture family; statistical M-mixture; exponential family; Gaussian family; Cauchy scale family; clustering Jensen–Shannon divergence; Jeffreys divergence; resistor average distance; Bhattacharyya distance; f-divergence; Jensen/Burbea–Rao divergence; Bregman divergence; abstract weighted mean; quasi-arithmetic mean; mixture family; statistical M-mixture; exponential family; Gaussian family; Cauchy scale family; clustering
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Nielsen, F. On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means. Entropy 2019, 21, 485.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top