Skip to Content

11,447 Results Found

  • Article
  • Open Access
2,014 Views
22 Pages

Tight Bounds Between the Jensen–Shannon Divergence and the Minmax Divergence

  • Arseniy Akopyan,
  • Herbert Edelsbrunner,
  • Žiga Virk and
  • Hubert Wagner

11 August 2025

Motivated by questions arising at the intersection of information theory and geometry, we compare two dissimilarity measures between finite categorical distributions. One is the well-known Jensen–Shannon divergence, which is easy to compute and...

  • Article
  • Open Access
5 Citations
3,976 Views
17 Pages

α-Geodesical Skew Divergence

  • Masanari Kimura and
  • Hideitsu Hino

25 April 2021

The asymmetric skew divergence smooths one of the distributions by mixing it, to a degree determined by the parameter λ, with the other distribution. Such divergence is an approximation of the KL divergence that does not require the target distributi...

  • Article
  • Open Access
5 Citations
6,421 Views
21 Pages

The divergence or relative entropy between probability densities is examined. Solutions that minimise the divergence between two distributions are usually “trivial” or unique. By using a fractional-order formulation for the divergence with respect to...

  • Article
  • Open Access
15 Citations
5,314 Views
16 Pages

Geometry Induced by a Generalization of Rényi Divergence

  • David C. De Souza,
  • Rui F. Vigelis and
  • Charles C. Cavalcante

17 November 2016

In this paper, we propose a generalization of Rényi divergence, and then we investigate its induced geometry. This generalization is given in terms of a φ-function, the same function that is used in the definition of non-parametric φ-families. The pr...

  • Article
  • Open Access
121 Citations
18,037 Views
24 Pages

16 February 2020

The Jensen–Shannon divergence is a renown bounded symmetrization of the Kullback–Leibler divergence which does not require probability densities to have matching supports. In this paper, we introduce a vector-skew generalization of the sc...

  • Review
  • Open Access
5 Citations
6,349 Views
13 Pages

11 June 2010

In the last years minimum phi-divergence estimators (MϕE) and phi-divergence test statistics (ϕTS) have been introduced as a very good alternative to classical likelihood ratio test and maximum likelihood estimator for different statistical problems....

  • Article
  • Open Access
143 Views
36 Pages

On Minimum Bregman Divergence Inference

  • Soumik Purkayastha and
  • Ayanendranath Basu

13 February 2026

The density power divergence (DPD) is a well-studied member of the Bregman divergence family and forms the basis of widely used minimum divergence estimators that balance efficiency and robustness. In this paper, we introduce and study a new sub-clas...

  • Feature Paper
  • Article
  • Open Access
1,317 Views
32 Pages

30 June 2024

This paper establishes a general framework for measuring statistical divergence. Namely, with regard to a pair of random variables that share a common range of values: quantifying the distance of the statistical distribution of one random variable fr...

  • Article
  • Open Access
61 Citations
9,449 Views
25 Pages

6 June 2018

When it is acknowledged that all candidate parameterised statistical models are misspecified relative to the data generating process, the decision maker (DM) must currently concern themselves with inference for the parameter value minimising the Kull...

  • Article
  • Open Access
6 Citations
3,885 Views
22 Pages

6 September 2024

The Kullback–Leibler (KL) divergence is a widely used measure for comparing probability distributions, but it faces limitations such as its unbounded nature and the lack of comparability between distributions with different quantum values (the...

  • Article
  • Open Access
20 Citations
6,347 Views
36 Pages

18 May 2020

The relative entropy and the chi-squared divergence are fundamental divergence measures in information theory and statistics. This paper is focused on a study of integral relations between the two divergences, the implications of these relations, the...

  • Article
  • Open Access
2 Citations
5,888 Views
27 Pages

9 September 2017

In image and signal processing, the beta-divergence is well known as a similarity measure between two positive objects. However, it is unclear whether or not the distance-like structure of beta-divergence is preserved, if we extend the domain of the...

  • Article
  • Open Access
20 Citations
5,848 Views
39 Pages

Ensemble Estimation of Information Divergence

  • Kevin R. Moon,
  • Kumar Sricharan,
  • Kristjan Greenewald and
  • Alfred O. Hero

27 July 2018

Recent work has focused on the problem of nonparametric estimation of information divergence functionals between two continuous random variables. Many existing approaches require either restrictive assumptions about the density support set or difficu...

  • Article
  • Open Access
4 Citations
6,772 Views
16 Pages

4 February 2017

This paper investigates the potential of conceptual divergences within and between languages for providing intellectual resources for theorizing. Specifically, it explores the role of multilingual researchers in using the possibilities of the plurali...

  • Article
  • Open Access
16 Citations
7,975 Views
21 Pages

Duality of Maximum Entropy and Minimum Divergence

  • Shinto Eguchi,
  • Osamu Komori and
  • Atsumi Ohara

26 June 2014

We discuss a special class of generalized divergence measures by the use of generator functions. Any divergence measure in the class is separated into the difference between cross and diagonal entropy. The diagonal entropy measure in the class associ...

  • Article
  • Open Access
7 Citations
3,958 Views
26 Pages

Discriminant Analysis under f-Divergence Measures

  • Anmol Dwivedi,
  • Sihui Wang and
  • Ali Tajer

27 January 2022

In statistical inference, the information-theoretic performance limits can often be expressed in terms of a statistical divergence between the underlying statistical models (e.g., in binary hypothesis testing, the error probability is related to the...

  • Feature Paper
  • Article
  • Open Access
3 Citations
4,152 Views
9 Pages

16 November 2021

A non-uniform (skewed) mixture of probability density functions occurs in various disciplines. One needs a measure of similarity to the respective constituents and its bounds. We introduce a skewed Jensen–Fisher divergence based on relative Fisher in...

  • Article
  • Open Access
32 Citations
7,248 Views
21 Pages

Robust and Sparse Regression via γ-Divergence

  • Takayuki Kawashima and
  • Hironori Fujisawa

13 November 2017

In high-dimensional data, many sparse regression methods have been proposed. However, they may not be robust against outliers. Recently, the use of density power weight has been studied for robust parameter estimation, and the corresponding divergenc...

  • Article
  • Open Access
2 Citations
958 Views
14 Pages

On the Horizontal Divergence Asymmetry in the Gulf of Mexico

  • Tianshu Zhou,
  • Jin-Han Xie and
  • Dhruv Balwada

17 January 2025

Due to the geostrophic balance, horizontal divergence-free is often assumed when analyzing large-scale oceanic flows. However, the geostrophic balance is a leading-order approximation. We investigate the statistical feature of weak horizontal compres...

  • Article
  • Open Access
28 Citations
11,089 Views
38 Pages

8 July 2011

Generalisation error estimation is an important issue in machine learning. Cross-validation traditionally used for this purpose requires building multiple models and repeating the whole procedure many times in order to produce reliable error estimate...

  • Article
  • Open Access
17 Citations
5,724 Views
14 Pages

21 January 2016

This paper studies contrastive divergence (CD) learning algorithm and proposes a new algorithm for training restricted Boltzmann machines (RBMs). We derive that CD is a biased estimator of the log-likelihood gradient method and make an analysis of th...

  • Article
  • Open Access
3 Citations
2,482 Views
25 Pages

Multivariate Shortfall and Divergence Risk Statistics

  • Haiyan Song,
  • Xianfu Zeng,
  • Yanhong Chen and
  • Yijun Hu

24 October 2019

The aim of this paper is to construct two new classes of multivariate risk statistics, and to study their properties. We, first, introduce the multivariate shortfall risk statistics and multivariate divergence risk statistics. Then, their basic prope...

  • Article
  • Open Access
3 Citations
3,811 Views
18 Pages

27 June 2019

This paper introduces a new family of the convex divergence-based risk measure by specifying ( h , ϕ ) -divergence, corresponding with the dual representation. First, the sensitivity characteristics of the modified divergence risk measure with...

  • Article
  • Open Access
1 Citations
1,783 Views
12 Pages

28 November 2022

This study considers a new decomposition of an extended divergence on a foliation by deformed probability simplexes from the information geometry perspective. In particular, we treat the case where each deformed probability simplex corresponds to a s...

  • Article
  • Open Access
3 Citations
3,233 Views
19 Pages

8 August 2018

This article deals with new concepts in a product MV-algebra, namely, with the concepts of Rényi entropy and Rényi divergence. We define the Rényi entropy of order q of a partition in a product MV-algebra and its conditional vers...

  • Article
  • Open Access
4 Citations
3,285 Views
17 Pages

A Non-Linear Filtering Algorithm Based on Alpha-Divergence Minimization

  • Yarong Luo,
  • Chi Guo,
  • Jiansheng Zheng and
  • Shengyong You

24 September 2018

A non-linear filtering algorithm based on the alpha-divergence is proposed, which uses the exponential family distribution to approximate the actual state distribution and the alpha-divergence to measure the approximation degree between the two distr...

  • Article
  • Open Access
1,937 Views
10 Pages

26 April 2022

Information geometry concerns the study of a dual structure (g,,*) upon a smooth manifold M. Such a geometry is totally encoded within a potential function usually referred to as a divergence or contrast function of (g,,*)...

  • Article
  • Open Access
1 Citations
3,277 Views
14 Pages

8 June 2021

Within exponential families, which may consist of multi-parameter and multivariate distributions, a variety of divergence measures, such as the Kullback–Leibler divergence, the Cressie–Read divergence, the Rényi divergence, and the Hellinger metric,...

  • Article
  • Open Access
74 Citations
15,004 Views
21 Pages

Kullback–Leibler Divergence Measure for Multivariate Skew-Normal Distributions

  • Javier E. Contreras-Reyes and
  • Reinaldo B. Arellano-Valle

4 September 2012

The aim of this work is to provide the tools to compute the well-known Kullback–Leibler divergence measure for the flexible family of multivariate skew-normal distributions. In particular, we use the Jeffreys divergence measure to compare the multiva...

  • Article
  • Open Access
1,566 Views
16 Pages

Asymptotic Properties of a Statistical Estimator of the Jeffreys Divergence: The Case of Discrete Distributions

  • Vladimir Glinskiy,
  • Artem Logachov,
  • Olga Logachova,
  • Helder Rojas,
  • Lyudmila Serga and
  • Anatoly Yambartsev

23 October 2024

We investigate the asymptotic properties of the plug-in estimator for the Jeffreys divergence, the symmetric variant of the Kullback–Leibler (KL) divergence. This study focuses specifically on the divergence between discrete distributions. Trad...

  • Article
  • Open Access
1 Citations
793 Views
26 Pages

5 July 2025

Given finite-dimensional random vectors Y, X, and Z that form a Markov chain in that order (YXZ), we derive the upper bounds on the excess minimum risk using generalized information divergence measures. Here, Y is a target vector to be es...

  • Article
  • Open Access
4 Citations
4,090 Views
17 Pages

Point Divergence Gain and Multidimensional Data Sequences Analysis

  • Renata Rychtáriková,
  • Jan Korbel,
  • Petr Macháček and
  • Dalibor Štys

3 February 2018

We introduce novel information-entropic variables—a Point Divergence Gain ( Ω α ( l m ) ), a Point Divergence Gain Entropy ( I α ), and a Point Divergence Gain Entropy Density ( P α )—which are derived from the Rényi entropy and...

  • Feature Paper
  • Article
  • Open Access
1,707 Views
25 Pages

Numerical Algorithms for Divergence-Free Velocity Applications

  • Giacomo Barbi,
  • Antonio Cervone and
  • Sandro Manservisi

14 August 2024

This work focuses on the well-known issue of mass conservation in the context of the finite element technique for computational fluid dynamic simulations. Specifically, non-conventional finite element families for solving Navier–Stokes equation...

  • Article
  • Open Access
2,701 Views
27 Pages

24 January 2024

We examine the co-variability between the surface wind divergence and vorticity and how it varies with latitude in the Pacific Ocean using surface vector winds from reanalysis and satellite scatterometer observations. We show a strong correlation bet...

  • Article
  • Open Access
2 Citations
2,234 Views
17 Pages

In this paper, we present a three-order, divergence-free finite volume scheme to simulate the steady state solar wind ambient. The divergence-free condition of the magnetic field is preserved by the constrained transport (CT) method. The CT method ca...

  • Article
  • Open Access
2,010 Views
27 Pages

29 October 2025

Owing to the lack of a unified ESG (environmental, social and governance) rating standard, notable inconsistencies have appeared in ESG ratings assigned to the same firm by various rating agencies. Based on data encompassing Chinese A-share listed co...

  • Article
  • Open Access
17 Citations
14,792 Views
26 Pages

21 May 2025

In recent years, corporate ESG performance has been widely incorporated into investment decisions and capital allocation considerations, becoming a focal point and hot topic for research by governments and organizations worldwide. However, due to var...

  • Article
  • Open Access
2 Citations
5,344 Views
22 Pages

11 April 2013

We use the biorthogonal multiwavelets related by differentiation constructed in previous work to construct compactly supported biorthogonal multiwavelet bases for the space of vector fields on the upper half plane R2 + such that the reconstruction wa...

  • Article
  • Open Access
6 Citations
6,486 Views
15 Pages

Robust Aggregation for Federated Learning by Minimum γ-Divergence Estimation

  • Cen-Jhih Li,
  • Pin-Han Huang,
  • Yi-Ting Ma,
  • Hung Hung and
  • Su-Yun Huang

13 May 2022

Federated learning is a framework for multiple devices or institutions, called local clients, to collaboratively train a global model without sharing their data. For federated learning with a central server, an aggregation algorithm integrates model...

  • Article
  • Open Access
21 Citations
4,248 Views
25 Pages

Correlating Extremes in Wind Divergence with Extremes in Rain over the Tropical Atlantic

  • Gregory P. King,
  • Marcos Portabella,
  • Wenming Lin and
  • Ad Stoffelen

25 February 2022

Air–sea fluxes are greatly enhanced by the winds and vertical exchanges generated by mesoscale convective systems (MCSs). In contrast to global numerical weather prediction models, space-borne scatterometers are able to resolve the small-scale...

  • Review
  • Open Access
10 Citations
5,594 Views
69 Pages

30 March 2024

Notothenioid fishes, a perciform group, radiated in the cold shelf waters around the Antarctic continent and the 110 species dominate fish diversity, abundance, and biomass at levels of ≈77%, 92%, and 91%, respectively. This occurred in a local...

  • Article
  • Open Access
9 Citations
5,777 Views
13 Pages

3 March 2017

The main aim of this contribution is to define the notions of Kullback-Leibler divergence and conditional mutual information in fuzzy probability spaces and to derive the basic properties of the suggested measures. In particular, chain rules for mutu...

  • Article
  • Open Access
14 Citations
4,258 Views
13 Pages

Computation of Kullback–Leibler Divergence in Bayesian Networks

  • Serafín Moral,
  • Andrés Cano and
  • Manuel Gómez-Olmedo

28 August 2021

Kullback–Leibler divergence KL(p,q) is the standard measure of error when we have a true probability distribution p which is approximate with probability distribution q. Its efficient computation is essential in many tasks, as in approximate computat...

  • Article
  • Open Access
1,352 Views
14 Pages

This study investigates a laser ranging technology scheme featuring a large divergence angle for both the emitted and received laser beams, focusing on applications where both the measured target and the ranging carrier are high-mobility platforms. A...

  • Article
  • Open Access
13 Citations
4,062 Views
13 Pages

Entropic Divergence and Entropy Related to Nonlinear Master Equations

  • Tamás Sándor Biró,
  • Zoltán Néda and
  • András Telcs

11 October 2019

We reverse engineer entropy formulas from entropic divergence, optimized to given classes of probability distribution function (PDF) evolution dynamical equation. For linear dynamics of the distribution function, the traditional Kullback–Leible...

of 229