You are currently on the new version of our website. Access the old version .

390 Results Found

  • Article
  • Open Access
3 Citations
3,195 Views
19 Pages

8 August 2018

This article deals with new concepts in a product MV-algebra, namely, with the concepts of Rényi entropy and Rényi divergence. We define the Rényi entropy of order q of a partition in a product MV-algebra and its conditional vers...

  • Article
  • Open Access
20 Citations
6,246 Views
36 Pages

18 May 2020

The relative entropy and the chi-squared divergence are fundamental divergence measures in information theory and statistics. This paper is focused on a study of integral relations between the two divergences, the implications of these relations, the...

  • Article
  • Open Access
7 Citations
4,283 Views
18 Pages

R-Norm Entropy and R-Norm Divergence in Fuzzy Probability Spaces

  • Dagmar Markechová,
  • Batool Mosapour and
  • Abolfazl Ebrahimzadeh

11 April 2018

In the presented article, we define the R-norm entropy and the conditional R-norm entropy of partitions of a given fuzzy probability space and study the properties of the suggested entropy measures. In addition, we introduce the concept of R-norm div...

  • Article
  • Open Access
5 Citations
4,147 Views
10 Pages

12 December 2018

Cross entropy and Kullback–Leibler (K-L) divergence are fundamental quantities of information theory, and they are widely used in many fields. Since cross entropy is the negated logarithm of likelihood, minimizing cross entropy is equivalent to...

  • Article
  • Open Access
33 Citations
9,543 Views
13 Pages

25 February 2010

In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects...

  • Article
  • Open Access
13 Citations
4,018 Views
13 Pages

Entropic Divergence and Entropy Related to Nonlinear Master Equations

  • Tamás Sándor Biró,
  • Zoltán Néda and
  • András Telcs

11 October 2019

We reverse engineer entropy formulas from entropic divergence, optimized to given classes of probability distribution function (PDF) evolution dynamical equation. For linear dynamics of the distribution function, the traditional Kullback–Leible...

  • Article
  • Open Access
10 Citations
4,069 Views
14 Pages

Logical Divergence, Logical Entropy, and Logical Mutual Information in Product MV-Algebras

  • Dagmar Markechová,
  • Batool Mosapour and
  • Abolfazl Ebrahimzadeh

16 February 2018

In the paper we propose, using the logical entropy function, a new kind of entropy in product MV-algebras, namely the logical entropy and its conditional version. Fundamental characteristics of these quantities have been shown and subsequently, the r...

  • Article
  • Open Access
6 Citations
3,232 Views
32 Pages

6 January 2024

Bayesian networks (BNs) are a foundational model in machine learning and causal inference. Their graphical structure can handle high-dimensional problems, divide them into a sparse collection of smaller ones, underlies Judea Pearl’s causality,...

  • Article
  • Open Access
56 Citations
8,376 Views
31 Pages

23 March 2015

The fuzzy oil drop model, a tool which can be used to study the structure of the hydrophobic core in proteins, has been applied in the analysis of proteins belonging to the jumonji group—JARID2, JARID1A, JARID1B and JARID1D—proteins that share the pr...

  • Article
  • Open Access
44 Citations
5,282 Views
22 Pages

20 June 2019

Dempster–Shafer (DS) evidence theory is widely applied in multi-source data fusion technology. However, classical DS combination rule fails to deal with the situation when evidence is highly in conflict. To address this problem, a novel multi-s...

  • Article
  • Open Access
16 Citations
7,917 Views
21 Pages

Duality of Maximum Entropy and Minimum Divergence

  • Shinto Eguchi,
  • Osamu Komori and
  • Atsumi Ohara

26 June 2014

We discuss a special class of generalized divergence measures by the use of generator functions. Any divergence measure in the class is separated into the difference between cross and diagonal entropy. The diagonal entropy measure in the class associ...

  • Article
  • Open Access
24 Citations
9,030 Views
22 Pages

1 March 2010

Csiszár’s ƒ-divergence of two probability distributions was extended to the quantum case by the author in 1985. In the quantum setting, positive semidefinite matrices are in the place of probability distributions and the quantum generalization is cal...

  • Article
  • Open Access
2 Citations
1,134 Views
34 Pages

6 September 2024

Two typical fixed-length random number generation problems in information theory are considered for general sources. One is the source resolvability problem and the other is the intrinsic randomness problem. In each of these problems, the optimum ach...

  • Feature Paper
  • Article
  • Open Access
12 Citations
4,982 Views
32 Pages

Local Intrinsic Dimensionality, Entropy and Statistical Divergences

  • James Bailey,
  • Michael E. Houle and
  • Xingjun Ma

30 August 2022

Properties of data distributions can be assessed at both global and local scales. At a highly localized scale, a fundamental measure is the local intrinsic dimensionality (LID), which assesses growth rates of the cumulative distribution function with...

  • Article
  • Open Access
7 Citations
2,293 Views
19 Pages

Entropy Minimization for Generalized Newtonian Fluid Flow between Converging and Diverging Channels

  • Sohail Rehman,
  • Hashim,
  • Abdelaziz Nasr,
  • Sayed M. Eldin and
  • Muhammad Y. Malik

17 October 2022

The foremost focus of this article was to investigate the entropy generation in hydromagnetic flow of generalized Newtonian Carreau nanofluid through a converging and diverging channel. In addition, a heat transport analysis was performed for Carreau...

  • Article
  • Open Access
9 Citations
5,259 Views
32 Pages

6 May 2018

Entropy and relative entropy measures play a crucial role in mathematical information theory. The relative entropies are also widely used in statistics under the name of divergence measures which link these two fields of science through the minimum d...

  • Article
  • Open Access
6 Citations
4,703 Views
12 Pages

24 May 2019

Complex networks of coupled maps of matrices (NCMM) are investigated in this paper. It is shown that a NCMM can evolve into two different steady states—the quiet state or the state of divergence. It appears that chimera states of spatiotemporal diver...

  • Feature Paper
  • Article
  • Open Access
41 Citations
5,520 Views
16 Pages

Correlations of Cross-Entropy Loss in Machine Learning

  • Richard Connor,
  • Alan Dearle,
  • Ben Claydon and
  • Lucia Vadicamo

3 June 2024

Cross-entropy loss is crucial in training many deep neural networks. In this context, we show a number of novel and strong correlations among various related divergence functions. In particular, we demonstrate that, in some circumstances, (a) cross-e...

  • Article
  • Open Access
5 Citations
2,498 Views
22 Pages

Divergence Entropy-Based Evaluation of Hydrophobic Core in Aggressive and Resistant Forms of Transthyretin

  • Mateusz Banach,
  • Katarzyna Stapor,
  • Piotr Fabian,
  • Leszek Konieczny and
  • Irena Roterman

13 April 2021

The two forms of transthyretin differing slightly in the tertiary structure, despite the presence of five mutations, show radically different properties in terms of susceptibility to the amyloid transformation process. These two forms of transthyreti...

  • Article
  • Open Access
353 Citations
22,984 Views
37 Pages

14 June 2010

In this paper, we extend and overview wide families of Alpha-, Beta- and Gamma-divergences and discuss their fundamental properties. In literature usually only one single asymmetric (Alpha, Beta or Gamma) divergence is considered. We show in this pap...

  • Feature Paper
  • Article
  • Open Access
1,272 Views
32 Pages

30 June 2024

This paper establishes a general framework for measuring statistical divergence. Namely, with regard to a pair of random variables that share a common range of values: quantifying the distance of the statistical distribution of one random variable fr...

  • Article
  • Open Access
1 Citations
2,161 Views
24 Pages

The purpose of this paper is twofold. On a technical side, we propose an extension of the Hausdorff distance from metric spaces to spaces equipped with asymmetric distance measures. Specifically, we focus on extending it to the family of Bregman dive...

  • Article
  • Open Access
4 Citations
4,071 Views
17 Pages

Point Divergence Gain and Multidimensional Data Sequences Analysis

  • Renata Rychtáriková,
  • Jan Korbel,
  • Petr Macháček and
  • Dalibor Štys

3 February 2018

We introduce novel information-entropic variables—a Point Divergence Gain ( Ω α ( l → m ) ), a Point Divergence Gain Entropy ( I α ), and a Point Divergence Gain Entropy Density ( P α )—which are derived from the Rényi entropy and...

  • Feature Paper
  • Article
  • Open Access
1 Citations
1,839 Views
13 Pages

18 September 2022

This paper aims at certain theoretical studies and additional computational analysis on symmetry and its lack in Kullback-Leibler and Jeffreys probabilistic divergences related to some engineering applications. As it is known, the Kullback-Leibler di...

  • Feature Paper
  • Article
  • Open Access
40 Citations
6,352 Views
21 Pages

17 August 2019

This paper presents the entropic damage indicators for metallic material fatigue processes obtained from three associated energy dissipation sources. Since its inception, reliability engineering has employed statistical and probabilistic models to as...

  • Article
  • Open Access
4 Citations
3,284 Views
10 Pages

18 March 2021

In this work, we first consider the discrete version of Fisher information measure and then propose Jensen–Fisher information, to develop some associated results. Next, we consider Fisher information and Bayes–Fisher information measures for mixing p...

  • Article
  • Open Access
20 Citations
6,441 Views
11 Pages

9 August 2018

The refinement axiom for entropy has been provocative in providing foundations of information theory, recognised as thoughtworthy in the writings of both Shannon and Jaynes. A resolution to their concerns has been provided recently by the discovery t...

  • Article
  • Open Access
15 Citations
10,174 Views
18 Pages

8 March 2011

A consistent entropy estimator for hyperspherical data is proposed based on the k-nearest neighbor (knn) approach. The asymptotic unbiasedness and consistency of the estimator are proved. Moreover, cross entropy and Kullback-Leibler (KL) divergence e...

  • Feature Paper
  • Article
  • Open Access
5 Citations
2,517 Views
23 Pages

Understanding Higher-Order Interactions in Information Space

  • Herbert Edelsbrunner,
  • Katharina Ölsböck and
  • Hubert Wagner

27 July 2024

Methods used in topological data analysis naturally capture higher-order interactions in point cloud data embedded in a metric space. This methodology was recently extended to data living in an information space, by which we mean a space measured wit...

  • Article
  • Open Access
1,736 Views
26 Pages

Fisher-like Metrics Associated with ϕ-Deformed (Naudts) Entropies

  • Cristina-Liliana Pripoae,
  • Iulia-Elena Hirica,
  • Gabriel-Teodor Pripoae and
  • Vasile Preda

17 November 2022

The paper defines and studies new semi-Riemannian generalized Fisher metrics and Fisher-like metrics, associated with entropies and divergences. Examples of seven such families are provided, based on exponential PDFs. The particular case when the bas...

  • Article
  • Open Access
3 Citations
3,908 Views
121 Pages

8 August 2020

We compute exact values respectively bounds of dissimilarity/distinguishability measures–in the sense of the Kullback-Leibler information distance (relative entropy) and some transforms of more general power divergences and Renyi divergences&nd...

  • Article
  • Open Access
2 Citations
2,451 Views
24 Pages

31 August 2021

The doubly stochastic mechanism generating the realizations of spatial log-Gaussian Cox processes is empirically assessed in terms of generalized entropy, divergence and complexity measures. The aim is to characterize the contribution to stochasticit...

  • Article
  • Open Access
1,865 Views
14 Pages

Some Properties of Fractal Tsallis Entropy

  • Vasile Preda and
  • Răzvan-Cornel Sfetcu

We introduce fractal Tsallis entropy and show that it satisfies Shannon–Khinchin axioms. Analogously to Tsallis divergence (or Tsallis relative entropy, according to some authors), fractal Tsallis divergence is defined and some properties of it are s...

  • Article
  • Open Access
2 Citations
3,098 Views
22 Pages

30 June 2020

Recent literature shows that many testing procedures used to evaluate asset pricing models result in spurious rejection probabilities. Model misspecification, the strong factor structure of test assets, or skewed test statistics largely explain this....

  • Article
  • Open Access
17 Citations
4,528 Views
25 Pages

4 October 2019

Rényi-type generalizations of entropy, relative entropy and mutual information have found numerous applications throughout information theory and beyond. While there is consensus that the ways A. Rényi generalized entropy and relative e...

  • Article
  • Open Access
9 Citations
5,020 Views
18 Pages

10 June 2017

The purpose of the paper is to introduce, using the known results concerning the entropy in product MV algebras, the concepts of mutual information and Kullback–Leibler divergence for the case of product MV algebras and examine algebraic properties o...

  • Article
  • Open Access
45 Citations
10,530 Views
39 Pages

12 June 2015

The main content of this review article is first to review the main inference tools using Bayes rule, the maximum entropy principle (MEP), information theory, relative entropy and the Kullback–Leibler (KL) divergence, Fisher information and its corre...

  • Concept Paper
  • Open Access
1 Citations
7,084 Views
14 Pages

In this paper, we borrow some of the key concepts of nonequilibrium statistical systems, to develop a framework for analyzing a self-organizing-optimizing system of independent interacting agents, with nonlinear dynamics at the macro level that is ba...

  • Article
  • Open Access
24 Citations
11,337 Views
28 Pages

23 January 2018

The Kullback–Leibler (KL) divergence is a fundamental measure of information geometry that is used in a variety of contexts in artificial intelligence. We show that, when system dynamics are given by distributed nonlinear systems, this measure can be...

  • Article
  • Open Access
18 Citations
6,221 Views
36 Pages

4 March 2021

Asymptotic unbiasedness and L2-consistency are established, under mild conditions, for the estimates of the Kullback–Leibler divergence between two probability measures in Rd, absolutely continuous with respect to (w.r.t.) the Lebesgue measure....

  • Article
  • Open Access
5 Citations
6,850 Views
15 Pages

A Novel Nonparametric Distance Estimator for Densities with Error Bounds

  • Alexandre R.F. Carvalho,
  • João Manuel R. S. Tavares and
  • Jose C. Principe

6 May 2013

The use of a metric to assess distance between probability densities is an important practical problem. In this work, a particular metric induced by an α-divergence is studied. The Hellinger metric can be interpreted as a particular case within the f...

  • Article
  • Open Access
4,392 Views
15 Pages

14 December 2017

In the Bayesian framework, the usual choice of prior in the prediction of homogeneous Poisson processes with random effects is the gamma one. Here, we propose the use of higher order maximum entropy priors. Their advantage is illustrated in a simulat...

  • Feature Paper
  • Article
  • Open Access
640 Views
11 Pages

Local Invariance of Divergence-Based Quantum Information Measures

  • Christopher Popp,
  • Tobias C. Sutter and
  • Beatrix C. Hiesmayr

10 October 2025

Quantum information quantities, such as mutual information and entropies, are essential for characterizing quantum systems and protocols in quantum information science. In this contribution, we identify types of information measures based on generali...

of 8