Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (19)

Search Parameters:
Keywords = Generalized Kullback-Leibler Relative Entropy

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 4988 KiB  
Article
Unsupervised Domain Adaptation Method Based on Relative Entropy Regularization and Measure Propagation
by Lianghao Tan, Zhuo Peng, Yongjia Song, Xiaoyi Liu, Huangqi Jiang, Shubing Liu, Weixi Wu and Zhiyuan Xiang
Entropy 2025, 27(4), 426; https://doi.org/10.3390/e27040426 - 14 Apr 2025
Viewed by 517
Abstract
This paper presents a novel unsupervised domain adaptation (UDA) framework that integrates information-theoretic principles to mitigate distributional discrepancies between source and target domains. The proposed method incorporates two key components: (1) relative entropy regularization, which leverages Kullback–Leibler (KL) divergence to align the predicted [...] Read more.
This paper presents a novel unsupervised domain adaptation (UDA) framework that integrates information-theoretic principles to mitigate distributional discrepancies between source and target domains. The proposed method incorporates two key components: (1) relative entropy regularization, which leverages Kullback–Leibler (KL) divergence to align the predicted label distribution of the target domain with a reference distribution derived from the source domain, thereby reducing prediction uncertainty; and (2) measure propagation, a technique that transfers probability mass from the source domain to generate pseudo-measures—estimated probabilistic representations—for the unlabeled target domain. This dual mechanism enhances both global feature alignment and semantic consistency across domains. Extensive experiments on benchmark datasets (OfficeHome and DomainNet) demonstrate that the proposed approach consistently outperforms State-of-the-Art methods, particularly in scenarios with significant domain shifts. These results confirm the robustness, scalability, and theoretical grounding of our framework, offering a new perspective on the fusion of information theory and domain adaptation. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

13 pages, 862 KiB  
Article
An Entropy-Based Approach to Model Selection with Application to Single-Cell Time-Stamped Snapshot Data
by William C. L. Stewart, Ciriyam Jayaprakash and Jayajit Das
Entropy 2025, 27(3), 274; https://doi.org/10.3390/e27030274 - 6 Mar 2025
Viewed by 781
Abstract
Recent single-cell experiments that measure copy numbers of over 40 proteins in thousands of individual cells at different time points [time-stamped snapshot (TSS) data] exhibit cell-to-cell variability. Because the same cells cannot be tracked over time, TSS data provide key information about the [...] Read more.
Recent single-cell experiments that measure copy numbers of over 40 proteins in thousands of individual cells at different time points [time-stamped snapshot (TSS) data] exhibit cell-to-cell variability. Because the same cells cannot be tracked over time, TSS data provide key information about the statistical time-evolution of protein abundances in single cells, information that could yield insights into the mechanisms influencing the biochemical signaling kinetics of a cell. However, when multiple candidate models (i.e., mechanistic models applied to initial protein abundances) can potentially explain the same TSS data, selecting the best model (i.e., model selection) is often challenging. For example, popular approaches like Kullback–Leibler divergence and Akaike’s Information Criterion are often difficult to implement largely because mathematical expressions for the likelihoods of candidate models are typically not available. To perform model selection, we introduce an entropy-based approach that uses split-sample techniques to exploit the availability of large data sets and uses (1) existing generalized method of moments (GMM) software to estimate model parameters, and (2) standard kernel density estimators and a Gaussian copula to estimate candidate models. Using simulated data, we show that our approach can select the ”ground truth” from a set of competing mechanistic models. Then, to assess the relative support for a candidate model, we compute model selection probabilities using a bootstrap procedure. Full article
(This article belongs to the Section Entropy and Biology)
Show Figures

Figure 1

23 pages, 7837 KiB  
Article
Understanding Higher-Order Interactions in Information Space
by Herbert Edelsbrunner, Katharina Ölsböck and Hubert Wagner
Entropy 2024, 26(8), 637; https://doi.org/10.3390/e26080637 - 27 Jul 2024
Cited by 4 | Viewed by 2044
Abstract
Methods used in topological data analysis naturally capture higher-order interactions in point cloud data embedded in a metric space. This methodology was recently extended to data living in an information space, by which we mean a space measured with an information theoretical distance. [...] Read more.
Methods used in topological data analysis naturally capture higher-order interactions in point cloud data embedded in a metric space. This methodology was recently extended to data living in an information space, by which we mean a space measured with an information theoretical distance. One such setting is a finite collection of discrete probability distributions embedded in the probability simplex measured with the relative entropy (Kullback–Leibler divergence). More generally, one can work with a Bregman divergence parameterized by a different notion of entropy. While theoretical algorithms exist for this setup, there is a paucity of implementations for exploring and comparing geometric-topological properties of various information spaces. The interest of this work is therefore twofold. First, we propose the first robust algorithms and software for geometric and topological data analysis in information space. Perhaps surprisingly, despite working with Bregman divergences, our design reuses robust libraries for the Euclidean case. Second, using the new software, we take the first steps towards understanding the geometric-topological structure of these spaces. In particular, we compare them with the more familiar spaces equipped with the Euclidean and Fisher metrics. Full article
Show Figures

Figure 1

32 pages, 414 KiB  
Article
Statistical Divergence and Paths Thereof to Socioeconomic Inequality and to Renewal Processes
by Iddo Eliazar
Entropy 2024, 26(7), 565; https://doi.org/10.3390/e26070565 - 30 Jun 2024
Viewed by 964
Abstract
This paper establishes a general framework for measuring statistical divergence. Namely, with regard to a pair of random variables that share a common range of values: quantifying the distance of the statistical distribution of one random variable from that of the other. The [...] Read more.
This paper establishes a general framework for measuring statistical divergence. Namely, with regard to a pair of random variables that share a common range of values: quantifying the distance of the statistical distribution of one random variable from that of the other. The general framework is then applied to the topics of socioeconomic inequality and renewal processes. The general framework and its applications are shown to yield and to relate to the following: f-divergence, Hellinger divergence, Renyi divergence, and Kullback–Leibler divergence (also known as relative entropy); the Lorenz curve and socioeconomic inequality indices; the Gini index and its generalizations; the divergence of renewal processes from the Poisson process; and the divergence of anomalous relaxation from regular relaxation. Presenting a ‘fresh’ perspective on statistical divergence, this paper offers its readers a simple and transparent construction of statistical-divergence gauges, as well as novel paths that lead from statistical divergence to the aforementioned topics. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
13 pages, 2920 KiB  
Article
On the Symmetry Importance in a Relative Entropy Analysis for Some Engineering Problems
by Marcin Kamiński
Symmetry 2022, 14(9), 1945; https://doi.org/10.3390/sym14091945 - 18 Sep 2022
Cited by 1 | Viewed by 1561
Abstract
This paper aims at certain theoretical studies and additional computational analysis on symmetry and its lack in Kullback-Leibler and Jeffreys probabilistic divergences related to some engineering applications. As it is known, the Kullback-Leibler distance in between two different uncertainty sources exhibits a lack [...] Read more.
This paper aims at certain theoretical studies and additional computational analysis on symmetry and its lack in Kullback-Leibler and Jeffreys probabilistic divergences related to some engineering applications. As it is known, the Kullback-Leibler distance in between two different uncertainty sources exhibits a lack of symmetry, while the Jeffreys model represents its symmetrization. The basic probabilistic computational implementation has been delivered in the computer algebra system MAPLE 2019®, whereas engineering illustrations have been prepared with the use of the Finite Element Method systems Autodesk ROBOT® & ABAQUS®. Determination of the first two probabilistic moments fundamental in the calculation of both relative entropies has been made (i) analytically, using a semi-analytical approach (based upon the series of the FEM experiments), and (ii) the iterative generalized stochastic perturbation technique, where some reference solutions have been delivered using (iii) Monte-Carlo simulation. Numerical analysis proves the fundamental role of computer algebra systems in probabilistic entropy determination and shows remarkable differences obtained with the two aforementioned relative entropy models, which, in some specific cases, may be neglected. As it is demonstrated in this work, a lack of symmetry in probabilistic divergence may have a decisive role in engineering reliability, where extreme and admissible responses cannot be simply replaced with each other in any case. Full article
Show Figures

Figure 1

16 pages, 2444 KiB  
Article
Non-Equilibrium Entropy and Irreversibility in Generalized Stochastic Loewner Evolution from an Information-Theoretic Perspective
by Yusuke Shibasaki and Minoru Saito
Entropy 2021, 23(9), 1098; https://doi.org/10.3390/e23091098 - 24 Aug 2021
Cited by 5 | Viewed by 4098
Abstract
In this study, we theoretically investigated a generalized stochastic Loewner evolution (SLE) driven by reversible Langevin dynamics in the context of non-equilibrium statistical mechanics. Using the ability of Loewner evolution, which enables encoding of non-equilibrium systems into equilibrium systems, we formulated the encoding [...] Read more.
In this study, we theoretically investigated a generalized stochastic Loewner evolution (SLE) driven by reversible Langevin dynamics in the context of non-equilibrium statistical mechanics. Using the ability of Loewner evolution, which enables encoding of non-equilibrium systems into equilibrium systems, we formulated the encoding mechanism of the SLE by Gibbs entropy-based information-theoretic approaches to discuss its advantages as a means to better describe non-equilibrium systems. After deriving entropy production and flux for the 2D trajectories of the generalized SLE curves, we reformulated the system’s entropic properties in terms of the Kullback–Leibler (KL) divergence. We demonstrate that this operation leads to alternative expressions of the Jarzynski equality and the second law of thermodynamics, which are consistent with the previously suggested theory of information thermodynamics. The irreversibility of the 2D trajectories is similarly discussed by decomposing the entropy into additive and non-additive parts. We numerically verified the non-equilibrium property of our model by simulating the long-time behavior of the entropic measure suggested by our formulation, referred to as the relative Loewner entropy. Full article
Show Figures

Figure 1

12 pages, 1478 KiB  
Article
Applying the Horizontal Visibility Graph Method to Study Irreversibility of Electromagnetic Turbulence in Non-Thermal Plasmas
by Belén Acosta-Tripailao, Denisse Pastén and Pablo S. Moya
Entropy 2021, 23(4), 470; https://doi.org/10.3390/e23040470 - 16 Apr 2021
Cited by 19 | Viewed by 3582
Abstract
One of the fundamental open questions in plasma physics is the role of non-thermal particles distributions in poorly collisional plasma environments, a system that is commonly found throughout the Universe, e.g., the solar wind and the Earth’s magnetosphere correspond to natural plasma physics [...] Read more.
One of the fundamental open questions in plasma physics is the role of non-thermal particles distributions in poorly collisional plasma environments, a system that is commonly found throughout the Universe, e.g., the solar wind and the Earth’s magnetosphere correspond to natural plasma physics laboratories in which turbulent phenomena can be studied. Our study perspective is born from the method of Horizontal Visibility Graph (HVG) that has been developed in the last years to analyze time series avoiding the tedium and the high computational cost that other methods offer. Here, we build a complex network based on directed HVG technique applied to magnetic field fluctuations time series obtained from Particle In Cell (PIC) simulations of a magnetized collisionless plasma to distinguish the degree distributions and calculate the Kullback–Leibler Divergence (KLD) as a measure of relative entropy of data sets produced by processes that are not in equilibrium. First, we analyze the connectivity probability distribution for the undirected version of HVG finding how the Kappa distribution for low values of κ tends to be an uncorrelated time series, while the Maxwell–Boltzmann distribution shows a correlated stochastic processes behavior. Subsequently, we investigate the degree of temporary irreversibility of magnetic fluctuations that are self-generated by the plasma, comparing the case of a thermal plasma (described by a Maxwell–Botzmann velocity distribution function) with non-thermal Kappa distributions. We have shown that the KLD associated to the HVG is able to distinguish the level of reversibility that is associated to the thermal equilibrium in the plasma, because the dissipative degree of the system increases as the value of κ parameter decreases and the distribution function departs from the Maxwell–Boltzmann equilibrium. Full article
Show Figures

Figure 1

13 pages, 2317 KiB  
Article
Time Series Analysis Applied to EEG Shows Increased Global Connectivity during Motor Activation Detected in PD Patients Compared to Controls
by Ana María Maitín, Ramiro Perezzan, Diego Herráez-Aguilar, José Ignacio Serrano, María Dolores Del Castillo, Aida Arroyo, Jorge Andreo and Juan Pablo Romero
Appl. Sci. 2021, 11(1), 15; https://doi.org/10.3390/app11010015 - 22 Dec 2020
Cited by 3 | Viewed by 3643
Abstract
Background: Brain connectivity has shown to be a key characteristic in the study of both Parkinson’s Disease (PD) and the response of the patients to the dopaminergic medication. Time series analysis has been used here for the first time to study brain connectivity [...] Read more.
Background: Brain connectivity has shown to be a key characteristic in the study of both Parkinson’s Disease (PD) and the response of the patients to the dopaminergic medication. Time series analysis has been used here for the first time to study brain connectivity changes during motor activation in PD. Methods: A 64-channel EEG signal was registered during unilateral motor activation and resting-state in 6 non-demented PD patients before and after the administration of levodopa and in 6 matched healthy controls. Spectral entropy correlation, coherence, and interhemispheric divergence differences among PD patients and controls were analyzed under the assumption of stationarity of the time series. Results: During the motor activation test, PD patients showed an increased correlation coefficient (both hands p < 0.001) and a remarkable increase in coherence in all frequency range compared to the generalized reduction observed in controls (both hands p < 0.001). The Kullback­–Leibler Divergence (KLD) of the Spectral Entropy between brain hemispheres was observed to increase in controls (right hand p = 0.01; left hand p = 0.015) and to decrease in PD patients (right hand p = 0.02; left hand p = 0.002) with motor activation. Conclusions: Our results suggest that the oscillatory activity of the different cortex areas within healthy brains is relatively independent of the rest. PD brains exhibit a stronger connectivity which grows during motor activation. The levodopa mitigates this anomalous performance. Full article
Show Figures

Figure 1

121 pages, 1378 KiB  
Article
Some Dissimilarity Measures of Branching Processes and Optimal Decision Making in the Presence of Potential Pandemics
by Niels B. Kammerer and Wolfgang Stummer
Entropy 2020, 22(8), 874; https://doi.org/10.3390/e22080874 - 8 Aug 2020
Cited by 3 | Viewed by 3544
Abstract
We compute exact values respectively bounds of dissimilarity/distinguishability measures–in the sense of the Kullback-Leibler information distance (relative entropy) and some transforms of more general power divergences and Renyi divergences–between two competing discrete-time Galton-Watson branching processes with immigration GWI for which the offspring as [...] Read more.
We compute exact values respectively bounds of dissimilarity/distinguishability measures–in the sense of the Kullback-Leibler information distance (relative entropy) and some transforms of more general power divergences and Renyi divergences–between two competing discrete-time Galton-Watson branching processes with immigration GWI for which the offspring as well as the immigration (importation) is arbitrarily Poisson-distributed; especially, we allow for arbitrary type of extinction-concerning criticality and thus for non-stationarity. We apply this to optimal decision making in the context of the spread of potentially pandemic infectious diseases (such as e.g., the current COVID-19 pandemic), e.g., covering different levels of dangerousness and different kinds of intervention/mitigation strategies. Asymptotic distinguishability behaviour and diffusion limits are investigated, too. Full article
Show Figures

Figure 1

22 pages, 2983 KiB  
Article
Relative Entropy and Minimum-Variance Pricing Kernel in Asset Pricing Model Evaluation
by Javier Rojo-Suárez and Ana Belén Alonso-Conde
Entropy 2020, 22(7), 721; https://doi.org/10.3390/e22070721 - 30 Jun 2020
Cited by 2 | Viewed by 2847
Abstract
Recent literature shows that many testing procedures used to evaluate asset pricing models result in spurious rejection probabilities. Model misspecification, the strong factor structure of test assets, or skewed test statistics largely explain this. In this paper we use the relative entropy of [...] Read more.
Recent literature shows that many testing procedures used to evaluate asset pricing models result in spurious rejection probabilities. Model misspecification, the strong factor structure of test assets, or skewed test statistics largely explain this. In this paper we use the relative entropy of pricing kernels to provide an alternative framework for testing asset pricing models. Building on the fact that the law of one price guarantees the existence of a valid pricing kernel, we study the relationship between the mean-variance efficiency of a model’s factor-mimicking portfolio, as measured by the cross-sectional generalized least squares (GLS) R 2 statistic, and the relative entropy of the pricing kernel, as determined by the Kullback–Leibler divergence. In this regard, we suggest an entropy-based decomposition that accurately captures the divergence between the factor-mimicking portfolio and the minimum-variance pricing kernel resulting from the Hansen-Jagannathan bound. Our results show that, although GLS R 2 statistics and relative entropy are strongly correlated, the relative entropy approach allows us to explicitly decompose the explanatory power of the model into two components, namely, the relative entropy of the pricing kernel and that corresponding to its correlation with asset returns. This makes the relative entropy a versatile tool for designing robust tests in asset pricing. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

13 pages, 312 KiB  
Article
Canonical Divergence for Measuring Classical and Quantum Complexity
by Domenico Felice, Stefano Mancini and Nihat Ay
Entropy 2019, 21(4), 435; https://doi.org/10.3390/e21040435 - 24 Apr 2019
Cited by 10 | Viewed by 3626
Abstract
A new canonical divergence is put forward for generalizing an information-geometric measure of complexity for both classical and quantum systems. On the simplex of probability measures, it is proved that the new divergence coincides with the Kullback–Leibler divergence, which is used to quantify [...] Read more.
A new canonical divergence is put forward for generalizing an information-geometric measure of complexity for both classical and quantum systems. On the simplex of probability measures, it is proved that the new divergence coincides with the Kullback–Leibler divergence, which is used to quantify how much a probability measure deviates from the non-interacting states that are modeled by exponential families of probabilities. On the space of positive density operators, we prove that the same divergence reduces to the quantum relative entropy, which quantifies many-party correlations of a quantum state from a Gibbs family. Full article
(This article belongs to the Special Issue Quantum Entropies and Complexity)
10 pages, 995 KiB  
Article
Divergence from, and Convergence to, Uniformity of Probability Density Quantiles
by Robert G. Staudte and Aihua Xia
Entropy 2018, 20(5), 317; https://doi.org/10.3390/e20050317 - 25 Apr 2018
Cited by 4 | Viewed by 4100
Abstract
We demonstrate that questions of convergence and divergence regarding shapes of distributions can be carried out in a location- and scale-free environment. This environment is the class of probability density quantiles (pdQs), obtained by normalizing the composition of the density with the associated [...] Read more.
We demonstrate that questions of convergence and divergence regarding shapes of distributions can be carried out in a location- and scale-free environment. This environment is the class of probability density quantiles (pdQs), obtained by normalizing the composition of the density with the associated quantile function. It has earlier been shown that the pdQ is representative of a location-scale family and carries essential information regarding shape and tail behavior of the family. The class of pdQs are densities of continuous distributions with common domain, the unit interval, facilitating metric and semi-metric comparisons. The Kullback–Leibler divergences from uniformity of these pdQs are mapped to illustrate their relative positions with respect to uniformity. To gain more insight into the information that is conserved under the pdQ mapping, we repeatedly apply the pdQ mapping and find that further applications of it are quite generally entropy increasing so convergence to the uniform distribution is investigated. New fixed point theorems are established with elementary probabilistic arguments and illustrated by examples. Full article
(This article belongs to the Special Issue Entropy: From Physics to Information Sciences and Geometry)
Show Figures

Graphical abstract

18 pages, 291 KiB  
Review
Relative Entropy in Biological Systems
by John C. Baez and Blake S. Pollard
Entropy 2016, 18(2), 46; https://doi.org/10.3390/e18020046 - 2 Feb 2016
Cited by 49 | Viewed by 10963
Abstract
In this paper we review various information-theoretic characterizations of the approach to equilibrium in biological systems. The replicator equation, evolutionary game theory, Markov processes and chemical reaction networks all describe the dynamics of a population or probability distribution. Under suitable assumptions, the distribution [...] Read more.
In this paper we review various information-theoretic characterizations of the approach to equilibrium in biological systems. The replicator equation, evolutionary game theory, Markov processes and chemical reaction networks all describe the dynamics of a population or probability distribution. Under suitable assumptions, the distribution will approach an equilibrium with the passage of time. Relative entropy—that is, the Kullback–Leibler divergence, or various generalizations of this—provides a quantitative measure of how far from equilibrium the system is. We explain various theorems that give conditions under which relative entropy is nonincreasing. In biochemical applications these results can be seen as versions of the Second Law of Thermodynamics, stating that free energy can never increase with the passage of time. In ecological applications, they make precise the notion that a population gains information from its environment as it approaches equilibrium. Full article
(This article belongs to the Special Issue Information and Entropy in Biological Systems)
41 pages, 424 KiB  
Article
Information Geometry Formalism for the Spatially Homogeneous Boltzmann Equation
by Bertrand Lods and Giovanni Pistone
Entropy 2015, 17(6), 4323-4363; https://doi.org/10.3390/e17064323 - 19 Jun 2015
Cited by 20 | Viewed by 5685
Abstract
Information Geometry generalizes to infinite dimension by modeling the tangent space of the relevant manifold of probability densities with exponential Orlicz spaces. We review here several properties of the exponential manifold on a suitable set Ɛ of mutually absolutely continuous densities. We study [...] Read more.
Information Geometry generalizes to infinite dimension by modeling the tangent space of the relevant manifold of probability densities with exponential Orlicz spaces. We review here several properties of the exponential manifold on a suitable set Ɛ of mutually absolutely continuous densities. We study in particular the fine properties of the Kullback-Liebler divergence in this context. We also show that this setting is well-suited for the study of the spatially homogeneous Boltzmann equation if Ɛ is a set of positive densities with finite relative entropy with respect to the Maxwell density. More precisely, we analyze the Boltzmann operator in the geometric setting from the point of its Maxwell’s weak form as a composition of elementary operations in the exponential manifold, namely tensor product, conditioning, marginalization and we prove in a geometric way the basic facts, i.e., the H-theorem. We also illustrate the robustness of our method by discussing, besides the Kullback-Leibler divergence, also the property of Hyvärinen divergence. This requires us to generalize our approach to Orlicz–Sobolev spaces to include derivatives. Full article
(This article belongs to the Special Issue Entropic Aspects in Statistical Physics of Complex Systems)
Show Figures

39 pages, 347 KiB  
Article
Entropy, Information Theory, Information Geometry and Bayesian Inference in Data, Signal and Image Processing and Inverse Problems
by Ali Mohammad-Djafari
Entropy 2015, 17(6), 3989-4027; https://doi.org/10.3390/e17063989 - 12 Jun 2015
Cited by 45 | Viewed by 9392
Abstract
The main content of this review article is first to review the main inference tools using Bayes rule, the maximum entropy principle (MEP), information theory, relative entropy and the Kullback–Leibler (KL) divergence, Fisher information and its corresponding geometries. For each of these tools, [...] Read more.
The main content of this review article is first to review the main inference tools using Bayes rule, the maximum entropy principle (MEP), information theory, relative entropy and the Kullback–Leibler (KL) divergence, Fisher information and its corresponding geometries. For each of these tools, the precise context of their use is described. The second part of the paper is focused on the ways these tools have been used in data, signal and image processing and in the inverse problems, which arise in different physical sciences and engineering applications. A few examples of the applications are described: entropy in independent components analysis (ICA) and in blind source separation, Fisher information in data model selection, different maximum entropy-based methods in time series spectral estimation and in linear inverse problems and, finally, the Bayesian inference for general inverse problems. Some original materials concerning the approximate Bayesian computation (ABC) and, in particular, the variational Bayesian approximation (VBA) methods are also presented. VBA is used for proposing an alternative Bayesian computational tool to the classical Markov chain Monte Carlo (MCMC) methods. We will also see that VBA englobes joint maximum a posteriori (MAP), as well as the different expectation-maximization (EM) algorithms as particular cases. Full article
(This article belongs to the Special Issue Information, Entropy and Their Geometric Structures)
Back to TopTop