entropy-logo

Journal Browser

Journal Browser

Information and Divergence Measures

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (30 September 2022) | Viewed by 31369

Printed Edition Available!
A printed edition of this Special Issue is available here.

Special Issue Editors


E-Mail Website
Guest Editor
Laboratory of Statistics and Data Analysis, Department of Statistics and Actuarial-Financial Mathematics, University of the Aegean, GR-83200 Karlovasi, Greece
Interests: model selection; applied probability; reliability theory; medical statistics; time series; biostatistics
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Lab of Stat and Data Analysis, University of the Aegean, 83200 Karlovasi, Greece
Interests: stochastic modeling; stochastic processes; applied probability; mathematical statistics; semi-Markov processes; reliability theory; multi-state systems; entropy and divergence; goodness-of-fit tests; control charts–statistical quality control

Special Issue Information

Dear Colleagues,

The general topic of measures can be considered as a broad scientific area aiming to develop methods and techniques for inference and modelling of various phenomena in various scientific fields.

The concept of distance is important in establishing the degree of similarity and the degree of closeness among functions, populations, or distributions. As a result, distances are related to inferential statistics, including both estimation and hypothesis testing problems, and modelling with applications in regression analysis, multivariate analysis, actuarial science, portfolio optimization, survival analysis, reliability theory, and many more. Thus, the significant importance of entropy and divergence measures that emerges in many scientific fields makes such a topic of great interest to scientists, researchers, medical experts, engineers, industrial managers, computer experts, data analysts, etc.

This issue intents to cover the recent developments in Information and Divergence Measures and presents new theoretical issues that were not previously presented in the literature, as well as the solutions of important practical problems and case studies illustrating the application methodology.

The issue is expected to be a collective work by a number of leading scientists, (data) analysts, statisticians, mathematicians, computer scientists, information theory experts, and engineers who have been working on the front end of information theory and divergence measures.

All manuscripts in the issue are going to be written by leading researchers and practitioners in their respective fields of expertise and present a plethora of innovative methods, approaches, and solutions not covered before in the literature.

Prof. Dr. Alex Karagrigoriou
Dr. Andreas Makrides
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • entropy
  • information theory
  • divergence measures
  • statistical inference
  • survival analysis
  • actuarial science
  • multivariate analysis

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (14 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research

3 pages, 195 KiB  
Editorial
Information and Divergence Measures
by Alex Karagrigoriou and Andreas Makrides
Entropy 2023, 25(4), 683; https://doi.org/10.3390/e25040683 - 19 Apr 2023
Viewed by 1119
Abstract
The present Special Issue of Entropy, entitled Information and Divergence Measures, covers various aspects and applications in the general area of Information and Divergence Measures [...] Full article
(This article belongs to the Special Issue Information and Divergence Measures)

Research

Jump to: Editorial

16 pages, 305 KiB  
Article
Robust Z-Estimators for Semiparametric Moment Condition Models
by Aida Toma
Entropy 2023, 25(7), 1013; https://doi.org/10.3390/e25071013 - 30 Jun 2023
Viewed by 884
Abstract
In the present paper, we introduce a class of robust Z-estimators for moment condition models. These new estimators can be seen as robust alternatives for the minimum empirical divergence estimators. By using the multidimensional Huber function, we first define robust estimators of the [...] Read more.
In the present paper, we introduce a class of robust Z-estimators for moment condition models. These new estimators can be seen as robust alternatives for the minimum empirical divergence estimators. By using the multidimensional Huber function, we first define robust estimators of the element that realizes the supremum in the dual form of the divergence. A linear relationship between the influence function of a minimum empirical divergence estimator and the influence function of the estimator of the element that realizes the supremum in the dual form of the divergence led to the idea of defining new Z-estimators for the parameter of the model, by using robust estimators in the dual form of the divergence. The asymptotic properties of the proposed estimators were proven, including here the consistency and their asymptotic normality. Then, the influence functions of the estimators were derived, and their robustness is demonstrated. Full article
(This article belongs to the Special Issue Information and Divergence Measures)
21 pages, 628 KiB  
Article
Transversality Conditions for Geodesics on the Statistical Manifold of Multivariate Gaussian Distributions
by Trevor Herntier and Adrian M. Peter
Entropy 2022, 24(11), 1698; https://doi.org/10.3390/e24111698 - 21 Nov 2022
Cited by 2 | Viewed by 1821
Abstract
We consider the problem of finding the closest multivariate Gaussian distribution on a constraint surface of all Gaussian distributions to a given distribution. Previous research regarding geodesics on the multivariate Gaussian manifold has focused on finding closed-form, shortest-path distances between two fixed distributions [...] Read more.
We consider the problem of finding the closest multivariate Gaussian distribution on a constraint surface of all Gaussian distributions to a given distribution. Previous research regarding geodesics on the multivariate Gaussian manifold has focused on finding closed-form, shortest-path distances between two fixed distributions on the manifold, often restricting the parameters to obtain the desired solution. We demonstrate how to employ the techniques of the calculus of variations with a variable endpoint to search for the closest distribution from a family of distributions generated via a constraint set on the parameter manifold. Furthermore, we examine the intermediate distributions along the learned geodesics which provide insight into uncertainty evolution along the paths. Empirical results elucidate our formulations, with visual illustrations concretely exhibiting dynamics of 1D and 2D Gaussian distributions. Full article
(This article belongs to the Special Issue Information and Divergence Measures)
Show Figures

Figure 1

11 pages, 275 KiB  
Article
Some Properties of Weighted Tsallis and Kaniadakis Divergences
by Răzvan-Cornel Sfetcu, Sorina-Cezarina Sfetcu and Vasile Preda
Entropy 2022, 24(11), 1616; https://doi.org/10.3390/e24111616 - 5 Nov 2022
Cited by 2 | Viewed by 1533
Abstract
We are concerned with the weighted Tsallis and Kaniadakis divergences between two measures. More precisely, we find inequalities between these divergences and Tsallis and Kaniadakis logarithms, prove that they are limited by similar bounds with those that limit Kullback–Leibler divergence and show that [...] Read more.
We are concerned with the weighted Tsallis and Kaniadakis divergences between two measures. More precisely, we find inequalities between these divergences and Tsallis and Kaniadakis logarithms, prove that they are limited by similar bounds with those that limit Kullback–Leibler divergence and show that are pseudo-additive. Full article
(This article belongs to the Special Issue Information and Divergence Measures)
13 pages, 6174 KiB  
Communication
Spatial Information-Theoretic Optimal LPI Radar Waveform Design
by Jun Chen, Jie Wang, Yidong Zhang, Fei Wang and Jianjiang Zhou
Entropy 2022, 24(11), 1515; https://doi.org/10.3390/e24111515 - 24 Oct 2022
Cited by 4 | Viewed by 1487
Abstract
In this paper, the design of low probability of intercept (LPI) radar waveforms considers not only the performance of passive interception systems (PISs), but also radar detection and resolution performance. Waveform design is an important considerations for the LPI ability of radar. Since [...] Read more.
In this paper, the design of low probability of intercept (LPI) radar waveforms considers not only the performance of passive interception systems (PISs), but also radar detection and resolution performance. Waveform design is an important considerations for the LPI ability of radar. Since information theory has a powerful performance-bound description ability from the perspective of information flow, LPI waveforms are designed in this paper within the constraints of the detection performance metrics of radar and PISs, both of which are measured by the Kullback–Leibler divergence, and the resolution performance metric, which is measured by joint entropy. The designed optimization model of LPI waveforms can be solved using the sequential quadratic programming (SQP) method. Simulation results verify that the designed LPI waveforms not only have satisfactory target-detecting and resolution performance, but also have a superior low interception performance against PISs. Full article
(This article belongs to the Special Issue Information and Divergence Measures)
Show Figures

Figure 1

18 pages, 295 KiB  
Article
Probabilistic Pairwise Model Comparisons Based on Bootstrap Estimators of the Kullback–Leibler Discrepancy
by Andres Dajles and Joseph Cavanaugh
Entropy 2022, 24(10), 1483; https://doi.org/10.3390/e24101483 - 18 Oct 2022
Cited by 2 | Viewed by 1485
Abstract
When choosing between two candidate models, classical hypothesis testing presents two main limitations: first, the models being tested have to be nested, and second, one of the candidate models must subsume the structure of the true data-generating model. Discrepancy measures have been used [...] Read more.
When choosing between two candidate models, classical hypothesis testing presents two main limitations: first, the models being tested have to be nested, and second, one of the candidate models must subsume the structure of the true data-generating model. Discrepancy measures have been used as an alternative method to select models without the need to rely upon the aforementioned assumptions. In this paper, we utilize a bootstrap approximation of the Kullback–Leibler discrepancy (BD) to estimate the probability that the fitted null model is closer to the underlying generating model than the fitted alternative model. We propose correcting for the bias of the BD estimator either by adding a bootstrap-based correction or by adding the number of parameters in the candidate model. We exemplify the effect of these corrections on the estimator of the discrepancy probability and explore their behavior in different model comparison settings. Full article
(This article belongs to the Special Issue Information and Divergence Measures)
19 pages, 342 KiB  
Article
Some Information Measures Properties of the GOS-Concomitants from the FGM Family
by Florentina Suter, Ioana Cernat and Mihai Drăgan
Entropy 2022, 24(10), 1361; https://doi.org/10.3390/e24101361 - 26 Sep 2022
Cited by 5 | Viewed by 1384
Abstract
In this paper we recall, extend and compute some information measures for the concomitants of the generalized order statistics (GOS) from the Farlie–Gumbel–Morgenstern (FGM) family. We focus on two types of information measures: some related to Shannon entropy, and some related to Tsallis [...] Read more.
In this paper we recall, extend and compute some information measures for the concomitants of the generalized order statistics (GOS) from the Farlie–Gumbel–Morgenstern (FGM) family. We focus on two types of information measures: some related to Shannon entropy, and some related to Tsallis entropy. Among the information measures considered are residual and past entropies which are important in a reliability context. Full article
(This article belongs to the Special Issue Information and Divergence Measures)
23 pages, 771 KiB  
Article
Application of Statistical K-Means Algorithm for University Academic Evaluation
by Daohua Yu, Xin Zhou, Yu Pan, Zhendong Niu and Huafei Sun
Entropy 2022, 24(7), 1004; https://doi.org/10.3390/e24071004 - 20 Jul 2022
Cited by 5 | Viewed by 2258
Abstract
With the globalization of higher education, academic evaluation is increasingly valued by the scientific and educational circles. Although the number of published papers of academic evaluation methods is increasing, previous research mainly focused on the method of assigning different weights for various indicators, [...] Read more.
With the globalization of higher education, academic evaluation is increasingly valued by the scientific and educational circles. Although the number of published papers of academic evaluation methods is increasing, previous research mainly focused on the method of assigning different weights for various indicators, which can be subjective and limited. This paper investigates the evaluation of academic performance by using the statistical K-means (SKM) algorithm to produce clusters. The core idea is mapping the evaluation data from Euclidean space to Riemannian space in which the geometric structure can be used to obtain accurate clustering results. The method can adapt to different indicators and make full use of big data. By using the K-means algorithm based on statistical manifolds, the academic evaluation results of universities can be obtained. Furthermore, through simulation experiments on the top 20 universities of China with the traditional K-means, GMM and SKM algorithms, respectively, we analyze the advantages and disadvantages of different methods. We also test the three algorithms on a UCI ML dataset. The simulation results show the advantages of the SKM algorithm. Full article
(This article belongs to the Special Issue Information and Divergence Measures)
Show Figures

Figure 1

26 pages, 449 KiB  
Article
A Generic Formula and Some Special Cases for the Kullback–Leibler Divergence between Central Multivariate Cauchy Distributions
by Nizar Bouhlel and David Rousseau
Entropy 2022, 24(6), 838; https://doi.org/10.3390/e24060838 - 17 Jun 2022
Cited by 5 | Viewed by 2727
Abstract
This paper introduces a closed-form expression for the Kullback–Leibler divergence (KLD) between two central multivariate Cauchy distributions (MCDs) which have been recently used in different signal and image processing applications where non-Gaussian models are needed. In this overview, the MCDs are surveyed and [...] Read more.
This paper introduces a closed-form expression for the Kullback–Leibler divergence (KLD) between two central multivariate Cauchy distributions (MCDs) which have been recently used in different signal and image processing applications where non-Gaussian models are needed. In this overview, the MCDs are surveyed and some new results and properties are derived and discussed for the KLD. In addition, the KLD for MCDs is showed to be written as a function of Lauricella D-hypergeometric series FD(p). Finally, a comparison is made between the Monte Carlo sampling method to approximate the KLD and the numerical value of the closed-form expression of the latter. The approximation of the KLD by Monte Carlo sampling method are shown to converge to its theoretical value when the number of samples goes to the infinity. Full article
(This article belongs to the Special Issue Information and Divergence Measures)
Show Figures

Figure 1

28 pages, 514 KiB  
Article
Robust Test Statistics Based on Restricted Minimum Rényi’s Pseudodistance Estimators
by María Jaenada, Pedro Miranda and Leandro Pardo
Entropy 2022, 24(5), 616; https://doi.org/10.3390/e24050616 - 28 Apr 2022
Cited by 6 | Viewed by 1790
Abstract
The Rao’s score, Wald and likelihood ratio tests are the most common procedures for testing hypotheses in parametric models. None of the three test statistics is uniformly superior to the other two in relation with the power function, and moreover, they are first-order [...] Read more.
The Rao’s score, Wald and likelihood ratio tests are the most common procedures for testing hypotheses in parametric models. None of the three test statistics is uniformly superior to the other two in relation with the power function, and moreover, they are first-order equivalent and asymptotically optimal. Conversely, these three classical tests present serious robustness problems, as they are based on the maximum likelihood estimator, which is highly non-robust. To overcome this drawback, some test statistics have been introduced in the literature based on robust estimators, such as robust generalized Wald-type and Rao-type tests based on minimum divergence estimators. In this paper, restricted minimum Rényi’s pseudodistance estimators are defined, and their asymptotic distribution and influence function are derived. Further, robust Rao-type and divergence-based tests based on minimum Rényi’s pseudodistance and restricted minimum Rényi’s pseudodistance estimators are considered, and the asymptotic properties of the new families of tests statistics are obtained. Finally, the robustness of the proposed estimators and test statistics is empirically examined through a simulation study, and illustrative applications in real-life data are analyzed. Full article
(This article belongs to the Special Issue Information and Divergence Measures)
Show Figures

Figure 1

13 pages, 355 KiB  
Article
A Skew Logistic Distribution for Modelling COVID-19 Waves and Its Evaluation Using the Empirical Survival Jensen–Shannon Divergence
by Mark Levene
Entropy 2022, 24(5), 600; https://doi.org/10.3390/e24050600 - 25 Apr 2022
Cited by 2 | Viewed by 2438
Abstract
A novel yet simple extension of the symmetric logistic distribution is proposed by introducing a skewness parameter. It is shown how the three parameters of the ensuing skew logistic distribution may be estimated using maximum likelihood. The skew logistic distribution is then extended [...] Read more.
A novel yet simple extension of the symmetric logistic distribution is proposed by introducing a skewness parameter. It is shown how the three parameters of the ensuing skew logistic distribution may be estimated using maximum likelihood. The skew logistic distribution is then extended to the skew bi-logistic distribution to allow the modelling of multiple waves in epidemic time series data. The proposed skew-logistic model is validated on COVID-19 data from the UK, and is evaluated for goodness-of-fit against the logistic and normal distributions using the recently formulated empirical survival Jensen–Shannon divergence (ESJS) and the Kolmogorov–Smirnov two-sample test statistic (KS2). We employ 95% bootstrap confidence intervals to assess the improvement in goodness-of-fit of the skew logistic distribution over the other distributions. The obtained confidence intervals for the ESJS are narrower than those for the KS2 on using this dataset, implying that the ESJS is more powerful than the KS2. Full article
(This article belongs to the Special Issue Information and Divergence Measures)
Show Figures

Figure 1

31 pages, 469 KiB  
Article
Information Inequalities via Submodularity and a Problem in Extremal Graph Theory
by Igal Sason
Entropy 2022, 24(5), 597; https://doi.org/10.3390/e24050597 - 25 Apr 2022
Cited by 3 | Viewed by 2822
Abstract
The present paper offers, in its first part, a unified approach for the derivation of families of inequalities for set functions which satisfy sub/supermodularity properties. It applies this approach for the derivation of information inequalities with Shannon information measures. Connections of the considered [...] Read more.
The present paper offers, in its first part, a unified approach for the derivation of families of inequalities for set functions which satisfy sub/supermodularity properties. It applies this approach for the derivation of information inequalities with Shannon information measures. Connections of the considered approach to a generalized version of Shearer’s lemma, and other related results in the literature are considered. Some of the derived information inequalities are new, and also known results (such as a generalized version of Han’s inequality) are reproduced in a simple and unified way. In its second part, this paper applies the generalized Han’s inequality to analyze a problem in extremal graph theory. This problem is motivated and analyzed from the perspective of information theory, and the analysis leads to generalized and refined bounds. The two parts of this paper are meant to be independently accessible to the reader. Full article
(This article belongs to the Special Issue Information and Divergence Measures)
17 pages, 894 KiB  
Article
Contingency Table Analysis and Inference via Double Index Measures
by Christos Meselidis and Alex Karagrigoriou
Entropy 2022, 24(4), 477; https://doi.org/10.3390/e24040477 - 29 Mar 2022
Cited by 3 | Viewed by 2167
Abstract
In this work, we focus on a general family of measures of divergence for estimation and testing with emphasis on conditional independence in cross tabulations. For this purpose, a restricted minimum divergence estimator is used for the estimation of parameters under constraints and [...] Read more.
In this work, we focus on a general family of measures of divergence for estimation and testing with emphasis on conditional independence in cross tabulations. For this purpose, a restricted minimum divergence estimator is used for the estimation of parameters under constraints and a new double index (dual) divergence test statistic is introduced and thoroughly examined. The associated asymptotic theory is provided and the advantages and practical implications are explored via simulation studies. Full article
(This article belongs to the Special Issue Information and Divergence Measures)
Show Figures

Figure 1

21 pages, 1068 KiB  
Article
Statistical Divergences between Densities of Truncated Exponential Families with Nested Supports: Duo Bregman and Duo Jensen Divergences
by Frank Nielsen
Entropy 2022, 24(3), 421; https://doi.org/10.3390/e24030421 - 17 Mar 2022
Cited by 11 | Viewed by 5388
Abstract
By calculating the Kullback–Leibler divergence between two probability measures belonging to different exponential families dominated by the same measure, we obtain a formula that generalizes the ordinary Fenchel–Young divergence. Inspired by this formula, we define the duo Fenchel–Young divergence and report a majorization [...] Read more.
By calculating the Kullback–Leibler divergence between two probability measures belonging to different exponential families dominated by the same measure, we obtain a formula that generalizes the ordinary Fenchel–Young divergence. Inspired by this formula, we define the duo Fenchel–Young divergence and report a majorization condition on its pair of strictly convex generators, which guarantees that this divergence is always non-negative. The duo Fenchel–Young divergence is also equivalent to a duo Bregman divergence. We show how to use these duo divergences by calculating the Kullback–Leibler divergence between densities of truncated exponential families with nested supports, and report a formula for the Kullback–Leibler divergence between truncated normal distributions. Finally, we prove that the skewed Bhattacharyya distances between truncated exponential families amount to equivalent skewed duo Jensen divergences. Full article
(This article belongs to the Special Issue Information and Divergence Measures)
Show Figures

Graphical abstract

Back to TopTop