Next Article in Journal
Outlier-Robust Surrogate Modeling of Ion–Solid Interaction Simulations
Next Article in Special Issue
Robust Z-Estimators for Semiparametric Moment Condition Models
Previous Article in Journal
Zero-Watermarking for Vector Maps Combining Spatial and Frequency Domain Based on Constrained Delaunay Triangulation Network and Discrete Fourier Transform
Previous Article in Special Issue
Transversality Conditions for Geodesics on the Statistical Manifold of Multivariate Gaussian Distributions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Information and Divergence Measures

by
Alex Karagrigoriou
1,* and
Andreas Makrides
1,2
1
Laboratory of Statistics and Data Analysis, Department of Statistics and Actuarial-Financial Mathematics, University of the Aegean, GR-83200 Karlovasi, Greece
2
Department of Computer Science, University of Nicosia, CY-1700 Nicosia, Cyprus
*
Author to whom correspondence should be addressed.
Entropy 2023, 25(4), 683; https://doi.org/10.3390/e25040683
Submission received: 31 March 2023 / Accepted: 13 April 2023 / Published: 19 April 2023
(This article belongs to the Special Issue Information and Divergence Measures)
The present Special Issue of Entropy, entitled Information and Divergence Measures, covers various aspects and applications in the general area of Information and Divergence Measures.
Measures of information appear everywhere in probability and statistics. They play a fundamental role in communication theory. They have a long history dating back to the papers of Fisher, Shannon, and Kullback. There are many measures each claiming to capture the concept of information or simply being measures of divergence or distance between two probability distributions. Numerous generalizations of such measures also exist.
The concept of distance is important in establishing the degree of similarity and/or closeness between functions, populations, and distributions. The intense engagement of many authors with entropy and divergence measures demonstrates the significant role they are playing in the sciences. Indeed, distances and entropies are related to inferential statistics, including both estimation and hypothesis testing problems [1,2,3,4,5,6], model selection criteria [7,8,9] and probabilistic and statistical modelling with applications in multivariate analysis, actuarial science, portfolio optimization, survival analysis, reliability theory, change-point problems, etc. [10,11,12,13,14,15]. Thus, the significance of entropy and divergence measures that emerges in these and many more scientific fields is a topic of great interest to scientists, researchers, medical experts, engineers, industrial managers, computer experts, data analysts, etc.
All the articles included in this Special Issue were reviewed and accepted for publication because they have been found to contribute research works of the highest quality and at the same time, they highlight the diversity of the topics in this scientific area. The issue presents twelve original contributions that span a wide range of topics. In [16], the authors demonstrate how to employ the techniques of the calculus of variations with a variable endpoint to search for the closest distribution from a family of distributions generated via a constraint set on the parameter manifold. In [17], the authors consider weighted Tsallis and Kaniadakis divergences and establish inequalities between these measures and Tsallis and Kaniadakis logarithms. In [18], LPI waveforms are designed within the constraints of the detection performance metrics of radar and PISs, both of which are measured by the Kullback–Leibler divergence, and the resolution performance metric, measured by joint entropy with the solution based on the sequential quadratic programming method. In [19], a bootstrap approximation of the Kullback–Leibler discrepancy is utilized to estimate the probability that the fitted null model is closer to the underlying generating model than the fitted alternative model. The authors also propose a bias correction either by adding a bootstrap-based correction or by adding the number of parameters in the candidate model. In [20], the authors extend, and compute information measures related to Shannon and Tsallis entropies, for the concomitants of the generalized order statistics from the Farlie–Gumbel–Morgenstern family. In [21], the evaluation of academic performance by using the statistical K-means (SKM) algorithm to produce clusters is investigated. A simulation experiment on the top 20 universities in China shows the advantages of the SKM algorithm over traditional methods. In [22], the authors introduce a closed-form expression for the Kullback–Leibler divergence between two central multivariate Cauchy distributions used in different signal and image processing applications where non-Gaussian models are needed. In [23], restricted minimum Rényi’s pseudodistance estimators are defined, and their asymptotic distribution and influence function are derived. Further, robust Rao-type and divergence-type tests based on minimum Rényi’s pseudodistance and restricted minimum Rényi’s pseudodistance estimators are considered, and their asymptotic properties are obtained. In [24], a skew logistic distribution is proposed and extended to the skew bi-logistic distribution to allow the modelling of multiple waves in epidemic time series data. The proposed distribution is validated by COVID-19 data from the UK and is evaluated for goodness-of-fit using the empirical survival Jensen–Shannon divergence and the Kolmogorov–Smirnov two-sample test statistic. In [25], an approach for the derivation of families of inequalities for set functions is suggested and applied to obtain information inequalities with Shannon information measures that satisfy sub/supermodularity and monotonicity properties. The author also applies the generalized Han’s inequality to analyse a problem in extremal graph theory, with an information–theoretic proof and interpretation. In [26], the authors focus on a general family of measures of divergence and purpose a restricted minimum divergence estimator under constraints and a new double-index (dual) divergence test statistic which is thoroughly examined. Finally, in [27], by calculating the Kullback–Leibler divergence between two probability measures belonging to different exponential families dominated by the same measure, the authors obtain a formula that generalizes the ordinary Fenchel–Young divergence and define the duo Fenchel–Young divergence which is equivalent to a duo Bregman divergence. The author also proves that the skewed Bhattacharyya distances between truncated exponential families amount to equivalent skewed duo Jensen divergences.

Acknowledgments

We wish to thank the authors for their contributions and their willingness to share innovative ideas and techniques to furnish this issue. In addition, we would like to thank and express our appreciation to the reviewers since they spent a considerable amount of time providing accurate and fair manuscript evaluations. Finally, we would like to express our pleasure for working with staff of the Editorial Office of Entropy for the fruitful and excellent cooperation.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Alin, A.; Kurt, S. Ordinary and penalized minimum power-divergence estimators in two-way contingency tables. Comput. Stat. 2008, 23, 455–468. [Google Scholar] [CrossRef]
  2. Basu, A.; Harris, I.R.; Hjort, N.L. Robust and efficient estimation by minimising a density power divergence. Biometrika 1998, 85, 549–559. [Google Scholar] [CrossRef]
  3. Jiménez-Gamero, M.D.; Batsidis, A. Minimum distance estimators for count data based on the probability generating function with applications. Metrika 2017, 80, 503–545. [Google Scholar] [CrossRef]
  4. Patra, S.; Maji, A.; Basu, A.; Pardo, L. The Power Divergence and the Density Power Divergence Families: The Mathematical Connection. Sankhya B 2013, 75, 16–28. [Google Scholar] [CrossRef]
  5. Toma, A.; Broniatowski, M. Dual divergence estimators and tests: Robustness results. J. Multivar. Anal. 2011, 102, 20–36. [Google Scholar] [CrossRef]
  6. Vonta, F.; Mattheou, K.; Karagrigoriou, A. On properties of the (φ,α)-power divergence family with applications to goodness of fit tests. Methodol. Comput. Appl. Probab. 2012, 14, 335–356. [Google Scholar] [CrossRef]
  7. Shang, J.; Cavanaugh, J.E. Bootstrap variants of the Akaike information criterion for mixed model selection. Comput. Stat. Data Anal. 2008, 52, 2004–2021. [Google Scholar] [CrossRef]
  8. Neath, A.A.; Cavanaugh, J.E.; Weyhaupt, A.G. Model evaluation, discrepancy function estimation, and social choice theory. Comput. Stat. 2015, 30, 231–249. [Google Scholar] [CrossRef]
  9. Mattheou, K.; Lee, S.; Karagrigoriou, A. A model selection criterion based on the BHHJ measure of divergence. J. Stat. Plan. Inference 2009, 139, 228–235. [Google Scholar] [CrossRef]
  10. Barbu, V.S.; D’Amico, G.; Makrides, A. A continuous-time semi-Markov system governed by stepwise transitions. Mathematics 2022, 10, 2745. [Google Scholar] [CrossRef]
  11. Batsidis, A.; Martin, N.; Pardo Llorente, L.; Zografos, K. φ-Divergence Based Procedure for Parametric Change-Point Problems. Methodol. Comput. Appl. Probab. 2016, 18, 21–35. [Google Scholar] [CrossRef]
  12. Nielsen, F. Revisiting Chernoff Information with Likelihood Ratio Exponential Families. Entropy 2022, 24, 1400. [Google Scholar] [CrossRef]
  13. Sachlas, A.; Papaioannou, T. Residual and past entropy in actuarial science and survival models. Methodol. Comput. Appl. Probab. 2014, 16, 79–99. [Google Scholar] [CrossRef]
  14. Preda, V.; Dedu, S.; Iatan, I.; Cernat, I.D.; Sheraz, M. Tsallis Entropy for Loss Models and Survival Models Involving Truncated and Censored Random Variables. Entropy 2022, 24, 1654. [Google Scholar] [CrossRef] [PubMed]
  15. Zografos, K.; Nadarajah, S. Survival exponential entropies. IEEE Trans. Inf. Theory 2005, 51, 1239–1246. [Google Scholar] [CrossRef]
  16. Herntier, T.; Peter, A.M. Transversality Conditions for Geodesics on the Statistical Manifold of Multivariate Gaussian Distributions. Entropy 2022, 24, 1698. [Google Scholar] [CrossRef]
  17. Sfetcu, R.-C.; Sfetcu, S.-C.; Preda, V. Some Properties of Weighted Tsallis and Kaniadakis Divergences. Entropy 2022, 24, 1616. [Google Scholar] [CrossRef]
  18. Chen, J.; Wang, J.; Zhang, Y.; Wang, F.; Zhou, J. Spatial Information-Theoretic Optimal LPI Radar Waveform Design. Entropy 2022, 24, 1515. [Google Scholar] [CrossRef]
  19. Dajles, A.; Cavanaugh, J. Probabilistic Pairwise Model Comparisons Based on Bootstrap Estimators of the Kullback–Leibler Discrepancy. Entropy 2022, 24, 1483. [Google Scholar] [CrossRef]
  20. Suter, F.; Cernat, I.; Drăgan, M. Some Information Measures Properties of the GOS-Concomitants from the FGM Family. Entropy 2022, 24, 1361. [Google Scholar] [CrossRef]
  21. Yu, D.; Zhou, X.; Pan, Y.; Niu, Z.; Sun, H. Application of Statistical K-Means Algorithm for University Academic Evaluation. Entropy 2022, 24, 1004. [Google Scholar] [CrossRef] [PubMed]
  22. Bouhlel, N.; Rousseau, D. A Generic Formula and Some Special Cases for the Kullback–Leibler Divergence between Central Multivariate Cauchy Distributions. Entropy 2022, 24, 838. [Google Scholar] [CrossRef] [PubMed]
  23. Jaenada, M.; Miranda, P.; Pardo, L. Robust Test Statistics Based on Restricted Minimum Rényi’s Pseudodistance Estimators. Entropy 2022, 24, 616. [Google Scholar] [CrossRef] [PubMed]
  24. Levene, M. A Skew Logistic Distribution for Modelling COVID-19 Waves and Its Evaluation Using the Empirical Survival Jensen–Shannon Divergence. Entropy 2022, 24, 600. [Google Scholar] [CrossRef]
  25. Sason, I. Information Inequalities via Submodularity and a Problem in Extremal Graph Theory. Entropy 2022, 24, 597. [Google Scholar] [CrossRef]
  26. Meselidis, C.; Karagrigoriou, A. Contingency Table Analysis and Inference via Double Index Measures. Entropy 2022, 24, 477. [Google Scholar] [CrossRef]
  27. Nielsen, F. Statistical Divergences between Densities of Truncated Exponential Families with Nested Supports: Duo Bregman and Duo Jensen Divergences. Entropy 2022, 24, 421. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Karagrigoriou, A.; Makrides, A. Information and Divergence Measures. Entropy 2023, 25, 683. https://doi.org/10.3390/e25040683

AMA Style

Karagrigoriou A, Makrides A. Information and Divergence Measures. Entropy. 2023; 25(4):683. https://doi.org/10.3390/e25040683

Chicago/Turabian Style

Karagrigoriou, Alex, and Andreas Makrides. 2023. "Information and Divergence Measures" Entropy 25, no. 4: 683. https://doi.org/10.3390/e25040683

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop