entropy-logo

Journal Browser

Journal Browser

25 Years of Sample Entropy

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Multidisciplinary Applications".

Deadline for manuscript submissions: 9 September 2025 | Viewed by 1082

Special Issue Editors


E-Mail Website
Guest Editor
Center for Advanced Medical Analytics, Department of Medicine, Division of Cardiology, University of Virginia, Charlottesville, VA 22908, USA
Interests: predictive analytics monitoring; early warning scores
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Division of Gastrointestinal Surgery, Department of Surgery, University of Alabama at Birmingham, Birmingham, AL 35249, USA
Interests: surgical outcomes; epidemiology; biostatistics
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

We are pleased to announce a call for papers for a Special Issue of Entropy celebrating the 25th anniversary of sample entropy, a technique first described by JS Richman and JR Moorman. Over the past quarter-century, sample entropy has proven to be a robust and versatile tool, finding applications across diverse fields such as physiology, finance, engineering, and neuroscience. Its adaptability underscores its value; yet, it is important to recognize that sample entropy fundamentally remains a tool for discovery.

As such, contributions aimed at refining the method, deepening our understanding of its theoretical underpinnings, or applying it in novel contexts are particularly welcome. This is especially relevant in today’s era of AI-driven research, where entropy metrics play a pivotal role in model fitting, feature selection, and data interpretation. We encourage submissions that explore how sample entropy intersects with machine learning, as well as studies that highlight its potential in emerging disciplines.

Join us in reflecting on the legacy of sample entropy while shaping its future. Accepted papers will showcase not only its current impact but also its potential to further enhance discovery in our increasingly data-driven world.

Prof. Dr. J Randall Moorman
Dr. Joshua Richman
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Shannon entropy
  • Kolmogorov–Sinai entropy
  • information theory
  • time series complexity
  • cross-entropy

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

27 pages, 15276 KiB  
Article
The Dynamics of Shannon Entropy in Analyzing Climate Variability for Modeling Temperature and Precipitation Uncertainty in Poland
by Bernard Twaróg
Entropy 2025, 27(4), 398; https://doi.org/10.3390/e27040398 - 8 Apr 2025
Viewed by 366
Abstract
The aim of this study is to quantitatively analyze the long-term climate variability in Poland during the period 1901–2010, using Shannon entropy as a measure of uncertainty and complexity within the atmospheric system. The analysis is based on the premise that variations in [...] Read more.
The aim of this study is to quantitatively analyze the long-term climate variability in Poland during the period 1901–2010, using Shannon entropy as a measure of uncertainty and complexity within the atmospheric system. The analysis is based on the premise that variations in temperature and precipitation reflect the dynamic nature of the climate, understood as a nonlinear system sensitive to fluctuations. This study focuses on monthly distributions of temperature and precipitation, modeled using the bivariate Clayton copula function. A normal marginal distribution was adopted for temperature and a gamma distribution for precipitation, both validated using the Anderson–Darling test. To improve estimation accuracy, a bootstrap resampling technique and numerical integration were applied to calculate Shannon entropy at each of the 396 grid points, with a spatial resolution of 0.25° × 0.25°. The results indicate a significant increase in Shannon entropy during the summer months, particularly in July (+0.203 bits) and January (+0.221 bits), compared to the baseline period (1901–1971), suggesting a growing unpredictability of the climate. The most pronounced trend changes were identified in the years 1985–1996 (as indicated by the Pettitt test), while seasonal trends were confirmed using the Mann–Kendall test. A spatial analysis of entropy at the levels of administrative regions and catchments revealed notable regional disparities—entropy peaked in January in the West Pomeranian Voivodeship (4.919 bits) and reached its minimum in April in Greater Poland (3.753 bits). Additionally, this study examined the relationship between Shannon entropy and global climatic indicators, including the Land–Ocean Temperature Index (NASA GISTEMP) and the ENSO index (NINO3.4). Statistically significant positive correlations were observed between entropy and global temperature anomalies during both winter (ρ = 0.826) and summer (ρ = 0.650), indicating potential linkages between local climate variability and global warming trends. To explore the direction of this relationship, a Granger causality test was conducted, which did not reveal statistically significant causality between NINO3.4 and Shannon entropy (p > 0.05 for all lags tested), suggesting that the observed relationships are likely co-varying rather than causal in the Granger sense. Further phase–space analysis (with a delay of τ = 3 months) allowed for the identification of attractors characteristic of chaotic systems. The entropy trajectories revealed transitions from equilibrium states (average entropy: 4.124–4.138 bits) to highly unstable states (up to 4.768 bits), confirming an increase in the complexity of the climate system. Shannon entropy thus proves to be a valuable tool for monitoring local climatic instability and may contribute to improved risk modeling of droughts and floods in the context of climate change in Poland. Full article
(This article belongs to the Special Issue 25 Years of Sample Entropy)
Show Figures

Figure 1

25 pages, 647 KiB  
Article
Multiscale Sample Entropy-Based Feature Extraction with Gaussian Mixture Model for Detection and Classification of Blue Whale Vocalization
by Oluwaseyi Paul Babalola, Olayinka Olaolu Ogundile and Vipin Balyan
Entropy 2025, 27(4), 355; https://doi.org/10.3390/e27040355 - 28 Mar 2025
Viewed by 491
Abstract
A multiscale sample entropy (MSE) algorithm is presented as a time domain feature extraction method to study the vocal behavior of blue whales through continuous acoustic monitoring. Additionally, MSE is applied to the Gaussian mixture model (GMM) for blue whale call detection and [...] Read more.
A multiscale sample entropy (MSE) algorithm is presented as a time domain feature extraction method to study the vocal behavior of blue whales through continuous acoustic monitoring. Additionally, MSE is applied to the Gaussian mixture model (GMM) for blue whale call detection and classification. The performance of the proposed MSE-GMM algorithm is experimentally assessed and benchmarked against traditional methods, including principal component analysis (PCA), wavelet-based feature (WF) extraction, and dynamic mode decomposition (DMD), all combined with the GMM. This study utilizes recorded data from the Antarctic open source library. To improve the accuracy of classification models, a GMM-based feature selection method is proposed, which evaluates both positively and negatively correlated features while considering inter-feature correlations. The proposed method demonstrates enhanced performance over conventional PCA-GMM, DMD-GMM, and WF-GMM methods, achieving higher accuracy and lower error rates when classifying the non-stationary and complex vocalizations of blue whales. Full article
(This article belongs to the Special Issue 25 Years of Sample Entropy)
Show Figures

Figure 1

Back to TopTop