Next Article in Journal
Symbolic Analysis of the Quality of Texts Translated into a Language Preserving Vowel Harmony
Previous Article in Journal
Tsallis Entropy in Consecutive k-out-of-n Good Systems: Bounds, Characterization, and Testing for Exponentiality
Previous Article in Special Issue
Localizing Synergies of Hidden Factors in Complex Systems: Resting Brain Networks and HeLa GeneExpression Profile as Case Studies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Review of Recent (2015–2024) Popular Entropy Definitions Applied to Physiological Signals

Department of Computer Science and Engineering, University of Ioannina, 45110 Ioannina, Greece
*
Author to whom correspondence should be addressed.
Entropy 2025, 27(9), 983; https://doi.org/10.3390/e27090983
Submission received: 23 July 2025 / Revised: 10 September 2025 / Accepted: 17 September 2025 / Published: 20 September 2025
(This article belongs to the Special Issue Entropy in Biomedical Engineering, 3rd Edition)

Abstract

Entropy estimation is widely used in time series analysis, particularly in the field of Biomedical Engineering. It plays a key role in analyzing a wide range of physiological signals and serves as a measure of signal complexity, which reflects the complexity of the underlying system. The widespread adoption of entropy in research has led to numerous entropy definitions, with Approximate Entropy and Sample Entropy being among the most widely used. Over the past decade, the field has remained highly active, with a significant number of new entropy definitions being proposed, some inspired by Approximate and Sample Entropy, some by Permutation entropy, while others followed their own course of thought. In this paper, we review and compare the most prominent entropy definitions that have appeared in the last decade (2015–2024). We performed the search on 20 December 2024. We adopt the PRISMA methodology for this purpose, a widely accepted standard for conducting systematic literature reviews. With the included articles, we present statistical results on the number of citations for each method and the application domains in which they have been used. We also conducted a thorough review of the selected articles, documenting for each paper which definition has been employed and on which physiological signal it has been applied.

1. Introduction

Entropy is a measure of uncertainty, randomness, or disorder. In thermodynamics, it quantifies the unavailability of a system to perform useful work. In information theory, it expresses the space we need to describe a system, while in time series analysis, it serves as a measure of complexity.
The word entropy has its origins in ancient Greek. It is a compound word consisting of two ancient Greek words: “ἐν” (en), meaning “in” or “within”, and “τροπή” (tropē), meaning “turn” or “transformation”. The term first appeared in physics by the German scientist Rudolf Clausius in 1865 [1] (Clausius’s own wording: “I propose to call the magnitude S the entropy of the body, from the Greek word τροπή, transformation”).
In 1948, Shannon introduced, in a groundbreaking paper [2], the concept of entropy in the context of information theory. According to Shannon’s definition, a message with certainty carries no new information, and a completely uncertain message carries the maximum amount of information. This was expressed by the famous formula:
H ( X ) = i = 1 n p i log 2 p i ,
where p i is the probability of a message x i to appear, expressed in bits. Another famous paper was published in 1991 [3], proposing Approximate Entropy, an entropy definition which embeds the time series into an m-dimensional space. The entropy is computed based on the probability of the distance between two vectors in this space being smaller than a given threshold. In 2002, Sample Entropy [4,5] was introduced as a variation of Approximate Entropy, claiming higher reliability and lower bias in real-world applications, due to the exclusion of self-matches, among other modifications. In practice today, Sample Entropy is used more frequently than Approximate Entropy and is the most popular entropy definition among those exploiting m-dimensional space embedding. We should also include in this short paragraph Permutation entropy [6], an entropy definition which follows a different approach: the time series is again embedded in an m-dimensional space, but entropy is estimated based on the diversity of sorting patterns. Specifically, each vector in the m-dimensional space is sorted, and the distribution of these sorting patterns determines the entropy. Permutation entropy was introduced in 2002 and gave a different perspective to entropy estimation.
Beyond the aforementioned definitions, numerous other entropy measures have been proposed in the literature. Interesting entropy definitions have been recently reviewed and compared in [7]. In this paper, we review and compare the impact on the research community of all recent entropy definitions proposed over the last decade (2015–2024). An initial literature review, prior to the systematic one, showed that the most common field of application was physiological signals; therefore, we chose this field as our focus. We then adopted a systematic methodology for the review, aiming to conduct the research as broadly and impartially as possible. Fourteen definitions will be examined, based on the PRISMA methodology [8]. In this methodology, queries are applied to a literature database to identify candidate papers. After a filtering process, the selected papers are reviewed. We focus on Biomedical Engineering problems; however, to limit the number of papers and maintain a systematic review process, we concentrate on physiological signals, the most popular application field for entropy. We examined the following signals: ECG/HRV (Electrocardiogram/Heart Rate Variability), EEG (Electroencephalogram), PPG (Photoplethysmogram), EHG (Electrohysterogram), CTG (Cardiotocogram), and EMG (Electromyogram).
The rest of this paper is structured as follows: In Section 2, the definitions of the examined methods are described. The PRISMA methodology is outlined in Section 3. The metadata resulting from this process is presented in tables and discussed in Section 4. The articles retained after the systematic selection are grouped according to the entropy definitions and the signals analyzed, and are reviewed in Section 5. Before the final section, Section 7, which summarizes the conclusions of the review, Section 6 intrigues the reader with an interesting discussion.

2. Recent Entropy Definitions

The following section provides a brief yet comprehensive overview of the definitions under examination. The definitions are presented concisely, yet with sufficient detail to ensure clarity and completeness. Additionally, we provide details on how the researchers who proposed the definition have applied it, including the parameters used and the lengths of the analyzed signals.
We use the notation X = x 1 , x 2 , , x N for the original time series of size N and X m = x 1 m , x 2 m , , x N m + 1 m for the time series of vectors in the embedding space with dimension m. For clarity, we have gathered all the symbols we used in Table 1.
Fourteen definitions of entropy will be presented, each proposed within the decade (2015–2024). Table 2 provides us with information about the course of thought each entropy definition follows.

2.1. Distance-Based Entropy Definitions

Several of the examined definitions were greatly inspired by the well-established entropy definitions, Approximate and Sample Entropy, often introducing refinements to overcome their known limitations. In this subsection we gather and present entropy definitions that primarily focus on measuring distances within the original embedded X m time series.

2.1.1. Range Entropy

Range entropy (RangEn) was proposed by Amir et al. in 2018 [9] as a modification of Approximate and Sample Entropy. It replaces the Chebyshev distance between x i m and x j m with the following metric (symbolized in the original paper as d r a n g e ( x i m , x j m ) ):
d i , j m = max k | x i + k x j + k | min k | x i + k x j + k | max k | x i + k x j + k | + min k | x i + k x j + k | , 1 k m .
Two definitions of entropy R a n g E n A and R a n g E n B are proposed, based on Approximate Entropy and on Sample Entropy. R a n g E n A and R a n g E n B were compared with Approximate and Sample Entropy, presenting a smaller standard deviation for various signal lengths ( N = 50 up to N = 1000 ), with embedding dimension m = 2 and threshold distance r = 0.2 [9].

2.1.2. Cosine Similarity Entropy

Cosine Similarity (CosEn) entropy was introduced by Theerasak et al. in 2017 [10]. It replaces (a) the Chebyshev distance, originally adopted by Sample Entropy, with the Angular Distance:
A n g D i s t i , j m = ( 1 π ) cos 1 x i m · x j m | | x i m | | | | x j m | | ,
and (b) the standard conditional probability with Shannon entropy, (originally symbolized A n g D i s t i , j m = M M a i , j ( m ) / π , where a i , j ( m ) = c o s 1 ( C o s S i m i , j ( m ) ) and C o s S i m i , j ( m ) = M M x i m · x j m / | | x i m | | | | x j m | | ).
Since CosEn employs Shannon entropy, the authors expected the method to exhibit the same characteristics. Maximum values of Cosine Similarity entropy are presented for r = 0.5 . The proposed values for r are between 0.05 and 0.2, or 0.5 and 1. The authors proposed m = 2 5 , r = 0.05 0.2 and minimum N = 100 for WGN and 1 f noise or N = 700 for first- or second-order Auto-Regressive processes [10].

2.1.3. Diversity Entropy

Diversity entropy (DivEn) was introduced by Xianzhi et al. in 2021 [11] to address certain inconsistencies present in Sample Entropy, Fuzzy entropy [12], and Permutation entropy. It computes the similarity between adjacent vectors x i m , x j m (originally symbolized as y i ( m ) , y j ( m ) ) using the Cosine Similarity (originally symbolized as d ( y i ( m ) , y j ( m ) ) :
d i , j m = x i m · x j m | x i m | | x j m | .
The Cosine Similarity is ranged from −1 to 1. All those distances are mapped onto k bins, B k . The probability p k of a distance falling within B k is defined. Diversity entropy is computed by Shannon entropy on the distribution of p k .
Diversity entropy was originally proposed as a measure to diagnose fault diagnosis in rotating machinery, where it reports low entropy values for deterministic series and higher values for chaotic ones using m = 4 and k = 100 [11].

2.1.4. Distribution Entropy

Distribution Entropy (DistEn) was proposed by Li et al. in 2015 [13]. The time series is again embedded in m-dimensional space. For each vector x i m (originally symbolized as x ( i ) ), the maximum distance d i , j m from all other vectors is computed. These distances are binned into M equal-sized bins. The empirical probability function is calculated:
p t , m , M = # elements   in   bin t total # elements ,
where t = 1 , 2 , , M . Distribution Entropy is reported as
D i s t E n m , M = 1 log 2 M t = 1 M p t , m , M log 2 p t , m , M ,
where the unit of the chosen base log 2 is bits.
In [13], using five types of chaotic series and stochastic processes, Distribution Entropy exhibited increased stability for N = 50 , , 200 compared to Sample and Fuzzy entropy. It presented a remarkable stability for M = 512 , , 1024 . Using m = 1 , , 10 , Distribution Entropy was stable both for the average levels and the standard deviations.

2.2. Symbolic and Ordinal Pattern-Based Entropy Definitions

Alongside Approximate and Sample Entropy, Permutation entropy has emerged as one of the widely used entropy definitions that also embeds the original time series X into X m . However, it analyzes the distribution of the ordinal patterns of the vectorized time series X m . This different approach to quantify irregularity in a time series inspired researchers to extend Permutation entropy to initially map the original time series, followed by an investigation of the ordinal patterns. In this section we will present the entropy definitions with a similar concept.

2.2.1. Increment Entropy

Inspired by symbolic dynamics and Permutation entropy, Increment entropy (IncrEn) [14] takes into consideration the amplitude of the time series and the difference in successive samples.
Given the original time series X, the increment time series V = v 1 , v 2 , , v N 1 is constructed, where v i = x i + 1 x i . The time series V is embedded into the m-dimensional space, giving the series of vectors V m = v 1 m , v 2 m , , v N m m .
Each element of each vector is mapped onto a two-letter word. The first letter is the sign s and the second is the magnitude q, based on the quantifying resolution parameter R. The series U m = u 1 m , u 2 m , u N m m consists of the vectors u i m = j = 0 m 1 s i + j q i + j (originally symbolized as { w i , 1 i N m } ). By computing the number of instances of u i m , and the probability of appearance, p ( u i m ) , Increment entropy is reported as
I n c r E n ( m ) = 1 m 1 i = 1 ( 2 q + 1 ) m p ( u i m ) log 2 p ( u i m ) ,
where entropy is given in bits.
The authors in [14] propose m = 2 , , 5 and a quantifying resolution of R 4 . However, further research [15] showed that in real-life applications, IncrEn’s parameters m , R , N become sensitive for short signals N 500 , reaching a stabilized state after N = 1000 . Short signals’ optimal parameters are 2 m 6 and 2 R 8 ; specifically, m and R should take values between 2 and 4.

2.2.2. Dispersion Entropy

In 2016, Rostaghi and Azami [16] proposed Dispersion Entropy (DispEn), an entropy definition that combines the order of the samples and their values and also employs the time delay for the embedding process.
Starting with the given time series X, a new mapped series U = u 1 , u 2 , , u N is produced, labeled from 1 to c. The series is embedded in the m-dimensional space with time delay d:
U m , c = u 1 m , c , u 2 m , c , , u ( N m + 1 ) m , c , u i m , c = ( u i c , u i + d c , , u i + ( m 1 ) d c ) .
Originally, u i m , c is symbolized as z i m , c = { z i c , z i + d c , , z i + ( m 1 ) d c } . The ordinal pattern π i m = ( u i , u i + 1 , , u i + m 1 ) (originally symbolized as π v 0 v 1 v m 1 , where z i c = v 0 , z i + d c = v 1 , , z i + ( m 1 ) d c = v m 1 ) is calculated and stored. The number of possible dispersion patterns for each u i m is c m . For each pattern π i m the probability of appearance p ( π i m ) is computed. The Shannon entropy of the probability of the appearances reports the Dispersion Entropy.
D i s p E n X , m , c , d = i = 1 c m p ( π i m ) ln p ( π i m ) .
The entropy is expressed in nats. Normalized D i s p E n is calculated as D i s p E n n o r m a l i z e d = M M D i s p E n / ln c m , where ln c m is the largest D i s p E n value.
For labeling the original signal X, both linear and nonlinear methods can be used [16], with normal cumulative distribution function (NCDF) showing superiority over linear mapping techniques and stabilization at the maximum value after N = 1000 .

2.2.3. Fluctuation-Based Dispersion Entropy

Inspired by Dispersion Entropy, Hamed et al. proposed Fluctuation-based Dispersion Entropy (FDispEn) in 2018 [17]. In Dispersion Entropy, the estimation of each element of the embedded time series X i m has c possible values; thus, each vector has c m possible patterns. Fluctuation-based Dispersion Entropy considers the differences between adjacent elements of dispersion patterns, leading to vectors of size m 1 with ( 2 c 1 ) m possible states. A normalized version of FDispEn is given by the formula F D i s p E n l n ( 2 c 1 ) m 1 .

2.2.4. Slope Entropy

Slope entropy (SlopEn) was introduced by David Cuesta-Frau in 2019 [18]. It was inspired by Permutation entropy, a definition where the amplitude information is ignored. The time series is embedded into an m-dimensional space, giving the series X m . The difference between two consecutive elements of the vector x i m is computed, resulting in the series D m 1 = d 1 m 1 , d 2 m 1 , , d N m + 1 m 1 .
Each element of each vector d i m 1 is mapped onto one of the possible five symbols + 2 , + 1 , 0 , 1 , 2 , as shown in Figure 1. The sizes of the sectors are described by the two parameters, γ and δ . The generated series U m 1 consists of the vectors u i m 1 , where each element is a symbol. The Shannon entropy of the distribution function p i , i.e., the probability of a vector of symbols u i m 1 (originally symbolized as ψ i m ) to appear, defines the Slope entropy. Slope entropy managed to show great discriminating capabilities for low N values (250 and 500) and m = 3 , , 8 . The recommended γ and δ parameters are γ 1 , 2 and δ close to zero [18]. A simplified Slope entropy definition is proposed where δ is discarded along with the zero symbol in the mapping procedure [19] or an asymmetry regarding the γ parameter [20].

2.2.5. Symbolic Dynamic Entropy

Symbolic Dynamic entropy was proposed by Li et al. [21] and is based on the state transition probability of the possible state patterns. It was initially inspired by a similar entropy definition [22] and addressed its shortcomings. The original time series X is transformed using either Uniform Partitioning or Maximum Entropy Partitioning into a new series U = u 1 , u 2 , , u N (originally symbolized as Z) consisting of c (originally symbolized as ϵ ) number of symbols. Then, the new series U is embedded in m-dimensional space with delay d (originally symbolized as λ ):
U m , c = u 1 m , c , u 2 m , c , , u ( N m + 1 ) m , c , u i m , c = ( u i c , u i + d c , , u i + ( m 1 ) d c ) .
Symbolic Dynamic Entropy is given by
s d e ( X , m , d , c ) = a = 1 c m p ( u a c , m , d ) · ln p ( u a c , m , d ) a = 1 c m b = 1 c p ( u a c , m , d ) · ln p ( u a c , m , d ) · p ( σ b | u a c , m , d ) ,
expressed in nats, with p ( u a c , m , d ) expressing the probability that the u a c , m , d pattern will appear and p ( σ b | u a c , m , d ) expressing the probability that the σ b symbol will appear given that the u a c , m , d pattern has appeared. Employing a normalization factor, Symbolic Dynamic Entropy is reported as
S D E ( X , m , d , c ) = s d e ( X , m , d , c ) / l n c m + 1 .
In order to find the optimal parameters, researchers in [21] propose an algorithm based on the Average Euclidean Distance (AED).

2.3. Complexity Estimation Based on Sorting Effort

The ordinal patterns of Permutation entropy are obtained by sorting each embedded vector x i m , a procedure that is also employed by Bubble Entropy to extract valuable information about the series regularity.

Bubble Entropy

Bubble Entropy (BubbleEn) was introduced in 2017 [23] as an entropy definition “almost free of parameters”. It was designed to free researchers from the need to estimate parameters. Instead of using the distribution of the sorting patterns to estimate entropy, Bubble Entropy used the distribution of the tasks spent to perform the sorting of the elements of each x i m , giving a physical meaning to what is distributed, since task means energy, a physical quantity very closely related to entropy. In addition, Bubble Entropy reduces the number of possible states in the distribution, leading to better-balanced distributions. The method reports the difference in entropy between spaces of size m + 1 and m. The description of the method is as follows.
The elements in each vector x i m are sorted, and the number of swaps s i performed by the bubble sort algorithm is what counts as a task. A new time series, the series of the sorting tasks, is formed:
s i = s 1 , s 2 , , s N m + 1 .
The probability mass function p i of having i swaps is used to evaluate the second-order Rényi entropy, expressed in bits:
H swaps m = log 2 i = 0 m 2 p i 2 .
Bubble Entropy is the normalized difference in the entropy of the sorting effort (swaps) required to sort vectors of length m + 1 and m:
B u b b l e E n m = K m H swaps m + 1 H swaps m ,
where K m is a normalization factor. For more information on normalization factors for Bubble Entropy, as well as for the study of several theoretical issues, please see [24], where three normalization options are suggested. The first option is not to normalize, the second is normalization based on the maximum number of possible states, and the last one is normalization based on the White Gaussian Noise.
The computation cost of Bubble Entropy is quite low, leading researchers to claim that there is no need to define a restricted range of m values, since it is feasible to compute a broad range of m. In [23] researchers used m = 2 , , 25 , with the m 12 range exhibiting valuable discriminative information. Using a synthetic series Bubble Entropy presented stability for short-length signals. In [24,25], they computed Bubble Entropy for 2 m 50 , and again, the range of 10 < m < 20 was observed as the most valuable one.

2.4. Multiscale and Hierarchical Definitions

The dynamics of a time series can be further explored through a scaling process, a powerful modification widely applied to many entropy definitions. Entropy of Entropy employs this approach to reveal the underlying dynamics.

Entropy of Entropy

In 2017, Chang Francis Hsu et al. proposed the definition of Entropy of Entropy (EoE) [26], a novel complexity measurement inspired by Multiscale entropy [27]. The original series X is divided into non-overlapping windows of size τ .
W τ = w 1 τ , w 2 τ , , w N / τ τ .
In the range of min ( w j τ ) and max ( w j τ ) elements, where 1 j N / τ , the window is divided into s slices of equal size. The probability p i of a sample x i to be in slice k, where 1 k s 1 , is computed as
p j , k = # x i over w j τ in state k τ .
Next, the Shannon entropy of each w i τ is computed:
y i τ = k = 1 s 1 p i , k ln p i , k .
The number of states L, where 1 L a l l _ s t a t e s (originally a l l _ s t a t e s are symbolized as s 2 ), for each y j τ is both finite and dependent on τ . The probability p L for the state y j τ over the generated series Y τ = y 1 τ , y 2 τ , , y M M N / τ to occur in state L is obtained by the following form:
p L = total number of y j τ over Y τ in level L N / τ .
Entropy of Entropy is computed for all finite possible states:
E o E τ , s = L all _ states p L ln p L .
In [26] the authors who proposed the method used Entropy of Entropy to discriminate between two pathological groups. The best results were reported with τ 5 and for 55 slices. When compared with Multiscale entropy, E o E performed sufficiently for a short time series.

2.5. Geometric or Phase-Space Definitions

Given a time series X, a Poincaré-Lorenz plot represents X in a two-dimensional space, with the x-axis representing the current x i state and the y-axis the next one, x i + 1 . A diagonal line separates points that declare acceleration from points that declare deceleration. Thus, we can visually extract information about the dynamics of the series. Phase and Gridded Distribution Entropy adapts Poincaré-Lorenz plots in their definition to measure the regularity of a series.

2.5.1. Phase Entropy

Phase entropy (PhEn) was introduced by Ashish et al. in 2019 [28], a definition based on the Poincaré-Lorenz plot. A second-order difference Poincaré-Lorenz plot is also informative, where the x-axis represents the difference x i + 1 x i and the y-axis the difference x i + 2 x i + 1 . The plot is divided into four quadrants. One of the quadrants reveals two consecutive accelerations. Another one reveals two consecutive decelerations. There is one quadrant revealing one acceleration followed by one deceleration, and one quadrant for a deceleration followed by one acceleration.
For each point the slope angle is calculated:
θ i = tan 1 x i + 2 x i + 1 x i + 1 x i .
In general, the plot can be divided into k instead of four sectors. The function p i is formulated by the slope angles:
p i , k = j   in   sector   i θ j j   in   all   sectors θ j .
Finally, the entropy is estimated in bits by
P h E n k = 1 log 2 k i = 1 k p i , k log 2 p i , k .
Researchers in [28] propose k 16 , or a value dividable by four. Phase entropy showed a stability on generated White Signal Noise of various lengths, starting with N = 100 up to N = 50,000.

2.5.2. Gridded Distribution Entropy

Gridded Distribution Entropy (GDEn) was introduced by Chang Yan et al. in 2019 [29]. The time series X is filtered and scaled between 0 and 1. Then, like Phase entropy, a Poincaré plot is generated, which is divided into n n blocks, yielding a gridded Poincaré plot. Let us assume that each block contains a finite number of β k points, where k = 1 , 2 , , n n . The Gridded Distribution Rate is computed:
G D R = α n n ,
where α is the number of blocks with at least one point in it. Gridded Distribution Entropy is computed by the Shannon entropy of p k , where
p k = β k N 1 .
In [29], the method becomes stable when the n parameter is greater than 80.

2.6. Pattern-Detection Definitions

An innovative course of thought in regularity analysis is introduced by Attention entropy, where the investigation is based upon some key patterns present in the time series.

Attention Entropy

Attention entropy (AttEn) was proposed by Jiawei Yang et al. [30] in 2023. The time series is not embedded into an m-dimensional space but is based on key pattern detection. The number of samples between key patterns gives a series on which Shannon entropy is computed. In [30] the local minimum and local maximum are proposed as key patterns. Attention entropy consistently discriminated HRV signals [30], with lengths starting from N = 100 , up to N = 10,000.

3. Methodology

We followed the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) protocol [8] to review the published literature systematically. Articles, initially collected from the Scopus article database, were screened to exclude publications not relevant to the purpose of our review. Then, we checked the eligibility of each article to further filter those we were going to review. The following questions drove us to the initial selection:
  • Does it propose a new entropy definition?
  • Has it been published during the last decade (2015–2024)?
  • Has the definition been used to analyze physiological signals (ECG/HRV, CTG, EEG, PPG, EHG, EMG)?
We used the Scopus article database to collect any articles that
  • Refer to any entropy definition investigating or including the name of the entropy definition in the title, abstract, or paper keywords.
  • Are related at any point with either EEG, ECG/HRV, CTG, EMG, EHG, or PPG, or the word “Biomedical”.
  • Belong to the “Computer Science and Engineering” field, as the most relevant available superset of Biomedical Engineering.
The queries that were used to initially collect and count the articles for each entropy definition were structured as follows:
  • (TITLE-ABS-KEY ("entropy definition")) AND
  • (ALL ("EEG" OR "ECG" OR "HRV" OR "PPG" OR "Biomedical"
  • OR "CTG" OR "EHG" OR "EMG"))
  • AND PUBYEAR > 2014 AND PUBYEAR < 2025 AND
  • (LIMIT-TO (SUBJAREA,"ENGI") OR LIMIT-TO (SUBJAREA,"COMP"))
The search was performed on 20 December 2024. All publications and their citations dated before that point were considered. We first collected all articles returned as relevant by the database query. Since some articles appeared more than once, we removed the duplicates to ensure only unique entries remained. In the next step, we kept only those articles belonging to the Biomedical field, studying physiological signals. Next, from the retained articles, we further selected only those employing at least one of the recent entropy definitions examined in this review. The remaining articles formed the basis for our statistical analysis and the following discussion.

4. Search, Selection, and Descriptive Statistics

With the above methodology and with the applied search criteria, we initially collected 516 articles. By removing duplicates, the number of articles was reduced to 447.
Through the screening process, we limited the articles to those belonging to the Biomedical Engineering field and, specifically, those examining physiological signals. We read the abstracts of all 447 papers and selected 213 articles as relevant. Apart from the Biomedical Engineering field, two other scientific fields were also very popular: “fault diagnosis” and “marine”.
In the final stage, we read the whole text of every paper and kept only those employing the examined entropy definitions. We have to note that we included papers applying the base entropy definitions (i.e., as they were initially proposed) and not subsequent modifications. Out of the 213 articles, 92 of them passed this final criterion.
We used a systematic approach in both the selection of entropy definitions and papers reviewed. This limits the possibility to exclude one or more definitions. A review of the literature before the systematic review helped in this direction.
In Figure 2 the filtering procedure is presented through a diagram, as suggested by the PRISMA protocol. At the top of this figure, we can see the identification phase, where all entropy definitions examined in this review are displayed. Each definition is accompanied by the number of articles initially selected as relevant. The identification phase is followed by the screening phase, the eligibility phase, and the inclusion phase.
Table 3 presents the number of articles referring to each definition. The first column indicates the total number of references identified for each definition. The following column indicates the number of articles containing the name of the entropy definition in their title, the third one in the keywords, and the last one in the abstract. We can see that Dispersion Entropy is the most mentioned one (31 references in total), followed by Bubble Entropy (25) and Distribution Entropy (22). The rest of the definitions present a slightly smaller number of citations. It should be noted that the sum of the column “As a Citation” is not 92 (the number of the examined articles), since many papers referenced more than one definition.
Table 4 shows the number of articles per examined method and physiological signal. For the same reason, as in Table 3, the summation of the columns or the rows does not give the aforementioned numbers. An interesting observation resulting from this table is that Bubble Entropy is the only entropy definition that has been used for the analysis of all examined physiological signals. It should be noted that Diversity entropy does not present any citations, since it has not been used solely as a base entropy in Biomedical signals. We included it in the review, though, for completeness, since it is a recent entropy definition.
Since, from Table 4, we cannot conclude the number of the reviewed papers for each physiological signal, we added a figure with this information. In Figure 3 we can see that the most commonly used signals are EEG (43) and ECG/HRV (41). Also included, but with a much smaller number of references, are EHG (5), EMG (5), CTG (2), and PPG (2).
Finally, Figure 4 shows the distribution of papers published in journals and conferences across different publishers. IEEE accounts for the largest share of papers (43), followed by MDPI (16, 12 of which are published in the “Entropy” journal), Springer (12), and Elsevier (9). An additional number of papers (12) can be found in various other publishers.

5. The Literature Review After the Systematic Article Selection

In the following, the retained articles are briefly examined by presenting the entropy definitions they used and the physiological signals on which they were applied.

5.1. Articles on Electroencephalogram

An EEG records electrical activity in the brain. Brain wave variability reflects fluctuations in these brain waves, which can indicate different mental states, cognitive load, or neurological health. Greater variability may suggest better brain adaptability and functioning.
Out of the 92 papers having passed the filters, 43 of them have applied the examined entropy definitions on an EEG. Seizure detection was the most popular target, while many papers detected sleep stages or emotional states.
Dispersion Entropy is applied in [16,31,32,33], to classify healthy and epileptic subjects, and in [34], to distinguish between normal, ictal, and non-ictal states. Dispersion Entropy is utilized in [35], again for the classification of healthy and epileptic subjects. Increment entropy is employed in [15,36] to categorize healthy and epileptic subjects, as well as ictal and interictal phases. Dispersion Entropy is used in [37] to categorize healthy subjects, subjects during interictal epileptic activity, and seizure attacks. In [18] Slope entropy differentiates seizure and seizure-free recordings. Slope entropy is also featured in [33], along with Dispersion, Increment, and Phase entropy. Distribution Entropy was also used to analyze ictal and interictal patients [38], to detect epileptic seizures [39], and, finally, along with FB-Dispersion Entropy, was used for seizure/non-seizure classification [40].
Sleep stage detection and sleep disorder detection are also popular subjects. Bubble Entropy is used in [41] for sleep and wake state classification and in [42] for detecting sleep spindles. Dispersion Entropy detects sleep stages in [43]. Both Bubble and Dispersion Entropy discriminate wakefulness and sleep stage in [44]. Range entropy employs statistical analysis to classify subjects into distinct states of wakefulness, drowsiness, and sleep [45]. Gridded Distribution Entropy detects and classifies sleep disorders, such as insomnia, narcolepsy, periodic leg movement, nocturnal frontal lobe epilepsy, bruxism, REM (Rapid Eye Movement) behavior disorder, and sleep-disordered breathing in [46]. In [47] Dispersion and Bubble Entropy have been computed on an EEG to discriminate four states: wake, light sleep, deep sleep, REM, and non-REM.
A significant number of papers detected emotional states through the complexity of the EEG signal. Dispersion Entropy is used for this purpose in [48,49,50,51], Increment entropy in [52], Bubble Entropy in [53], and Distribution Entropy in [54]. The detection of the stress level, as well as the valence or the arousal, is examined in [55,56] by Dispersion Entropy. Depression is detected by Distribution Entropy in [57] and alertness by Range entropy in [58].
Some fields were less popular. In [59], FDispEn detected eye-blinking artifacts, while DispEn identified different movements task [60].
Bubble Entropy with FDispEn discriminated subjects with mild cognitive impairment or vascular dementia and controls [61], while Bubble Entropy, along with Attention and Symbolic Dynamic entropy, is used for the same purpose in [62]. Dispersion Entropy was employed for Attention-Deficit/Hyperactivity Disorder detection, for channel identification, for classification between healthy persons and persons with Alzheimer’s disease, and along with Slope entropy, for cognitive task classification [63,64,65,66]. Bubble and Slope entropy have been utilized in speech recognition tasks [67], while PhEn, GDEn, and CosEn have been applied to the evaluation of brain death and coma patients [68]. Finally, Slope entropy used a single frontal EEG to predict the index of the Depth of Anesthesia [69], and Distribution Entropy for identity authentication [70].

5.2. Articles on Heart Signals

An ECG measures the electrical activity of the heart. Heart Rate Variability expresses the variations in time between heartbeats, which can indicate the balance between the sympathetic and parasympathetic nervous systems. Higher HRV often reflects better cardiovascular health and resilience.
This category was also proved to be popular since, out of the 92 papers that passed the filters, 41 of them have applied the examined entropy definitions on heart signals.
Distribution Entropy is the definition with the largest number of applications on heart signals among the examined methods. It has been used for arrhythmia detection [71,72,73], for congestive heart failure [13,74,75], and for Chagas disease [76]; it has also been applied to data sets with young and elderly subjects [13,73,75,77] and for coronary artery disease before and after intervention [78]. Other applications of Distribution Entropy include sleep stage detection [79], rest–walk state [80], and rest–tilt state [81] recognition. Finally, Distribution Entropy has been applied for recognizing emotions [54] and for cognitive tasks [82].
Dispersion Entropy has been used for biometric simulation [83] and for sleep stage recognition [43]. It has also been used for atrial fibrillation in a four-class detection problem [84] and to distinguish normal recordings from recordings with premature beats [85]. Finally, coronary artery disease is detected using young and elderly subjects [86], while young and elderly subjects are also studied in [16,62].
Increment entropy has been employed for stress evaluation [87,88], epilepsy detection in conjunction with an EEG [15], and sudden cardiac death prediction [89].
Bubble Entropy has been employed in congestive heart failure recognition [23,24,90] and for a tolerance to spikes study [91]. It has also been employed for discrimination between normal sinus rhythm, congestive heart failure, coronary artery disease, and sudden cardiac death [92]. Slope entropy has been used for the classification of recordings of young and elderly subjects [18]. Fluctuation-based Dispersion Entropy, introduced in [17], was originally applied to analyze rat blood pressure signals.
Entropy of Entropy was applied to analyze heart signals before and after renal artery denervation [93] and to discriminate NSR, CHF, and atrial fibrillation subjects [26]. Gridded Distribution Entropy differentiated healthy young people from healthy aged adults, as well as distinguishing healthy subjects from patients with coronary artery disease [29].
There are many papers in which more than one entropy definition is exploited. Attention, Dispersion, Distribution, and Phase entropy have been applied to HRV estimation, Chronic Chagas disease, and Cardiomyopathy detection [94]. Bubble and Phase entropy were used for coronary artery disease, sudden cardiac arrest, and congestive heart failure detection in [95], while Dispersion, Slope, Increment, and Phase entropy were used for myocardial infarction in [33]. Bubble Entropy and Dispersion Entropy have been utilized for HRV estimation on normal sinus rhythm, congestive heart failure, and coronary artery disease recordings [96]. Phase, Bubble, and Gridded Distribution Entropy have been employed in [28] for congestive heart failure and normal sinus rhythm recordings, while Attention and Bubble Entropy for the study of recordings of young and elderly subjects, congestive heart failure, and atrial fibrillation patients [30].

5.3. Articles on Cardiotocogram

The Cardiotocography signal monitors the fetal heart rate and uterine contractions during pregnancy and labor. It helps assess fetal well-being and detects potential complications related to oxygen levels and uterine activity. Bubble Entropy has been used both to evaluate the cord artery pH in labor [97] and to assess the well-being of the fetuses [98].

5.4. Articles on Photoplethysmogram

The Photoplethysmogram signal measures changes in blood volume using light absorption, typically through a sensor on the skin. It is commonly used for monitoring heart rate, oxygen saturation, and vascular health. Changes in its variability can indicate stress and potential underlying health issues. In [99], Attention, Dispersion, and Slope entropy were used to extract complexity features to discriminate between normal and cerebral infarction subjects, and Bubble Entropy was used for blood pressure estimation and stratification [100].

5.5. Articles on Electrohysterography

The examined entropy definitions have also been applied to signals acquired through Electrohysterography. The EHG signal is a measurement of electrical activity in the uterus, often used to assess uterine contractions during labor. It provides insights into the coordination and strength of uterine muscle activity. There are four papers referencing one of the examined definitions. Phase entropy has been used to analyze term and pre-term signals [101] and signals from low-risk pregnant subjects during parturition [102], while Dispersion Entropy and Bubble Entropy were used for pre-term birth prediction [103]. Bubble Entropy was employed to compare uterine myoelectrical activity between single and multiple gestation women [104]. Finally, Dispersion Entropy has been used to investigate term and pre-term births [105].

5.6. Articles on Electromyography

Electromyography measures the electrical activity of muscles. Variability in EMG is important because it reflects motor unit recruitment and neuromuscular control. Healthy variability indicates flexible and adaptive motor output, while reduced or excessive variability can signal neurological or muscular dysfunction, such as in Parkinson’s disease or spasticity. Bubble Entropy was used both to distinguish EMG fatigue and non-fatigue signals [106] and to investigate the neuromuscular system [107]. A classification between fatigue and non-fatigue subjects was also performed in [108], based on the extracted Phase entropy features. Slope entropy was used to classify EMG signals of patients with myopathy, neuropathy patients, and healthy subjects [18].

6. Discussion

Entropy is a measure of the system’s uncertainty used to distinguish between order and randomness. Beyond the examined definitions in this review, numerous other entropy measures have been proposed in the literature, since the field has been active for much more than a decade. Researchers have long been intrigued by the idea of measuring the uncertainty of systems, proposing novel entropy definitions. As a result, valuable metrics have been developed to further investigate the properties of a time series. A variety of research fields have adopted these measures in their studies, including physiological signal analysis.
To date, not only do we not know which entropy definition performs best, but we also know that the performance of each definition depends on the data set. This remains a long-standing and interesting problem for researchers to investigate. What they currently do is employ more than one definition and combine the results. With the widespread use of machine learning, this has become an easier task and a more straightforward choice.
However, we have to note that researchers usually select the most popular or well-established methods for their studies or for producing features for machine learning. They also do not justify their selections. It is true that they do not know a priori which method will perform best, and only after the computation can they assess their value. We hope that this review will help towards the direction of adding more entropy definitions in their experiments.
A main disadvantage of all entropy measures is the dependence on parameters. Both Approximate and Sample Entropy, for example, depend on the appropriate selection of two parameters: the embedding dimension (m) and the threshold distance (r), with the last being a real number. Values of m are practically limited to the range m = 1 4 , but r is a real parameter with an infinite domain set. Optimal estimation of these parameters is difficult or impossible, since they depend on the data set. Typical values, m = 2 , r = 0.2 , are almost always used as a confession that the problem is non-trivial. Parameter selection is a difficult problem for all entropy definitions. Researchers propose entropy measures that avoid proposing specific typical values for the parameters. Parameter estimation is still an open problem and an application-dependent issue.
There are some limitations of this work and potential biases. We used a single database, Scopus. Even though this was the most straightforward selection, it could narrow our research results. Then our subject-area filters could also limit the extent of the search. For example we already noted that we used the subject “Computer Science and Engineering” and the term “Biomedical” in the search criteria, as the umbrella of “Biomedical Engineering”, since the latter was not an option in the available search criteria of Scopus. We narrowed our search to articles written in English. Finally, as we stated before, we also have to mention here that we only included the original definitions and not their variants. For example we considered only Bubble Entropy [23,24] and not multiscale Bubble Entropy [109]. There is always the risk of missing preprints. On the other hand, we did not limit our search to journals only, but included conferences as well, as returned by the query to Scopus.

7. Conclusions

In this paper, we conducted a systematic and comprehensive review of recent entropy definitions proposed within the last decade (2015–2024). The PRISMA methodology was adopted as the most appropriate framework for conducting a systematic literature review. Using the Scopus database, we initially identified 516 papers related to physiological signals. After removing duplicates, this number was reduced to 447 and further refined to 213 by focusing exclusively on the Biomedical field. Ultimately, 92 papers that employed recent entropy definitions were included in our analysis.
We presented a table summarizing the total number of citations per entropy definition, along with the number of applications for each definition across various physiological signals. Dispersion Entropy had the highest number of citations (31 papers), followed by Bubble Entropy (25 papers) and Distribution Entropy (22 papers).
Additionally, we included a chart illustrating the distribution of the reviewed papers by publisher. IEEE accounted for the largest share (43 papers), followed by MDPI (16 papers).
A bar chart was also presented, showing the number of studies per physiological signal. The most frequently analyzed signals were EEG (43 papers) and ECG/HRV (41 papers). Other signals included, though with fewer references, were EHG (5), EMG (5), CTG (2), and PPG (2).
Finally, in the last section, each selected paper was individually reviewed. The methodology and objectives were documented, including the specific entropy definition applied and the physiological signal it was used on.

Author Contributions

Conceptualization, G.M.; methodology, D.P.; writing—original draft preparation, G.M. and D.P.; supervision, G.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Clausius, R. Ueber verschiedene für die Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wärmetheorie. Ann. Der Phys. Und Chem. 1865, 125, 353–400. [Google Scholar] [CrossRef]
  2. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 623–656. [Google Scholar] [CrossRef]
  3. Pincus, S.M. Approximate entropy as a measure of system complexity. Proc. Natl. Acad. Sci. USA 1991, 88, 2297–2301. [Google Scholar] [CrossRef]
  4. Richman, J.S.; Moorman, J.R. Physiological time-series analysis using approximate entropy and sample entropy. Am. J. Physiol.-Heart Circ. Physiol. 2000, 278, H2039–H2049. [Google Scholar] [CrossRef]
  5. Lake, D.E.; Richman, J.S.; Griffin, M.P.; Moorman, J.R. Sample entropy analysis of neonatal heart rate variability. Am. J. Physiol. Regul. Integr. Comp. Physiol. 2002, 283, R789–R797. [Google Scholar] [CrossRef] [PubMed]
  6. Bandt, C.; Pompe, B. Permutation Entropy: A Natural Complexity Measure for Time Series. Phys. Rev. Lett. 2002, 88, 174102. [Google Scholar] [CrossRef]
  7. Azami, H.; Faes, L.; Escudero, J.; Humeau-Heurtier, A.; Silva, L. Entropy Analysis of Univariate Biomedical Signals:Review and Comparison of Methods. In Frontiers in Entropy Across the Disciplines; Contemporary Mathematics and Its Applications: Monographs, Expositions and Lecture Notes; World Scientific: Singapore, 2022; Volume 4, pp. 233–286. [Google Scholar] [CrossRef]
  8. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef] [PubMed]
  9. Omidvarnia, A.; Mesbah, M.; Pedersen, M.; Jackson, G. Range Entropy: A bridge between signal complexity and self-similarity. Entropy 2018, 20, 962. [Google Scholar] [CrossRef] [PubMed]
  10. Chanwimalueang, T.; Mandic, D.P. Cosine similarity entropy: Self-correlation-based complexity analysis of dynamical systems. Entropy 2017, 19, 652. [Google Scholar] [CrossRef]
  11. Wang, X.; Si, S.; Li, Y. Multiscale Diversity Entropy: A novel dynamical measure for fault diagnosis of rotating machinery. IEEE Trans. Ind. Inform. 2021, 17, 5419–5429. [Google Scholar] [CrossRef]
  12. Chen, W.; Wang, Z.; Xie, H.; Yu, W. Characterization of surface EMG signal based on fuzzy entropy. IEEE Trans. Neural Syst. Rehabil. Eng. 2007, 15, 266–272. [Google Scholar] [CrossRef]
  13. Li, P.; Liu, C.; Li, K.; Zheng, D.; Liu, C.; Hou, Y. Assessing the complexity of short-term heartbeat interval series by distribution entropy. Med. Biol. Eng. Comput. 2015, 53, 77–87. [Google Scholar] [CrossRef]
  14. Liu, X.; Jiang, A.; Xu, N.; Xue, J. Increment entropy as a measure of complexity for time series. Entropy 2016, 18, 22. [Google Scholar] [CrossRef]
  15. Liu, X.; Wang, X.; Zhou, X.; Jiang, A. Appropriate use of the increment entropy for electrophysiological time series. Comput. Biol. Med. 2018, 95, 13–23. [Google Scholar] [CrossRef]
  16. Rostaghi, M.; Azami, H. Dispersion Entropy: A measure for time-series analysis. IEEE Signal Process. Lett. 2016, 23, 610–614. [Google Scholar] [CrossRef]
  17. Azami, H.; Escudero, J. Amplitude-and fluctuation-based dispersion entropy. Entropy 2018, 20, 210. [Google Scholar] [CrossRef] [PubMed]
  18. Cuesta-Frau, D. Slope Entropy: A new time series complexity estimator based on both symbolic patterns and amplitude information. Entropy 2019, 21, 1167. [Google Scholar] [CrossRef]
  19. Kouka, M.; Cuesta-Frau, D. Slope entropy characterisation: The role of the δ parameter. Entropy 2022, 24, 1456. [Google Scholar] [CrossRef] [PubMed]
  20. Kouka, M.; Cuesta-Frau, D.; Moltó-Gallego, V. Slope Entropy characterisation: An asymmetric approach to threshold parameters role analysis. Entropy 2024, 26, 82. [Google Scholar] [CrossRef]
  21. Li, Y.; Yang, Y.; Li, G.; Xu, M.; Huang, W. A fault diagnosis scheme for planetary gearboxes using modified multi-scale symbolic dynamic entropy and mRMR feature selection. Mech. Syst. Signal Process. 2017, 91, 295–312. [Google Scholar] [CrossRef]
  22. Wang, J.; Li, T.; Xie, R.; Wang, X.M.; Cao, Y.Y. Fault feature extraction for multiple electrical faults of aviation electro-mechanical actuator based on symbolic dynamics entropy. In Proceedings of the 2015 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC), Ningbo, China, 19 September 2015; IEEE: Shanghai, China, 2015; pp. 1–6. [Google Scholar] [CrossRef]
  23. Manis, G.; Aktaruzzaman, M.; Sassi, R. Bubble entropy: An entropy almost free of parameters. IEEE Trans. Biomed. Eng. 2017, 64, 2711–2718. [Google Scholar] [CrossRef]
  24. Manis, G.; Bodini, M.; Rivolta, M.W.; Sassi, R. A two-steps-ahead estimator for Bubble entropy. Entropy 2021, 23, 761. [Google Scholar] [CrossRef] [PubMed]
  25. Manis, G.; Platakis, D.; Sassi, R. Exploration on Bubble entropy. IEEE J. Biomed. Health Inform. 2025, 1–10. [Google Scholar] [CrossRef] [PubMed]
  26. Hsu, C.F.; Wei, S.Y.; Huang, H.P.; Hsu, L.; Chi, S.; Peng, C.K. Entropy of entropy: Measurement of dynamical complexity for biological systems. Entropy 2017, 19, 550. [Google Scholar] [CrossRef]
  27. Costa, M.; Goldberg, A.L.; Peng, C.-K. Multiscale entropy analysis of complex physiologic time series. Phys. Rev. Lett. 2002, 89, 068102. [Google Scholar] [CrossRef]
  28. Rohila, A.; Sharma, A. Phase entropy: A new complexity measure for heart rate variability. Physiol. Meas. 2019, 40, 105006. [Google Scholar] [CrossRef]
  29. Yan, C.; Li, P.; Liu, C.; Wang, X.; Yin, C.; Yao, L. Novel gridded descriptors of poincaré plot for analyzing heartbeat interval time-series. Comput. Biol. Med. 2019, 109, 280–289. [Google Scholar] [CrossRef]
  30. Yang, J.; Choudhary, G.I.; Rahardja, S.; Fränti, P. Classification of interbeat interval time-series using Attention entropy. IEEE Trans. Affect. Comput. 2023, 14, 321–330. [Google Scholar] [CrossRef]
  31. Arı, A. Analysis of EEG signal for seizure detection based on WPT. Electron. Lett. 2020, 56, 1381–1383. [Google Scholar] [CrossRef]
  32. Sukriti; Chakraborty, M.; Mitra, D. Dispersion entropy for the automated detection of epileptic seizures. In Proceedings of the 2020 IEEE 15th International Conference on Industrial and Information Systems (ICIIS), Rupnagar, India, 26–28 November 2020; IEEE: Dhanbad, India, 2020; pp. 204–207. [Google Scholar] [CrossRef]
  33. Huang, Y.; Zhao, Y.; Capstick, A.; Palermo, F.; Haddadi, H.; Barnaghi, P. Analyzing entropy features in time-series data for pattern recognition in neurological conditions. Artif. Intell. Med. 2024, 150, 102821. [Google Scholar] [CrossRef]
  34. Nabila, Y.; Zakaria, H. Epileptic seizure prediction from EEG signal recording using energy and dispersion entropy with SVM classifier. In Proceedings of the 2024 International Conference on Information Technology Research and Innovation (ICITRI), Jakarta, Indonesia, 5–6 September 2024; IEEE: Bandung, Indonesia, 2024; pp. 1–6. [Google Scholar] [CrossRef]
  35. Zaylaa, A.J.; Harb, A.; Khatib, F.I.; Nahas, Z.; Karameh, F.N. Entropy complexity analysis of electroencephalographic signals during pre-ictal, seizure and post-ictal brain events. In Proceedings of the 2015 International Conference on Advances in Biomedical Engineering (ICABME), Beirut, Lebanon, 16–18 September 2015; IEEE: Beirut, Lebanon, 2015; pp. 134–137. [Google Scholar] [CrossRef]
  36. Liu, X.; Jiang, A.; Xu, N. Automated epileptic seizure detection in EEGs using increment entropy. In Proceedings of the 2017 IEEE 30th Canadian Conference on Electrical and Computer Engineering (CCECE), Windsor, ON, Canada, 30 April–3 May 2017; IEEE: Changzhou, China, 2017; pp. 1–4. [Google Scholar] [CrossRef]
  37. Chen, Z.; Ma, X.; Fu, J.; Li, Y. Ensemble improved permutation entropy: A new approach for time series analysis. Entropy 2023, 25, 1175. [Google Scholar] [CrossRef]
  38. Li, P.; Yan, C.; Karmakar, C.; Liu, C. Distribution entropy analysis of epileptic EEG signals. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; IEEE: Jinan, China, 2015; pp. 4170–4173. [Google Scholar] [CrossRef]
  39. Ali, E.; Udhayakumar, R.K.; Angelova, M.; Karmakar, C. Performance analysis of entropy methods in detecting epileptic seizure from surface Electroencephalograms. In Proceedings of the 2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Guadalajara, Mexico, 1–5 November 2021; IEEE: Dinajpur, Bangladesh, 2021; pp. 1082–1085. [Google Scholar] [CrossRef]
  40. Parui, S.; Samanta, D.; Chakravorty, N.; Mansoor, W.; Ghosh, U. A study on seizure detection performance in an automated process by extracting entropy features. In Proceedings of the 2022 5th International Conference on Signal Processing and Information Security (ICSPIS), Dubai, United Arab Emirates, 7–8 December 2022; IEEE: Kharagpur, India, 2022; pp. 86–91. [Google Scholar] [CrossRef]
  41. Khan, Y.A.; Tahreem, M.; Farooq, O. Single channel EEG based binary sleep and wake classification using entropy based features. In Proceedings of the 2023 International Conference on Recent Advances in Electrical, Electronics and Digital Healthcare Technologies (REEDCON), New Delhi, India, 1–3 May 2023; IEEE: Aligarh, India, 2023; pp. 100–105. [Google Scholar] [CrossRef]
  42. Manis, G.; Dudysova, D.; Gerla, V.; Lhotska, L. Detecting sleep spindles using entropy. In Proceedings of the European Medical and Biological Engineering Conference, Portorož, Slovenia, 29 November–3 December 2020; Springer: Ioannina, Greece, 2020; pp. 356–362. [Google Scholar] [CrossRef]
  43. Tripathy, R.; Acharya, U.R. Use of features from RR-time series and EEG signals for automated classification of sleep stages in deep neural network framework. Biocybern. Biomed. Eng. 2018, 38, 890–902. [Google Scholar] [CrossRef]
  44. Shahbakhti, M.; Beiramvand, M.; Eigirdas, T.; Solé-Casals, J.; Wierzchon, M.; Broniec-Wojcik, A.; Augustyniak, P.; Marozas, V. Discrimination of wakefulness from sleep stage I using nonlinear features of a single frontal EEG channel. IEEE Sens. J. 2022, 22, 6975–6984. [Google Scholar] [CrossRef]
  45. Hadra, M.; Omidvarnia, A.; Mesbah, M. Temporal complexity of EEG encodes human alertness. Physiol. Meas. 2022, 43, 095002. [Google Scholar] [CrossRef]
  46. Jain, R.; Ganesan, R.A. Effective diagnosis of sleep disorders using EEG and EOG signals. In Proceedings of the 2024 46th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 15–19 July 2024; IEEE: Zanzibar, Tanzania, 2024; pp. 1–4. [Google Scholar] [CrossRef]
  47. Tripathy, R.K.; Ghosh, S.K.; Gajbhiye, P.; Acharya, U.R. Development of automated sleep stage classification system using multivariate projection-based fixed boundary empirical wavelet transform and entropy features extracted from multichannel EEG signals. Entropy 2020, 22, 1141. [Google Scholar] [CrossRef] [PubMed]
  48. García-Martínez, B.; Fernández-Caballero, A.; Alcaraz, R.; Martínez-Rodrigo, A. Application of dispersion entropy for the detection of emotions with Electroencephalographic signals. IEEE Trans. Cogn. Dev. Syst. 2021, 14, 1179–1187. [Google Scholar] [CrossRef]
  49. Ding, X.W.; Liu, Z.T.; Li, D.Y.; He, Y.; Wu, M. Electroencephalogram emotion recognition based on dispersion entropy feature extraction using random oversampling imbalanced data processing. IEEE Trans. Cogn. Dev. Syst. 2021, 14, 882–891. [Google Scholar] [CrossRef]
  50. Hu, S.J.; Liu, Z.T.; Ding, X.W. Electroencephalogram emotion recognition using variational modal decomposition based dispersion entropy feature extraction. In Proceedings of the 2021 40th Chinese Control Conference (CCC), Shanghai, China, 26–28 July 2021; IEEE: Wuhan, China, 2021; pp. 3323–3326. [Google Scholar] [CrossRef]
  51. Kumar, H.; Ganapathy, N.; Puthankattil, S.D.; Swaminathan, R. EEG based emotion recognition using entropy features and Bayesian optimized random forest. Curr. Dir. Biomed. Eng. 2021, 7, 767–770. [Google Scholar] [CrossRef]
  52. Cai, H.; Liu, X.; Ni, R.; Song, S.; Cangelosi, A. Emotion recognition through combining EEG and EOG over relevant channels with optimal windowing. IEEE Trans. Hum.-Mach. Syst. 2023, 53, 697–706. [Google Scholar] [CrossRef]
  53. Pusarla, N.; Singh, A.; Tripathi, S.; Vujji, A.; Pachori, R.B. Exploring CEEMDAN and LMD domains entropy features for decoding EEG-based emotion patterns. IEEE Access 2024, 12, 103606–103625. [Google Scholar] [CrossRef]
  54. Gargano, A.; Scilingo, E.P.; Nardelli, M. The dynamics of emotions: A preliminary study on continuously annotated arousal signals. In Proceedings of the 2022 IEEE International Symposium on Medical Measurements and Applications (MeMeA), Giardini, Naxos, 22–24 June 2022; IEEE: Pisa, Italy, 2022; pp. 1–6. [Google Scholar] [CrossRef]
  55. García-Martínez, B.; Fernández-Caballero, A.; Alcaraz, R.; Martínez-Rodrigo, A. Assessment of dispersion patterns for negative stress detection from Electroencephalographic signals. Pattern Recognit. 2021, 119, 108094. [Google Scholar] [CrossRef]
  56. Liu, Z.T.; Xu, X.; She, J.; Yang, Z.; Chen, D. Electroencephalography emotion recognition based on rhythm information entropy extraction. J. Adv. Comput. Intell. Intell. Inform. 2024, 28, 1095–1106. [Google Scholar] [CrossRef]
  57. García-Martínez, B.; Martínez-Rodrigo, A.; Zangroniz Cantabrana, R.; Pastor Garcia, J.M.; Alcaraz, R. Application of entropy-based metrics to identify emotional distress from Electroencephalographic recordings. Entropy 2016, 18, 221. [Google Scholar] [CrossRef]
  58. Hadra, M.G.; Maaly, I.A.; Dweib, I. Range entropy as a discriminant feature for EEG-based alertness states identification. In Proceedings of the 2020 IEEE-EMBS Conference on Biomedical Engineering and Sciences (IECBES), Langkawi, Malaysia, 1–3 March 2021; IEEE: Khartoum, Sudan, 2021; pp. 395–400. [Google Scholar] [CrossRef]
  59. Yadavalli, M.K.; Pamula, V.K. An efficient framework to automatic extract EOG artifacts from single channel EEG recordings. In Proceedings of the 2022 IEEE International Conference on Signal Processing and Communications (SPCOM), Bangalore, India, 11 July 2022; IEEE: Kakinada, India, 2022; pp. 1–5. [Google Scholar] [CrossRef]
  60. Al-Qazzaz, N.K.; Aldoori, A.A.; Ali, S.H.B.M.; Ahmad, S.A.; Mohammed, A.K.; Mohyee, M.I. EEG signal complexity measurements to enhance BCI-based stroke patients’ rehabilitation. Sensors 2023, 23, 3889. [Google Scholar] [CrossRef]
  61. Al-Qazzaz, N.K.; Ali, S.H.B.M.; Ahmad, S.A. Recognition enhancement of dementia patients’ working memory using entropy-based features and local tangent space alignment algorithm. In Advances in Non-Invasive Biomedical Signal Sensing and Processing with Machine Learning; Springer: Berlin/Heidelberg, Germany, 2023; pp. 345–373. [Google Scholar] [CrossRef]
  62. Al-Qazzaz, N.K.; Ali, S.H.B.M.; Ahmad, S.A. Deep learning model for prediction of dementia severity based on EEG signals. Al-Khwarizmi Eng. J. 2024, 20, 1–12. [Google Scholar] [CrossRef]
  63. Rezaeezadeh, M.; Shamekhi, S.; Shamsi, M. Attention deficit hyperactivity disorder diagnosis using non-linear univariate and multivariate EEG measurements: A preliminary study. Phys. Eng. Sci. Med. 2020, 43, 577–592. [Google Scholar] [CrossRef]
  64. Lee, J.H.; Choi, Y.S. A data driven Information theoretic feature extraction in EEG-based motor imagery BCI. In Proceedings of the 2019 International Conference on Information and Communication Technology Convergence (ICTC), Jeju Island, Republic of Korea, 16–18 October 2019; IEEE: Seoul, Republic of Korea, 2019; pp. 1373–1376. [Google Scholar] [CrossRef]
  65. Azami, H.; Rostaghi, M.; Fernández, A.; Escudero, J. Dispersion entropy for the analysis of resting-state MEG regularity in Alzheimer’s disease. In Proceedings of the 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 16–20 August 2016; IEEE: Toronto, ON, Canada, 2016; pp. 6417–6420. [Google Scholar] [CrossRef]
  66. Varshney, A.; Ghosh, S.K.; Padhy, S.; Tripathy, R.K.; Acharya, U.R. Automated classification of mental arithmetic tasks using recurrent neural network and entropy features obtained from multi-channel EEG signals. Electronics 2021, 10, 1079. [Google Scholar] [CrossRef]
  67. Dash, S.; Tripathy, R.K.; Dash, D.K.; Panda, G.; Pachori, R.B. Multiscale domain gradient boosting models for the automated recognition of imagined vowels using multichannel EEG signals. IEEE Sens. Lett. 2022, 6, 1–4. [Google Scholar] [CrossRef]
  68. Xiao, H.; Li, L.; Mandic, D.P. ClassA entropy for the analysis of structural complexity of physiological signals. In Proceedings of the ICASSP 2023–2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Rhodes, Greece, 4–10 June 2023; IEEE: London, UK, 2023; pp. 1–5. [Google Scholar] [CrossRef]
  69. Shahbakhti, M.; Beiramvand, M.; Far, S.M.; Solé-Casals, J.; Lipping, T.; Augustyniak, P. Utilizing slope entropy as an effective index for wearable EEG-based depth of anesthesia monitoring. In Proceedings of the 2024 46th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 15–19 July 2024; IEEE: Kaunas, Lithuania, 2024; pp. 1–4. [Google Scholar] [CrossRef]
  70. Amalina, I.; Saidatul, A.; Fook, C.; Ibrahim, Z. Preliminary study on EEG based typing biometrics for user authentication using nonlinear features. In Proceedings of the IOP Conference Series: Materials Science and Engineering, Bogor, Indonesia, 15–16 December 2018; IOP Publishing: Bristol, UK, 2019; Volume 557, p. 012035. [Google Scholar] [CrossRef]
  71. Karmakar, C.; Udhayakumar, R.K.; Palaniswami, M. Distribution entropy (DistEn): A complexity measure to detect arrhythmia from short length RR interval time series. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; IEEE: Melbourne, Australia, 2015; pp. 5207–5210. [Google Scholar] [CrossRef]
  72. Udhayakumar, R.K.; Karmakar, C.; Li, P.; Palaniswami, M. Effect of embedding dimension on complexity measures in identifying arrhythmia. In Proceedings of the 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 16–20 August 2016; IEEE: Melbourne, Australia, 2016; pp. 6230–6233. [Google Scholar] [CrossRef]
  73. Udhayakumar, R.K.; Karmakar, C.; Li, P.; Palaniswami, M. Influence of embedding dimension on distribution entropy in analyzing heart rate variability. In Proceedings of the 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 16–20 August 2016; IEEE: Melbourne, Australia, 2016; pp. 6222–6225. [Google Scholar] [CrossRef]
  74. Li, Y.; Li, P.; Karmakar, C.; Liu, C. Distribution entropy for short-term QT interval variability analysis: A comparison between the heart failure and normal control groups. In Proceedings of the 2015 Computing in Cardiology Conference (CinC), Nice, France, 6–9 September 2015; IEEE: Jinan, China, 2015; pp. 1153–1156. [Google Scholar] [CrossRef]
  75. Nardelli, M.; Citi, L.; Barbieri, R.; Valenza, G. Characterization of autonomic states by complex sympathetic and parasympathetic dynamics. Physiol. Meas. 2023, 44, 035004. [Google Scholar] [CrossRef]
  76. Silva, L.E.; Moreira, H.T.; Schmidt, A.; Romano, M.M.; Salgado, H.C.; Fazan, R.; Marin-Neto, J.A. The relationship between nonlinear heart rate variability and echocardiographic indices in Chagas disease. In Proceedings of the 2020 11th Conference of the European Study Group on Cardiovascular Oscillations (ESGCO), Pisa, Italy, 15 July 2020; IEEE: Ribeirão Preto, Brazil, 2020; pp. 1–2. [Google Scholar] [CrossRef]
  77. Udhayakumar, R.K.; Karmakar, C.; Li, P.; Palaniswami, M. Effect of data length and bin numbers on distribution entropy (DistEn) measurement in analyzing healthy aging. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; IEEE: Melbourne, Australia, 2015; pp. 7877–7880. [Google Scholar] [CrossRef]
  78. Yan, C.; Liu, C.; Yao, L.; Wang, X.; Wang, J.; Li, P. Short-term effect of percutaneous coronary intervention on heart rate variability in patients with coronary artery disease. Entropy 2021, 23, 540. [Google Scholar] [CrossRef] [PubMed]
  79. Yan, C.; Li, P.; Yang, M.; Li, Y.; Li, J.; Zhang, H.; Liu, C. Entropy analysis of heart rate variability in different sleep stages. Entropy 2022, 24, 379. [Google Scholar] [CrossRef]
  80. Shi, B.; Zhang, Y.; Yuan, C.; Wang, S.; Li, P. Entropy analysis of short-term heartbeat interval time series during regular walking. Entropy 2017, 19, 568. [Google Scholar] [CrossRef]
  81. Nardelli, M.; Citi, L.; Barbieri, R.; Valenza, G. Intrinsic complexity of sympathetic and parasympathetic dynamics from HRV series: A preliminary study on postural changes. In Proceedings of the 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Montreal, QC, Canada, 20–24 July 2020; IEEE: Pisa, Italy, 2020; pp. 2577–2580. [Google Scholar] [CrossRef]
  82. Arsac, L.M. Entropy-based multifractal testing of heart rate variability during cognitive-autonomic interplay. Entropy 2023, 25, 1364. [Google Scholar] [CrossRef] [PubMed]
  83. Aulia, S.; Hadiyoso, S.; Wijayanto, I.; Irawati, I.D. Biometric simulation based on single lead Electrocardiogram signal using dispersion entropy and linear discriminant analysis. Pattern Recognit. 2022, 16, 1359. [Google Scholar] [CrossRef]
  84. Nicolet, J.J.; Restrepo, J.F.; Schlotthauer, G. Classification of intracavitary electrograms in atrial fibrillation using information and complexity measures. Biomed. Signal Process. Control 2020, 57, 101753. [Google Scholar] [CrossRef]
  85. Kafantaris, E.; Piper, I.; Lo, T.Y.M.; Escudero, J. Application of dispersion entropy to healthy and pathological heartbeat ECG segments. In Proceedings of the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019; IEEE: Edinburgh, UK, 2019; pp. 2269–2272. [Google Scholar] [CrossRef]
  86. Singh, R.S.; Gelmecha, D.J.; Sinha, D.K. Expert system based detection and classification of coronary artery disease using ranking methods and nonlinear attributes. Multimed. Tools Appl. 2022, 81, 19723–19750. [Google Scholar] [CrossRef]
  87. Tripathy, R.K.; Dash, D.K.; Ghosh, S.K.; Pachori, R.B. Detection of different stages of anxiety from single-channel wearable ECG sensor signal using Fourier–Bessel domain adaptive wavelet transform. IEEE Sens. Lett. 2023, 7, 1–4. [Google Scholar] [CrossRef]
  88. Deka, D.; Deka, B. Investigation on HRV signal dynamics for meditative intervention. In Soft Computing: Theories and Applications; Springer: Berlin/Heidelberg, Germany, 2020; pp. 993–1005. [Google Scholar] [CrossRef]
  89. Khazaei, M.; Raeisi, K.; Goshvarpour, A.; Ahmadzadeh, M. Early detection of sudden cardiac death using nonlinear analysis of heart rate variability. Biocybern. Biomed. Eng. 2018, 38, 931–940. [Google Scholar] [CrossRef]
  90. Manis, G.; Bodini, M.; Rivolta, M.W.; Sassi, R. Bubble entropy of fractional Gaussian noise and fractional Brownian motion. In Proceedings of the 2021 Computing in Cardiology (CinC), Brno, Czech Republic, 13–15 September 2021; IEEE: Ioannina, Greece, 2021; Volume 48, pp. 1–4. [Google Scholar] [CrossRef]
  91. Manis, G.; Sassi, R. Tolerance to spikes: A comparison of Sample and Bubble entropy. In Proceedings of the 2017 Computing in Cardiology (CinC), Rennes, France, 24–27 September 2017; IEEE: Ioannina, Greece, 2017; pp. 1–4. [Google Scholar] [CrossRef]
  92. Rohila, A.; Sharma, A. Detection of sudden cardiac death by a comparative study of heart rate variability in normal and abnormal heart conditions. Biocybern. Biomed. Eng. 2020, 40, 1140–1154. [Google Scholar] [CrossRef]
  93. Lin, P.L.; Lin, P.Y.; Huang, H.P.; Vaezi, H.; Liu, L.Y.M.; Lee, Y.H.; Huang, C.C.; Yang, T.F.; Hsu, L.; Hsu, C.F. The autonomic balance of heart rhythm complexity after renal artery denervation: Insight from entropy of entropy and average entropy analysis. BioMed. Eng. Online 2022, 21, 32. [Google Scholar] [CrossRef]
  94. Silva, L.E.V.; Moreira, H.T.; de Oliveira, M.M.; Cintra, L.S.S.; Salgado, H.C.; Fazan, R., Jr.; Tinós, R.; Rassi, A., Jr.; Schmidt, A.; Marin-Neto, J.A. Heart rate variability as a biomarker in patients with chronic Chagas cardiomyopathy with or without concomitant digestive involvement and its relationship with the Rassi score. BioMed. Eng. Online 2022, 21, 44. [Google Scholar] [CrossRef] [PubMed]
  95. Rohila, A.; Sharma, A. Correlation between heart rate variability features. In Proceedings of the 2020 7th International Conference on Signal Processing and Integrated Networks (SPIN), Noida, India, 27–28 February 2020; IEEE: Roorkee, India, 2020; pp. 669–674. [Google Scholar] [CrossRef]
  96. Saxena, S.; Gupta, V.K.; Hrisheekesha, P. Coronary heart disease detection using nonlinear features and online sequential extreme learning machine. Biomed. Eng. Appl. Basis Commun. 2019, 31, 1950046. [Google Scholar] [CrossRef]
  97. Manis, G.; Sassi, R. Relation between fetal HRV and value of umbilical cord artery pH in labor, a study with entropy measures. In Proceedings of the 2017 IEEE 30th International Symposium on Computer-Based Medical Systems (CBMS), Thessaloniki, Greece, 22–24 June 2017; IEEE: Ioannina, Greece, 2017; pp. 272–277. [Google Scholar] [CrossRef]
  98. Rivolta, M.W. Information theory and fetal heart rate variability analysis. In Innovative Technologies and Signal Processing in Perinatal Medicine: Volume 2; Springer: Berlin/Heidelberg, Germany, 2023; pp. 171–188. [Google Scholar] [CrossRef]
  99. Gupta, S.; Singh, A.; Sharma, A. CIsense: An automated framework for early screening of cerebral infarction using PPG sensor data. Biomed. Eng. Lett. 2023, 14, 199–207. [Google Scholar] [CrossRef]
  100. Gupta, S.; Singh, A.; Sharma, A.; Tripathy, R.K. Higher order derivative-based integrated model for cuff-less blood pressure estimation and stratification using PPG signals. IEEE Sens. J. 2022, 22, 22030–22039. [Google Scholar] [CrossRef]
  101. Zamudio-De Hoyos, J.R.; Vázquez-Flores, D.; Pliego-Carrillo, A.C.; Ledesma-Ramírez, C.I.; Mendieta-Zerón, H.; Reyes-Lagos, J.J. Nonlinearity of Electrohysterographic signals is diminished in active preterm labor. In Proceedings of the Congreso Nacional de Ingeniería Biomédica, Puerto Vallarta, México, 6–8 October 2022; Springer: Toluca, Mexico, 2022; pp. 302–307. [Google Scholar] [CrossRef]
  102. Reyes-Lagos, J.J.; Pliego-Carrillo, A.C.; Ledesma-Ramírez, C.I.; Peña-Castillo, M.Á.; García-González, M.T.; Pacheco-López, G.; Echeverría, J.C. Phase entropy analysis of Electrohysterographic data at the third trimester of human pregnancy and active parturition. Entropy 2020, 22, 798. [Google Scholar] [CrossRef]
  103. Nieto-del Amor, F.; Beskhani, R.; Ye-Lin, Y.; Garcia-Casado, J.; Diaz-Martinez, A.; Monfort-Ortiz, R.; Diago-Almela, V.J.; Hao, D.; Prats-Boluda, G. Assessment of dispersion and bubble entropy measures for enhancing preterm birth prediction based on Electrohysterographic signals. Sensors 2021, 21, 6071. [Google Scholar] [CrossRef]
  104. Diaz-Martinez, A.; Prats-Boluda, G.; Monfort-Ortiz, R.; Garcia-Casado, J.; Roca-Prats, A.; Tormo-Crespo, E.; Nieto-del Amor, F.; Diago-Almela, V.J.; Ye-Lin, Y. Overdistention accelerates electrophysiological changes in uterine muscle towards labour in multiple gestations. IRBM 2024, 45, 100837. [Google Scholar] [CrossRef]
  105. Nieto-del Amor, F.; Ye Lin, Y.; Garcia-Casado, J.; Díaz-Martínez, M.d.A.; González Martínez, M.; Monfort-Ortiz, R.; Prats-Boluda, G. Dispersion entropy: A measure of Electrohysterographic complexity for preterm labor discrimination. In Proceedings of the 14th International Joint Conference on Biomedical Engineering Systems and Technologies (BIOSTEC 2021), Vienna, Austria, 11–13 February 2021; Curran Associates, Inc.: Red Hook, NY, USA, 2021; pp. 260–267. [Google Scholar] [CrossRef]
  106. Sasidharan, D.; Venugopal, G.; Swaminathan, R. Complexity analysis of surface Electromyography signals under fatigue using Hjorth parameters and bubble entropy. J. Mech. Med. Biol. 2023, 23, 2340051. [Google Scholar] [CrossRef]
  107. Sasidharan, D.; Kavyamol, K.; Subramanian, S.; Venugopal, G. Chaotic complexity determination of surface EMG signals. In Proceedings of the Indian Conference on Applied Mechanics, Mandi, India, 14–16 December 2022; Springer: Coimbatore, India, 2022; pp. 323–329. [Google Scholar] [CrossRef]
  108. Sowmya, S.; Banerjee, S.S.; Swaminathan, R. Assessment of muscle fatigue using phase entropy of sEMG signals during dynamic contractions of biceps brachii. In Proceedings of the 2023 9th International Conference on Control, Decision and Information Technologies (CoDIT), Rome, Italy, 3–6 July 2023; IEEE: Tamil Nadu, India, 2023; pp. 2253–2256. [Google Scholar] [CrossRef]
  109. Zhang, J.; Wang, C.; Gui, P.; Wang, M.; Zou, T. State Assessment of Rolling Bearings Based on the Multiscale Bubble Entropy. In Proceedings of the 2021 International Conference on Electronics, Circuits and Information Engineering (ECIE), Zhengzhou, China, 22–24 January 2021; IEEE: Beijing, China, 2021; pp. 179–182. [Google Scholar] [CrossRef]
Figure 1. Schematic illustration of the mapping method. Figure from the “Entropy” journal [18].
Figure 1. Schematic illustration of the mapping method. Figure from the “Entropy” journal [18].
Entropy 27 00983 g001
Figure 2. The PRISMA diagram.
Figure 2. The PRISMA diagram.
Entropy 27 00983 g002
Figure 3. Number of applications per physiological signal.
Figure 3. Number of applications per physiological signal.
Entropy 27 00983 g003
Figure 4. Number of articles per publisher.
Figure 4. Number of articles per publisher.
Entropy 27 00983 g004
Table 1. Description of symbols.
Table 1. Description of symbols.
SymbolDescription
x i Sample from the time series
NLength of the time series
X = x 1 , x 2 , , x N Original time series
mEmbedding dimension
dTime delay
x i m = ( x i , x i + d , , x i + ( m 1 ) d ) Embedded vector
X m = x 1 m , x 2 m , , x N m + 1 m Embedded series from X
U = u 1 , u 2 , , u N Series of symbols
U m = u 1 m , u 2 m , u N m + 1 m Embedded series of symbols
Table 2. Entropy definitions categorized by their conceptual families.
Table 2. Entropy definitions categorized by their conceptual families.
Definition FamilyDefinition
Embedding and Distance-BasedRange Entropy
Cosine Similarity Entropy
Diversity Entropy
Distribution Entropy
Symbolic and Ordinal Pattern-BasedIncrement Entropy
Dispersion Entropy
F-B Dispersion Entropy
Slope Entropy
Symbolic Dynamic Entropy
Complexity Estimation Based on Sorting EffortBubble Entropy
Multiscale and Hierarchical DefinitionsEntropy of Entropy
Geometric or Phase-Space DefinitionsPhase Entropy
Gridded Distribution Entropy
Pattern-Detection DefinitionsAttention Entropy
Table 3. Number of articles per method.
Table 3. Number of articles per method.
As a CitationIn TitleIn KeywordsIn Abstract
Dispersion Entropy:31121631
Bubble Entropy:256919
Distribution Entropy:226520
Increment Entropy:8258
Phase Entropy:9339
Slope Entropy:6214
F-B Dispersion Entropy:4113
Attention Entropy:5113
Gridded Distribution Entropy:5114
Range Entropy:2122
Entropy of Entropy:31-3
Symbolic Dynamic Entropy:2--1
Cosine Entropy:2--1
Diversity Entropy: 1----
1 Even though Diversity entropy had no citations, it is included, since it is a recently proposed definition.
Table 4. Number of articles per method and physiological signal.
Table 4. Number of articles per method and physiological signal.
EEGECG/HRVPPGEHGEMGCTG
Dispersion Entropy:201012--
Bubble Entropy:8101222
Distribution Entropy:515--1-
Increment Entropy:55----
Phase Entropy:242-1-
Slope Entropy:421-1-
F-B Dispersion Entropy:3-----
Attention Entropy:12----
Range Entropy:2-----
Gridded Distribution Entropy:22----
Entropy of Entropy:-2----
Symbolic Dynamic Entropy:1-----
Cosine Entropy:1-----
Diversity Entropy:------
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Platakis, D.; Manis, G. Review of Recent (2015–2024) Popular Entropy Definitions Applied to Physiological Signals. Entropy 2025, 27, 983. https://doi.org/10.3390/e27090983

AMA Style

Platakis D, Manis G. Review of Recent (2015–2024) Popular Entropy Definitions Applied to Physiological Signals. Entropy. 2025; 27(9):983. https://doi.org/10.3390/e27090983

Chicago/Turabian Style

Platakis, Dimitrios, and George Manis. 2025. "Review of Recent (2015–2024) Popular Entropy Definitions Applied to Physiological Signals" Entropy 27, no. 9: 983. https://doi.org/10.3390/e27090983

APA Style

Platakis, D., & Manis, G. (2025). Review of Recent (2015–2024) Popular Entropy Definitions Applied to Physiological Signals. Entropy, 27(9), 983. https://doi.org/10.3390/e27090983

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop