E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Special Issue "Evaluation of Systems’ Irregularity and Complexity: Sample Entropy, Its Derivatives, and Their Applications across Scales and Disciplines"

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (30 April 2018).

Printed Edition Available!
A printed edition of this Special Issue is available here.

Special Issue Editor

Guest Editor
Dr. Anne Humeau-Heurtier

Laboratoire Angevin de Recherche en Ingénierie des Systèmes (LARIS), University of Angers, IUT, GEII Department, 4 boulevard Lavoisier, BP 42018, 49016 Angers cedex, France
E-Mail
Interests: entropy; multiscale entropy; nonlinear analysis; empirical mode decomposition; biomedical data

Special Issue Information

Dear Colleagues,

Based on information theory, a number of entropy measures have been proposed since the 1990s to assess systems’ irregularity: Approximate entropy, sample entropy, permutation entropy, intrinsic mode entropy, dispersion entropy, to cite only a few. Among them, sample entropy has been used in a very large variety of disciplines, for univariate and multivariate data. However, improvements of the sample entropy algorithm are still proposed because sample entropy is unstable for short time series, may be sensitive to the parameter values, and can be too time consuming for long data.

At the same time, it is worth noting that sample entropy does not take into account the multiple temporal scales inherent to complex systems. It is maximized for completely random processes and is used only to quantify the irregularity of signals on a single scale. This is why analyses of irregularity—with sample entropy or its derivatives—at multiple time scales have been proposed to assess systems’ complexity.

This Special Issue invites contributions that present new and original research based on the use of sample entropy or its derivatives. Studies applying sample entropy or its derivatives on a single scale or on multiple scales, as well as applications on any kind of time series, are welcome. Furthermore, manuscripts summarizing the most recent state-of-the-art of this topic will also be considered.

Prof. Anne Humeau-Heurtier
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • complexity
  • entropy measure
  • interdisciplinary applications
  • irregularity
  • sample entropy
  • multiscale entropy

Published Papers (17 papers)

View options order results:
result details:
Displaying articles 1-17
Export citation of selected articles as:

Editorial

Jump to: Research, Review

Open AccessEditorial
Evaluation of Systems’ Irregularity and Complexity: Sample Entropy, Its Derivatives, and Their Applications across Scales and Disciplines
Entropy 2018, 20(10), 794; https://doi.org/10.3390/e20100794
Received: 8 October 2018 / Accepted: 9 October 2018 / Published: 16 October 2018
Cited by 1 | PDF Full-text (171 KB) | HTML Full-text | XML Full-text

Research

Jump to: Editorial, Review

Open AccessArticle
Research on Recognition Method of Driving Fatigue State Based on Sample Entropy and Kernel Principal Component Analysis
Entropy 2018, 20(9), 701; https://doi.org/10.3390/e20090701
Received: 29 July 2018 / Revised: 7 September 2018 / Accepted: 10 September 2018 / Published: 13 September 2018
Cited by 1 | PDF Full-text (866 KB) | HTML Full-text | XML Full-text
Abstract
In view of the nonlinear characteristics of electroencephalography (EEG) signals collected in the driving fatigue state recognition research and the issue that the recognition accuracy of the driving fatigue state recognition method based on EEG is still unsatisfactory, this paper proposes a driving [...] Read more.
In view of the nonlinear characteristics of electroencephalography (EEG) signals collected in the driving fatigue state recognition research and the issue that the recognition accuracy of the driving fatigue state recognition method based on EEG is still unsatisfactory, this paper proposes a driving fatigue recognition method based on sample entropy (SE) and kernel principal component analysis (KPCA), which combines the advantage of the high recognition accuracy of sample entropy and the advantages of KPCA in dimensionality reduction for nonlinear principal components and the strong non-linear processing capability. By using support vector machine (SVM) classifier, the proposed method (called SE_KPCA) is tested on the EEG data, and compared with those based on fuzzy entropy (FE), combination entropy (CE), three kinds of entropies including SE, FE and CE that merged with KPCA. Experiment results show that the method is effective. Full article
Figures

Figure 1

Open AccessArticle
Sample Entropy of the Heart Rate Reflects Properties of the System Organization of Behaviour
Entropy 2018, 20(6), 449; https://doi.org/10.3390/e20060449
Received: 27 April 2018 / Revised: 4 June 2018 / Accepted: 5 June 2018 / Published: 8 June 2018
Cited by 2 | PDF Full-text (3014 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
Cardiac activity is involved in the processes of organization of goal-directed behaviour. Each behavioural act is aimed at achieving an adaptive outcome and it is subserved by the actualization of functional systems consisting of elements distributed across the brain and the rest of [...] Read more.
Cardiac activity is involved in the processes of organization of goal-directed behaviour. Each behavioural act is aimed at achieving an adaptive outcome and it is subserved by the actualization of functional systems consisting of elements distributed across the brain and the rest of the body. This paper proposes a system-evolutionary view on the activity of the heart and its variability. We have compared the irregularity of the heart rate, as measured by sample entropy (SampEn), in behaviours that are subserved by functional systems formed at different stages of individual development, which implement organism-environment interactions with different degrees of differentiation. The results have shown that SampEn of the heart rate was higher during performing tasks that included later acquired knowledge (foreign language vs. native language; mathematical vocabulary vs. general vocabulary) and decreased in the stress and alcohol conditions, as well as at the beginning of learning. These results are in line with the hypothesis that irregularity of the heart rate reflects the properties of a set of functional systems subserving current behaviour, with higher irregularity corresponding to later acquired and more complex behaviour. Full article
Figures

Figure 1

Open AccessArticle
Complexity Analysis of Carbon Market Using the Modified Multi-Scale Entropy
Entropy 2018, 20(6), 434; https://doi.org/10.3390/e20060434
Received: 28 April 2018 / Revised: 31 May 2018 / Accepted: 1 June 2018 / Published: 5 June 2018
Cited by 3 | PDF Full-text (1910 KB) | HTML Full-text | XML Full-text
Abstract
Carbon markets provide a market-based way to reduce climate pollution. Subject to general market regulations, the major existing emission trading markets present complex characteristics. This paper analyzes the complexity of carbon market by using the multi-scale entropy. Pilot carbon markets in China are [...] Read more.
Carbon markets provide a market-based way to reduce climate pollution. Subject to general market regulations, the major existing emission trading markets present complex characteristics. This paper analyzes the complexity of carbon market by using the multi-scale entropy. Pilot carbon markets in China are taken as the example. Moving average is adopted to extract the scales due to the short length of the data set. Results show a low-level complexity inferring that China’s pilot carbon markets are quite immature in lack of market efficiency. However, the complexity varies in different time scales. China’s carbon markets (except for the Chongqing pilot) are more complex in the short period than in the long term. Furthermore, complexity level in most pilot markets increases as the markets developed, showing an improvement in market efficiency. All these results demonstrate that an effective carbon market is required for the full function of emission trading. Full article
Figures

Figure 1

Open AccessArticle
Hierarchical Cosine Similarity Entropy for Feature Extraction of Ship-Radiated Noise
Entropy 2018, 20(6), 425; https://doi.org/10.3390/e20060425
Received: 30 April 2018 / Revised: 25 May 2018 / Accepted: 31 May 2018 / Published: 1 June 2018
Cited by 5 | PDF Full-text (4251 KB) | HTML Full-text | XML Full-text
Abstract
The classification performance of passive sonar can be improved by extracting the features of ship-radiated noise. Traditional feature extraction methods neglect the nonlinear features in ship-radiated noise, such as entropy. The multiscale sample entropy (MSE) algorithm has been widely used for quantifying the [...] Read more.
The classification performance of passive sonar can be improved by extracting the features of ship-radiated noise. Traditional feature extraction methods neglect the nonlinear features in ship-radiated noise, such as entropy. The multiscale sample entropy (MSE) algorithm has been widely used for quantifying the entropy of a signal, but there are still some limitations. To remedy this, the hierarchical cosine similarity entropy (HCSE) is proposed in this paper. Firstly, the hierarchical decomposition is utilized to decompose a time series into some subsequences. Then, the sample entropy (SE) is modified by utilizing Shannon entropy rather than conditional entropy and employing angular distance instead of Chebyshev distance. Finally, the complexity of each subsequence is quantified by the modified SE. Simulation results show that the HCSE method overcomes some limitations in MSE. For example, undefined entropy is not likely to occur in HCSE, and it is more suitable for short time series. Compared with MSE, the experimental results illustrate that the classification accuracy of real ship-radiated noise is significantly improved from 75% to 95.63% by using HCSE. Consequently, the proposed HCSE can be applied in practical applications. Full article
Figures

Figure 1

Open AccessArticle
Centered and Averaged Fuzzy Entropy to Improve Fuzzy Entropy Precision
Entropy 2018, 20(4), 287; https://doi.org/10.3390/e20040287
Received: 16 March 2018 / Revised: 13 April 2018 / Accepted: 13 April 2018 / Published: 15 April 2018
Cited by 3 | PDF Full-text (460 KB) | HTML Full-text | XML Full-text
Abstract
Several entropy measures are now widely used to analyze real-world time series. Among them, we can cite approximate entropy, sample entropy and fuzzy entropy (FuzzyEn), the latter one being probably the most efficient among the three. However, FuzzyEn precision depends on the number [...] Read more.
Several entropy measures are now widely used to analyze real-world time series. Among them, we can cite approximate entropy, sample entropy and fuzzy entropy (FuzzyEn), the latter one being probably the most efficient among the three. However, FuzzyEn precision depends on the number of samples in the data under study. The longer the signal, the better it is. Nevertheless, long signals are often difficult to obtain in real applications. This is why we herein propose a new FuzzyEn that presents better precision than the standard FuzzyEn. This is performed by increasing the number of samples used in the computation of the entropy measure, without changing the length of the time series. Thus, for the comparisons of the patterns, the mean value is no longer a constraint. Moreover, translated patterns are not the only ones considered: reflected, inversed, and glide-reflected patterns are also taken into account. The new measure (so-called centered and averaged FuzzyEn) is applied to synthetic and biomedical signals. The results show that the centered and averaged FuzzyEn leads to more precise results than the standard FuzzyEn: the relative percentile range is reduced compared to the standard sample entropy and fuzzy entropy measures. The centered and averaged FuzzyEn could now be used in other applications to compare its performances to those of other already-existing entropy measures. Full article
Figures

Figure 1

Open AccessArticle
Coarse-Graining Approaches in Univariate Multiscale Sample and Dispersion Entropy
Entropy 2018, 20(2), 138; https://doi.org/10.3390/e20020138
Received: 1 December 2017 / Revised: 15 February 2018 / Accepted: 16 February 2018 / Published: 22 February 2018
Cited by 8 | PDF Full-text (3483 KB) | HTML Full-text | XML Full-text
Abstract
The evaluation of complexity in univariate signals has attracted considerable attention in recent years. This is often done using the framework of Multiscale Entropy, which entails two basic steps: coarse-graining to consider multiple temporal scales, and evaluation of irregularity for each of those [...] Read more.
The evaluation of complexity in univariate signals has attracted considerable attention in recent years. This is often done using the framework of Multiscale Entropy, which entails two basic steps: coarse-graining to consider multiple temporal scales, and evaluation of irregularity for each of those scales with entropy estimators. Recent developments in the field have proposed modifications to this approach to facilitate the analysis of short-time series. However, the role of the downsampling in the classical coarse-graining process and its relationships with alternative filtering techniques has not been systematically explored yet. Here, we assess the impact of coarse-graining in multiscale entropy estimations based on both Sample Entropy and Dispersion Entropy. We compare the classical moving average approach with low-pass Butterworth filtering, both with and without downsampling, and empirical mode decomposition in Intrinsic Multiscale Entropy, in selected synthetic data and two real physiological datasets. The results show that when the sampling frequency is low or high, downsampling respectively decreases or increases the entropy values. Our results suggest that, when dealing with long signals and relatively low levels of noise, the refine composite method makes little difference in the quality of the entropy estimation at the expense of considerable additional computational cost. It is also found that downsampling within the coarse-graining procedure may not be required to quantify the complexity of signals, especially for short ones. Overall, we expect these results to contribute to the ongoing discussion about the development of stable, fast and robust-to-noise multiscale entropy techniques suited for either short or long recordings. Full article
Figures

Figure 1

Open AccessArticle
Application of Multiscale Entropy in Assessing Plantar Skin Blood Flow Dynamics in Diabetics with Peripheral Neuropathy
Entropy 2018, 20(2), 127; https://doi.org/10.3390/e20020127
Received: 22 January 2018 / Revised: 10 February 2018 / Accepted: 12 February 2018 / Published: 15 February 2018
Cited by 4 | PDF Full-text (5452 KB) | HTML Full-text | XML Full-text
Abstract
Diabetic foot ulcer (DFU) is a common complication of diabetes mellitus, while tissue ischemia caused by impaired vasodilatory response to plantar pressure is thought to be a major factor of the development of DFUs, which has been assessed using various measures of skin [...] Read more.
Diabetic foot ulcer (DFU) is a common complication of diabetes mellitus, while tissue ischemia caused by impaired vasodilatory response to plantar pressure is thought to be a major factor of the development of DFUs, which has been assessed using various measures of skin blood flow (SBF) in the time or frequency domain. These measures, however, are incapable of characterizing nonlinear dynamics of SBF, which is an indicator of pathologic alterations of microcirculation in the diabetic foot. This study recruited 18 type 2 diabetics with peripheral neuropathy and eight healthy controls. SBF at the first metatarsal head in response to locally applied pressure and heating was measured using laser Doppler flowmetry. A multiscale entropy algorithm was utilized to quantify the regularity degree of the SBF responses. The results showed that during reactive hyperemia and thermally induced biphasic response, the regularity degree of SBF in diabetics underwent only small changes compared to baseline and significantly differed from that in controls at multiple scales (p < 0.05). On the other hand, the transition of regularity degree of SBF in diabetics distinctively differed from that in controls (p < 0.05). These findings indicated that multiscale entropy could provide a more comprehensive assessment of impaired microvascular reactivity in the diabetic foot compared to other entropy measures based on only a single scale, which strengthens the use of plantar SBF dynamics to assess the risk for DFU. Full article
Figures

Figure 1

Open AccessArticle
A Novel Multivariate Sample Entropy Algorithm for Modeling Time Series Synchronization
Entropy 2018, 20(2), 82; https://doi.org/10.3390/e20020082
Received: 29 November 2017 / Revised: 15 January 2018 / Accepted: 19 January 2018 / Published: 24 January 2018
Cited by 1 | PDF Full-text (3297 KB) | HTML Full-text | XML Full-text
Abstract
Approximate and sample entropy (AE and SE) provide robust measures of the deterministic or stochastic content of a time series (regularity), as well as the degree of structural richness (complexity), through operations at multiple data scales. Despite the success of the univariate algorithms, [...] Read more.
Approximate and sample entropy (AE and SE) provide robust measures of the deterministic or stochastic content of a time series (regularity), as well as the degree of structural richness (complexity), through operations at multiple data scales. Despite the success of the univariate algorithms, multivariate sample entropy (mSE) algorithms are still in their infancy and have considerable shortcomings. Not only are existing mSE algorithms unable to analyse within- and cross-channel dynamics, they can counter-intuitively interpret increased correlation between variates as decreased regularity. To this end, we first revisit the embedding of multivariate delay vectors (DVs), critical to ensuring physically meaningful and accurate analysis. We next propose a novel mSE algorithm and demonstrate its improved performance over existing work, for synthetic data and for classifying wake and sleep states from real-world physiological data. It is furthermore revealed that, unlike other tools, such as the correlation of phase synchrony, synchronized regularity dynamics are uniquely identified via mSE analysis. In addition, a model for the operation of this novel algorithm in the presence of white Gaussian noise is presented, which, in contrast to the existing algorithms, reveals for the first time that increasing correlation between different variates reduces entropy. Full article
Figures

Figure 1

Open AccessArticle
Changes in the Complexity of Heart Rate Variability with Exercise Training Measured by Multiscale Entropy-Based Measurements
Entropy 2018, 20(1), 47; https://doi.org/10.3390/e20010047
Received: 12 December 2017 / Revised: 4 January 2018 / Accepted: 8 January 2018 / Published: 17 January 2018
Cited by 4 | PDF Full-text (904 KB) | HTML Full-text | XML Full-text
Abstract
Quantifying complexity from heart rate variability (HRV) series is a challenging task, and multiscale entropy (MSE), along with its variants, has been demonstrated to be one of the most robust approaches to achieve this goal. Although physical training is known to be beneficial, [...] Read more.
Quantifying complexity from heart rate variability (HRV) series is a challenging task, and multiscale entropy (MSE), along with its variants, has been demonstrated to be one of the most robust approaches to achieve this goal. Although physical training is known to be beneficial, there is little information about the long-term complexity changes induced by the physical conditioning. The present study aimed to quantify the changes in physiological complexity elicited by physical training through multiscale entropy-based complexity measurements. Rats were subject to a protocol of medium intensity training ( n = 13 ) or a sedentary protocol ( n = 12 ). One-hour HRV series were obtained from all conscious rats five days after the experimental protocol. We estimated MSE, multiscale dispersion entropy (MDE) and multiscale SDiff q from HRV series. Multiscale SDiff q is a recent approach that accounts for entropy differences between a given time series and its shuffled dynamics. From SDiff q , three attributes (q-attributes) were derived, namely SDiff q m a x , q m a x and q z e r o . MSE, MDE and multiscale q-attributes presented similar profiles, except for SDiff q m a x . q m a x showed significant differences between trained and sedentary groups on Time Scales 6 to 20. Results suggest that physical training increases the system complexity and that multiscale q-attributes provide valuable information about the physiological complexity. Full article
Figures

Figure 1

Open AccessArticle
Low Computational Cost for Sample Entropy
Entropy 2018, 20(1), 61; https://doi.org/10.3390/e20010061
Received: 28 November 2017 / Revised: 24 December 2017 / Accepted: 9 January 2018 / Published: 13 January 2018
Cited by 10 | PDF Full-text (464 KB) | HTML Full-text | XML Full-text
Abstract
Sample Entropy is the most popular definition of entropy and is widely used as a measure of the regularity/complexity of a time series. On the other hand, it is a computationally expensive method which may require a large amount of time when used [...] Read more.
Sample Entropy is the most popular definition of entropy and is widely used as a measure of the regularity/complexity of a time series. On the other hand, it is a computationally expensive method which may require a large amount of time when used in long series or with a large number of signals. The computationally intensive part is the similarity check between points in m dimensional space. In this paper, we propose new algorithms or extend already proposed ones, aiming to compute Sample Entropy quickly. All algorithms return exactly the same value for Sample Entropy, and no approximation techniques are used. We compare and evaluate them using cardiac inter-beat (RR) time series. We investigate three algorithms. The first one is an extension of the k d -trees algorithm, customized for Sample Entropy. The second one is an extension of an algorithm initially proposed for Approximate Entropy, again customized for Sample Entropy, but also improved to present even faster results. The last one is a completely new algorithm, presenting the fastest execution times for specific values of m, r, time series length, and signal characteristics. These algorithms are compared with the straightforward implementation, directly resulting from the definition of Sample Entropy, in order to give a clear image of the speedups achieved. All algorithms assume the classical approach to the metric, in which the maximum norm is used. The key idea of the two last suggested algorithms is to avoid unnecessary comparisons by detecting them early. We use the term unnecessary to refer to those comparisons for which we know a priori that they will fail at the similarity check. The number of avoided comparisons is proved to be very large, resulting in an analogous large reduction of execution time, making them the fastest algorithms available today for the computation of Sample Entropy. Full article
Figures

Figure 1

Open AccessArticle
Entropy-Based Structural Health Monitoring System for Damage Detection in Multi-Bay Three-Dimensional Structures
Entropy 2018, 20(1), 49; https://doi.org/10.3390/e20010049
Received: 28 November 2017 / Revised: 3 January 2018 / Accepted: 5 January 2018 / Published: 11 January 2018
Cited by 3 | PDF Full-text (4056 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, a structural health monitoring (SHM) system based on multi-scale cross-sample entropy (MSCE) is proposed for detecting damage locations in multi-bay three-dimensional structures. The location of damage is evaluated for each bay through MSCE analysis by examining the degree of dissimilarity [...] Read more.
In this paper, a structural health monitoring (SHM) system based on multi-scale cross-sample entropy (MSCE) is proposed for detecting damage locations in multi-bay three-dimensional structures. The location of damage is evaluated for each bay through MSCE analysis by examining the degree of dissimilarity between the response signals of vertically-adjacent floors. Subsequently, the results are quantified using the damage index (DI). The performance of the proposed SHM system was determined in this study by performing a finite element analysis of a multi-bay seven-story structure. The derived results revealed that the SHM system successfully detected the damaged floors and their respective directions for several cases. The proposed system provides a preliminary assessment of which bay has been more severely affected. Thus, the effectiveness and high potential of the SHM system for locating damage in large and complex structures rapidly and at low cost are demonstrated. Full article
Figures

Figure 1

Open AccessArticle
Automated Multiclass Classification of Spontaneous EEG Activity in Alzheimer’s Disease and Mild Cognitive Impairment
Entropy 2018, 20(1), 35; https://doi.org/10.3390/e20010035
Received: 15 December 2017 / Revised: 4 January 2018 / Accepted: 5 January 2018 / Published: 9 January 2018
Cited by 9 | PDF Full-text (1237 KB) | HTML Full-text | XML Full-text
Abstract
The discrimination of early Alzheimer’s disease (AD) and its prodromal form (i.e., mild cognitive impairment, MCI) from cognitively healthy control (HC) subjects is crucial since the treatment is more effective in the first stages of the dementia. The aim of our study is [...] Read more.
The discrimination of early Alzheimer’s disease (AD) and its prodromal form (i.e., mild cognitive impairment, MCI) from cognitively healthy control (HC) subjects is crucial since the treatment is more effective in the first stages of the dementia. The aim of our study is to evaluate the usefulness of a methodology based on electroencephalography (EEG) to detect AD and MCI. EEG rhythms were recorded from 37 AD patients, 37 MCI subjects and 37 HC subjects. Artifact-free trials were analyzed by means of several spectral and nonlinear features: relative power in the conventional frequency bands, median frequency, individual alpha frequency, spectral entropy, Lempel–Ziv complexity, central tendency measure, sample entropy, fuzzy entropy, and auto-mutual information. Relevance and redundancy analyses were also conducted through the fast correlation-based filter (FCBF) to derive an optimal set of them. The selected features were used to train three different models aimed at classifying the trials: linear discriminant analysis (LDA), quadratic discriminant analysis (QDA) and multi-layer perceptron artificial neural network (MLP). Afterwards, each subject was automatically allocated in a particular group by applying a trial-based majority vote procedure. After feature extraction, the FCBF method selected the optimal set of features: individual alpha frequency, relative power at delta frequency band, and sample entropy. Using the aforementioned set of features, MLP showed the highest diagnostic performance in determining whether a subject is not healthy (sensitivity of 82.35% and positive predictive value of 84.85% for HC vs. all classification task) and whether a subject does not suffer from AD (specificity of 79.41% and negative predictive value of 84.38% for AD vs. all comparison). Our findings suggest that our methodology can help physicians to discriminate AD, MCI and HC. Full article
Figures

Figure 1

Open AccessFeature PaperArticle
Fuzzy Entropy Analysis of the Electroencephalogram in Patients with Alzheimer’s Disease: Is the Method Superior to Sample Entropy?
Entropy 2018, 20(1), 21; https://doi.org/10.3390/e20010021
Received: 29 November 2017 / Revised: 20 December 2017 / Accepted: 28 December 2017 / Published: 3 January 2018
Cited by 7 | PDF Full-text (590 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
Alzheimer’s disease (AD) is the most prevalent form of dementia in the world, which is characterised by the loss of neurones and the build-up of plaques in the brain, causing progressive symptoms of memory loss and confusion. Although definite diagnosis is only possible [...] Read more.
Alzheimer’s disease (AD) is the most prevalent form of dementia in the world, which is characterised by the loss of neurones and the build-up of plaques in the brain, causing progressive symptoms of memory loss and confusion. Although definite diagnosis is only possible by necropsy, differential diagnosis with other types of dementia is still needed. An electroencephalogram (EEG) is a cheap, portable, non-invasive method to record brain signals. Previous studies with non-linear signal processing methods have shown changes in the EEG due to AD, which is characterised reduced complexity and increased regularity. EEGs from 11 AD patients and 11 age-matched control subjects were analysed with Fuzzy Entropy (FuzzyEn), a non-linear method that was introduced as an improvement over the frequently used Approximate Entropy (ApEn) and Sample Entropy (SampEn) algorithms. AD patients had significantly lower FuzzyEn values than control subjects (p < 0.01) at electrodes T6, P3, P4, O1, and O2. Furthermore, when diagnostic accuracy was calculated using Receiver Operating Characteristic (ROC) curves, FuzzyEn outperformed both ApEn and SampEn, reaching a maximum accuracy of 86.36%. These results suggest that FuzzyEn could increase the insight into brain dysfunction in AD, providing potentially useful diagnostic information. However, results depend heavily on the input parameters that are used to compute FuzzyEn. Full article
Figures

Figure 1

Open AccessArticle
Entropy Analysis of Short-Term Heartbeat Interval Time Series during Regular Walking
Entropy 2017, 19(10), 568; https://doi.org/10.3390/e19100568
Received: 18 September 2017 / Revised: 11 October 2017 / Accepted: 21 October 2017 / Published: 24 October 2017
Cited by 11 | PDF Full-text (917 KB) | HTML Full-text | XML Full-text
Abstract
Entropy measures have been extensively used to assess heart rate variability (HRV), a noninvasive marker of cardiovascular autonomic regulation. It is yet to be elucidated whether those entropy measures can sensitively respond to changes of autonomic balance and whether the responses, if there [...] Read more.
Entropy measures have been extensively used to assess heart rate variability (HRV), a noninvasive marker of cardiovascular autonomic regulation. It is yet to be elucidated whether those entropy measures can sensitively respond to changes of autonomic balance and whether the responses, if there are any, are consistent across different entropy measures. Sixteen healthy subjects were enrolled in this study. Each subject undertook two 5-min ECG measurements, one in a resting seated position and another while walking on a treadmill at a regular speed of 5 km/h. For each subject, the two measurements were conducted in a randomized order and a 30-min rest was required between them. HRV time series were derived and were analyzed by eight entropy measures, i.e., approximate entropy (ApEn), corrected ApEn (cApEn), sample entropy (SampEn), fuzzy entropy without removing local trend (FuzzyEn-g), fuzzy entropy with local trend removal (FuzzyEn-l), permutation entropy (PermEn), conditional entropy (CE), and distribution entropy (DistEn). Compared to resting seated position, regular walking led to significantly reduced CE and DistEn (both p ≤ 0.006; Cohen’s d = 0.9 for CE, d = 1.7 for DistEn), and increased PermEn (p < 0.0001; d = 1.9), while all these changes disappeared after performing a linear detrend or a wavelet detrend (<~0.03 Hz) on HRV. In addition, cApEn, SampEn, FuzzyEn-g, and FuzzyEn-l showed significant decreases during regular walking after linear detrending (all p < 0.006; 0.8 < d < 1), while a significantly increased ApEn (p < 0.0001; d = 1.9) and a significantly reduced cApEn (p = 0.0006; d = 0.8) were observed after wavelet detrending. To conclude, multiple entropy analyses should be performed to assess HRV in order for objective results and caution should be paid when drawing conclusions based on observations from a single measure. Besides, results from different studies will not be comparable unless it is clearly stated whether data have been detrended and the methods used for detrending have been specified. Full article
Figures

Figure 1

Open AccessFeature PaperArticle
Automated Diagnosis of Myocardial Infarction ECG Signals Using Sample Entropy in Flexible Analytic Wavelet Transform Framework
Entropy 2017, 19(9), 488; https://doi.org/10.3390/e19090488
Received: 28 July 2017 / Revised: 8 September 2017 / Accepted: 8 September 2017 / Published: 13 September 2017
Cited by 17 | PDF Full-text (834 KB) | HTML Full-text | XML Full-text
Abstract
Myocardial infarction (MI) is a silent condition that irreversibly damages the heart muscles. It expands rapidly and, if not treated timely, continues to damage the heart muscles. An electrocardiogram (ECG) is generally used by the clinicians to diagnose the MI patients. Manual identification [...] Read more.
Myocardial infarction (MI) is a silent condition that irreversibly damages the heart muscles. It expands rapidly and, if not treated timely, continues to damage the heart muscles. An electrocardiogram (ECG) is generally used by the clinicians to diagnose the MI patients. Manual identification of the changes introduced by MI is a time-consuming and tedious task, and there is also a possibility of misinterpretation of the changes in the ECG. Therefore, a method for automatic diagnosis of MI using ECG beat with flexible analytic wavelet transform (FAWT) method is proposed in this work. First, the segmentation of ECG signals into beats is performed. Then, FAWT is applied to each ECG beat, which decomposes them into subband signals. Sample entropy (SEnt) is computed from these subband signals and fed to the random forest (RF), J48 decision tree, back propagation neural network (BPNN), and least-squares support vector machine (LS-SVM) classifiers to choose the highest performing one. We have achieved highest classification accuracy of 99.31% using LS-SVM classifier. We have also incorporated Wilcoxon and Bhattacharya ranking methods and observed no improvement in the performance. The proposed automated method can be installed in the intensive care units (ICUs) of hospitals to aid the clinicians in confirming their diagnosis. Full article
Figures

Figure 1

Review

Jump to: Editorial, Research

Open AccessReview
Entropy Change of Biological Dynamics in Asthmatic Patients and Its Diagnostic Value in Individualized Treatment: A Systematic Review
Entropy 2018, 20(6), 402; https://doi.org/10.3390/e20060402
Received: 16 March 2018 / Revised: 12 April 2018 / Accepted: 23 April 2018 / Published: 24 May 2018
Cited by 4 | PDF Full-text (969 KB) | HTML Full-text | XML Full-text
Abstract
Asthma is a chronic respiratory disease featured with unpredictable flare-ups, for which continuous lung function monitoring is the key for symptoms control. To find new indices to individually classify severity and predict disease prognosis, continuous physiological data collected from monitoring devices is being [...] Read more.
Asthma is a chronic respiratory disease featured with unpredictable flare-ups, for which continuous lung function monitoring is the key for symptoms control. To find new indices to individually classify severity and predict disease prognosis, continuous physiological data collected from monitoring devices is being studied from different perspectives. Entropy, as an analysis method for quantifying the inner irregularity of data, has been widely applied in physiological signals. However, based on our knowledge, there is no such study to summarize the complexity differences of various physiological signals in asthmatic patients. Therefore, we organized a systematic review to summarize the complexity differences of important signals in patients with asthma. We searched several medical databases and systematically reviewed existing asthma clinical trials in which entropy changes in physiological signals were studied. As a conclusion, we find that, for airflow, heart rate variability, center of pressure and respiratory impedance, their entropy values decrease significantly in asthma patients compared to those of healthy people, while, for respiratory sound and airway resistance, their entropy values increase along with the progression of asthma. Entropy of some signals, such as respiratory inter-breath interval, shows strong potential as novel indices of asthma severity. These results will give valuable guidance for the utilization of entropy in physiological signals. Furthermore, these results should promote the development of management and diagnosis of asthma using continuous monitoring data in the future. Full article
Figures

Graphical abstract

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top