Next Article in Journal
Application of Second Law Analysis in Heat Exchanger Systems
Next Article in Special Issue
Kernel Methods for Nonlinear Connectivity Detection
Previous Article in Journal
Turbine Passage Design Methodology to Minimize Entropy Production—A Two-Step Optimization Strategy
Previous Article in Special Issue
Information-Domain Analysis of Cardiovascular Complexity: Night and Day Modulations of Entropy and the Effects of Hypertension
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Entropy Measures as Descriptors to Identify Apneas in Rheoencephalographic Signals

1
Biomedical Engineering Research Centre, Universitat Politècnica de Catalunya, CIBER of Bioengineering, Biomaterials and Nanomedicine (CIBER-BBN), 08028 Barcelona, Spain
2
Quantium Medical, Research and Development Department, 08302 Mataró, Spain
3
Systems Pharmacology Effect Control & Modeling (SPEC-M) Research Group, Department of Anesthesia, Hospital CLINIC de Barcelona, 08036 Barcelona, Spain
4
Department of Anesthesia and Perioperative Care, University of California San Francisco (UCSF), San Francisco, CA 94143, USA
*
Author to whom correspondence should be addressed.
Entropy 2019, 21(6), 605; https://doi.org/10.3390/e21060605
Submission received: 27 April 2019 / Revised: 9 June 2019 / Accepted: 15 June 2019 / Published: 18 June 2019
(This article belongs to the Special Issue Information Dynamics in Brain and Physiological Networks)

Abstract

:
Rheoencephalography (REG) is a simple and inexpensive technique that intends to monitor cerebral blood flow (CBF), but its ability to reflect CBF changes has not been extensively proved. Based on the hypothesis that alterations in CBF during apnea should be reflected in REG signals under the form of increased complexity, several entropy metrics were assessed for REG analysis during apnea and resting periods in 16 healthy subjects: approximate entropy (ApEn), sample entropy (SampEn), fuzzy entropy (FuzzyEn), corrected conditional entropy (CCE) and Shannon entropy (SE). To compute these entropy metrics, a set of parameters must be defined a priori, such as, for example, the embedding dimension m, and the tolerance threshold r. A thorough analysis of the effects of parameter selection in the entropy metrics was performed, looking for the values optimizing differences between apnea and baseline signals. All entropy metrics, except SE, provided higher values for apnea periods (p-values < 0.025). FuzzyEn outperformed all other metrics, providing the lowest p-value (p = 0.0001), allowing to conclude that REG signals during apnea have higher complexity than in resting periods. Those findings suggest that REG signals reflect CBF changes provoked by apneas, even though further studies are needed to confirm this hypothesis.

1. Introduction

The brain represents only up to 2% of the body weight in humans, while it receives up to 20% of the total cardiac output [1]. This suggests that the brain has large metabolic needs and, as it is an organ that has no mechanism to store nutrients, oxygen or water, it needs to receive a large and uninterrupted blood supply. An inadequate or drastic reduction in cerebral blood flow (CBF) would provoke brain ischemia, which is often the cause of death in traumatic head-injured patients. Furthermore, secondary brain insults are frequent in traumatic patients that could be anticipated with CBF monitoring [2].
Moreover, cerebral ischemia and neuronal damage are two critical adverse events during anesthesia. In this way, even though encephalic vascular accidents are infrequent during common surgeries, complex procedures present a higher risk [3]. The occurrence of neurologic complications during cardiac surgeries has been estimated as 2%–6% [4], often during the intraoperative period [5]. It is a relatively low occurrence but given the large number of patients undergoing this kind of procedures every year, the population at risk is considerably high.
Therefore, CBF monitoring is mandatory for critical patients and would increase safety in clinical procedures provoking alterations in CBF. Several techniques are available for that purpose; however, they are not always suitable for standard bedside monitoring either because they are invasive or extremely cumbersome and expensive. Rheoencephalography (REG) is a very simple, non-invasive and inexpensive technique that allows cerebral blood flow monitoring by sending an electric current through the scalp and measuring the impedance generated by the tissues. Since empty vessels present lower electrical conductivity than vessels filled with blood, monitoring the impedance through the scalp would reflect the amount of blood flow reaching the brain.
Several studies have been performed evaluating the ability of REG to reflect CBF changes [6], involving assessments of cerebrovascular resistance [7] and cerebral autoregulation in both animals and humans [8,9,10]. Even though REG shows consistency among subjects [11] and correlation of REG and CBF [12] has been published, its lack of absolute values and specificity during its clinical use reduced the popularity of REG [13].
One of the most relevant problems that arise from REG signals is the interferences provoked by movements, respiration and contamination with extracranial blood flow [14,15,16]. Consequently, REG recordings suitable for processing are often short and noisy, and statistics adapted to those conditions are needed to be able to extract relevant clinical information from REG. Based on the hypothesis that apnea would provoke changes in REG signals under the form of increased complexity, several entropy metrics robust in noisy environments are assessed for REG analysis during breath holding.
Pincus et al. [17] presented approximate entropy (ApEn), as a relative entropy metric suitable for short and noisy datasets, applicable to biomedical signals. ApEn approximates the exact regularity statistic Kolmogorov–Sinai entropy and reflects the predictability of a time series by exploring repetitive patterns in the data. It has been extensively used in heart rate variability (HRV) analysis as for example to detect heart failure [18] or to identify differences in HRV in diabetic patients [19]. Moreover, ApEn has also been used to study the electroencephalograph (EEG) regularity during sleep [20,21] and under sevoflurane anesthesia [22]. Even though those clinical applications have shown the ability of ApEn to correlate with physiological conditions, there is lots of controversy about its use. It has been reported to be inconsistent, lower than expected for short records and thus dependent on the length of the time series [23]. Furthermore, to compute the ApEn value of a time series, three parameters need to be defined: the segment length (N), the embedded dimension (m) and the noise threshold (r). The choice of those three parameters influence the ApEn result, therefore limiting its use to relative measurements.
In order to compensate for the limitations of ApEn, Richman and Moorman [23] proposed a new entropy metric, called SampEn. The main difference of the computation of ApEn and SampEn is that SampEn does not count self matches. However, it still requires the a priori definition of the same parameters N, m and r.
SampEn has been used in different type of biomedical signals, as for example to characterize human gait signals [24], as a detector of driving fatigue in HRV signals [25] or to study EEG brain maturation in newborns [26]. Advantages of SampEn over the use of ApEn have also been reported, indicating that ApEn presents inconsistencies that are avoided by using SampEn instead [27] and that it is a better choice for short datasets [28].
Both ApEn and SampEn rely on the Heaviside function to define the similarity between two patterns. Due to its binary output, pairs of patterns are either included or rejected before the entropy calculation. In contrast, FuzzyEn was defined as a new entropy metric [29], in which the Heaviside function classifying the patterns as similar or not is replaced by a fuzzy function that computes a membership coefficient ranging from 0 to 1, where 1 maximizes the membership likelihood. Consequently, in addition to the selection of N, m and r, FuzzyEn requires a fourth parameter, n, which is the gradient of the boundary of the exponential function used to assess similarity. When compared to ApEn and SampEn, FuzzyEn outperformed the other measures in electromyogram (EMG) signals characterization [29], as well as in Alzheimer’s disease detection in electrocardiographic (ECG) signals [30].
Those three Entropy metrics rely on the selection of several parameters and there is lots of controversy around how they should be selected and the bias they introduce in the final entropy values. Even though some methods have been proposed to determine those values [31,32,33,34], no consensus has been reached so far. For that reason, in this paper, other metrics will be used not requiring the definition of so many parameters: Shannon Entropy (SE) and corrected conditional entropy (CCE).
SE was introduced by Shannon to be applied in the information theory domain [35] and reflects the regularity of the information generated by a defined source. For its use in biomedical applications, the parameters to be defined are the signal length to be considered and the number of quantization levels used for signal discretization. Additionally, in some cases, SE is applied to short sequences of symbols rather than at a sample level and requires therefore the dimension of the data segments to be analyzed. SE has provided successful results when applied to EEG signals for person identification [36] and monitoring of intrapartum fetal heart rate dynamics [37].
CCE is an entropy measure introduced by Porta [38] that reduces the bias of regularity existing in conditional entropy. It is based on the definition of SE and has been used mainly on HRV signals, in some cases showing the expected trends but without statistical significance [39,40], and in others providing successful results, such as the ones obtained by Viola et al. [41], describing a reduction in complexity of HRV signals during Rapid Eye Movement (REM) sleep with aging.
The entropy measures herein presented have not been previously applied to REG signals to the extent of the knowledge of the authors, but they have been extensively used for diagnosis purposes in other biomedical signals, such as the previously mentioned examples, mainly on EMG and HRV. Nonetheless, entropy measures have been applied to the study of plethysmography signals, which also reflect a pulse wave and are therefore closer to REG signals than EMG and HRV. For instance, Pham et al. [42] proved that SampEn of plethysmography records is a good predictor of mental disorder detection, therefore proving the usefulness of entropy assessment in pulse waves.
The main goal of this work is to study if entropy metrics applied to REG signals can detect changes in CBF during breath holding -apnea- and analyze which parameters would optimize the results. The underlying hypothesis is that entropy would increase during apneas, since under those circumstances CBF changes take place, altering the regular baseline pattern of REG signals and thus reducing regularity and increasing entropy.

2. Materials and Methods

2.1. Entropy Definitions

This section provides information on the algorithms used for entropy calculations. Different entropy metrics will be calculated and tested: ApEn, SampEn, FuzzyEn, SE and CCE. The parameters involved in the entropy evaluation will be a priori identified: the embedding dimension (m), the signal length (N), the multiplicand of the standard deviation to define the noise level (r), the gradient of the fuzzy membership function (n) and the number of quantization levels (ε).

2.1.1. Shannon Entropy

The Shannon entropy (SE) [35] assesses the amount of information generated by a system. It can be used either locally or globally [43] and, for consistency with the other entropy metrics evaluated in this work, SE will be applied to consecutive patterns of length m. Hence, from a time series x(n) of length N, quantized in ε levels, a phase space reconstruction with dimension m is built, resulting in a set of vectors x m ε i = x ε i , x ε i 1 , , x ε i m + 1 . The SE of the time series is then computed as
S E m , ε = x m ε p x m ε l o g   p x m ε
where p x m ε corresponds to the joint probability of the x m ε pattern and the sum is performed across all the different patterns. This entropy metric requires the definition of the number of quantization levels (ε), the embedding dimension (m) and the length of the input signal (N). Thus, in this work, SE will be computed for a set of quantization levels ε ranging from 10 to 50, in steps of 10, with dimensions m from 2 to 4 and a set of signal lengths N = {1000, 2000, 3000, 4000} samples.

2.1.2. Corrected Conditional Entropy

Corrected conditional entropy [38] is based on the correction applied to the conditional entropy (CE) definition. CE is calculated as the variation of SE in two consecutive values for the embedding dimension, m:
C E   m , ε = x m 1 ε p x m 1 ε x m ε p x m ε / x m 1 ε l o g   p x m ε / x m 1 ε
where the first term sums across all the different patterns x m 1 ε and p x m 1 ε corresponds to the joint probability of the x m 1 ε pattern and the second term covers all m samples in the pattern, with p x i ε / x m 1 ε representing the joint probability of the m-th pattern conditioned to the preceding m-1 patterns. Therefore, CE (3) can be formulated as a function of SE:
C E m , ε = S E m , ε S E m 1 , ε
Porta et al. proposed in [38] a correction to CE in order to compensate for unique patterns that should theoretically increase entropy but reduce it when using the CE definition. The proposed correction consists in adding a corrective term and defining CCE as:
C C E m , ε = C E m , ε + S E 1 , ε   p e r c m , ε
where p e r c m , ε is the percentage of single points in the m-dimensional space. Moreover, the same authors propose the use of the minimum of the CCE entropy, CCEmin, as an approximation to the entropy of the signal, avoiding having to define the value for m in advance for the entropy calculation [44]. Additionally, they introduced the regularity index ρ, computed as:
ρ = 1 min N C C E m , ε
To estimate the overall regularity of a time series. In equation (5), NCCE refers to the normalized CCE by SE(1,ε), resulting in a regularity index providing values between 0 and 1, representing maximum and minimum complexity, respectively.
CCE and the regularity index ρ were computed for all the signals in the experimental dataset. Analogous to the parameter set chosen for SE, CCE was computed with embedding dimension m from 2 to 4, quantization levels ε of 10, 20, 30, 40 and 50 while the length N of the segments used ranged from 1000 to 4000, in steps of 1000 samples.

2.1.3. Approximate Entropy

ApEn [17] allows us to quantify the regularity of a time series without the need of previous knowledge of the dynamics of the system [43], resulting in larger values for increasing complexity in the data. ApEn reflects the likelihood that patterns that are close, within a defined distance r, in a m-dimensional space remain close within the same tolerance when defined in a m+1 dimensional space.
Given a digital signal u(n), with length of N samples, values for the embedded dimension m and the filtering level r are fixed a priori. A set of vectors, x, in the Rm dimensional space are then created:
x i = u i , , u i + m 1
For each i, 1 ≤ i ≤ N-m+1, an estimation of the correlation integral Cim(r) is computed as:
C i m r = n u m b e r   o f   j   s u c h   t h a t   d x i , x j r N m + 1
where the distance between x(i) and x(j) is defined as:
d x m i , x m j = max k = 1 , 2 , , m u t i + k 1 u t j + k 1
Finally, ApEn is calculated as:
A p E n m , r , N = ɸ m r ɸ m + 1 r
where
ɸ m r = N m + 1 1   i = 1 N m + 1 l o g   C i m r
The performance of ApEn depends on the choice for the input parameters r and m, as well as the length of the time series to be compared. Since noise smaller than r is filtered out, ideally r should be small enough to preserve the information of the dynamics of the system, but very small values would compromise the calculation of conditional probabilities [45]. Regarding the choice for m, larger values are preferred but it shall be considered that its selection is limited by the length of the time series (N), since N should be between 10m and 30m points [46,47].
ApEn values can vary significantly for r and m values, therefore it shall be used for systems comparison. Typical values for m are m = 2 and m = 3, while selected values for r depend on the type of signals to which this technique is applied [17]. The most commonly used combination is m = 2 and r = 0.2 (20% of the standard deviation) [43]. Pincus et al. [17] obtained significant results on the comparison of HRV signals of healthy and sick infants using r values ranging from 0.1 to 0.25 while Chen at al. used r = 0.3 to successfully distinguish EMG signals originated by four different movements [29].
Even though several algorithms have been published to overcome the difficulties in the choice of r [33,34], when comparing ApEn values for two or more groups the optimal r value could be different in each group and lead to inconsistent results [48]. Therefore, experimental analysis is recommended to identify the best r for each application.
It should also be taken into account that ApEn is a biased statistic, strongly dependent on the signal length and lacking of consistency [23], providing unexpected ApEn variations for different pairs of m and r values [47]. The bias is due to the concavity of the logarithmic function, as well as to the fact that ApEn counts self matches when computing the correlation integral [45].
For the analysis of REG signals in apnea and baseline recordings, considering that data sequences available were 4000 samples length, N values of the analyzed time series ranged from 1000 to 4000, in steps of 1000. For the parameter m, it was limited to m = 2, m = 3 and m = 4, the last one exceeding the N ≤ 10m criteria. Finally, chosen r values covered the range of 0.05 to 0.3 times the standard deviation of the input signal.

2.1.4. Sample Entropy

The entropy metric SampEn [23] intends to surpass the constraints presented in ApEn by excluding self matches in the entropy calculation and, therefore, reducing computation times. The algorithm follows the same initial steps presented for ApEn, but when computing the correlation integral self-matches are excluded, as shown in Equation (11).
C i m r = n u m b e r   o f   j   s u c h   t h a t   d x i , x j r   a n d   i j N m + 1
Lastly, ɸ m r is defined as
ɸ m r = N m 1   i = 1 N m l o g C i m r
and SampEn is calculated as the difference between the logarithms of ɸ m r and ɸ m + 1 r :
S a m p E n m , r , N = log ɸ m r log ɸ m + 1 r
SampEn requires the a priori definition of the same parameters listed for ApEn - N, m and r – and those are typically coincident with the ones used for ApEn (i.e., m = 2, r = 0.2). However, even though some authors consider the same criteria can be used for both SampEn and ApEn [49], other publications suggest that they should be explored independently since algorithms proposed for the choice of r in ApEn are not applicable for SampEn [50]. Moreover, appropriate values for m and r depend of the type of signal under analysis [49].
For instance, Lake et al. [51] studied the selection of m and r parameters for neonatal HRV analysis, concluding that the best pair of values was m = 3 and r = 0.2. In contrast, while applying the SampEn algorithm to characterize the effects of exercise in RR and QT intervals, Lewis et al. [52] explored different combinations of r and m values to finally chose m = 2 and r between 0.1 and 0.15. Higher r values have also been considered as optimal, as for example in the atrial fibrillation organization analysis presented by Alcaraz et al. [49], in which after identifying several combinations providing good classification results, the best values were considered to be m = 3 and r between 0.3 and 0.4.
SampEn overcomes the bias problem detected in ApEn as well as its inconsistencies, such that if SampEn of one signal (x1) is higher than the value obtained with another signal (x2) for a pair of m and r values, a new m-r pair would still provide higher SampEn values for the signal x1 [51]. Nonetheless, Castiglioni et al. [50] detected inconsistencies in SampEn calculations when studying mechanomyographic signals for certain m values, and Yentes et al. [28] published similar findings for some r choices, suggesting that under certain conditions SampEn can also be affected by inconsistencies.
Controversy around adequate m and r values and the existence of inconsistencies in SampEn calculations, requires that for a new type of signals, such as REG signals, an analysis of the effect of m, r and N is performed. Therefore, in this work, the same values suggested for ApEn will be used to explore the ability of SampEn to detect apnea periods in REG signals: a range of m (from 2 to 4), r (from 0.1 to 0.3) and N (from 1000 to 4000).

2.1.5. Fuzzy Entropy

ApEn and SampEn share a definition of similarity in which data segments with distances lower than the threshold value r are considered as positive matches, while others are rejected and not considered for the calculation. Even though ApEn includes self-matches and SampEn does not, in both cases a Heaviside function is used to assess similarity. In contrast, the definition of FuzzyEn [29] relies on a degree of similarity between 0 and 1. This similarity is based on the concept of fuzzy membership as defined by Zadeh [53] and results in a weaker influence of the choice of r in the final entropy calculations [43].
Besides the use of fuzzy membership calculations, the FuzzyEn algorithm also differs from ApEn and SampEn in the way it creates the set of m-dimensional vectors. Given a time series N-samples length u(n), vector sequences are defined as:
x i m = u i , u i + 1 , , u i + m 1 u 0 i ,
where u 0 i represents the baseline trend and is computed as
u 0 i = 1 m j = 0 m 1 u i + j
The distance between two vectors, x i m and x j m , is defined as the maximum distance among all the scalar components of the vector d i j m . A matrix, D i j m is built, containing the similarity degrees for all pairs of r and n (width and gradient of the boundary of the exponential function, respectively)
D i , j m = µ d x ¯ i m , x ¯ j m , n , r
where µ is the exponential fuzzy function:
µ x , n , r = e x r n
Finally, the function ɸm is calculated as:
ɸ m = 1 N m i = 1 N m j = 1 , j i N m D i , j m N m 1
and the fuzzy entropy is computed as:
F u z z y E n m , r , n , N = ln ɸ m ln ɸ m + 1 .
FuzzyEn, therefore, needs four parameters to be computed: N, m, r and n values. Typical values for N, m and r are coincident with the ones used for SampEn and ApEn, even though dependence on r is less critical due to the substitution of the Heaviside function by the fuzzy membership calculation. Regarding the values for n, only small values guarantee a good approximation of entropy [43], being n = 2 the most frequently used [29,54,55].
For this application on REG signals, N, m and r ranges tested were the same ones proposed for ApEn and SampEn, while n values ranged from 2 to 10.

2.2. Experimental Protocol

This work is based on a previously published dataset [56]. A group of 16 young healthy volunteers signed an informed consent for REG data recording during apnea and baseline periods. This study was carried out following the principles of the Declaration of Helsinki and the corresponding protocol approved by the local Institutional Review Board and Ethics Committee. Participants were aged 25.4 ± 3.6 years, 59.6 ± 6.8 kg weight and 166.9 ± 8.3 cm height, including 8 males and 8 females.
The qCO monitor (Quantium Medical, Spain) was used for cerebral impedance monitoring. Two pairs of electrodes were placed in the subject’s temples, one pair in each side, containing an electrode emitting current and a second one sensing the output signal. A 50 kHz, 1 mA current was used for excitation and the obtained REG signal was recorded at 250 samples/s.
Subjects were asked to relax in supine position until a stable REG signal was obtained. Afterwards, data recording started, repeating twice the sequence consisting of 3 min of resting period followed by 1 min of breath holding. In case volunteers were unable to complete the 1 min apnea, they were required to raise their hand to inform the investigators and start the 3 min resting period.

2.3. Data Analysis

Even though entropy measures are known to be robust in the presence of limited amounts of noise, the recorded signals were filtered to reduce the influence of powerline interferences in the computed parameters and to filter out slow drifts provoked by respiration as well as other direct current (DC) fluctuations. Two Chebyshev type II filters were used, one of them being a 4th order high-pass filter with a stop band frequency of 0.1 Hz and the other one being an 8th order low-pass filter with stop band frequency at 20 Hz. Moreover, filtered signals were screened and detected artefacts were rejected to finally select data segments of 4000 samples. An example of a pre-processed REG recoding is shown in Figure 1.
Finally, 53 sequences were selected, 29 belonging to apnea recordings and 24 from baseline periods. The average main frequency of the recorded signals was 1.10 ± 0.47 Hz (mean ± standard deviation), resulting in a cardiac cycle of 245 ± 57 samples. The dynamic range of the recorded REG waves was 0.089 ± 0.028 Ω (95% confidence interval). No differences were observed between groups in terms of amplitudes or heart cycle duration.
All entropy metrics were computed for each input parameter combination indicated in Table 1, and their ability to distinguish baseline and apnea sequences was assessed by hypothesis testing, using either Student t-tests or Mann–Whitney tests, for normal and non-normal distributions, respectively. Normality was determined using a Lilliefors test. The statistical significance threshold was set at p < 0.05, and Bonferroni corrections were applied resulting in a final threshold of p < 0.025. Additionally, the area under the curve (AUC) of the receiver operating characteristic (ROC) and the classification accuracy (acc) were also computed.
REG signals processing is typically based on the analysis of the geometry of the pulse waves, by means of detecting local maximums and minimums as well as other features extracted from the time series [57,58]. In order to determine if the entropy metrics herein proposed outperform this classical analysis, the following set of features were extracted from the recordings:
  • Maximum amplitude (Max)
  • Minimum amplitude (Min)
  • Amplitude range (Range)
  • Slope of the increasing edge (α)
  • Area under the curve of each cardiac cycle (Area)
  • Time between two consecutive maximums (Δtmax)
  • Time between two consecutive minimums (Δtmin)
  • Time between a minimum and its consecutive maximum (Δtmin-max)
Moreover, the derivatives of the time series were also computed and the maximum value of the derivative in each cycle (δmax) and the range of the derivative (δrange) were computed and analyzed. The median value of those features in each recording was considered for analysis and subject to hypothesis testing under the same assumptions used for the entropy metrics.

3. Results

3.1. Parameters Selection for Each Entropy Metric

The evolution of the entropy metrics ApEn(m,r,N), SampEn(m,r,N), FuzzyEn(m,n,r,N), SE(N,m,ε), CCE(N,m,ε) and ρ(N,ε) as a function of the parameters selection is herein presented, as well as their ability to differentiate between apnea and baseline signals.
The entropy CCE and the regularity index ρ resulted in statistically significant results for apnea detection, while none of the parameter combinations tested for SE was able to identify apneas. Results for CCE as a function of ε, m and N are provided in Figure 2, together with the corresponding p-value illustrating the ability of CCE to distinguish between apnea and baseline recordings. As the number of quantization levels ε increases, CCE increases for both apnea and baseline periods (Figure 2a), but the p-value decreases (Figure 2d), showing a minimum for ε = 10 and ε = 20 levels. Regarding the embedding dimension m, CCE decreases as m increases, providing the best statistical significance for m = 2 ((Figure 2b,e)), while CCE remains almost stable for increasing segments length (N) (Figure 2c). Segments with lengths of 2000 and 3000 samples provide the lowest p-value.
In addition to the analysis of CCE values, Figure 3 illustrates the results for the regularity index ρ. A monotonic decrease in regularity was observed for increasing number of quantification intervals (ε), showing higher regularity for baseline recordings (Figure 3a). The effect of increasing the signal length (N) is depicted in Figure 3b, showing an increase of regularity as N increases. Regarding the influence of the number of the quantification intervals in the statistical significance of the results, using ε ≤ 50 intervals kept p-value lower than the significance threshold (p < 0.025) for signal lengths of N = 2000 samples, as shown in Figure 3c. However, for a fixed number of quantification intervals ε = 20, the regularity index ρ is statistically significant for signal lengths N ≥ 2000 samples (Figure 3d). Therefore, optimal parameters for ρ calculation to detect apneas are ε = 20 quantification steps for signals of N = 2000 samples.
The results in Figure 3 were obtained considering an embedded dimension m = 10, assuming that the minimum value CCEmin of the CCE would fall into this m range. To prove this assumption, a study of the entropy CCE varying the values of the embedded dimension m is presented in Figure 4. Each plotted CCE curve in Figure 4a belongs to an apnea recording and each in Figure 4b to a baseline recording. It can be observed that the minimum entropy takes place for m < 10 in both apnea and baseline signals. Furthermore, the location of the minimum CCE is not affected by the type of signal (apnea or baseline), as shown in Figure 4, and the median CCE values of the apnea signals are higher than the median values obtained from the baseline recordings.
Results referred to the study of ApEn, SampEn and FuzzyEn entropies are shown in Figure 5. In order to explore the effects of m and N, the parameter r was initially fixed to 0.3 as recommended in [29]. Entropy values were higher for apneas when compared to baseline for all entropy metrics and parameter combinations. ApEn provided the highest entropy values, followed by SampEn and FuzzyEn, respectively. Both ApEn and SampEn provided lower values for recordings of N = 1000 samples and remained approximately stable for recordings of N = 2000 samples or larger.
The ability of the three entropy metrics to distinguish between apnea and baseline segments was assessed by the p-value resulting from the hypothesis testing (Table 2). FuzzyEn provided statistically significant differences between both types of signals in all parameter combinations tested for m and N, while statistical significance for ApEn was limited to m = 2 and m = 3 for any sequence length and for SampEn was limited to m= 2 for a signal length of N ≥ 2000 samples. Therefore, parameter values m = 2 and N = 2000 were selected as the most appropriate across all entropy metrics for apnea detection in REG signals.
Regarding the parameter r, all entropies showed lower values as r increased and this behavior was common for both apnea and baseline signals (Figure 6a–c). In the case of FuzzyEn, p-values decreased monotonically with r, proving a better differentiation between apnea and baseline as r grows, even though FuzzyEn provided p-value< 0.025 for all r (Figure 6f). Instead, ApEn and SampEn needed at least r = 0.2 and r = 0.25, respectively, to provide significant results, showing both a minimum for r = 0.25 (Figure 6d–e). For that reason, r = 0.25 was considered a suitable value for apnea detection in REG signals.
The entropies ApEn and SampEn are fully characterized with values for N, r and m. However, for FuzzyEn, a fourth parameter (n) needs to be considered. FuzzyEn showed decreasing values for increasing n values, as shown in Figure 7a, and the standard deviation of computed entropies only tended to 0 for values of n higher than 6 (Figure 7b). In order to select the best n value for apnea detection, the statistically significant level was calculated comparing the FuzzyEn values of apnea from baseline group. FuzzyEn had the minimum p-value at n = 2 and hence this was considered the best choice (Figure 7c).
The standard deviation of the entropy metrics provides an assessment of their stability. Moreover, its evolution of r is used to determine their consistency [29]. Therefore, the evolution of the standard deviation of the three entropy metrics (ApEn, SampEn and FuzzyEn) as a function of the parameter r is depicted in Figure 8. All of them decrease with increasing r, showing a higher standard deviation value for apneas than baselines. FuzzyEn showed the lowest standard deviation, followed by SampEn. It is worth noting that both FuzzyEn and SampEn decreased monotonically while ApEn showed an almost flat behavior for r values around 0.3 in the apnea signals. This phenomenon was less pronounced in baseline recordings, but an inflection point can be observed in the same r range.

3.2. Final Parameter and Entropy Values

Results for all tested entropy metrics are included in Table 3. The values of the parameters that best describe these entropies when comparing apnea and baseline recordings are included. All these entropy metrics show increased values for apnea recordings, indicating an increased signal complexity. It should be noted that the index ρ presents the opposite behavior, since it reflects regularity instead of complexity.
Since Shannon entropy did not provide significant results for any parameter (N, m, ε) combination it has not been included in this table. In addition to the p-value computed for each metric, Table 3 contains the values of area under the curve (AUC) and accuracy (acc), in which FuzzyEn outperforms other entropy metrics. Moreover, Figure 9 depicts the ROC curves for all the entropy metrics summarized in the table.
Additionally, Figure 10 shows the distribution of each entropy metric for baseline and apnea groups. CCE and ρ present the highest dispersion of values, while Apen, SampEn and FuzzyEn have less dispersed distributions but with many outliers, specially Apen and FuzzyEn. Those results suggest that even though the selected metrics provide statistically significant differences in apnea and baseline recordings, individual differences should be noted.
Finally, results obtained applying the classical REG analysis based on geometric features extraction are provided in Table 4. None of the proposed features showed statistically significant differences between apnea and baseline signals, suggesting that entropy metrics outperform the classical analysis of REG waves for apnea detection.

4. Discussion

All entropy metrics proposed, except for SE, provided evidence regarding the increased irregularity in apnea signals when compared to baseline recordings. However, those results were shown to be dependent on the choice of the parameters needed for each entropy metric calculation. For instance, CCE values increased with an increasing number of quantization intervals, and decreased with increasing m, while remained stable with increasing sequence length. The regularity index (ρ) decreased with the number of quantization levels, in accordance with the evolution of CCE since ρ reflects regularity instead of entropy. However, ρ increased with increasing signal length indicating that a fewer number of new patterns were detected when signal length was extended. Those results are consistent with those published by Porta et al. [38], since REG waves show a quasi-periodic pattern. However, it should be noted that when using REG signals, the optimal number of quantization levels providing a better differentiation between apnea and baseline recordings, ε = 20, is higher than the one proposed by Porta in his work, ε = 6.
Considering the performances of CCE and the regularity index ρ, the latter provided the lowest p-value when tested for differences between apnea and baseline recordings. This allows us to conclude that the normalization step in the definition of ρ enhances comparative results.
Even though SE and CCE are both derived from the original definition of the Shannon entropy, CCE provides significant results while SE does not. This different performance of SE and CCE exists because SE reflects the distribution of the patterns in a given sequence while CCE assesses differences between consecutive patterns. This phenomenon has been analyzed previously in other publications [59], referring to SE as an entropy measure and conditional entropy as an entropy rate.
Regarding the results for ApEn, SampEn and FuzzyEn, they all decrease with increasing r threshold, but their behavior with increasing time series length and embedding dimension differs. Figure 5a shows increasing ApEn values for longer signals in apneas while the effects of signal length in baseline recordings are negligible. The same trend can be observed for SampEn in Figure 5b, while FuzzyEn (Figure 5c) shows stable entropy values for all signal lengths. SampEn was reported to be independent of signal length while ApEn is known to provide lower entropy estimates for short recordings [23]. Considering that the effect of signal length is only present in apneas, results could be interpreted as an increasing complexity in REG signals proportional to apnea duration, rather than just a weakness of the entropy estimators.
One of the main differences between ApEn, SampEn and FuzzyEn is their evolution as a function of the embedding dimension m. SampEn (Figure 5b) provides lower entropies for increasing m, while FuzzyEn (Figure 5c) shows the opposite behavior and ApEn does not show a consistent behavior, since the highest entropy is obtained for m = 3, followed by m = 2 and m = 4 (Figure 5a). This inconsistency in ApEn might be due to the bias inherent in this estimation. Moreover, the use of the Heaviside function might be influencing the results in such a way that softening the similarity boundary with fuzzy membership functions provides the most consistent results in terms of entropy rates as a function of the embedding dimension for a fixed r value.
No other inconsistencies were detected in ApEn, SampEn or FuzzyEn. Some authors have reported a flip-flop effect in entropy estimations [60,61]. They observed that, given two groups of signals to be compared, some r values resulted in higher entropy for signals in one group, while other r selections provided the opposite results. No flip-flop episodes were detected in this apnea-baseline dataset. Moreover, considering the definition of Aktaruzzaman [62] of practical consistency, one can conclude that the three metrics were consistent since always identified higher entropies in the apnea group for a broad range of input parameters. However, looking at the evolution of the standard deviation of each entropy (Figure 8), FuzzyEn provides the lowest values, followed by SampEn. For ApEn, the Entropy standard deviation is not decreasing monotonically, since it shows a plateau around r = 0.3. This suggests a higher variability of ApEn calculations when compared to the other estimators.
ApEn provides the highest entropy values and FuzzyEn the lowest, but all of them provide significantly different results for apnea and baseline recordings, for one or more set of N, m, r and n parameters. Optimal values for apnea detection were common to ApEn, SampEn and FuzzyEn—using n = 2 for fuzzy membership functions- even though FuzzyEn showed to be less sensitive to parameter selection, providing significant results for all the parameter combinations tested.
Recommended values for r, m and N are slightly different from those reported by other authors with different types of signals. The embedding dimension, m = 2, is coincident with most of the analysis published, but different from the one provided for plethysmograms [42], m = 7. However, due to the limited length of the recording, using embedding dimensions higher than 3 or 4 would require the use of very large r values, loosing information of the patterns in the time series. Values for the r threshold optimizing apnea detection are higher than the ones reported in other applications, usually ranging from 0.15 to 0.2 [51,52]. Regarding the value of n in the FuzzyEn algorithm, recommendations of using the smallest possible value are consistent with the results herein presented, where n = 2 provided the better statistical significance for apnea detection.
FuzzyEn provided the best statistical significance and AUC for apnea detection in REG signals, followed by ApEn, CCE and SampEn, all of them identifying higher complexity in apnea when compared to baseline signals (as can be seen in Table 3). Previous publications have also compared the performance of different entropy metrics. For instance, Chen et al. [29] compared ApEn, SampEn and FuzzyEn in their ability to characterize surface EMG signals, where FuzzyEn outperformed the other metrics, both in terms of classification and by providing a lower standard deviation of the entropy metrics, as it is also observed in the present study. Xie et al. [54] also compared the same three entropy definitions with the objective of detecting muscular fatigue in EMG signals. FuzzyEn provided the best results while ApEn failed to detect muscular fatigue. Furthermore, while analyzing EEG in patients with Alzheimer’s disease compared to healthy subjects, FuzzyEn was also the best predictor when compared to ApEn and SampEn [30], and ApEn was again considered the poorest estimator. Even though SampEn is known to outperform ApEn [23], in this study ApEn provided a better discrimination between apnea and baseline signals. Analogously, Cuesta-Frau et al. [63] reached the same conclusion when studying body temperature records of critical patients as a predictor of survival.
All time series processed in this study were sampled as 250 Hz. It is well known that sampling frequency affects the selection of the optimal parameters for entropy calculation [49], as well as signal to noise ratio [28]. However, due to the artefacts present in the recording because of movements, reducing the sampling frequency would have limited the length of the time series and, therefore, the range of dimensions m tested for each entropy definition. Therefore, sampling frequency was not included as an input variable in the estimation of entropy in the recorded dataset.
The use of entropy metrics applied to biosignals often aims at detecting a disease, as for example heart failure [18] or sick newborns [51] by means of the analysis of HRV signals. In those cases, lower entropies are associated to the disease condition. No previous studies on the regularity of REG signals have been published to the best knowledge of the authors. However, Pham et al. [42] analyzed plethysmograms, which share many properties with REG signals, and used the information for diagnosis purposes, aiming at detecting mental disorders. In our study, participants were healthy volunteers performing a simple respiratory challenge to provoke CBF changes. Therefore, rather than detecting a disease, entropy metrics were used to detect alterations in CBF reflected in REG waves. The results suggest that during apneas, in order to preserve a fixed amount of oxygen supplied to the brain, compensation mechanisms are activated that modify the REG pulse waves adding complexity to the signal. During baseline, oxygen and blood supply to the brain do not suffer alterations and REG signals are, therefore, more regular.
Further studies are needed to confirm those findings, but results suggest that entropy analysis is suitable for CBF changes detection in REG signals. Moreover, this analysis outperforms the classical approach used for REG signals, based on geometric features detection in the pulse waves, that was proved to fail in detecting apneas.

5. Conclusions

The findings presented in this study suggest that FuzzyEn is the entropy metric providing the best ability to distinguish between apnea and baseline in REG signals among the set of entropy metrics proposed, followed by ApEn and CCE. Nonetheless, a careful selection of the input parameters needed to compute those entropy metrics should be performed in advance, since values recommended for other applications are not suitable for REG signals.
Moreover, entropy analysis has been shown to be more adequate for apnea detection than classical methods applied to REG signals. Even though a larger dataset and other mechanisms to alter CBF are needed to confirm those findings, REG signals seem to be carrying CBF information that can be assessed by means of complexity analysis.

Author Contributions

C.G., E.J. and P.G. conceived and designed the experiments; C.G. and P.G. performed the experiments; C.G. and M.V. analyzed the data; C.G. wrote the paper. E.J, P.G. and M.V. reviewed the paper.

Funding

This work was supported by MINECO (DPI2017-89827-R) from Spanish Government. CIBER of Bioengineering, Biomaterials and Nanomedicine is an initiative of ISCIII. This work was also developed under the scope of the Industrial PhD program by the Regional Catalan Government (DI-2015, Generalitat de Catalunya, Spain) in collaboration with Quantium Medical S.L.U. Pedro Gambús is supported by a grant from COLCIENCIAS (Project number: 123280764083).

Conflicts of Interest

This study was partly funded within the Industrial PhD program by the Regional Catalan Government (DI-2015, Generalitat de Catalunya, Spain), that involves the collaboration of a private entity, Quantium Medical S.L.U, in which both C.G. and E.J. are employed. Quantium Medical designed the medical device qCO monitor available for this study. The founding sponsors had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; and in the decision to publish the results.

References

  1. Cipolla, M.J. The Cerebral Circulation; Morgan & Claypool Life Sciences: San Rafael, CA, USA, 2009. [Google Scholar]
  2. Goettel, N.; Patet, C.; Rossi, A.; Burkhart, C.S.; Czosnyka, M.; Strebel, S.P.; Steiner, L.A. Monitoring of cerebral blood flow autoregulation in adults undergoing sevoflurane anesthesia: a prospective cohort study of two age groups. J. Clin. Monit. Comput. 2016, 30, 255–264. [Google Scholar] [CrossRef] [PubMed]
  3. Tuman, K.J.; McCarthy, R.J.; Najafi, H.; Ivankovich, A.D. Differential effects of advanced age on neurologic and cardiac risks of coronary artery operations. J. Thorac. Cardiovasc. Surg. 1992, 104, 1510–1517. [Google Scholar] [PubMed]
  4. Reed, G.L.; Singer, D.E.; Picard, E.H.; DeSanctis, R.W. Stroke Following Coronary-Artery Bypass Surgery. N. Engl. J. Med. 1988, 319, 1246–1250. [Google Scholar] [CrossRef] [PubMed]
  5. North American Symptomatic Carotid Endarterectomy Trial Collaborators Beneficial Effect of Carotid Endarterectomy in Symptomatic Patients with High-Grade Carotid Stenosis. N. Engl. J. Med. 1991, 325, 445–453. [CrossRef] [PubMed]
  6. Bodo, M. Studies in Rheoencephalography (REG). J. Electr. Bioimpedance 2010, 1, 18–40. [Google Scholar] [CrossRef] [Green Version]
  7. Bodo, M.; Pearce, F.J.; Armonda, R.A. Cerebrovascular reactivity: rat studies in rheoencephalography. Physiol. Meas. 2004, 25, 1371–1384. [Google Scholar] [CrossRef] [PubMed]
  8. Popovic, D.; Bodo, M.; Pearce, F.; van Albert, S.; Garcia, A.; Settle, T.; Armonda, R. Assessment of cerebral blood flow autoregulation (CBF AR) with rheoencephalography (REG): studies in animals. J. Phys. Conf. Ser. 2013, 434, 12042. [Google Scholar] [CrossRef]
  9. Bodo, M.; Pearce, F.J.; Baranyi, L.; Armonda, R.A. Changes in the intracranial rheoencephalogram at lower limit of cerebral blood flow autoregulation. Physiol. Meas. 2005, 26, S1–S17. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  10. Bodo, M.; Pearce, F.; Garcia, A. In vivo cerebral blood flow autoregulation studies using rheoencephalography. J. Phys. 2010, 224, 8–11. [Google Scholar] [CrossRef]
  11. Jevning, R.; Fernando, G.; Wilson, A.F. Evaluation of consistency among different electrical impedance indices of relative cerebral blood flow in normal resting individuals. J. Biomed. Eng. 1989, 11, 53–56. [Google Scholar] [CrossRef]
  12. Jacquy, J.; Dekoninck, W.J.; Piraux, A.; Calay, R.; Bacq, J.; Levy, D.; Noel, G. Cerebral blood flow and quantitative rheoencephalography. Electroencephalogr. Clin. Neurophysiol. 1974, 37, 507–511. [Google Scholar] [CrossRef]
  13. Moskalenko, Y.E. Rheoencephalography: Past Popularity, Obvilion at Present and Optimistic Future. Int. J. Adv. Life Sci. Technol. 2015, 2, 1–15. [Google Scholar] [CrossRef]
  14. Perez, J.J. To what extent is the bipolar rheoencephalographic signal contaminated by scalp blood flow? A clinical study to quantify its extra and non-extracranial components. Biomed. Eng. Online 2014, 13, 1–11. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Perez, J.J.; Guijarro, E.; Sancho, J.; Navarre, A. Extraction of the Intracranial Component from the Rheoencephalographic Signal: A New Approach. In Proceedings of the 2006 International Conference of the IEEE Engineering in Medicine and Biology Society, New York, NY, USA, 30 August–3 September 2006; IEEE: Piscataway, NJ, USA, 2006; pp. 6064–6067. [Google Scholar]
  16. Pérez, J.J.; Guijarro, E.; Barcia, J.A. Quantification of intracranial contribution to rheoencephalography by a numerical model of the head. Clin. Neurophysiol. 2000, 111, 1306–1314. [Google Scholar] [CrossRef]
  17. Pincus, S.M.; Gladstone, I.M.; Ehrenkranz, R.A. A regularity statistic for medical data analysis. J. Clin. Monit. 1991, 7, 335–345. [Google Scholar] [CrossRef] [PubMed]
  18. Beckers, F.; Ramaekers, D.; Aubert, A.E. Approximate entropy of heart rate variability: Validation of methods and application in heart failure. Cardiovasc. Eng. An Int. J. 2001, 1, 177–182. [Google Scholar] [CrossRef]
  19. Li, X.; Yu, S.; Chen, H.; Lu, C.; Zhang, K.; Li, F. Cardiovascular autonomic function analysis using approximate entropy from 24-h heart rate variability and its frequency components in patients with type 2 diabetes. J. Diabetes Investig. 2015, 6, 227–235. [Google Scholar] [CrossRef]
  20. Burioka, N.; Miyata, M.; Cornélissen, G.; Halberg, F.; Takeshima, T.; Kaplan, D.T.; Suyama, H.; Endo, M.; Maegaki, Y.; Nomura, T.; et al. Approximate entropy in the electroencephalogram during wake and sleep. Clin. EEG Neurosci. 2005, 36, 21–24. [Google Scholar] [CrossRef]
  21. Lee, G.M.H.; Fattinger, S.; Mouthon, A.-L.; Noirhomme, Q.; Huber, R. Electroencephalogram approximate entropy influenced by both age and sleep. Front. Neuroinform. 2013, 7, 33. [Google Scholar] [CrossRef] [Green Version]
  22. Bruhn, J.; Röpcke, H.; Hoeft, A. Approximate entropy as an electroencephalographic measure of anesthetic drug effect during desflurane anesthesia. Anesthesiol. J. Am. Soc. Anesthesiol. 2000, 92, 715–726. [Google Scholar] [CrossRef]
  23. Richman, J.S.; Moorman, J.R. Physiological time-series analysis using approximate entropy and sample entropy. Am. J. Physiol. Circ. Physiol. 2000, 278, H2039–H2049. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  24. Ahmadi, S.; Sepehri, N.; Wu, C.; Szturm, T. Sample Entropy of Human Gait Center of Pressure Displacement: A Systematic Methodological Analysis. Entropy 2018, 20, 579. [Google Scholar] [CrossRef]
  25. Wang, F.; Wang, H.; Fu, R. Real-Time ECG-based detection of fatigue driving using sample entropy. Entropy 2018, 20, 196. [Google Scholar] [CrossRef]
  26. Zhang, D.; Ding, H.; Liu, Y.; Zhou, C.; Ding, H.; Ye, D. Neurodevelopment in newborns: A sample entropy analysis of electroencephalogram. Physiol. Meas. 2009, 30, 491. [Google Scholar] [CrossRef] [PubMed]
  27. Montesinos, L.; Castaldo, R.; Pecchia, L. On the use of approximate entropy and sample entropy with centre of pressure time-series. J. Neuroeng. Rehabil. 2018, 15, 116. [Google Scholar] [CrossRef]
  28. Yentes, J.M.; Hunt, N.; Schmid, K.K.; Kaipust, J.P.; McGrath, D.; Stergiou, N. The appropriate use of approximate entropy and sample entropy with short data sets. Ann. Biomed. Eng. 2013, 41, 349–365. [Google Scholar] [CrossRef] [PubMed]
  29. Chen, W.; Wang, Z.; Xie, H.; Yu, W. Characterization of surface EMG signal based on fuzzy entropy. IEEE Trans. Neural Syst. Rehabil. Eng. 2007, 15, 266–272. [Google Scholar] [CrossRef]
  30. Simons, S.; Espino, P.; Abásolo, D. Fuzzy entropy analysis of the electroencephalogram in patients with Alzheimer’s disease: is the method superior to sample entropy? Entropy 2018, 20, 21. [Google Scholar] [CrossRef]
  31. Lu, S.; Chen, X.; Kanters, J.K.; Solomon, I.C.; Chon, K.H. Automatic Selection of the Threshold Value $ r $ for Approximate Entropy. IEEE Trans. Biomed. Eng. 2008, 55, 1966–1972. [Google Scholar]
  32. Liu, C.; Liu, C.; Shao, P.; Li, L.; Sun, X.; Wang, X.; Liu, F. Comparison of different threshold values r for approximate entropy: application to investigate the heart rate variability between heart failure and healthy control groups. Physiol. Meas. 2010, 32, 167. [Google Scholar] [CrossRef]
  33. Chon, K.H.; Scully, C.G.; Lu, S. Approximate entropy for all signals. IEEE Eng. Med. Biol. Mag. 2009, 28. [Google Scholar] [CrossRef] [PubMed]
  34. Restrepo, J.F.; Schlotthauer, G.; Torres, M.E. Maximum approximate entropy and r threshold: A new approach for regularity changes detection. Phys. A Stat. Mech. Appl. 2014, 409, 97–109. [Google Scholar] [CrossRef]
  35. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
  36. Phung, D.Q.; Tran, D.; Ma, W.; Nguyen, P.; Pham, T. Using Shannon Entropy as EEG Signal Feature for Fast Person Identification. In Proceedings of the ESANN, Bruges, Belgium, 23–25 April 2014; Volume 4, pp. 413–418. [Google Scholar]
  37. Granero-Belinchon, C.; Roux, S.; Abry, P.; Doret, M.; Garnier, N. Information theory to probe intrapartum fetal heart rate dynamics. Entropy 2017, 19, 640. [Google Scholar] [CrossRef]
  38. Porta, A.; Baselli, G.; Liberati, D.; Montano, N.; Cogliati, C.; Gnecchi-Ruscone, T.; Malliani, A.; Cerutti, S. Measuring regularity by means of a corrected conditional entropy in sympathetic outflow. Biol. Cybern. 1998, 78, 71–78. [Google Scholar] [CrossRef]
  39. Guzzetti, S.; Mezzetti, S.; Magatelli, R.; Porta, A.; De Angelis, G.; Rovelli, G.; Malliani, A. Linear and non-linear 24 h heart rate variability in chronic heart failure. Auton. Neurosci. Basic Clin. 2000, 86, 114–119. [Google Scholar] [CrossRef]
  40. Faes, L.; Nollo, G.; Porta, A. Mechanisms of causal interaction between short-term RR interval and systolic arterial pressure oscillations during orthostatic challenge. J. Appl. Physiol. 2013, 114, 1657–1667. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  41. Viola, A.U.; Tobaldini, E.; Chellappa, S.L.; Casali, K.R.; Porta, A.; Montano, N. Short-term complexity of cardiac autonomic control during sleep: REM as a potential risk factor for cardiovascular system in aging. PLoS ONE 2011, 6, e19002. [Google Scholar] [CrossRef] [PubMed]
  42. Pham, T.D.; Thang, T.C.; Oyama-Higa, M.; Sugiyama, M. Mental-disorder detection using chaos and nonlinear dynamical analysis of photoplethysmographic signals. Chaos Solitons Fractals 2013, 51, 64–74. [Google Scholar]
  43. Borowska, M. Entropy-based algorithms in the analysis of biomedical signals. Stud. Logic Gramm. Rhetor. 2015, 43, 21–32. [Google Scholar] [CrossRef]
  44. Porta, A.; Guzzetti, S.; Montano, N.; Pagani, M.; Somers, V.; Malliani, A.; Baselli, G.; Cerutti, S. Information domain analysis of cardiovascular variability signals: evaluation of regularity, synchronisation and co-ordination. Med. Biol. Eng. Comput. 2000, 38, 180–188. [Google Scholar] [CrossRef] [PubMed]
  45. Pincus, S.M.; Huang, W.-M. Approximate entropy: statistical properties and applications. Commun. Stat. Methods 1992, 21, 3061–3077. [Google Scholar] [CrossRef]
  46. Wolf, A.; Swift, J.B.; Swinney, H.L.; Vastano, J.A. Determining Lyapunov exponents from a time series. Phys. D Nonlinear Phenom. 1985, 16, 285–317. [Google Scholar] [CrossRef] [Green Version]
  47. Pincus, S.M.; Goldberger, A.L. Physiological time-series analysis: What does regularity quantify? Am. J. Physiol. Circ. Physiol. 1994, 266, H1643–H1656. [Google Scholar] [CrossRef] [PubMed]
  48. Castiglioni, P.; Di Rienzo, M. How the threshold “r” influences approximate entropy analysis of heart-rate variability. In Proceedings of the Computers in Cardiology, Bologna, Italy, 14–17 September 2008; pp. 561–564. [Google Scholar]
  49. Alcaraz, R.; Abásolo, D.; Hornero, R.; Rieta, J.J. Optimal parameters study for sample entropy-based atrial fibrillation organization analysis. Comput. Methods Programs Biomed. 2010, 99, 124–132. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  50. Castiglioni, P.; Zurek, S.; Piskorski, J.; Kosmider, M.; Guzik, P.; Ce, E.; Rampichini, S.; Merati, G. Assessing sample entropy of physiological signals by the norm component matrix algorithm: Application on muscular signals during isometric contraction. In Proceedings of the 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Osaka, Japan, 3–7 July 2013; pp. 5053–5056. [Google Scholar]
  51. Lake, D.E.; Richman, J.S.; Griffin, M.P.; Moorman, J.R. Sample entropy analysis of neonatal heart rate variability. Am. J. Physiol. Integr. Comp. Physiol. 2002, 283, R789–R797. [Google Scholar] [CrossRef] [Green Version]
  52. Lewis, M.J.; Short, A.L. Sample entropy of electrocardiographic RR and QT time-series data during rest and exercise. Physiol. Meas. 2007, 28, 731. [Google Scholar] [CrossRef]
  53. Zadeh, L.A. Fuzzy sets. Inf. Control 1965, 8, 338–353. [Google Scholar] [CrossRef] [Green Version]
  54. Xie, H.-B.; Chen, W.-T.; He, W.-X.; Liu, H. Complexity analysis of the biomedical signal using fuzzy entropy measurement. Appl. Soft Comput. 2011, 11, 2871–2879. [Google Scholar] [CrossRef]
  55. Liu, C.; Zhao, L. Using fuzzy measure entropy to improve the stability of traditional entropy measures. In Proceedings of the Computing in Cardiology, Hangzhou, China, 18–21 September 2011; pp. 681–684. [Google Scholar]
  56. González, C.; Jensen, E.W.; Gambús, P.L.; Vallverdú, M. Poincaré plot analysis of cerebral blood flow signals: Feature extraction and classification methods for apnea detection. PLoS ONE 2018, 13, e0208642. [Google Scholar] [CrossRef]
  57. Montgomery, L.D.; Montgomery, R.W.; Guisado, R. Rheoencephalographic and electroencephalographic measures of cognitive workload: analytical procedures. Biol. Psychol. 1995, 40, 143–159. [Google Scholar] [CrossRef]
  58. Lovett, J.W.D.; Barchha, R.; Lee, R.S.; Little, M.H.; Watkinson, J.S. Acute effects of ECT on the cerebral circulation in man. A computerized study by cerebral impedance plethysmography. Eur. Neurol. 1974, 12, 47–62. [Google Scholar] [CrossRef] [PubMed]
  59. Porta, A.; Guzzetti, S.; Montano, N.; Furlan, R.; Pagani, M.; Malliani, A.; Cerutti, S. Entropy, entropy rate, and pattern classification as tools to typify complexity in short heart period variability series. IEEE Trans. Biomed. Eng. 2001, 48, 1282–1291. [Google Scholar] [CrossRef] [PubMed]
  60. Mayer, C.C.; Bachler, M.; Hörtenhuber, M.; Stocker, C.; Holzinger, A.; Wassertheurer, S. Selection of entropy-measure parameters for knowledge discovery in heart rate variability data. BMC Bioinform. 2014, 15, S2. [Google Scholar] [CrossRef] [PubMed]
  61. Bošković, A.; Lončar-Turukalo, T.; Japundžić-Žigon, N.; Bajić, D. The flip-flop effect in entropy estimation. In Proceedings of the 2011 IEEE 9th International Symposium on Intelligent Systems and Informatics, Subotica, Serbia, 8–10 September 2011; pp. 227–230. [Google Scholar]
  62. Aktaruzzaman, M.; Sassi, R. Parametric estimation of sample entropy in heart rate variability analysis. Biomed. Signal Process. Control 2014, 14, 141–147. [Google Scholar] [CrossRef]
  63. Cuesta-Frau, D.; Miro-Martinez, P.; Oltra-Crespo, S.; Varela-Entrecanales, M.; Aboy, M.; Novak, D.; Austin, D. Measuring body temperature time series regularity using approximate entropy and sample entropy. In Proceedings of the 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Minneapolis, MN, USA, 3–6 September 2009; pp. 3461–3464. [Google Scholar]
Figure 1. Filtered rheoencephalography (REG) signal collected during breath holding.
Figure 1. Filtered rheoencephalography (REG) signal collected during breath holding.
Entropy 21 00605 g001
Figure 2. Corrected conditional entropy (CCE(N, m, ε)) values of apnea and baseline recordings as a function of (a) the quantification intervals (ε), (b) the embedding dimension (m) and (c) the signal length (N). The corresponding statistical significance (p-value) of the differences between apnea and baseline recordings is presented in (d), (e) and (f), respectively.
Figure 2. Corrected conditional entropy (CCE(N, m, ε)) values of apnea and baseline recordings as a function of (a) the quantification intervals (ε), (b) the embedding dimension (m) and (c) the signal length (N). The corresponding statistical significance (p-value) of the differences between apnea and baseline recordings is presented in (d), (e) and (f), respectively.
Entropy 21 00605 g002
Figure 3. The influence of the signal length and the number of quantization levels in the regularity index (ρ) is analyzed in: (a) values of ρ as a function of the number of quantification intervals (ε) and (b) values of ρ as a function of the signal length (N). The results of the statistical analysis (p-value) comparing apnea and baseline signals using this entropy indexes are shown in: (c) p-values versus the number of quantification intervals and (d) p-values versus the signal length.
Figure 3. The influence of the signal length and the number of quantization levels in the regularity index (ρ) is analyzed in: (a) values of ρ as a function of the number of quantification intervals (ε) and (b) values of ρ as a function of the signal length (N). The results of the statistical analysis (p-value) comparing apnea and baseline signals using this entropy indexes are shown in: (c) p-values versus the number of quantification intervals and (d) p-values versus the signal length.
Entropy 21 00605 g003
Figure 4. Values of the entropy CCE (ε = 20; N = 2000) as a function of the embedding dimension m for all apnea (a) and baseline (b) recordings, including their median values (thick black line).
Figure 4. Values of the entropy CCE (ε = 20; N = 2000) as a function of the embedding dimension m for all apnea (a) and baseline (b) recordings, including their median values (thick black line).
Entropy 21 00605 g004
Figure 5. Values of the entropies (a) ApEn, (b) SampEn and (c) FuzzyEn as a function of the number of samples (N) and the dimension (m) for apnea (solid line) and baseline segments (dashed line).
Figure 5. Values of the entropies (a) ApEn, (b) SampEn and (c) FuzzyEn as a function of the number of samples (N) and the dimension (m) for apnea (solid line) and baseline segments (dashed line).
Entropy 21 00605 g005
Figure 6. Entropy values of ApEn(2,r,2000), SampEn(2,r,2000) and FuzzyEn(2,2,r,2000) as a function of r for apnea and baseline recordings (ac) and the corresponding p-values (df).
Figure 6. Entropy values of ApEn(2,r,2000), SampEn(2,r,2000) and FuzzyEn(2,2,r,2000) as a function of r for apnea and baseline recordings (ac) and the corresponding p-values (df).
Entropy 21 00605 g006
Figure 7. (a) Median FuzzyEn values as a function of n, including the 25–75 interquartile range (colored area); (b) standard deviation of FuzzyEn as a function of n; (c) p-value obtained comparing FuzzyEn values in apnea and baseline groups as a function of n.
Figure 7. (a) Median FuzzyEn values as a function of n, including the 25–75 interquartile range (colored area); (b) standard deviation of FuzzyEn as a function of n; (c) p-value obtained comparing FuzzyEn values in apnea and baseline groups as a function of n.
Entropy 21 00605 g007
Figure 8. Standard deviation of ApEn(2, r, 2000), SampEn(2, r, 2000) and FuzzyEn(2, 2, r, 2000) as a function of r comparing baseline and apnea segments.
Figure 8. Standard deviation of ApEn(2, r, 2000), SampEn(2, r, 2000) and FuzzyEn(2, 2, r, 2000) as a function of r comparing baseline and apnea segments.
Entropy 21 00605 g008
Figure 9. Receiver operating characteristic (ROC) curves of all entropy metrics providing statistically significant differences between apnea and baseline recordings.
Figure 9. Receiver operating characteristic (ROC) curves of all entropy metrics providing statistically significant differences between apnea and baseline recordings.
Entropy 21 00605 g009
Figure 10. Boxplot of all selected entropy metrics, showing the median values (horizontal red lines) and outliers (red crosses).
Figure 10. Boxplot of all selected entropy metrics, showing the median values (horizontal red lines) and outliers (red crosses).
Entropy 21 00605 g010
Table 1. Parameter combination used to calculate each entropy metric.
Table 1. Parameter combination used to calculate each entropy metric.
Signal Length (N) (Samples)Embedding Dimension (m)Filtering Level (r)Quantization Intervals (ε)Fuzzy Function Gradient (n)
Shannon Entropy1000 to 40002 to 4-10 to 50-
Corrected Conditional Entropy1000 to 40002 to 4 *-10 to 50-
Approximate Entropy1000 to 40002 to 40.05 to 0.3--
Sample Entropy1000 to 40002 to 40.05 to 0.3--
Fuzzy entropy1000 to 40002 to 40.05 to 0.3-2 to 10
* Only used in CCE calculation, not applicable for ρ.
Table 2. Statistical significance values of the entropies ApEn, SampEn and FuzzyEn for apnea detection as a function of the embedding dimension (m) and the signal length (N).
Table 2. Statistical significance values of the entropies ApEn, SampEn and FuzzyEn for apnea detection as a function of the embedding dimension (m) and the signal length (N).
N = 1000N = 2000N = 3000N = 4000
ApEn
m = 20.00440.00060.00060.0004
m = 30.01310.00140.00130.0004
m = 40.63790.49150.53760.3391
SampEn
m = 20.0480.0140.0170.017
m = 30.1660.1950.1450.136
m = 40.3870.2800.1830.172
FuzzyEn
m = 20.000760.000130.000140.00012
m = 30.000860.000180.000160.00014
m = 40.003290.000420.000220.00021
Table 3. Mean values and standard deviation (std) of all entropy metrics when comparing apnea and baseline recordings. The values of the set of parameters that best describe these entropies are included. Statistics as p-value, area under the curve (AUC) and accuracy (acc) are provided to assess the ability of the entropy metrics to distinguish between apnea and baseline.
Table 3. Mean values and standard deviation (std) of all entropy metrics when comparing apnea and baseline recordings. The values of the set of parameters that best describe these entropies are included. Statistics as p-value, area under the curve (AUC) and accuracy (acc) are provided to assess the ability of the entropy metrics to distinguish between apnea and baseline.
Entropy MeasureParametersApnea
Mean ± std
Baseline
Mean ± std
p-ValueAUCacc (%)
ApEnr = 0.25
m = 2
N = 2000
0.155 ± 0.0450.118 ± 0.0350.00030.78969.8
SampEnr = 0.25
m = 2
N = 2000
0.111 ± 0.031 0.092 ± 0.022 0.0132 0.698 60.4
FuzzyEnr = 0.25
m = 2
N = 2000
n = 2
0.021 ± 0.0090.015 ± 0.0060.00010.80969.8
CCEε = 20
m = 2
N = 2000
0.581 ± 0.0630.518 ± 0.0750.00160.74467.9
ρε = 20
N = 2000
0.838 ± 0.0240.854 ± 0.0170.00840.71362.3
Table 4. Mean values and standard deviation (std) of all the features extracted from the linear time series and p_value statistics illustrating their ability to distinguish between apnea and baseline signals.
Table 4. Mean values and standard deviation (std) of all the features extracted from the linear time series and p_value statistics illustrating their ability to distinguish between apnea and baseline signals.
ParameterUnitsApnea
Mean ± std
Baseline
Mean ± std
p-Value
MaxΩ0.041 ± 0.0140.045 ± 0.0170.356
MinΩ−0.051 ± 0.017−0.054 ± 0.0180.523
RangeΩ0.092 ± 0.0280.099 ± 0.0330.376
Δtmaxsamples238.7 ± 22.1254.9 ± 43.20.084
Δtminsamples242.11 ± 23.2248.6 ± 38.80.455
Δtmin-maxsamples52.88 ± 27.3660.56 ± 24.760.217
αa.u.0.002 ± 0.0010.002 ± 0.0010.406
AreaΩ.s12.453 ± 4.76613.471 ± 4.8560.446
δmaxΩ/s0.006 ± 0.0020.005 ± 0.0020.272
δrangeΩ/s0.007 ± 0.0020.007 ± 0.0020.145

Share and Cite

MDPI and ACS Style

González, C.; Jensen, E.; Gambús, P.; Vallverdú, M. Entropy Measures as Descriptors to Identify Apneas in Rheoencephalographic Signals. Entropy 2019, 21, 605. https://doi.org/10.3390/e21060605

AMA Style

González C, Jensen E, Gambús P, Vallverdú M. Entropy Measures as Descriptors to Identify Apneas in Rheoencephalographic Signals. Entropy. 2019; 21(6):605. https://doi.org/10.3390/e21060605

Chicago/Turabian Style

González, Carmen, Erik Jensen, Pedro Gambús, and Montserrat Vallverdú. 2019. "Entropy Measures as Descriptors to Identify Apneas in Rheoencephalographic Signals" Entropy 21, no. 6: 605. https://doi.org/10.3390/e21060605

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop