Open AccessArticle
Multiscale Cross-Approximate Entropy Analysis of Bilateral Fingertips Photoplethysmographic Pulse Amplitudes among Middle-to-Old Aged Individuals with or without Type 2 Diabetes
Entropy 2017, 19(4), 145; doi:10.3390/e19040145 (registering DOI) -
Abstract
Multiscale cross-approximate entropy (MC-ApEn) between two different physiological signals could evaluate cardiovascular health in diabetes. Whether MC-ApEn analysis between two similar signals such as photoplethysmographic (PPG) pulse amplitudes of bilateral fingertips can reflect diabetes status is unknown. From a middle-to-old-aged population free of
[...] Read more.
Multiscale cross-approximate entropy (MC-ApEn) between two different physiological signals could evaluate cardiovascular health in diabetes. Whether MC-ApEn analysis between two similar signals such as photoplethysmographic (PPG) pulse amplitudes of bilateral fingertips can reflect diabetes status is unknown. From a middle-to-old-aged population free of prior cardiovascular disease, we selected the unaffected (no type 2 diabetes, n = 36), the well-controlled diabetes (glycated hemoglobin (HbA1c) < 8%, n = 30), and the poorly- controlled diabetes (HbA1c ≥ 8%, n = 26) groups. MC-ApEn indexes were calculated from simultaneous consecutive 1500 PPG pulse amplitudes signals of bilateral index fingertips. The average of scale factors 1–5 (MC-ApEnSS) and of scale factors 6–10 (MC-ApEnLS) were defined as the small- and large-scales MC-ApEn, respectively. The MC-ApEnLS index was highest in the unaffected, followed by the well-controlled diabetes, and then the poorly-controlled diabetes (0.70, 0.62, and 0.53; all paired p-values were <0.05); in contrast, the MC-ApEnSS index did not differ between groups. Our findings suggested that the bilateral fingertips large-scale MC-ApEnLS index of PPG pulse amplitudes might be able to evaluate the glycemic status and detect subtle vascular disease in type 2 diabetes. Full article
Open AccessArticle
The Many Classical Faces of Quantum Structures
Entropy 2017, 19(4), 144; doi:10.3390/e19040144 -
Abstract Interpretational problems with quantum mechanics can be phrased precisely by only talking about empirically accessible information. This prompts a mathematical reformulation of quantum mechanics in terms of classical mechanics. We survey this programme in terms of algebraic quantum theory. Full article
Open AccessArticle
Paradigms of Cognition
Entropy 2017, 19(4), 143; doi:10.3390/e19040143 -
Abstract
An abstract, quantitative theory which connects elements of information —key ingredients in the cognitive proces—is developed. Seemingly unrelated results are thereby unified. As an indication of this, consider results in classical probabilistic information theory involving information projections and so-called Pythagorean inequalities. This
[...] Read more.
An abstract, quantitative theory which connects elements of information —key ingredients in the cognitive proces—is developed. Seemingly unrelated results are thereby unified. As an indication of this, consider results in classical probabilistic information theory involving information projections and so-called Pythagorean inequalities. This has a certain resemblance to classical results in geometry bearing Pythagoras’ name. By appealing to the abstract theory presented here, you have a common point of reference for these results. In fact, the new theory provides a general framework for the treatment of a multitude of global optimization problems across a range of disciplines such as geometry, statistics and statistical physics. Several applications are given, among them an “explanation” of Tsallis entropy is suggested. For this, as well as for the general development of the abstract underlying theory, emphasis is placed on interpretations and associated philosophical considerations. Technically, game theory is the key tool. Full article
Open AccessArticle
A Novel Framework for Shock Filter Using Partial Differential Equations
Entropy 2017, 19(4), 142; doi:10.3390/e19040142 -
Abstract
In dilation or erosion processes, a shock filter is widely used in signal enhancing or image deburring. Traditionally, sign function is employed in shock filtering for reweighting of edge-detection in images and decides whether a pixel should dilate to the local maximum or
[...] Read more.
In dilation or erosion processes, a shock filter is widely used in signal enhancing or image deburring. Traditionally, sign function is employed in shock filtering for reweighting of edge-detection in images and decides whether a pixel should dilate to the local maximum or evolve to the local minimum. Some researchers replace sign function with tanh function or arctan function, trying to change the evolution tracks of the pixels when filtering is in progress. However, analysis here reveals that only function replacement does usually not work. This paper revisits first shock filters and their modifications. Then, a fuzzy shock filter is proposed after a membership function in a shock filter model is adopted to adjust the evolve rate of image pixels. The proposed filter is a parameter tuning system, which unites several formulations of shock filters into one fuzzy framework. Experimental results show that the new filter is flexible and robust and can converge fast. Full article
Figures

Figure 1

Open AccessArticle
Impact Location and Quantification on an Aluminum Sandwich Panel Using Principal Component Analysis and Linear Approximation with Maximum Entropy
Entropy 2017, 19(4), 137; doi:10.3390/e19040137 -
Abstract
To avoid structural failures it is of critical importance to detect, locate and quantify impact damage as soon as it occurs. This can be achieved by impact identification methodologies, which continuously monitor the structure, detecting, locating, and quantifying impacts as they occur. This
[...] Read more.
To avoid structural failures it is of critical importance to detect, locate and quantify impact damage as soon as it occurs. This can be achieved by impact identification methodologies, which continuously monitor the structure, detecting, locating, and quantifying impacts as they occur. This article presents an improved impact identification algorithm that uses principal component analysis (PCA) to extract features from the monitored signals and an algorithm based on linear approximation with maximum entropy to estimate the impacts. The proposed methodology is validated with two experimental applications, which include an aluminum plate and an aluminum sandwich panel. The results are compared with those of other impact identification algorithms available in literature, demonstrating that the proposed method outperforms these algorithms. Full article
Figures

Figure 1

Open AccessArticle
Permutation Entropy for the Characterisation of Brain Activity Recorded with Magnetoencephalograms in Healthy Ageing
Entropy 2017, 19(4), 141; doi:10.3390/e19040141 -
Abstract
The characterisation of healthy ageing of the brain could help create a fingerprint of normal ageing that might assist in the early diagnosis of neurodegenerative conditions. This study examined changes in resting state magnetoencephalogram (MEG) permutation entropy due to age and gender in
[...] Read more.
The characterisation of healthy ageing of the brain could help create a fingerprint of normal ageing that might assist in the early diagnosis of neurodegenerative conditions. This study examined changes in resting state magnetoencephalogram (MEG) permutation entropy due to age and gender in a sample of 220 healthy participants (98 males and 122 females, ages ranging between 7 and 84). Entropy was quantified using normalised permutation entropy and modified permutation entropy, with an embedding dimension of 5 and a lag of 1 as the input parameters for both algorithms. Effects of age were observed over the five regions of the brain, i.e., anterior, central, posterior, and left and right lateral, with the anterior and central regions containing the highest permutation entropy. Statistically significant differences due to age were observed in the different brain regions for both genders, with the evolutions described using the fitting of polynomial regressions. Nevertheless, no significant differences between the genders were observed across all ages. These results suggest that the evolution of entropy in the background brain activity, quantified with permutation entropy algorithms, might be considered an alternative illustration of a ‘nominal’ physiological rhythm. Full article
Figures

Figure 1

Open AccessArticle
Ionic Liquids Confined in Silica Ionogels: Structural, Thermal, and Dynamical Behaviors
Entropy 2017, 19(4), 140; doi:10.3390/e19040140 -
Abstract
Ionogels are porous monoliths providing nanometer-scale confinement of an ionic liquid within an oxide network. Various dynamic parameters and the detailed nature of phase transitions were investigated by using a neutron scattering technique, giving smaller time and space scales compared to earlier results
[...] Read more.
Ionogels are porous monoliths providing nanometer-scale confinement of an ionic liquid within an oxide network. Various dynamic parameters and the detailed nature of phase transitions were investigated by using a neutron scattering technique, giving smaller time and space scales compared to earlier results from other techniques. By investigating the nature of the hydrogen mean square displacement (local mobility), qualitative information on diffusion and different phase transitions were obtained. The results presented herein show similar short-time molecular dynamics between pristine ionic liquids and confined ionic liquids through residence time and diffusion coefficient values, thus, explaining in depth the good ionic conductivity of ionogels. Full article
Figures

Figure 1

Open AccessArticle
Thermal Ratchet Effect in Confining Geometries
Entropy 2017, 19(4), 119; doi:10.3390/e19040119 -
Abstract
The stochastic model of the Feynman–Smoluchowski ratchet is proposed and solved using generalization of the Fick–Jacobs theory. The theory fully captures nonlinear response of the ratchet to the difference of heat bath temperatures. The ratchet performance is discussed using the mean velocity, the
[...] Read more.
The stochastic model of the Feynman–Smoluchowski ratchet is proposed and solved using generalization of the Fick–Jacobs theory. The theory fully captures nonlinear response of the ratchet to the difference of heat bath temperatures. The ratchet performance is discussed using the mean velocity, the average heat flow between the two heat reservoirs and the figure of merit, which quantifies energetic cost for attaining a certain mean velocity. Limits of the theory are tested comparing its predictions to numerics. We also demonstrate connection between the ratchet effect emerging in the model and rotations of the probability current and explain direction of the mean velocity using simple discrete analogue of the model. Full article
Figures

Figure 1

Open AccessArticle
Leveraging Receiver Message Side Information in Two-Receiver Broadcast Channels: A General Approach †
Entropy 2017, 19(4), 138; doi:10.3390/e19040138 -
Abstract
We consider two-receiver broadcast channels where each receiver may know a priori some of the messages requested by the other receiver as receiver message side information (RMSI). We devise a general approach to leverage RMSI in these channels. To this end, we first
[...] Read more.
We consider two-receiver broadcast channels where each receiver may know a priori some of the messages requested by the other receiver as receiver message side information (RMSI). We devise a general approach to leverage RMSI in these channels. To this end, we first propose a pre-coding scheme considering the general message setup where each receiver requests both common and private messages and knows a priori part of the private message requested by the other receiver as RMSI. We then construct the transmission scheme of a two-receiver channel with RMSI by applying the proposed pre-coding scheme to the best transmission scheme for the channel without RMSI. To demonstrate the effectiveness of our approach, we apply our pre-coding scheme to three categories of the two-receiver discrete memoryless broadcast channel: (i) channel without state; (ii) channel with states known causally to the transmitter; and (iii) channel with states known non-causally to the transmitter. We then derive a unified inner bound for all three categories. We show that our inner bound is tight for some new cases in each of the three categories, as well as all cases whose capacity region was known previously. Full article
Figures

Figure 1

Open AccessReview
The Quantum Harmonic Otto Cycle
Entropy 2017, 19(4), 136; doi:10.3390/e19040136 -
Abstract
The quantum Otto cycle serves as a bridge between the macroscopic world of heat engines and the quantum regime of thermal devices composed from a single element. We compile recent studies of the quantum Otto cycle with a harmonic oscillator as a working
[...] Read more.
The quantum Otto cycle serves as a bridge between the macroscopic world of heat engines and the quantum regime of thermal devices composed from a single element. We compile recent studies of the quantum Otto cycle with a harmonic oscillator as a working medium. This model has the advantage that it is analytically trackable. In addition, an experimental realization has been achieved, employing a single ion in a harmonic trap. The review is embedded in the field of quantum thermodynamics and quantum open systems. The basic principles of the theory are explained by a specific example illuminating the basic definitions of work and heat. The relation between quantum observables and the state of the system is emphasized. The dynamical description of the cycle is based on a completely positive map formulated as a propagator for each stroke of the engine. Explicit solutions for these propagators are described on a vector space of quantum thermodynamical observables. These solutions which employ different assumptions and techniques are compared. The tradeoff between power and efficiency is the focal point of finite-time-thermodynamics. The dynamical model enables the study of finite time cycles limiting time on the adiabatic and the thermalization times. Explicit finite time solutions are found which are frictionless (meaning that no coherence is generated), and are also known as shortcuts to adiabaticity.The transition from frictionless to sudden adiabats is characterized by a non-hermitian degeneracy in the propagator. In addition, the influence of noise on the control is illustrated. These results are used to close the cycles either as engines or as refrigerators. The properties of the limit cycle are described. Methods to optimize the power by controlling the thermalization time are also introduced. At high temperatures, the Novikov–Curzon–Ahlborn efficiency at maximum power is obtained. The sudden limit of the engine which allows finite power at zero cycle time is shown. The refrigerator cycle is described within the frictionless limit, with emphasis on the cooling rate when the cold bath temperature approaches zero. Full article
Figures

Figure 1

Open AccessArticle
Tensor Singular Spectrum Decomposition Algorithm Based on Permutation Entropy for Rolling Bearing Fault Diagnosis
Entropy 2017, 19(4), 139; doi:10.3390/e19040139 -
Abstract
Mechanical vibration signal mapped into a high-dimensional space tends to exhibit a special distribution and movement characteristics, which can further reveal the dynamic behavior of the original time series. As the most natural representation of high-dimensional data, tensor can preserve the intrinsic structure
[...] Read more.
Mechanical vibration signal mapped into a high-dimensional space tends to exhibit a special distribution and movement characteristics, which can further reveal the dynamic behavior of the original time series. As the most natural representation of high-dimensional data, tensor can preserve the intrinsic structure of the data to the maximum extent. Thus, the tensor decomposition algorithm has broad application prospects in signal processing. High-dimensional tensor can be obtained from a one-dimensional vibration signal by using phase space reconstruction, which is called the tensorization of data. As a new signal decomposition method, tensor-based singular spectrum algorithm (TSSA) fully combines the advantages of phase space reconstruction and tensor decomposition. However, TSSA has some problems, mainly in estimating the rank of tensor and selecting the optimal reconstruction tensor. In this paper, the improved TSSA algorithm based on convex-optimization and permutation entropy (PE) is proposed. Firstly, aiming to accurately estimate the rank of tensor decomposition, this paper presents a convex optimization algorithm using non-convex penalty functions based on singular value decomposition (SVD). Then, PE is employed to evaluate the desired tensor and improve the denoising performance. In order to verify the effectiveness of proposed algorithm, both numerical simulation and experimental bearing failure data are analyzed. Full article
Figures

Figure 1

Open AccessArticle
Permutation Entropy: New Ideas and Challenges
Entropy 2017, 19(3), 134; doi:10.3390/e19030134 -
Abstract
Over recent years, some new variants of Permutation entropy have been introduced and applied to EEG analysis, including a conditional variant and variants using some additional metric information or being based on entropies that are different from the Shannon entropy. In some situations,
[...] Read more.
Over recent years, some new variants of Permutation entropy have been introduced and applied to EEG analysis, including a conditional variant and variants using some additional metric information or being based on entropies that are different from the Shannon entropy. In some situations, it is not completely clear what kind of information the new measures and their algorithmic implementations provide. We discuss the new developments and illustrate them for EEG data. Full article
Figures

Figure 1

Open AccessArticle
Structure and Dynamics of Water at Carbon-Based Interfaces
Entropy 2017, 19(3), 135; doi:10.3390/e19030135 -
Abstract
Water structure and dynamics are affected by the presence of a nearby interface. Here, first we review recent results by molecular dynamics simulations about the effect of different carbon-based materials, including armchair carbon nanotubes and a variety of graphene sheets—flat and with corrugation—on
[...] Read more.
Water structure and dynamics are affected by the presence of a nearby interface. Here, first we review recent results by molecular dynamics simulations about the effect of different carbon-based materials, including armchair carbon nanotubes and a variety of graphene sheets—flat and with corrugation—on water structure and dynamics. We discuss the calculations of binding energies, hydrogen bond distributions, water’s diffusion coefficients and their relation with surface’s geometries at different thermodynamical conditions. Next, we present new results of the crystallization and dynamics of water in a rigid graphene sieve. In particular, we show that the diffusion of water confined between parallel walls depends on the plate distance in a non-monotonic way and is related to the water structuring, crystallization, re-melting and evaporation for decreasing inter-plate distance. Our results could be relevant in those applications where water is in contact with nanostructured carbon materials at ambient or cryogenic temperatures, as in man-made superhydrophobic materials or filtration membranes, or in techniques that take advantage of hydrated graphene interfaces, as in aqueous electron cryomicroscopy for the analysis of proteins adsorbed on graphene. Full article
Figures

Figure 1

Open AccessArticle
Spectral Entropy Parameters during Rapid Ventricular Pacing for Transcatheter Aortic Valve Implantation
Entropy 2017, 19(3), 133; doi:10.3390/e19030133 -
Abstract
The time-frequency balanced spectral entropy of the EEG is a monitoring technique measuring the level of hypnosis during general anesthesia. Two components of spectral entropy are calculated: state entropy (SE) and response entropy (RE). Transcatheter aortic valve implantation (TAVI) is a less invasive
[...] Read more.
The time-frequency balanced spectral entropy of the EEG is a monitoring technique measuring the level of hypnosis during general anesthesia. Two components of spectral entropy are calculated: state entropy (SE) and response entropy (RE). Transcatheter aortic valve implantation (TAVI) is a less invasive treatment for patients suffering from symptomatic aortic stenosis with contraindications for open heart surgery. The goal of hemodynamic management during the procedure is to achieve hemodynamic stability with exact blood pressure control and use of rapid ventricular pacing (RVP) that result in severe hypotension. The objective of this study was to examine how the spectral entropy values respond to RVP and other critical events during the TAVI procedure. Twenty one patients undergoing general anesthesia for TAVI were evaluated. The RVP was used twice during the procedure at a rate of 185 ± 9/min with durations of 16 ± 4 s (range 8–22 s) and 24 ± 6 s (range 18–39 s). The systolic blood pressure during RVP was under 50 ± 5 mmHg. Spectral entropy values SE were significantly declined during the RVP procedure, from 28 ± 13 to 23 ± 13 (p < 0.003) and from 29 ± 12 to 24 ± 10 (p < 0.001). The corresponding values for RE were 29 ± 13 vs. 24 ± 13 (p < 0.006) and 30 ± 12 vs. 25 ± 10 (p < 0.001). Both SE and RE values returned to the pre-RVP values after 1 min. Ultra-short hypotension during RVP changed the spectral entropy parameters, however these indices reverted rapidly to the same value before application of RVP. Full article
Figures

Figure 1

Open AccessArticle
Discrepancies between Conventional Multiscale Entropy and Modified Short-Time Multiscale Entropy of Photoplethysmographic Pulse Signals in Middle- and Old- Aged Individuals with or without Diabetes
Entropy 2017, 19(3), 132; doi:10.3390/e19030132 -
Abstract
Multiscale entropy (MSE) of physiological signals may reflect cardiovascular health in diabetes. The classic MSE (cMSE) algorithm requires more than 750 signals for the calculations. The modified short-time MSE (sMSE) may have inconsistent outcomes compared with the cMSE at large time scales and
[...] Read more.
Multiscale entropy (MSE) of physiological signals may reflect cardiovascular health in diabetes. The classic MSE (cMSE) algorithm requires more than 750 signals for the calculations. The modified short-time MSE (sMSE) may have inconsistent outcomes compared with the cMSE at large time scales and in a disease status. Therefore, we compared the cMSE of 1500 (cMSE1500) consecutive and 1000 photoplethysmographic (PPG) pulse amplitudes with the sMSE of 500 PPG (sMSE500) pulse amplitudes of bilateral fingertips among middle- to old-aged individuals with or without type 2 diabetes. We discovered that cMSE1500 had the smallest value across scale factors 1–10, followed by cMSE1000, and then sMSE500in both hands. The cMSE1500, cMSE1000 and sMSE500 did not differ at each scale factor in both hands of persons without diabetes and in the dominant hand of those with diabetes. In contrast, the sMSE500 differed at all scales 1–10 in the non-dominant hand with diabetes. In conclusion, autonomic dysfunction, prevalent in the non-dominant hand which had a low local physical activity in the person with diabetes, might be imprecisely evaluated by the sMSE; therefore, using more PPG signal numbers for the cMSE is preferred in such a situation. Full article
Figures

Figure 1

Open AccessArticle
Identity Based Generalized Signcryption Scheme in the Standard Model
Entropy 2017, 19(3), 121; doi:10.3390/e19030121 -
Abstract
Generalized signcryption (GSC) can adaptively work as an encryption scheme, a signature scheme or a signcryption scheme with only one algorithm. It is more suitable for the storage constrained setting. In this paper, motivated by Paterson–Schuldt’s scheme, based on bilinear pairing, we first
[...] Read more.
Generalized signcryption (GSC) can adaptively work as an encryption scheme, a signature scheme or a signcryption scheme with only one algorithm. It is more suitable for the storage constrained setting. In this paper, motivated by Paterson–Schuldt’s scheme, based on bilinear pairing, we first proposed an identity based generalized signcryption (IDGSC) scheme in the standard model. To the best of our knowledge, it is the first scheme that is proven secure in the standard model. Full article
Open AccessArticle
Friction, Free Axes of Rotation and Entropy
Entropy 2017, 19(3), 123; doi:10.3390/e19030123 -
Abstract
Friction forces acting on rotators may promote their alignment and therefore eliminate degrees of freedom in their movement. The alignment of rotators by friction force was shown by experiments performed with different spinners, demonstrating how friction generates negentropy in a system of rotators.
[...] Read more.
Friction forces acting on rotators may promote their alignment and therefore eliminate degrees of freedom in their movement. The alignment of rotators by friction force was shown by experiments performed with different spinners, demonstrating how friction generates negentropy in a system of rotators. A gas of rigid rotators influenced by friction force is considered. The orientational negentropy generated by a friction force was estimated with the Sackur-Tetrode equation. The minimal change in total entropy of a system of rotators, corresponding to their eventual alignment, decreases with temperature. The reported effect may be of primary importance for the phase equilibrium and motion of ubiquitous colloidal and granular systems. Full article
Figures

Open AccessArticle
Pairs Generating as a Consequence of the Fractal Entropy: Theory and Applications
Entropy 2017, 19(3), 128; doi:10.3390/e19030128 -
Abstract
In classical concepts, theoretical models are built assuming that the dynamics of the complex system’s stuctural units occur on continuous and differentiable motion variables. In reality, the dynamics of the natural complex systems are much more complicated. These difficulties can be overcome in
[...] Read more.
In classical concepts, theoretical models are built assuming that the dynamics of the complex system’s stuctural units occur on continuous and differentiable motion variables. In reality, the dynamics of the natural complex systems are much more complicated. These difficulties can be overcome in a complementary approach, using the fractal concept and the corresponding non-differentiable theoretical model, such as the scale relativity theory or the extended scale relativity theory. Thus, using the last theory, fractal entropy through non-differentiable Lie groups was established and, moreover, the pairs generating mechanisms through fractal entanglement states were explained. Our model has implications in the dynamics of biological structures, in the form of the “chameleon-like” behavior of cholesterol. Full article
Figures

Figure 1

Open AccessArticle
Fractional Jensen–Shannon Analysis of the Scientific Output of Researchers in Fractional Calculus
Entropy 2017, 19(3), 127; doi:10.3390/e19030127 -
Abstract
This paper analyses the citation profiles of researchers in fractional calculus. Different metrics are used to quantify the dissimilarities between the data, namely the Canberra distance, and the classical and the generalized (fractional) Jensen–Shannon divergence. The information is then visualized by means of
[...] Read more.
This paper analyses the citation profiles of researchers in fractional calculus. Different metrics are used to quantify the dissimilarities between the data, namely the Canberra distance, and the classical and the generalized (fractional) Jensen–Shannon divergence. The information is then visualized by means of multidimensional scaling and hierarchical clustering. The mathematical tools and metrics allow for direct comparison and visualization of researchers based on their relative positioning and on patterns displayed in two- or three-dimensional maps. Full article
Figures

Figure 1

Open AccessArticle
Distance-Based Lempel–Ziv Complexity for the Analysis of Electroencephalograms in Patients with Alzheimer’s Disease
Entropy 2017, 19(3), 129; doi:10.3390/e19030129 -
Abstract
The analysis of electroencephalograms (EEGs) of patients with Alzheimer’s disease (AD) could contribute to the diagnosis of this dementia. In this study, a new non-linear signal processing metric, distance-based Lempel–Ziv complexity (dLZC), is introduced to characterise changes between pairs of electrodes in EEGs
[...] Read more.
The analysis of electroencephalograms (EEGs) of patients with Alzheimer’s disease (AD) could contribute to the diagnosis of this dementia. In this study, a new non-linear signal processing metric, distance-based Lempel–Ziv complexity (dLZC), is introduced to characterise changes between pairs of electrodes in EEGs in AD. When complexity in each signal arises from different sub-sequences, dLZC would be greater than when similar sub-sequences are present in each signal. EEGs from 11 AD patients and 11 age-matched control subjects were analysed. The dLZC values for AD patients were lower than for control subjects for most electrode pairs, with statistically significant differences (p < 0.01, Student’s t-test) in 17 electrode pairs in the distant left, local posterior left, and interhemispheric regions. Maximum diagnostic accuracies with leave-one-out cross-validation were 77.27% for subject-based classification and 78.25% for epoch-based classification. These findings suggest not only that EEGs from AD patients are less complex than those from controls, but also that the richness of the information contained in pairs of EEGs from patients is also lower than in age-matched controls. The analysis of EEGs in AD with dLZC may increase the insight into brain dysfunction, providing complementary information to that obtained with other complexity and synchrony methods. Full article
Figures

Figure 1