Next Issue
Previous Issue

E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Table of Contents

Entropy, Volume 19, Issue 12 (December 2017)

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
Cover Story (view full-size image) The combined application of the Bayesian Model Selection and Maximum Entropy sharply differentiates [...] Read more.
View options order results:
result details:
Displaying articles 1-67
Export citation of selected articles as:
Open AccessArticle A Geodesic-Based Riemannian Gradient Approach to Averaging on the Lorentz Group
Entropy 2017, 19(12), 698; https://doi.org/10.3390/e19120698
Received: 19 November 2017 / Revised: 16 December 2017 / Accepted: 17 December 2017 / Published: 20 December 2017
Viewed by 1033 | PDF Full-text (293 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, we propose an efficient algorithm to solve the averaging problem on the Lorentz group O(n,k). Firstly, we introduce the geometric structures of O(n,k) endowed with a Riemannian metric where
[...] Read more.
In this paper, we propose an efficient algorithm to solve the averaging problem on the Lorentz group O ( n , k ) . Firstly, we introduce the geometric structures of O ( n , k ) endowed with a Riemannian metric where geodesic could be written in closed form. Then, the algorithm is presented based on the Riemannian-steepest-descent approach. Finally, we compare the above algorithm with the Euclidean gradient algorithm and the extended Hamiltonian algorithm. Numerical experiments show that the geodesic-based Riemannian-steepest-descent algorithm performs the best in terms of the convergence rate. Full article
Figures

Figure 1

Open AccessArticle Normalised Mutual Information of High-Density Surface Electromyography during Muscle Fatigue
Entropy 2017, 19(12), 697; https://doi.org/10.3390/e19120697
Received: 15 September 2017 / Revised: 8 December 2017 / Accepted: 18 December 2017 / Published: 20 December 2017
Cited by 1 | Viewed by 1216 | PDF Full-text (3283 KB) | HTML Full-text | XML Full-text
Abstract
This study has developed a technique for identifying the presence of muscle fatigue based on the spatial changes of the normalised mutual information (NMI) between multiple high density surface electromyography (HD-sEMG) channels. Muscle fatigue in the tibialis anterior (TA) during isometric contractions at
[...] Read more.
This study has developed a technique for identifying the presence of muscle fatigue based on the spatial changes of the normalised mutual information (NMI) between multiple high density surface electromyography (HD-sEMG) channels. Muscle fatigue in the tibialis anterior (TA) during isometric contractions at 40% and 80% maximum voluntary contraction levels was investigated in ten healthy participants (Age range: 21 to 35 years; Mean age = 26 years; Male = 4, Female = 6). HD-sEMG was used to record 64 channels of sEMG using a 16 by 4 electrode array placed over the TA. The NMI of each electrode with every other electrode was calculated to form an NMI distribution for each electrode. The total NMI for each electrode (the summation of the electrode’s NMI distribution) highlighted regions of high dependence in the electrode array and was observed to increase as the muscle fatigued. To summarise this increase, a function, M(k), was defined and was found to be significantly affected by fatigue and not by contraction force. The technique discussed in this study has overcome issues regarding electrode placement and was used to investigate how the dependences between sEMG signals within the same muscle change spatially during fatigue. Full article
(This article belongs to the Special Issue Information Theory Applied to Physiological Signals)
Figures

Figure 1

Open AccessArticle Hypothesis Tests for Bernoulli Experiments: Ordering the Sample Space by Bayes Factors and Using Adaptive Significance Levels for Decisions
Entropy 2017, 19(12), 696; https://doi.org/10.3390/e19120696
Received: 31 August 2017 / Revised: 18 December 2017 / Accepted: 18 December 2017 / Published: 20 December 2017
Viewed by 1418 | PDF Full-text (1112 KB) | HTML Full-text | XML Full-text
Abstract
The main objective of this paper is to find the relation between the adaptive significance level presented here and the sample size. We statisticians know of the inconsistency, or paradox, in the current classical tests of significance that are based on p-value
[...] Read more.
The main objective of this paper is to find the relation between the adaptive significance level presented here and the sample size. We statisticians know of the inconsistency, or paradox, in the current classical tests of significance that are based on p-value statistics that are compared to the canonical significance levels (10%, 5%, and 1%): “Raise the sample to reject the null hypothesis” is the recommendation of some ill-advised scientists! This paper will show that it is possible to eliminate this problem of significance tests. We present here the beginning of a larger research project. The intention is to extend its use to more complex applications such as survival analysis, reliability tests, and other areas. The main tools used here are the Bayes factor and the extended Neyman–Pearson Lemma. Full article
(This article belongs to the Special Issue Maximum Entropy and Bayesian Methods)
Figures

Figure 1

Open AccessLetter Molecular Conformational Manifolds between Gas-Liquid Interface and Multiphasic
Entropy 2017, 19(12), 695; https://doi.org/10.3390/e19120695
Received: 27 October 2017 / Revised: 11 December 2017 / Accepted: 12 December 2017 / Published: 19 December 2017
Viewed by 1017 | PDF Full-text (537 KB) | HTML Full-text | XML Full-text
Abstract
The analysis of conformational changes of hydrocarbon molecules is imperative in the prediction of their transport properties in different phases, such as evaporation/condensation coefficients (β) in the gas-liquid interface and evaporation rates of fuel droplets (k) in multiphases. In
[...] Read more.
The analysis of conformational changes of hydrocarbon molecules is imperative in the prediction of their transport properties in different phases, such as evaporation/condensation coefficients (β) in the gas-liquid interface and evaporation rates of fuel droplets (k) in multiphases. In this letter, we analyze the effects of entropic contributions ( T Δ S e v ( T ) ) to Δ G e v ( T ) during the evaporation/condensation of chain conformers at the interface with a modified version of the solvation model SMD/ωB97X-D/cc-pVTZ in which the temperature dependency of surface tension and the interfacial flow density of the conformers is taken into account. The evaporation/condensation coefficient (β) and evaporation rate (k) are respectively calculated using the statistical associating fluid theory (SAFT) and a combined quantum-classical reaction rate theory named quantum transition state theory-classical kinetic gas theory (QTST-CKGT). The detailed analyses show the importance of internal entropic states over the interfacial layer induced by meso-confinement phenomena in the very vicinity of fuel droplets surfaces. Full article
(This article belongs to the Special Issue Nonequilibrium Thermodynamics of Interfaces)
Figures

Graphical abstract

Open AccessArticle Combined Forecasting of Rainfall Based on Fuzzy Clustering and Cross Entropy
Entropy 2017, 19(12), 694; https://doi.org/10.3390/e19120694
Received: 31 August 2017 / Revised: 5 December 2017 / Accepted: 14 December 2017 / Published: 19 December 2017
Cited by 2 | Viewed by 1160 | PDF Full-text (3617 KB) | HTML Full-text | XML Full-text
Abstract
Rainfall is an essential index to measure drought, and it is dependent upon various parameters including geographical environment, air temperature and pressure. The nonlinear nature of climatic variables leads to problems such as poor accuracy and instability in traditional forecasting methods. In this
[...] Read more.
Rainfall is an essential index to measure drought, and it is dependent upon various parameters including geographical environment, air temperature and pressure. The nonlinear nature of climatic variables leads to problems such as poor accuracy and instability in traditional forecasting methods. In this paper, the combined forecasting method based on data mining technology and cross entropy is proposed to forecast the rainfall with full consideration of the time-effectiveness of historical data. In view of the flaws of the fuzzy clustering method which is easy to fall into local optimal solution and low speed of operation, the ant colony algorithm is adopted to overcome these shortcomings and, as a result, refine the model. The method for determining weights is also improved by using the cross entropy. Besides, the forecast is conducted by analyzing the weighted average rainfall based on Thiessen polygon in the Beijing–Tianjin–Hebei region. Since the predictive errors are calculated, the results show that improved ant colony fuzzy clustering can effectively select historical data and enhance the accuracy of prediction so that the damage caused by extreme weather events like droughts and floods can be greatly lessened and even kept at bay. Full article
(This article belongs to the Special Issue Entropy Applications in Environmental and Water Engineering)
Figures

Figure 1

Open AccessArticle Chaos in a Cancer Model via Fractional Derivatives with Exponential Decay and Mittag-Leffler Law
Entropy 2017, 19(12), 681; https://doi.org/10.3390/e19120681
Received: 1 November 2017 / Revised: 3 December 2017 / Accepted: 6 December 2017 / Published: 19 December 2017
Cited by 6 | Viewed by 1230 | PDF Full-text (2157 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, a three-dimensional cancer model was considered using the Caputo-Fabrizio-Caputo and the new fractional derivative with Mittag-Leffler kernel in Liouville-Caputo sense. Special solutions using an iterative scheme via Laplace transform, Sumudu-Picard integration method and Adams-Moulton rule were obtained. We studied the
[...] Read more.
In this paper, a three-dimensional cancer model was considered using the Caputo-Fabrizio-Caputo and the new fractional derivative with Mittag-Leffler kernel in Liouville-Caputo sense. Special solutions using an iterative scheme via Laplace transform, Sumudu-Picard integration method and Adams-Moulton rule were obtained. We studied the uniqueness and existence of the solutions. Novel chaotic attractors with total order less than three are obtained. Full article
(This article belongs to the Special Issue Wavelets, Fractals and Information Theory III)
Figures

Figure 1

Open AccessArticle Stochastic Thermodynamics: A Dynamical Systems Approach
Entropy 2017, 19(12), 693; https://doi.org/10.3390/e19120693
Received: 16 October 2017 / Revised: 13 December 2017 / Accepted: 13 December 2017 / Published: 17 December 2017
Viewed by 1265 | PDF Full-text (627 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, we develop an energy-based, large-scale dynamical system model driven by Markov diffusion processes to present a unified framework for statistical thermodynamics predicated on a stochastic dynamical systems formalism. Specifically, using a stochastic state space formulation, we develop a nonlinear stochastic
[...] Read more.
In this paper, we develop an energy-based, large-scale dynamical system model driven by Markov diffusion processes to present a unified framework for statistical thermodynamics predicated on a stochastic dynamical systems formalism. Specifically, using a stochastic state space formulation, we develop a nonlinear stochastic compartmental dynamical system model characterized by energy conservation laws that is consistent with statistical thermodynamic principles. In particular, we show that the difference between the average supplied system energy and the average stored system energy for our stochastic thermodynamic model is a martingale with respect to the system filtration. In addition, we show that the average stored system energy is equal to the mean energy that can be extracted from the system and the mean energy that can be delivered to the system in order to transfer it from a zero energy level to an arbitrary nonempty subset in the state space over a finite stopping time. Full article
(This article belongs to the Special Issue Entropy and Its Applications across Disciplines)
Figures

Figure 1

Open AccessArticle Permutation Entropy: Too Complex a Measure for EEG Time Series?
Entropy 2017, 19(12), 692; https://doi.org/10.3390/e19120692
Received: 16 November 2017 / Revised: 11 December 2017 / Accepted: 13 December 2017 / Published: 16 December 2017
Cited by 3 | Viewed by 2375 | PDF Full-text (1364 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
Permutation entropy (PeEn) is a complexity measure that originated from dynamical systems theory. Specifically engineered to be robustly applicable to real-world data, the quantity has since been utilised for a multitude of time series analysis tasks. In electroencephalogram (EEG) analysis, value changes of
[...] Read more.
Permutation entropy (PeEn) is a complexity measure that originated from dynamical systems theory. Specifically engineered to be robustly applicable to real-world data, the quantity has since been utilised for a multitude of time series analysis tasks. In electroencephalogram (EEG) analysis, value changes of PeEn correlate with clinical observations, among them the onset of epileptic seizures or the loss of consciousness induced by anaesthetic agents. Regarding this field of application, the present work suggests a relation between PeEn-based complexity estimation and spectral methods of EEG analysis: for ordinal patterns of three consecutive samples, the PeEn of an epoch of EEG appears to approximate the centroid of its weighted power spectrum. To substantiate this proposition, a systematic approach based on redundancy reduction is introduced and applied to sleep and epileptic seizure EEG. The interrelation demonstrated may aid the interpretation of PeEn in EEG, and may increase its comparability with other techniques of EEG analysis. Full article
(This article belongs to the Special Issue Permutation Entropy & Its Interdisciplinary Applications)
Figures

Figure 1

Open AccessArticle Spin Interaction under the Collision of Two Kerr-(Anti-)de Sitter Black Holes
Entropy 2017, 19(12), 691; https://doi.org/10.3390/e19120691
Received: 8 November 2017 / Revised: 8 December 2017 / Accepted: 14 December 2017 / Published: 15 December 2017
Cited by 4 | Viewed by 854 | PDF Full-text (476 KB) | HTML Full-text | XML Full-text
Abstract
We investigate herein the spin interaction during collisions between Kerr-(anti-)de Sitter black holes. The spin interaction potential depends on the relative rotation directions of the black holes, and this potential can be released as gravitational radiation upon collision. The energy of the radiation
[...] Read more.
We investigate herein the spin interaction during collisions between Kerr-(anti-)de Sitter black holes. The spin interaction potential depends on the relative rotation directions of the black holes, and this potential can be released as gravitational radiation upon collision. The energy of the radiation depends on the cosmological constant and corresponds to the spin interaction potential in the limit that one of the black holes has negligibly small mass and angular momentum. We then determine the approximate overall behaviors of the upper bounds on the radiation using thermodynamics. The results indicate that the spin interaction can consistently contribute to the radiation. In addition, the radiation depends on the stability of the black hole produced by the collision. Full article
(This article belongs to the Section Astrophysics, Cosmology, and Black Holes)
Figures

Figure 1

Open AccessArticle Non-Equilibrium Thermodynamic Analysis of Double Diffusive, Nanofluid Forced Convection in Catalytic Microreactors with Radiation Effects
Entropy 2017, 19(12), 690; https://doi.org/10.3390/e19120690
Received: 11 October 2017 / Revised: 27 November 2017 / Accepted: 14 December 2017 / Published: 15 December 2017
Cited by 1 | Viewed by 1092 | PDF Full-text (2420 KB) | HTML Full-text | XML Full-text
Abstract
This paper presents a theoretical investigation of the second law performance of double diffusive forced convection in microreactors with the inclusion of nanofluid and radiation effects. The investigated microreactors consist of a single microchannel, fully filled by a porous medium. The transport of
[...] Read more.
This paper presents a theoretical investigation of the second law performance of double diffusive forced convection in microreactors with the inclusion of nanofluid and radiation effects. The investigated microreactors consist of a single microchannel, fully filled by a porous medium. The transport of heat and mass are analysed by including the thick walls and a first order, catalytic chemical reaction on the internal surfaces of the microchannel. Two sets of thermal boundary conditions are considered on the external surfaces of the microchannel; (1) constant temperature and (2) constant heat flux boundary condition on the lower wall and convective boundary condition on the upper wall. The local thermal non-equilibrium approach is taken to thermally analyse the porous section of the system. The mass dispersion equation is coupled with the transport of heat in the nanofluid flow through consideration of Soret effect. The problem is analytically solved and illustrations of the temperature fields, Nusselt number, total entropy generation rate and performance evaluation criterion (PEC) are provided. It is shown that the radiation effect tends to modify the thermal behaviour within the porous section of the system. The radiation parameter also reduces the overall temperature of the system. It is further demonstrated that, expectedly, the nanoparticles reduce the temperature of the system and increase the Nusselt number. The total entropy generation rate and consequently PEC shows a strong relation with radiation parameter and volumetric concentration of nanoparticles. Full article
(This article belongs to the Special Issue Non-Equilibrium Thermodynamics of Micro Technologies)
Figures

Figure 1

Open AccessArticle Entropy Analysis for a Nonlinear Fluid with a Nonlinear Heat Flux Vector
Entropy 2017, 19(12), 689; https://doi.org/10.3390/e19120689
Received: 19 October 2017 / Revised: 5 December 2017 / Accepted: 11 December 2017 / Published: 14 December 2017
Cited by 1 | Viewed by 802 | PDF Full-text (233 KB) | HTML Full-text | XML Full-text
Abstract
Flowing media in both industrial and natural processes are often characterized as assemblages of densely packed granular materials. Typically, the constitutive relations for the stress tensor and heat flux vector are fundamentally nonlinear. Moreover, these equations are coupled through the Clausius–Duhem inequality. However,
[...] Read more.
Flowing media in both industrial and natural processes are often characterized as assemblages of densely packed granular materials. Typically, the constitutive relations for the stress tensor and heat flux vector are fundamentally nonlinear. Moreover, these equations are coupled through the Clausius–Duhem inequality. However, the consequences of this coupling are rarely studied. Here we address this issue by obtaining constraints imposed by the Clausius–Duhem inequality on the constitutive relations for both the stress tensor and the heat flux vector in which the volume fraction gradient plays an important role. A crucial result of the analysis is the restriction on the dependency of phenomenological coefficients appearing in the constitutive equations on the model objective functions. Full article
(This article belongs to the Section Thermodynamics)
Open AccessArticle Entropy and Compression Capture Different Complexity Features: The Case of Fetal Heart Rate
Entropy 2017, 19(12), 688; https://doi.org/10.3390/e19120688
Received: 21 September 2017 / Revised: 28 November 2017 / Accepted: 11 December 2017 / Published: 14 December 2017
Cited by 2 | Viewed by 1033 | PDF Full-text (2674 KB) | HTML Full-text | XML Full-text
Abstract
Entropy and compression have been used to distinguish fetuses at risk of hypoxia from their healthy counterparts through the analysis of Fetal Heart Rate (FHR). Low correlation that was observed between these two approaches suggests that they capture different complexity features. This study
[...] Read more.
Entropy and compression have been used to distinguish fetuses at risk of hypoxia from their healthy counterparts through the analysis of Fetal Heart Rate (FHR). Low correlation that was observed between these two approaches suggests that they capture different complexity features. This study aims at characterizing the complexity of FHR features captured by entropy and compression, using as reference international guidelines. Single and multi-scale approaches were considered in the computation of entropy and compression. The following physiologic-based features were considered: FHR baseline; percentage of abnormal long (%abLTV) and short (%abSTV) term variability; average short term variability; and, number of acceleration and decelerations. All of the features were computed on a set of 68 intrapartum FHR tracings, divided as normal, mildly, and moderately-severely acidemic born fetuses. The correlation between entropy/compression features and the physiologic-based features was assessed. There were correlations between compressions and accelerations and decelerations, but neither accelerations nor decelerations were significantly correlated with entropies. The %abSTV was significantly correlated with entropies (ranging between −0.54 and −0.62), and to a higher extent with compression (ranging between −0.80 and −0.94). Distinction between groups was clearer in the lower scales using entropy and in the higher scales using compression. Entropy and compression are complementary complexity measures. Full article
(This article belongs to the Special Issue Information Theory Applied to Physiological Signals)
Figures

Figure 1

Open AccessArticle Choosing between Higher Moment Maximum Entropy Models and Its Application to Homogeneous Point Processes with Random Effects
Entropy 2017, 19(12), 687; https://doi.org/10.3390/e19120687
Received: 29 August 2017 / Revised: 7 December 2017 / Accepted: 9 December 2017 / Published: 14 December 2017
Viewed by 807 | PDF Full-text (333 KB) | HTML Full-text | XML Full-text
Abstract
In the Bayesian framework, the usual choice of prior in the prediction of homogeneous Poisson processes with random effects is the gamma one. Here, we propose the use of higher order maximum entropy priors. Their advantage is illustrated in a simulation study and
[...] Read more.
In the Bayesian framework, the usual choice of prior in the prediction of homogeneous Poisson processes with random effects is the gamma one. Here, we propose the use of higher order maximum entropy priors. Their advantage is illustrated in a simulation study and the choice of the best order is established by two goodness-of-fit criteria: Kullback–Leibler divergence and a discrepancy measure. This procedure is illustrated on a warranty data set from the automobile industry. Full article
(This article belongs to the Special Issue Maximum Entropy and Bayesian Methods)
Figures

Figure 1

Open AccessArticle Extremal Matching Energy of Random Polyomino Chains
Entropy 2017, 19(12), 684; https://doi.org/10.3390/e19120684
Received: 21 October 2017 / Revised: 9 December 2017 / Accepted: 11 December 2017 / Published: 14 December 2017
Viewed by 809 | PDF Full-text (273 KB) | HTML Full-text | XML Full-text
Abstract
Polyomino graphs is one of the research objectives in statistical physics and in modeling problems of surface chemistry. A random polyomino chain is a subgraph of a polyomino graph. The matching energy is defined as the sum of the absolute values of the
[...] Read more.
Polyomino graphs is one of the research objectives in statistical physics and in modeling problems of surface chemistry. A random polyomino chain is a subgraph of a polyomino graph. The matching energy is defined as the sum of the absolute values of the zeros of the matching polynomial of a graph. In this paper, we characterize the graphs with the extremal matching energy among all random polyomino chains of a polyomino graph by the probability method. Full article
Figures

Figure 1

Open AccessArticle Do We Really Need to Catch Them All? A New User-Guided Social Media Crawling Method
Entropy 2017, 19(12), 686; https://doi.org/10.3390/e19120686
Received: 18 October 2017 / Revised: 28 November 2017 / Accepted: 11 December 2017 / Published: 13 December 2017
Viewed by 1628 | PDF Full-text (774 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
[-15]With the growing use of popular social media services like Facebook and Twitter it is challenging to collect all content from the networks without access to the core infrastructure or paying for it. Thus, if all content cannot be collected one must consider
[...] Read more.
[-15]With the growing use of popular social media services like Facebook and Twitter it is challenging to collect all content from the networks without access to the core infrastructure or paying for it. Thus, if all content cannot be collected one must consider which data are of most importance. In this work we present a novel User-guided Social Media Crawling method (USMC) that is able to collect data from social media, utilizing the wisdom of the crowd to decide the order in which user generated content should be collected to cover as many user interactions as possible. USMC is validated by crawling 160 public Facebook pages, containing content from 368 million users including 1.3 billion interactions, and it is compared with two other crawling methods. The results show that it is possible to cover approximately 75% of the interactions on a Facebook page by sampling just 20% of its posts, and at the same time reduce the crawling time by 53%. In addition, the social network constructed from the 20% sample contains more than 75% of the users and edges compared to the social network created from all posts, and it has similar degree distribution. Full article
(This article belongs to the Special Issue Entropy and Complexity of Data)
Figures

Figure 1

Open AccessArticle Testing the Beta-Lognormal Model in Amazonian Rainfall Fields Using the Generalized Space q-Entropy
Entropy 2017, 19(12), 685; https://doi.org/10.3390/e19120685
Received: 9 October 2017 / Revised: 28 November 2017 / Accepted: 8 December 2017 / Published: 13 December 2017
Cited by 1 | Viewed by 1192 | PDF Full-text (2088 KB) | HTML Full-text | XML Full-text
Abstract
We study spatial scaling and complexity properties of Amazonian radar rainfall fields using the Beta-Lognormal Model (BL-Model) with the aim to characterize and model the process at a broad range of spatial scales. The Generalized Space q-Entropy Function (GSEF), an entropic measure
[...] Read more.
We study spatial scaling and complexity properties of Amazonian radar rainfall fields using the Beta-Lognormal Model (BL-Model) with the aim to characterize and model the process at a broad range of spatial scales. The Generalized Space q-Entropy Function (GSEF), an entropic measure defined as a continuous set of power laws covering a broad range of spatial scales, S q ( λ ) λ Ω ( q ), is used as a tool to check the ability of the BL-Model to represent observed 2-D radar rainfall fields. In addition, we evaluate the effect of the amount of zeros, the variability of rainfall intensity, the number of bins used to estimate the probability mass function, and the record length on the GSFE estimation. Our results show that: (i) the BL-Model adequately represents the scaling properties of the q-entropy, S q, for Amazonian rainfall fields across a range of spatial scales λ from 2 km to 64 km; (ii) the q-entropy in rainfall fields can be characterized by a non-additivity value, q s a t, at which rainfall reaches a maximum scaling exponent, Ω s a t; (iii) the maximum scaling exponent Ω s a t is directly related to the amount of zeros in rainfall fields and is not sensitive to either the number of bins to estimate the probability mass function or the variability of rainfall intensity; and (iv) for small-samples, the GSEF of rainfall fields may incur in considerable bias. Finally, for synthetic 2-D rainfall fields from the BL-Model, we look for a connection between intermittency using a metric based on generalized Hurst exponents, M ( q 1 , q 2 ), and the non-extensive order (q-order) of a system, Θ q, which relates to the GSEF. Our results do not exhibit evidence of such relationship. Full article
(This article belongs to the Special Issue Entropy Applications in Environmental and Water Engineering)
Figures

Figure 1

Open AccessArticle Entropy Conditions Involved in the Nonlinear Coupled Constitutive Method for Solving Continuum and Rarefied Gas Flows
Entropy 2017, 19(12), 683; https://doi.org/10.3390/e19120683
Received: 5 September 2017 / Revised: 19 November 2017 / Accepted: 8 December 2017 / Published: 12 December 2017
Viewed by 1128 | PDF Full-text (4404 KB) | HTML Full-text | XML Full-text
Abstract
The numerical study of continuum-rarefied gas flows is of considerable interest because it can provide fundamental knowledge regarding flow physics. Recently, the nonlinear coupled constitutive method (NCCM) has been derived from the Boltzmann equation and implemented to investigate continuum-rarefied gas flows. In this
[...] Read more.
The numerical study of continuum-rarefied gas flows is of considerable interest because it can provide fundamental knowledge regarding flow physics. Recently, the nonlinear coupled constitutive method (NCCM) has been derived from the Boltzmann equation and implemented to investigate continuum-rarefied gas flows. In this study, we first report the important and detailed issues in the use of the H theorem and positive entropy generation in the NCCM. Importantly, the unified nonlinear dissipation model and its relationships to the Rayleigh–Onsager function were demonstrated in the treatment of the collision term of the Boltzmann equation. In addition, we compare the Grad moment method, the Burnett equation, and the NCCM. Next, differences between the NCCM equations and the Navier–Stokes equations are explained in detail. For validation, numerical studies of rarefied and continuum gas flows were conducted. These studies include rarefied and/or continuum gas flows around a two-dimensional (2D) cavity, a 2D airfoil, a 2D cylinder, and a three-dimensional space shuttle. It was observed that the present results of the NCCM are in good agreement with those of the Direct Simulation Monte Carlo (DSMC) method in rarefied cases and are in good agreement with those of the Navier–Stokes equations in continuum cases. Finally, this study can be regarded as a theoretical basis of the NCCM for the development of a unified framework for solving continuum-rarefied gas flows. Full article
Figures

Figure 1

Open AccessArticle Channel Capacity of Coding System on Tsallis Entropy and q-Statistics
Entropy 2017, 19(12), 682; https://doi.org/10.3390/e19120682
Received: 12 August 2017 / Revised: 4 December 2017 / Accepted: 8 December 2017 / Published: 12 December 2017
Cited by 4 | Viewed by 1350 | PDF Full-text (193 KB) | HTML Full-text | XML Full-text
Abstract
The field of information science has greatly developed, and applications in various fields have emerged. In this paper, we evaluated the coding system in the theory of Tsallis entropy for transmission of messages and aimed to formulate the channel capacity by maximization of
[...] Read more.
The field of information science has greatly developed, and applications in various fields have emerged. In this paper, we evaluated the coding system in the theory of Tsallis entropy for transmission of messages and aimed to formulate the channel capacity by maximization of the Tsallis entropy within a given condition of code length. As a result, we obtained a simple relational expression between code length and code appearance probability and, additionally, a generalized formula of the channel capacity on the basis of Tsallis entropy statistics. This theoretical framework may contribute to data processing techniques and other applications. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Open AccessArticle Altered Brain Complexity in Women with Primary Dysmenorrhea: A Resting-State Magneto-Encephalography Study Using Multiscale Entropy Analysis
Entropy 2017, 19(12), 680; https://doi.org/10.3390/e19120680
Received: 30 September 2017 / Revised: 20 November 2017 / Accepted: 6 December 2017 / Published: 11 December 2017
Viewed by 1155 | PDF Full-text (848 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
How chronic pain affects brain functions remains unclear. As a potential indicator, brain complexity estimated by entropy-based methods may be helpful for revealing the underlying neurophysiological mechanism of chronic pain. In this study, complexity features with multiple time scales and spectral features were
[...] Read more.
How chronic pain affects brain functions remains unclear. As a potential indicator, brain complexity estimated by entropy-based methods may be helpful for revealing the underlying neurophysiological mechanism of chronic pain. In this study, complexity features with multiple time scales and spectral features were extracted from resting-state magnetoencephalographic signals of 156 female participants with/without primary dysmenorrhea (PDM) during pain-free state. Revealed by multiscale sample entropy (MSE), PDM patients (PDMs) exhibited loss of brain complexity in regions associated with sensory, affective, and evaluative components of pain, including sensorimotor, limbic, and salience networks. Significant correlations between MSE values and psychological states (depression and anxiety) were found in PDMs, which may indicate specific nonlinear disturbances in limbic and default mode network circuits after long-term menstrual pain. These findings suggest that MSE is an important measure of brain complexity and is potentially applicable to future diagnosis of chronic pain. Full article
(This article belongs to the Special Issue Information Theory Applied to Physiological Signals)
Figures

Figure 1

Open AccessEditorial Second-Law Analysis: A Powerful Tool for Analyzing Computational Fluid Dynamics (CFD) Results
Entropy 2017, 19(12), 679; https://doi.org/10.3390/e19120679
Received: 1 December 2017 / Revised: 1 December 2017 / Accepted: 8 December 2017 / Published: 11 December 2017
Cited by 2 | Viewed by 881 | PDF Full-text (176 KB) | HTML Full-text | XML Full-text
Abstract
Second-law analysis (SLA) is an important concept in thermodynamics, which basically assesses energy by its value in terms of its convertibility from one form to another.[...] Full article
(This article belongs to the Special Issue Entropy in Computational Fluid Dynamics)
Open AccessArticle Information Landscape and Flux, Mutual Information Rate Decomposition and Connections to Entropy Production
Entropy 2017, 19(12), 678; https://doi.org/10.3390/e19120678
Received: 29 September 2017 / Revised: 27 November 2017 / Accepted: 6 December 2017 / Published: 11 December 2017
Cited by 2 | Viewed by 896 | PDF Full-text (768 KB) | HTML Full-text | XML Full-text
Abstract
We explored the dynamics of two interacting information systems. We show that for the Markovian marginal systems, the driving force for information dynamics is determined by both the information landscape and information flux. While the information landscape can be used to construct the
[...] Read more.
We explored the dynamics of two interacting information systems. We show that for the Markovian marginal systems, the driving force for information dynamics is determined by both the information landscape and information flux. While the information landscape can be used to construct the driving force to describe the equilibrium time-reversible information system dynamics, the information flux can be used to describe the nonequilibrium time-irreversible behaviors of the information system dynamics. The information flux explicitly breaks the detailed balance and is a direct measure of the degree of the nonequilibrium or time-irreversibility. We further demonstrate that the mutual information rate between the two subsystems can be decomposed into the equilibrium time-reversible and nonequilibrium time-irreversible parts, respectively. This decomposition of the Mutual Information Rate (MIR) corresponds to the information landscape-flux decomposition explicitly when the two subsystems behave as Markov chains. Finally, we uncover the intimate relationship between the nonequilibrium thermodynamics in terms of the entropy production rates and the time-irreversible part of the mutual information rate. We found that this relationship and MIR decomposition still hold for the more general stationary and ergodic cases. We demonstrate the above features with two examples of the bivariate Markov chains. Full article
Open AccessArticle Automated Detection of Paroxysmal Atrial Fibrillation Using an Information-Based Similarity Approach
Entropy 2017, 19(12), 677; https://doi.org/10.3390/e19120677
Received: 10 October 2017 / Revised: 20 November 2017 / Accepted: 8 December 2017 / Published: 10 December 2017
Cited by 4 | Viewed by 1354 | PDF Full-text (4601 KB) | HTML Full-text | XML Full-text
Abstract
Atrial fibrillation (AF) is an abnormal rhythm of the heart, which can increase heart-related complications. Paroxysmal AF episodes occur intermittently with varying duration. Human-based diagnosis of paroxysmal AF with a longer-term electrocardiogram recording is time-consuming. Here we present a fully automated ensemble model
[...] Read more.
Atrial fibrillation (AF) is an abnormal rhythm of the heart, which can increase heart-related complications. Paroxysmal AF episodes occur intermittently with varying duration. Human-based diagnosis of paroxysmal AF with a longer-term electrocardiogram recording is time-consuming. Here we present a fully automated ensemble model for AF episode detection based on RR-interval time series, applying a novel approach of information-based similarity analysis and ensemble scheme. By mapping RR-interval time series to binary symbolic sequences and comparing the rank-frequency patterns of m-bit words, the dissimilarity between AF and normal sinus rhythms (NSR) were quantified. To achieve high detection specificity and sensitivity, and low variance, a weighted variation of bagging with multiple AF and NSR templates was applied. By performing dissimilarity comparisons between unknown RR-interval time series and multiple templates, paroxysmal AF episodes were detected. Based on our results, optimal AF detection parameters are symbolic word length m = 9 and observation window n = 150, achieving 97.04% sensitivity, 97.96% specificity, and 97.78% overall accuracy. Sensitivity, specificity, and overall accuracy vary little despite changes in m and n parameters. This study provides quantitative information to enhance the categorization of AF and normal cardiac rhythms. Full article
(This article belongs to the Special Issue Information Theory Applied to Physiological Signals)
Figures

Figure 1

Open AccessArticle Maximum Exergetic Efficiency Operation of a Solar Powered H2O-LiBr Absorption Cooling System
Entropy 2017, 19(12), 676; https://doi.org/10.3390/e19120676
Received: 22 October 2017 / Revised: 3 December 2017 / Accepted: 6 December 2017 / Published: 9 December 2017
Cited by 1 | Viewed by 1230 | PDF Full-text (13286 KB) | HTML Full-text | XML Full-text
Abstract
A solar driven cooling system consisting of a single effect H2O-LiBr absorbtion cooling module (ACS), a parabolic trough collector (PTC), and a storage tank (ST) module is analyzed during one full day operation. The pressurized water is used to transfer heat
[...] Read more.
A solar driven cooling system consisting of a single effect H2O-LiBr absorbtion cooling module (ACS), a parabolic trough collector (PTC), and a storage tank (ST) module is analyzed during one full day operation. The pressurized water is used to transfer heat from PTC to ST and to feed the ACS desorber. The system is constrained to operate at the maximum ACS exergetic efficiency, under a time dependent cooling load computed on 15 July for a one storey house located near Bucharest, Romania. To set up the solar assembly, two commercial PTCs were selected, namely PT1-IST and PTC 1800 Solitem, and a single unit ST was initially considered. The mathematical model, relying on the energy balance equations, was coded under Engineering Equation Solver (EES) environment. The solar data were obtained from the Meteonorm database. The numerical simulations proved that the system cannot cover the imposed cooling load all day long, due to the large variation of water temperature inside the ST. By splitting the ST into two units, the results revealed that the PT1-IST collector only drives the ACS between 9 am and 4:30 pm, while the PTC 1800 one covers the entire cooling period (9 am–6 pm) for optimum ST capacities of 90 kg/90 kg and 90 kg/140 kg, respectively. Full article
(This article belongs to the Section Thermodynamics)
Figures

Figure 1

Open AccessArticle A General Symbolic Approach to Kolmogorov-Sinai Entropy
Entropy 2017, 19(12), 675; https://doi.org/10.3390/e19120675
Received: 31 October 2017 / Revised: 4 December 2017 / Accepted: 5 December 2017 / Published: 9 December 2017
Cited by 2 | Viewed by 1758 | PDF Full-text (390 KB) | HTML Full-text | XML Full-text
Abstract
It is popular to study a time-dependent nonlinear system by encoding outcomes of measurements into sequences of symbols following certain symbolization schemes. Mostly, symbolizations by threshold crossings or variants of it are applied, but also, the relatively new symbolic approach, which goes back
[...] Read more.
It is popular to study a time-dependent nonlinear system by encoding outcomes of measurements into sequences of symbols following certain symbolization schemes. Mostly, symbolizations by threshold crossings or variants of it are applied, but also, the relatively new symbolic approach, which goes back to innovative works of Bandt and Pompe—ordinal symbolic dynamics—plays an increasing role. In this paper, we discuss both approaches novelly in one breath with respect to the theoretical determination of the Kolmogorov-Sinai entropy (KS entropy). For this purpose, we propose and investigate a unifying approach to formalize symbolizations. By doing so, we can emphasize the main advantage of the ordinal approach if no symbolization scheme can be found that characterizes KS entropy directly: the ordinal approach, as well as generalizations of it provide, under very natural conditions, a direct route to KS entropy by default. Full article
(This article belongs to the Special Issue Symbolic Entropy Analysis and Its Applications)
Figures

Figure 1

Open AccessArticle On the Statistical Mechanics of Alien Species Distribution
Entropy 2017, 19(12), 674; https://doi.org/10.3390/e19120674
Received: 20 November 2017 / Revised: 6 December 2017 / Accepted: 8 December 2017 / Published: 9 December 2017
Viewed by 1111 | PDF Full-text (854 KB) | HTML Full-text | XML Full-text
Abstract
Many species of plants are found in regions to which they are alien. Their global distributions are characterised by a family of exponential functions of the kind that arise in elementary statistical mechanics (an example in ecology is MacArthur’s broken stick). We show
[...] Read more.
Many species of plants are found in regions to which they are alien. Their global distributions are characterised by a family of exponential functions of the kind that arise in elementary statistical mechanics (an example in ecology is MacArthur’s broken stick). We show here that all these functions are quantitatively reproduced by a model containing a single parameter—some global resource partitioned at random on the two axes of species number and site number. A dynamical model generating this equilibrium is a two-fold stochastic process and suggests a curious and interesting biological interpretation in terms of niche structures fluctuating with time and productivity, with sites and species highly idiosyncratic. Idiosyncrasy implies that attempts to identify a priori those species likely to become naturalised are unlikely to be successful. Although this paper is primarily concerned with a particular problem in population biology, the two-fold stochastic process may be of more general interest. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences II)
Figures

Figure 1

Open AccessArticle Characterisation of the Effects of Sleep Deprivation on the Electroencephalogram Using Permutation Lempel–Ziv Complexity, a Non-Linear Analysis Tool
Entropy 2017, 19(12), 673; https://doi.org/10.3390/e19120673
Received: 30 October 2017 / Revised: 2 December 2017 / Accepted: 5 December 2017 / Published: 8 December 2017
Cited by 1 | Viewed by 1259 | PDF Full-text (6479 KB) | HTML Full-text | XML Full-text
Abstract
Specific patterns of brain activity during sleep and waking are recorded in the electroencephalogram (EEG). Time-frequency analysis methods have been widely used to analyse the EEG and identified characteristic oscillations for each vigilance state (VS), i.e., wakefulness, rapid-eye movement (REM) and non-rapid-eye movement
[...] Read more.
Specific patterns of brain activity during sleep and waking are recorded in the electroencephalogram (EEG). Time-frequency analysis methods have been widely used to analyse the EEG and identified characteristic oscillations for each vigilance state (VS), i.e., wakefulness, rapid-eye movement (REM) and non-rapid-eye movement (NREM) sleep. However, other aspects such as change of patterns associated with brain dynamics may not be captured unless a non-linear-based analysis method is used. In this pilot study, Permutation Lempel–Ziv complexity (PLZC), a novel symbolic dynamics analysis method, was used to characterise the changes in the EEG in sleep and wakefulness during baseline and recovery from sleep deprivation (SD). The results obtained with PLZC were contrasted with a related non-linear method, Lempel–Ziv complexity (LZC). Both measure the emergence of new patterns. However, LZC is dependent on the absolute amplitude of the EEG, while PLZC is only dependent on the relative amplitude due to symbolisation procedure and thus, more resistant to noise. We showed that PLZC discriminates activated brain states associated with wakefulness and REM sleep, which both displayed higher complexity, compared to NREM sleep. Additionally, significantly lower PLZC values were measured in NREM sleep during the recovery period following SD compared to baseline, suggesting a reduced emergence of new activity patterns in the EEG. These findings were validated using PLZC on surrogate data. By contrast, LZC was merely reflecting changes in the spectral composition of the EEG. Overall, this study implies that PLZC is a robust non-linear complexity measure, which is not dependent on amplitude variations in the signal, and which may be useful to further assess EEG alterations induced by environmental or pharmacological manipulations. Full article
(This article belongs to the Special Issue Symbolic Entropy Analysis and Its Applications)
Figures

Figure 1

Open AccessArticle Association between Multiscale Entropy Characteristics of Heart Rate Variability and Ischemic Stroke Risk in Patients with Permanent Atrial Fibrillation
Entropy 2017, 19(12), 672; https://doi.org/10.3390/e19120672
Received: 6 November 2017 / Revised: 1 December 2017 / Accepted: 6 December 2017 / Published: 7 December 2017
Viewed by 1450 | PDF Full-text (3539 KB) | HTML Full-text | XML Full-text
Abstract
Multiscale entropy (MSE) profiles of heart rate variability (HRV) in patients with atrial fibrillation (AFib) provides clinically useful information for ischemic stroke risk assessment, suggesting that the complex properties characterized by MSE profiles are associated with ischemic stroke risk. However, the meaning of
[...] Read more.
Multiscale entropy (MSE) profiles of heart rate variability (HRV) in patients with atrial fibrillation (AFib) provides clinically useful information for ischemic stroke risk assessment, suggesting that the complex properties characterized by MSE profiles are associated with ischemic stroke risk. However, the meaning of HRV complexity in patients with AFib has not been clearly interpreted, and the physical and mathematical understanding of the relation between HRV dynamics and the ischemic stroke risk is not well established. To gain a deeper insight into HRV dynamics in patients with AFib, and to improve ischemic stroke risk assessment using HRV analysis, we study the HRV characteristics related to MSE profiles, such as the long-range correlation and probability density function. In this study, we analyze the HRV time series of 173 patients with permanent AFib. Our results show that, although HRV time series in patients with AFib exhibit long-range correlation (1/f fluctuations)—as observed in healthy subjects—in a range longer than 90 s, these autocorrelation properties have no significant predictive power for ischemic stroke occurrence. Further, the probability density function structure of the coarse-grained times series at scales greater than 2 s is dominantly associated with ischemic stroke risk. This observation could provide valuable information for improving ischemic stroke risk assessment using HRV analysis. Full article
(This article belongs to the Special Issue Information Theory Applied to Physiological Signals)
Figures

Figure 1

Open AccessArticle Inspecting Non-Perturbative Contributions to the Entanglement Entropy via Wavefunctions
Entropy 2017, 19(12), 671; https://doi.org/10.3390/e19120671
Received: 14 November 2017 / Revised: 4 December 2017 / Accepted: 5 December 2017 / Published: 7 December 2017
Cited by 2 | Viewed by 833 | PDF Full-text (407 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, we would like to systematically explore the implications of non-perturbative effects on entanglement in a many body system. Instead of pursuing the usual path-integral method in a singular space, we attempt to study the wavefunctions in detail. We begin with
[...] Read more.
In this paper, we would like to systematically explore the implications of non-perturbative effects on entanglement in a many body system. Instead of pursuing the usual path-integral method in a singular space, we attempt to study the wavefunctions in detail. We begin with a toy model of multiple particles whose interaction potential admits multiple minima. We study the entanglement of the true ground state after taking the tunneling effects into account and find some simple patterns. Notably, in the case of multiple particle interactions, entanglement entropy generically decreases with increasing number of minima. The knowledge of the subsystem actually increases with the number of minima. The reduced density matrix can also be seen to have close connections with graph spectra. In a more careful study of the two-well tunneling system, we also extract the exponentially-suppressed tail contribution, the analogue of instantons. To understand the effects of multiple minima in a field theory, we are inspired to inspect wavefunctions in a toy model of a bosonic field describing quasi-particles of two different condensates related by Bogoliubov transformations. We find that the area law is naturally preserved. This is probably a useful set of perspectives that promise wider applications. Full article
Figures

Figure 1

Open AccessArticle Oscillations in Multiparticle Production Processes
Entropy 2017, 19(12), 670; https://doi.org/10.3390/e19120670
Received: 22 November 2017 / Revised: 30 November 2017 / Accepted: 4 December 2017 / Published: 6 December 2017
Cited by 2 | Viewed by 1120 | PDF Full-text (415 KB) | HTML Full-text | XML Full-text
Abstract
We discuss two examples of oscillations apparently hidden in some experimental results for high-energy multiparticle production processes: (i) the log-periodic oscillatory pattern decorating the power-like Tsallis distributions of transverse momenta; (ii) the oscillations of the modified combinants obtained from the measured multiplicity distributions.
[...] Read more.
We discuss two examples of oscillations apparently hidden in some experimental results for high-energy multiparticle production processes: (i) the log-periodic oscillatory pattern decorating the power-like Tsallis distributions of transverse momenta; (ii) the oscillations of the modified combinants obtained from the measured multiplicity distributions. Our calculations are confronted with p p data from the Large Hadron Collider (LHC). We show that in both cases, these phenomena can provide new insight into the dynamics of these processes. Full article
(This article belongs to the Special Issue News Trends in Statistical Physics of Complex Systems)
Figures

Figure 1

Open AccessArticle Formation of Photo-Responsive Liquid Crystalline Emulsion by Using Microfluidics Device
Entropy 2017, 19(12), 669; https://doi.org/10.3390/e19120669
Received: 8 November 2017 / Revised: 30 November 2017 / Accepted: 4 December 2017 / Published: 6 December 2017
Cited by 1 | Viewed by 1592 | PDF Full-text (1734 KB) | HTML Full-text | XML Full-text
Abstract
Photo-responsive double emulsions made of liquid crystal (LC) were prepared by a microfluidic device, and the light-induced processes were studied. The phase transition was induced from the center of the topological defect for an emulsion made of (N-(4-methoxybenzylidene)-4-butylaniline (MBBA), and strange
[...] Read more.
Photo-responsive double emulsions made of liquid crystal (LC) were prepared by a microfluidic device, and the light-induced processes were studied. The phase transition was induced from the center of the topological defect for an emulsion made of (N-(4-methoxybenzylidene)-4-butylaniline (MBBA), and strange texture change was observed for an emulsion made of 4-cyano-4′-pentylbiphenyl (5CB) doped with azobenzene. The results suggest that there are defect-involved processes in the phase change of LC double emulsions. Full article
(This article belongs to the Special Issue Nonequilibrium Thermodynamics of Interfaces)
Figures

Figure 1

Back to Top