Entropy
http://www.mdpi.com/journal/entropy
Latest open access articles published in Entropy at http://www.mdpi.com/journal/entropy<![CDATA[Entropy, Vol. 19, Pages 242: A Framework for Designing the Architectures of Deep Convolutional Neural Networks]]>
http://www.mdpi.com/1099-4300/19/6/242
Recent advances in Convolutional Neural Networks (CNNs) have obtained promising results in difficult deep learning tasks. However, the success of a CNN depends on finding an architecture to fit a given problem. A hand-crafted architecture is a challenging, time-consuming process that requires expert knowledge and effort, due to a large number of architectural design choices. In this article, we present an efficient framework that automatically designs a high-performing CNN architecture for a given problem. In this framework, we introduce a new optimization objective function that combines the error rate and the information learnt by a set of feature maps using deconvolutional networks (deconvnet). The new objective function allows the hyperparameters of the CNN architecture to be optimized in a way that enhances the performance by guiding the CNN through better visualization of learnt features via deconvnet. The actual optimization of the objective function is carried out via the Nelder-Mead Method (NMM). Further, our new objective function results in much faster convergence towards a better architecture. The proposed framework has the ability to explore a CNN architecture’s numerous design choices in an efficient way and also allows effective, distributed execution and synchronization via web services. Empirically, we demonstrate that the CNN architecture designed with our approach outperforms several existing approaches in terms of its error rate. Our results are also competitive with state-of-the-art results on the MNIST dataset and perform reasonably against the state-of-the-art results on CIFAR-10 and CIFAR-100 datasets. Our approach has a significant role in increasing the depth, reducing the size of strides, and constraining some convolutional layers not followed by pooling layers in order to find a CNN architecture that produces a high recognition performance.Entropy2017-05-24196Article10.3390/e190602422421099-43002017-05-24doi: 10.3390/e19060242Saleh AlbelwiAusif Mahmood<![CDATA[Entropy, Vol. 19, Pages 241: Axiomatic Characterization of the Quantum Relative Entropy and Free Energy]]>
http://www.mdpi.com/1099-4300/19/6/241
Building upon work by Matsumoto, we show that the quantum relative entropy with full-rank second argument is determined by four simple axioms: (i) Continuity in the first argument; (ii) the validity of the data-processing inequality; (iii) additivity under tensor products; and (iv) super-additivity. This observation has immediate implications for quantum thermodynamics, which we discuss. Specifically, we demonstrate that, under reasonable restrictions, the free energy is singled out as a measure of athermality. In particular, we consider an extended class of Gibbs-preserving maps as free operations in a resource-theoretic framework, in which a catalyst is allowed to build up correlations with the system at hand. The free energy is the only extensive and continuous function that is monotonic under such free operations.Entropy2017-05-23196Article10.3390/e190602412411099-43002017-05-23doi: 10.3390/e19060241Henrik WilmingRodrigo GallegoJens Eisert<![CDATA[Entropy, Vol. 19, Pages 240: Maxwell’s Demon—A Historical Review]]>
http://www.mdpi.com/1099-4300/19/6/240
For more than 140 years Maxwell’s demon has intrigued, enlightened, mystified, frustrated, and challenged physicists in unique and interesting ways. Maxwell’s original conception was brilliant and insightful, but over the years numerous different versions of Maxwell’s demon have been presented. Most versions have been answered with reasonable physical arguments, with each of these answers (apparently) keeping the second law of thermodynamics intact. Though the laws of physics did not change in this process of questioning and answering, we have learned a lot along the way about statistical mechanics and thermodynamics. This paper will review a selected history and discuss some of the interesting historical characters who have participated.Entropy2017-05-23196Review10.3390/e190602402401099-43002017-05-23doi: 10.3390/e19060240Andrew Rex<![CDATA[Entropy, Vol. 19, Pages 233: On Linear Coding over Finite Rings and Applications to Computing]]>
http://www.mdpi.com/1099-4300/19/5/233
This paper presents a coding theorem for linear coding over finite rings, in the setting of the Slepian–Wolf source coding problem. This theorem covers corresponding achievability theorems of Elias (IRE Conv. Rec. 1955, 3, 37–46) and Csiszár (IEEE Trans. Inf. Theory 1982, 28, 585–592) for linear coding over finite fields as special cases. In addition, it is shown that, for any set of finite correlated discrete memoryless sources, there always exists a sequence of linear encoders over some finite non-field rings which achieves the data compression limit, the Slepian–Wolf region. Hence, the optimality problem regarding linear coding over finite non-field rings for data compression is closed with positive confirmation with respect to existence. For application, we address the problem of source coding for computing, where the decoder is interested in recovering a discrete function of the data generated and independently encoded by several correlated i.i.d. random sources. We propose linear coding over finite rings as an alternative solution to this problem. Results in Körner–Marton (IEEE Trans. Inf. Theory 1979, 25, 219–221) and Ahlswede–Han (IEEE Trans. Inf. Theory 1983, 29, 396–411, Theorem 10) are generalized to cases for encoding (pseudo) nomographic functions (over rings). Since a discrete function with a finite domain always admits a nomographic presentation, we conclude that both generalizations universally apply for encoding all discrete functions of finite domains. Based on these, we demonstrate that linear coding over finite rings strictly outperforms its field counterpart in terms of achieving better coding rates and reducing the required alphabet sizes of the encoders for encoding infinitely many discrete functions.Entropy2017-05-20195Article10.3390/e190502332331099-43002017-05-20doi: 10.3390/e19050233Sheng HuangMikael Skoglund<![CDATA[Entropy, Vol. 19, Pages 238: Lyapunov Spectra of Coulombic and Gravitational Periodic Systems]]>
http://www.mdpi.com/1099-4300/19/5/238
An open question in nonlinear dynamics is the relation between the Kolmogorov entropy and the largest Lyapunov exponent of a given orbit. Both have been shown to have diagnostic capability for phase transitions in thermodynamic systems. For systems with long-range interactions, the choice of boundary plays a critical role and appropriate boundary conditions must be invoked. In this work, we compute Lyapunov spectra for Coulombic and gravitational versions of the one-dimensional systems of parallel sheets with periodic boundary conditions. Exact expressions for time evolution of the tangent-space vectors are derived and are utilized toward computing Lypaunov characteristic exponents using an event-driven algorithm. The results indicate that the energy dependence of the largest Lyapunov exponent emulates that of Kolmogorov entropy for each system for a given system size. Our approach forms an effective and approximation-free instrument for studying the dynamical properties exhibited by the Coulombic and gravitational systems and finds applications in investigating indications of thermodynamic transitions in small as well as large versions of the spatially periodic systems. When a phase transition exists, we find that the largest Lyapunov exponent serves as a precursor of the transition that becomes more pronounced as the system size increases.Entropy2017-05-20195Article10.3390/e190502382381099-43002017-05-20doi: 10.3390/e19050238Pankaj KumarBruce Miller<![CDATA[Entropy, Vol. 19, Pages 237: Can a Robot Have Free Will?]]>
http://www.mdpi.com/1099-4300/19/5/237
Using insights from cybernetics and an information-based understanding of biological systems, a precise, scientifically inspired, definition of free-will is offered and the essential requirements for an agent to possess it in principle are set out. These are: (a) there must be a self to self-determine; (b) there must be a non-zero probability of more than one option being enacted; (c) there must be an internal means of choosing among options (which is not merely random, since randomness is not a choice). For (a) to be fulfilled, the agent of self-determination must be organisationally closed (a “Kantian whole”). For (c) to be fulfilled: (d) options must be generated from an internal model of the self which can calculate future states contingent on possible responses; (e) choosing among these options requires their evaluation using an internally generated goal defined on an objective function representing the overall “master function” of the agent and (f) for “deep free-will”, at least two nested levels of choice and goal (d–e) must be enacted by the agent. The agent must also be able to enact its choice in physical reality. The only systems known to meet all these criteria are living organisms, not just humans, but a wide range of organisms. The main impediment to free-will in present-day artificial robots, is their lack of being a Kantian whole. Consciousness does not seem to be a requirement and the minimum complexity for a free-will system may be quite low and include relatively simple life-forms that are at least able to learn.Entropy2017-05-20195Article10.3390/e190502372371099-43002017-05-20doi: 10.3390/e19050237Keith Farnsworth<![CDATA[Entropy, Vol. 19, Pages 236: Entropy in Investigation of Vasovagal Syndrome in Passive Head Up Tilt Test]]>
http://www.mdpi.com/1099-4300/19/5/236
This paper presents an application of Approximate Entropy (ApEn) and Sample Entropy (SampEn) in the analysis of heart rhythm, blood pressure and stroke volume for the diagnosis of vasovagal syndrome. The analyzed biosignals were recorded during positive passive tilt tests—HUTT(+). Signal changes and their entropy were compared in three main phases of the test: supine position, tilt, and pre-syncope, with special focus on the latter, which was analyzed in a sliding window of each signal. In some cases, ApEn and SampEn were equally useful for the assessment of signal complexity (p &lt; 0.05 in corresponding calculations). The complexity of the signals was found to decrease in the pre-syncope phase (SampEn (RRI): 1.20–0.34, SampEn (sBP): 1.29–0.57, SampEn (dBP): 1.19–0.48, SampEn (SV): 1.62–0.91). The pattern of the SampEn (SV) decrease differs from the pattern of the SampEn (sBP), SampEn (dBP) and SampEn (RRI) decrease. For all signals, the lowest entropy values in the pre-syncope phase were observed at the moment when loss of consciousness occurred.Entropy2017-05-20195Article10.3390/e190502362361099-43002017-05-20doi: 10.3390/e19050236Katarzyna BuszkoAgnieszka PiątkowskaEdward KoźlukGrzegorz Opolski<![CDATA[Entropy, Vol. 19, Pages 234: The Particle as a Statistical Ensemble of Events in Stueckelberg–Horwitz–Piron Electrodynamics ]]>
http://www.mdpi.com/1099-4300/19/5/234
In classical Maxwell electrodynamics, charged particles following deterministic trajectories are described by currents that induce fields, mediating interactions with other particles. Statistical methods are used when needed to treat complex particle and/or field configurations. In Stueckelberg–Horwitz–Piron (SHP) electrodynamics, the classical trajectories are traced out dynamically, through the evolution of a 4D spacetime event x μ ( τ ) as τ grows monotonically. Stueckelberg proposed to formalize the distinction between coordinate time x 0 = c t (measured by laboratory clocks) and chronology τ (the temporal ordering of event occurrence) in order to describe antiparticles and resolve problems of irreversibility such as grandfather paradoxes. Consequently, in SHP theory, the elementary object is not a particle (a 4D curve in spacetime) but rather an event (a single point along the dynamically evolving curve). Following standard deterministic methods in classical relativistic field theory, one is led to Maxwell-like field equations that are τ -dependent and sourced by a current that represents a statistical ensemble of instantaneous events distributed along the trajectory. The width λ of this distribution defines a correlation time for the interactions and a mass spectrum for the photons emitted by particles. As λ becomes very large, the photon mass goes to zero and the field equations become τ -independent Maxwell’s equations. Maxwell theory thus emerges as an equilibrium limit of SHP, in which λ is larger than any other relevant time scale. Thus, statistical mechanics is a fundamental ingredient in SHP electrodynamics, and its insights are required to give meaning to the concept of a particle.Entropy2017-05-19195Article10.3390/e190502342341099-43002017-05-19doi: 10.3390/e19050234Martin Land<![CDATA[Entropy, Vol. 19, Pages 232: A Kullback–Leibler View of Maximum Entropy and Maximum Log-Probability Methods]]>
http://www.mdpi.com/1099-4300/19/5/232
Entropy methods enable a convenient general approach to providing a probability distribution with partial information. The minimum cross-entropy principle selects the distribution that minimizes the Kullback–Leibler divergence subject to the given constraints. This general principle encompasses a wide variety of distributions, and generalizes other methods that have been proposed independently. There remains, however, some confusion about the breadth of entropy methods in the literature. In particular, the asymmetry of the Kullback–Leibler divergence provides two important special cases when the target distribution is uniform: the maximum entropy method and the maximum log-probability method. This paper compares the performance of both methods under a variety of conditions. We also examine a generalized maximum log-probability method as a further demonstration of the generality of the entropy approach.Entropy2017-05-19195Article10.3390/e190502322321099-43002017-05-19doi: 10.3390/e19050232Ali AbbasAndrea H. CadenbachEhsan Salimi<![CDATA[Entropy, Vol. 19, Pages 231: A Novel Faults Diagnosis Method for Rolling Element Bearings Based on EWT and Ambiguity Correlation Classifiers]]>
http://www.mdpi.com/1099-4300/19/5/231
According to non-stationary characteristic of the acoustic emission signal of rolling element bearings, a novel fault diagnosis method based on empirical wavelet transform (EWT) and ambiguity correlation classification (ACC) is proposed. In the proposed method, the acoustic emission signal acquired from a one-channel sensor is firstly decomposed using the EWT method, and then the mutual information of decomposed components and the original signal is computed and used to extract the noiseless component in order to obtain the reconstructed signal. Afterwards, the ambiguity correlation classifier, which has the advantages of ambiguity functions in the processing of the non-stationary signal, and the combining of correlation coefficients, is applied. Finally, multiple datasets of reconstructed signals for different operative conditions are fed to the ambiguity correlation classifier for training and testing. The proposed method was verified by experiments, and experimental results have shown that the proposed method can effectively diagnose three different operative conditions of rolling element bearings with higher detection rates than support vector machine and back-propagation (BP) neural network algorithms.Entropy2017-05-18195Article10.3390/e190502312311099-43002017-05-18doi: 10.3390/e19050231Xingmeng JiangLi WuMingtao Ge<![CDATA[Entropy, Vol. 19, Pages 230: Specific and Complete Local Integration of Patterns in Bayesian Networks]]>
http://www.mdpi.com/1099-4300/19/5/230
We present a first formal analysis of specific and complete local integration. Complete local integration was previously proposed as a criterion for detecting entities or wholes in distributed dynamical systems. Such entities in turn were conceived to form the basis of a theory of emergence of agents within dynamical systems. Here, we give a more thorough account of the underlying formal measures. The main contribution is the disintegration theorem which reveals a special role of completely locally integrated patterns (what we call ι-entities) within the trajectories they occur in. Apart from proving this theorem we introduce the disintegration hierarchy and its refinement-free version as a way to structure the patterns in a trajectory. Furthermore, we construct the least upper bound and provide a candidate for the greatest lower bound of specific local integration. Finally, we calculate the ι -entities in small example systems as a first sanity check and find that ι -entities largely fulfil simple expectations.Entropy2017-05-18195Article10.3390/e190502302301099-43002017-05-18doi: 10.3390/e19050230Martin BiehlTakashi IkegamiDaniel Polani<![CDATA[Entropy, Vol. 19, Pages 227: Ion Hopping and Constrained Li Diffusion Pathways in the Superionic State of Antifluorite Li2O]]>
http://www.mdpi.com/1099-4300/19/5/227
Li2O belongs to the family of antifluorites that show superionic behavior at high temperatures. While some of the superionic characteristics of Li2O are well-known, the mechanistic details of ionic conduction processes are somewhat nebulous. In this work, we first establish an onset of superionic conduction that is emblematic of a gradual disordering process among the Li ions at a characteristic temperature Tα (~1000 K) using reported neutron diffraction data and atomistic simulations. In the superionic state, the Li ions are observed to portray dynamic disorder by hopping between the tetrahedral lattice sites. We then show that string-like ionic diffusion pathways are established among the Li ions in the superionic state. The diffusivity of these dynamical string-like structures, which have a finite lifetime, shows a remarkable correlation to the bulk diffusivity of the system.Entropy2017-05-18195Article10.3390/e190502272271099-43002017-05-18doi: 10.3390/e19050227Ajay AnnamareddyJacob Eapen<![CDATA[Entropy, Vol. 19, Pages 228: Face Verification with Multi-Task and Multi-Scale Feature Fusion]]>
http://www.mdpi.com/1099-4300/19/5/228
Face verification for unrestricted faces in the wild is a challenging task. This paper proposes a method based on two deep convolutional neural networks (CNN) for face verification. In this work, we explore using identification signals to supervise one CNN and the combination of semi-verification and identification to train the other one. In order to estimate semi-verification loss at a low computation cost, a circle, which is composed of all faces, is used for selecting face pairs from pairwise samples. In the process of face normalization, we propose using different landmarks of faces to solve the problems caused by poses. In addition, the final face representation is formed by the concatenating feature of each deep CNN after principal component analysis (PCA) reduction. Furthermore, each feature is a combination of multi-scale representations through making use of auxiliary classifiers. For the final verification, we only adopt the face representation of one region and one resolution of a face jointing Joint Bayesian classifier. Experiments show that our method can extract effective face representation with a small training dataset and our algorithm achieves 99.71% verification accuracy on Labeled Faces in the Wild (LFW) dataset.Entropy2017-05-17195Article10.3390/e190502282281099-43002017-05-17doi: 10.3390/e19050228Xiaojun LuYue YangWeilin ZhangQi WangYang Wang<![CDATA[Entropy, Vol. 19, Pages 229: Investigation of the Intra- and Inter-Limb Muscle Coordination of Hands-and-Knees Crawling in Human Adults by Means of Muscle Synergy Analysis]]>
http://www.mdpi.com/1099-4300/19/5/229
To investigate the intra- and inter-limb muscle coordination mechanism of human hands-and-knees crawling by means of muscle synergy analysis, surface electromyographic (sEMG) signals of 20 human adults were collected bilaterally from 32 limb related muscles during crawling with hands and knees at different speeds. The nonnegative matrix factorization (NMF) algorithm was exerted on each limb to extract muscle synergies. The results showed that intra-limb coordination was relatively stable during human hands-and-knees crawling. Two synergies, one relating to the stance phase and the other relating to the swing phase, could be extracted from each limb during a crawling cycle. Synergy structures during different speeds kept good consistency, but the recruitment levels, durations, and phases of muscle synergies were adjusted to adapt the change of crawling speed. Furthermore, the ipsilateral phase lag (IPL) value which was used to depict the inter-limb coordination changed with crawling speed for most subjects, and subjects using the no-limb-pairing mode at low speed tended to adopt the trot-like mode or pace-like mode at high speed. The research results could be well explained by the two-level central pattern generator (CPG) model consisting of a half-center rhythm generator (RG) and a pattern formation (PF) circuit. This study sheds light on the underlying control mechanism of human crawling.Entropy2017-05-17195Article10.3390/e190502292291099-43002017-05-17doi: 10.3390/e19050229Xiang ChenXiaocong NiuDe WuYi YuXu Zhang<![CDATA[Entropy, Vol. 19, Pages 226: Information Entropy and Measures of Market Risk]]>
http://www.mdpi.com/1099-4300/19/5/226
In this paper we investigate the relationship between the information entropy of the distribution of intraday returns and intraday and daily measures of market risk. Using data on the EUR/JPY exchange rate, we find a negative relationship between entropy and intraday Value-at-Risk, and also between entropy and intraday Expected Shortfall. This relationship is then used to forecast daily Value-at-Risk, using the entropy of the distribution of intraday returns as a predictor.Entropy2017-05-16195Article10.3390/e190502262261099-43002017-05-16doi: 10.3390/e19050226Daniel PeleEmese LazarAlfonso Dufour<![CDATA[Entropy, Vol. 19, Pages 225: Entropy Information of Cardiorespiratory Dynamics in Neonates during Sleep]]>
http://www.mdpi.com/1099-4300/19/5/225
Sleep is a central activity in human adults and characterizes most of the newborn infant life. During sleep, autonomic control acts to modulate heart rate variability (HRV) and respiration. Mechanisms underlying cardiorespiratory interactions in different sleep states have been studied but are not yet fully understood. Signal processing approaches have focused on cardiorespiratory analysis to elucidate this co-regulation. This manuscript proposes to analyze heart rate (HR), respiratory variability and their interrelationship in newborn infants to characterize cardiorespiratory interactions in different sleep states (active vs. quiet). We are searching for indices that could detect regulation alteration or malfunction, potentially leading to infant distress. We have analyzed inter-beat (RR) interval series and respiration in a population of 151 newborns, and followed up with 33 at 1 month of age. RR interval series were obtained by recognizing peaks of the QRS complex in the electrocardiogram (ECG), corresponding to the ventricles depolarization. Univariate time domain, frequency domain and entropy measures were applied. In addition, Transfer Entropy was considered as a bivariate approach able to quantify the bidirectional information flow from one signal (respiration) to another (RR series). Results confirm the validity of the proposed approach. Overall, HRV is higher in active sleep, while high frequency (HF) power characterizes more quiet sleep. Entropy analysis provides higher indices for SampEn and Quadratic Sample entropy (QSE) in quiet sleep. Transfer Entropy values were higher in quiet sleep and point to a major influence of respiration on the RR series. At 1 month of age, time domain parameters show an increase in HR and a decrease in variability. No entropy differences were found across ages. The parameters employed in this study help to quantify the potential for infants to adapt their cardiorespiratory responses as they mature. Thus, they could be useful as early markers of risk for infant cardiorespiratory vulnerabilities.Entropy2017-05-15195Article10.3390/e190502252251099-43002017-05-15doi: 10.3390/e19050225Maristella LucchiniNicolò PiniWilliam FiferNina BurtchenMaria Signorini<![CDATA[Entropy, Vol. 19, Pages 224: Classification of Fractal Signals Using Two-Parameter Non-Extensive Wavelet Entropy]]>
http://www.mdpi.com/1099-4300/19/5/224
This article proposes a methodology for the classification of fractal signals as stationary or nonstationary. The methodology is based on the theoretical behavior of two-parameter wavelet entropy of fractal signals. The wavelet ( q , q ′ ) -entropy is a wavelet-based extension of the ( q , q ′ ) -entropy of Borges and is based on the entropy planes for various q and q ′ ; it is theoretically shown that it constitutes an efficient and effective technique for fractal signal classification. Moreover, the second parameter q ′ provides further analysis flexibility and robustness in the sense that different ( q , q ′ ) pairs can analyze the same phenomena and increase the range of dispersion of entropies. A comparison study against the standard signal summation conversion technique shows that the proposed methodology is not only comparable in accuracy but also more computationally efficient. The application of the proposed methodology to physiological and financial time series is also presented along with the classification of these as stationary or nonstationary.Entropy2017-05-15195Article10.3390/e190502242241099-43002017-05-15doi: 10.3390/e19050224Julio Ramírez-PachecoJoel Trejo-SánchezJoaquin Cortez-GonzálezRamón Palacio<![CDATA[Entropy, Vol. 19, Pages 223: Prediction and Evaluation of Zero Order Entropy Changes in Grammar-Based Codes]]>
http://www.mdpi.com/1099-4300/19/5/223
The change of zero order entropy is studied over different strategies of grammar production rule selection. The two major rules are distinguished: transformations leaving the message size intact and substitution functions changing the message size. Relations for zero order entropy changes were derived for both cases and conditions under which the entropy decreases were described. In this article, several different greedy strategies reducing zero order entropy, as well as message sizes are summarized, and the new strategy MinEnt is proposed. The resulting evolution of the zero order entropy is compared with a strategy of selecting the most frequent digram used in the Re-Pair algorithm.Entropy2017-05-13195Article10.3390/e190502232231099-43002017-05-13doi: 10.3390/e19050223Michal VasinekJan Platos<![CDATA[Entropy, Vol. 19, Pages 218: Minimum Entropy Active Fault Tolerant Control of the Non-Gaussian Stochastic Distribution System Subjected to Mean Constraint]]>
http://www.mdpi.com/1099-4300/19/5/218
Stochastic distribution control (SDC) systems are a group of systems where the outputs considered is the measured probability density function (PDF) of the system output whilst subjected to a normal crisp input. The purpose of the active fault tolerant control of such systems is to use the fault estimation information and other measured information to make the output PDF still track the given distribution when the objective PDF is known. However, if the target PDF is unavailable, the PDF tracking operation will be impossible. Minimum entropy control of the system output can be considered as an alternative strategy. The mean represents the center location of the stochastic variable, and it is reasonable that the minimum entropy fault tolerant controller can be designed subjected to mean constraint. In this paper, using the rational square-root B-spline model for the shape control of the system output probability density function (PDF), a nonlinear adaptive observer based fault diagnosis algorithm is proposed to diagnose the fault. Through the controller reconfiguration, the system entropy subjected to mean restriction can still be minimized when fault occurs. An illustrative example is utilized to demonstrate the use of the minimum entropy fault tolerant control algorithms.Entropy2017-05-11195Article10.3390/e190502182181099-43002017-05-11doi: 10.3390/e19050218Haokun JinYacun GuanLina Yao<![CDATA[Entropy, Vol. 19, Pages 221: Muscle Fatigue Analysis of the Deltoid during Three Head-Related Static Isometric Contraction Tasks]]>
http://www.mdpi.com/1099-4300/19/5/221
This study aimed to investigate the fatiguing characteristics of muscle-tendon units (MTUs) within skeletal muscles during static isometric contraction tasks. The deltoid was selected as the target muscle and three head-related static isometric contraction tasks were designed to activate three heads of the deltoid in different modes. Nine male subjects participated in this study. Surface electromyography (SEMG) signals were collected synchronously from the three heads of the deltoid. The performances of five SEMG parameters, including root mean square (RMS), mean power frequency (MPF), the first coefficient of autoregressive model (ARC1), sample entropy (SE) and Higuchi’s fractal dimension (HFD), in quantification of fatigue, were evaluated in terms of sensitivity to variability ratio (SVR) and consistency firstly. Then, the HFD parameter was selected as the fatigue index for further muscle fatigue analysis. The experimental results demonstrated that the three deltoid heads presented different activation modes during three head-related fatiguing contractions. The fatiguing characteristics of the three heads were found to be task-dependent, and the heads kept in a relatively high activation level were more prone to fatigue. In addition, the differences in fatiguing rate between heads increased with the increase in load. The findings of this study can be helpful in better understanding the underlying neuromuscular control strategies of the central nervous system (CNS). Based on the results of this study, the CNS was thought to control the contraction of the deltoid by taking the three heads as functional units, but a certain synergy among heads might also exist to accomplish a contraction task.Entropy2017-05-11195Article10.3390/e190502212211099-43002017-05-11doi: 10.3390/e19050221Wenxiang CuiXiang ChenShuai CaoXu Zhang<![CDATA[Entropy, Vol. 19, Pages 220: A Functorial Construction of Quantum Subtheories]]>
http://www.mdpi.com/1099-4300/19/5/220
We apply the geometric quantization procedure via symplectic groupoids to the setting of epistemically-restricted toy theories formalized by Spekkens (Spekkens, 2016). In the continuous degrees of freedom, this produces the algebraic structure of quadrature quantum subtheories. In the odd-prime finite degrees of freedom, we obtain a functor from the Frobenius algebra of the toy theories to the Frobenius algebra of stabilizer quantum mechanics.Entropy2017-05-11195Article10.3390/e190502202201099-43002017-05-11doi: 10.3390/e19050220Ivan ContrerasAli Duman<![CDATA[Entropy, Vol. 19, Pages 219: Calculating Iso-Committor Surfaces as Optimal Reaction Coordinates with Milestoning]]>
http://www.mdpi.com/1099-4300/19/5/219
Reaction coordinates are vital tools for qualitative and quantitative analysis of molecular processes. They provide a simple picture of reaction progress and essential input for calculations of free energies and rates. Iso-committor surfaces are considered the optimal reaction coordinate. We present an algorithm to compute efficiently a sequence of isocommittor surfaces. These surfaces are considered an optimal reaction coordinate. The algorithm analyzes Milestoning results to determine the committor function. It requires only the transition probabilities between the milestones, and not transition times. We discuss the following numerical examples: (i) a transition in the Mueller potential; (ii) a conformational change of a solvated peptide; and (iii) cholesterol aggregation in membranes.Entropy2017-05-11195Article10.3390/e190502192191099-43002017-05-11doi: 10.3390/e19050219Ron ElberJuan Bello-RivasPiao MaAlfredo CardenasArman Fathizadeh<![CDATA[Entropy, Vol. 19, Pages 217: On the Convergence and Law of Large Numbers for the Non-Euclidean Lp -Means]]>
http://www.mdpi.com/1099-4300/19/5/217
This paper describes and proves two important theorems that compose the Law of Large Numbers for the non-Euclidean L p -means, known to be true for the Euclidean L 2 -means: Let the L p -mean estimator, which constitutes the specific functional that estimates the L p -mean of N independent and identically distributed random variables; then, (i) the expectation value of the L p -mean estimator equals the mean of the distributions of the random variables; and (ii) the limit N → ∞ of the L p -mean estimator also equals the mean of the distributions.Entropy2017-05-11195Article10.3390/e190502172171099-43002017-05-11doi: 10.3390/e19050217George Livadiotis<![CDATA[Entropy, Vol. 19, Pages 215: Cauchy Principal Value Contour Integral with Applications]]>
http://www.mdpi.com/1099-4300/19/5/215
Cauchy principal value is a standard method applied in mathematical applications by which an improper, and possibly divergent, integral is measured in a balanced way around singularities or at infinity. On the other hand, entropy prediction of systems behavior from a thermodynamic perspective commonly involves contour integrals. With the aim of facilitating the calculus of such integrals in this entropic scenario, we revisit the generalization of Cauchy principal value to complex contour integral, formalize its definition and—by using residue theory techniques—provide an useful way to evaluate them.Entropy2017-05-10195Article10.3390/e190502152151099-43002017-05-10doi: 10.3390/e19050215Matilde LeguaLuis Sánchez-Ruiz<![CDATA[Entropy, Vol. 19, Pages 211: Meromorphic Non-Integrability of Several 3D Dynamical Systems]]>
http://www.mdpi.com/1099-4300/19/5/211
In this paper, we apply the differential Galoisian approach to investigate the meromorphic non-integrability of a class of 3D equations in mathematical physics, including Nosé–Hoover equations, the Lü system, the Rikitake-like system and Rucklidge equations, which are well known in the fields of molecular dynamics, chaotic theory and fluid mechanics, respectively. Our main results show that all these considered systems are, in fact, non-integrable in nearly all parameters.Entropy2017-05-10195Article10.3390/e190502112111099-43002017-05-10doi: 10.3390/e19050211Kaiyin HuangShaoyun ShiWenlei Li<![CDATA[Entropy, Vol. 19, Pages 216: Designing Labeled Graph Classifiers by Exploiting the Rényi Entropy of the Dissimilarity Representation]]>
http://www.mdpi.com/1099-4300/19/5/216
Representing patterns as labeled graphs is becoming increasingly common in the broad field of computational intelligence. Accordingly, a wide repertoire of pattern recognition tools, such as classifiers and knowledge discovery procedures, are nowadays available and tested for various datasets of labeled graphs. However, the design of effective learning procedures operating in the space of labeled graphs is still a challenging problem, especially from the computational complexity viewpoint. In this paper, we present a major improvement of a general-purpose classifier for graphs, which is conceived on an interplay between dissimilarity representation, clustering, information-theoretic techniques, and evolutionary optimization algorithms. The improvement focuses on a specific key subroutine devised to compress the input data. We prove different theorems which are fundamental to the setting of the parameters controlling such a compression operation. We demonstrate the effectiveness of the resulting classifier by benchmarking the developed variants on well-known datasets of labeled graphs, considering as distinct performance indicators the classification accuracy, computing time, and parsimony in terms of structural complexity of the synthesized classification models. The results show state-of-the-art standards in terms of test set accuracy and a considerable speed-up for what concerns the computing time.Entropy2017-05-09195Article10.3390/e190502162161099-43002017-05-09doi: 10.3390/e19050216Lorenzo Livi<![CDATA[Entropy, Vol. 19, Pages 214: Anatomy of a Spin: The Information-Theoretic Structure of Classical Spin Systems]]>
http://www.mdpi.com/1099-4300/19/5/214
Collective organization in matter plays a significant role in its expressed physical properties. Typically, it is detected via an order parameter, appropriately defined for each given system’s observed emergent patterns. Recent developments in information theory, however, suggest quantifying collective organization in a system- and phenomenon-agnostic way: decomposing the system’s thermodynamic entropy density into a localized entropy, that is solely contained in the dynamics at a single location, and a bound entropy, that is stored in space as domains, clusters, excitations, or other emergent structures. As a concrete demonstration, we compute this decomposition and related quantities explicitly for the nearest-neighbor Ising model on the 1D chain, on the Bethe lattice with coordination number k = 3 , and on the 2D square lattice, illustrating its generality and the functional insights it gives near and away from phase transitions. In particular, we consider the roles that different spin motifs play (in cluster bulk, cluster edges, and the like) and how these affect the dependencies between spins.Entropy2017-05-08195Article10.3390/e190502142141099-43002017-05-08doi: 10.3390/e19050214Vikram VijayaraghavanRyan JamesJames Crutchfield<![CDATA[Entropy, Vol. 19, Pages 213: Cockroach Swarm Optimization Algorithm for Travel Planning]]>
http://www.mdpi.com/1099-4300/19/5/213
In transport planning, one should allow passengers to travel through the complicated transportation scheme with efficient use of different modes of transport. In this paper, we propose the use of a cockroach swarm optimization algorithm for determining paths with the shortest travel time. In our approach, this algorithm has been modified to work with the time-expanded model. Therefore, we present how the algorithm has to be adapted to this model, including correctly creating solutions and defining steps and movement in the search space. By introducing the proposed modifications, we are able to solve journey planning. The results have shown that the performance of our approach, in terms of converging to the best solutions, is satisfactory. Moreover, we have compared our results with Dijkstra’s algorithm and a particle swarm optimization algorithm.Entropy2017-05-06195Article10.3390/e190502132131099-43002017-05-06doi: 10.3390/e19050213Joanna KwiecieńMarek Pasieka<![CDATA[Entropy, Vol. 19, Pages 210: Information Content Based Optimal Radar Waveform Design: LPI’s Purpose]]>
http://www.mdpi.com/1099-4300/19/5/210
This paper presents a low probability of interception (LPI) radar waveform design method with a fixed average power constraint based on information theory. The Kullback–Leibler divergence (KLD) between the intercept signal and background noise is presented as a practical metric to evaluate the performance of the adversary intercept receiver in this paper. Through combining it with the radar performance metric, that is, the mutual information (MI), a multi-objective optimization model of LPI waveform design is developed. It is a trade-off between the performance of radar and enemy intercept receiver. After being transformed into a single-objective optimization problem, it can be solved by using an interior point method and a sequential quadratic programming (SQP) method. Simulation results verify the correctness and effectiveness of the proposed LPI radar waveform design method.Entropy2017-05-06195Article10.3390/e190502102101099-43002017-05-06doi: 10.3390/e19050210Jun ChenFei WangJianjiang Zhou<![CDATA[Entropy, Vol. 19, Pages 212: Boltzmann Entropy of a Newtonian Universe]]>
http://www.mdpi.com/1099-4300/19/5/212
A dynamical estimate is given for the Boltzmann entropy of the Universe, under the simplifying assumptions provided by Newtonian cosmology. We first model the cosmological fluid as the probability fluid of a quantum-mechanical system. Next, following current ideas about the emergence of spacetime, we regard gravitational equipotentials as isoentropic surfaces. Therefore, gravitational entropy is proportional to the vacuum expectation value of the gravitational potential in a certain quantum state describing the matter contents of the Universe. The entropy of the matter sector can also be computed. While providing values of the entropy that turn out to be somewhat higher than existing estimates, our results are in perfect compliance with the upper bound set by the holographic principle.Entropy2017-05-06195Article10.3390/e190502122121099-43002017-05-06doi: 10.3390/e19050212D. CabreraPedro Fernández de CórdobaJ.M. Isidro<![CDATA[Entropy, Vol. 19, Pages 209: Ensemble Averages, Soliton Dynamics and Influence of Haptotaxis in a Model of Tumor-Induced Angiogenesis]]>
http://www.mdpi.com/1099-4300/19/5/209
In this work, we present a numerical study of the influence of matrix degrading enzyme (MDE) dynamics and haptotaxis on the development of vessel networks in tumor-induced angiogenesis. Avascular tumors produce growth factors that induce nearby blood vessels to emit sprouts formed by endothelial cells. These capillary sprouts advance toward the tumor by chemotaxis (gradients of growth factor) and haptotaxis (adhesion to the tissue matrix outside blood vessels). The motion of the capillaries in this constrained space is modelled by stochastic processes (Langevin equations, branching and merging of sprouts) coupled to continuum equations for concentrations of involved substances. There is a complementary deterministic description in terms of the density of actively moving tips of vessel sprouts. The latter forms a stable soliton-like wave whose motion is influenced by the different taxis mechanisms. We show the delaying effect of haptotaxis on the advance of the angiogenic vessel network by direct numerical simulations of the stochastic process and by a study of the soliton motion.Entropy2017-05-04195Article10.3390/e190502092091099-43002017-05-04doi: 10.3390/e19050209Luis BonillaManuel CarreteroFilippo Terragni<![CDATA[Entropy, Vol. 19, Pages 204: Measures of Qualitative Variation in the Case of Maximum Entropy]]>
http://www.mdpi.com/1099-4300/19/5/204
Asymptotic behavior of qualitative variation statistics, including entropy measures, can be modeled well by normal distributions. In this study, we test the normality of various qualitative variation measures in general. We find that almost all indices tend to normality as the sample size increases, and they are highly correlated. However, for all of these qualitative variation statistics, maximum uncertainty is a serious factor that prevents normality. Among these, we study the properties of two qualitative variation statistics; VarNC and StDev statistics in the case of maximum uncertainty, since these two statistics show lower sampling variability and utilize all sample information. We derive probability distribution functions of these statistics and prove that they are consistent. We also discuss the relationship between VarNC and the normalized form of Tsallis (α = 2) entropy in the case of maximum uncertainty.Entropy2017-05-04195Article10.3390/e190502042041099-43002017-05-04doi: 10.3390/e19050204Atif EvrenErhan Ustaoğlu<![CDATA[Entropy, Vol. 19, Pages 208: Objective Bayesian Entropy Inference for Two-Parameter Logistic Distribution Using Upper Record Values]]>
http://www.mdpi.com/1099-4300/19/5/208
In this paper, we provide an entropy inference method that is based on an objective Bayesian approach for upper record values having a two-parameter logistic distribution. We derive the entropy that is based on the i-th upper record value and the joint entropy that is based on the upper record values. Moreover, we examine their properties. For objective Bayesian analysis, we obtain objective priors, namely, the Jeffreys and reference priors, for the unknown parameters of the logistic distribution. The priors are based on upper record values. Then, we develop an entropy inference method that is based on these objective priors. In real data analysis, we assess the quality of the proposed models under the objective priors and compare them with the model under the informative prior.Entropy2017-05-03195Article10.3390/e190502082081099-43002017-05-03doi: 10.3390/e19050208Jung SeoYongku Kim<![CDATA[Entropy, Vol. 19, Pages 206: Divergence and Sufficiency for Convex Optimization]]>
http://www.mdpi.com/1099-4300/19/5/206
Logarithmic score and information divergence appear in information theory, statistics, statistical mechanics, and portfolio theory. We demonstrate that all these topics involve some kind of optimization that leads directly to regret functions and such regret functions are often given by Bregman divergences. If a regret function also fulfills a sufficiency condition it must be proportional to information divergence. We will demonstrate that sufficiency is equivalent to the apparently weaker notion of locality and it is also equivalent to the apparently stronger notion of monotonicity. These sufficiency conditions have quite different relevance in the different areas of application, and often they are not fulfilled. Therefore sufficiency conditions can be used to explain when results from one area can be transferred directly to another and when one will experience differences.Entropy2017-05-03195Article10.3390/e190502062061099-43002017-05-03doi: 10.3390/e19050206Peter Harremoës<![CDATA[Entropy, Vol. 19, Pages 205: About the Concept of Quantum Chaos]]>
http://www.mdpi.com/1099-4300/19/5/205
The research on quantum chaos finds its roots in the study of the spectrum of complex nuclei in the 1950s and the pioneering experiments in microwave billiards in the 1970s. Since then, a large number of new results was produced. Nevertheless, the work on the subject is, even at present, a superposition of several approaches expressed in different mathematical formalisms and weakly linked to each other. The purpose of this paper is to supply a unified framework for describing quantum chaos using the quantum ergodic hierarchy. Using the factorization property of this framework, we characterize the dynamical aspects of quantum chaos by obtaining the Ehrenfest time. We also outline a generalization of the quantum mixing level of the kicked rotator in the context of the impulsive differential equations.Entropy2017-05-03195Concept Paper10.3390/e190502052051099-43002017-05-03doi: 10.3390/e19050205Ignacio GomezMarcelo LosadaOlimpia Lombardi<![CDATA[Entropy, Vol. 19, Pages 207: Coarse Graining Shannon and von Neumann Entropies]]>
http://www.mdpi.com/1099-4300/19/5/207
The nature of coarse graining is intuitively “obvious”, but it is rather difficult to find explicit and calculable models of the coarse graining process (and the resulting entropy flow) discussed in the literature. What we would like to have at hand is some explicit and calculable process that takes an arbitrary system, with specified initial entropy S, and that monotonically and controllably drives the entropy to its maximum value. This does not have to be a physical process, in fact for some purposes it is better to deal with a gedanken-process, since then it is more obvious how the “hidden information” is hiding in the fine-grain correlations that one is simply agreeing not to look at. We shall present several simple mathematically well-defined and easy to work with conceptual models for coarse graining. We shall consider both the classical Shannon and quantum von Neumann entropies, including models based on quantum decoherence, and analyse the entropy flow in some detail. When coarse graining the quantum von Neumann entropy, we find it extremely useful to introduce an adaptation of Hawking’s super-scattering matrix. These explicit models that we shall construct allow us to quantify and keep clear track of the entropy that appears when coarse graining the system and the information that can be hidden in unobserved correlations (while not the focus of the current article, in the long run, these considerations are of interest when addressing the black hole information puzzle).Entropy2017-05-03195Article10.3390/e190502072071099-43002017-05-03doi: 10.3390/e19050207Ana Alonso-SerranoMatt Visser<![CDATA[Entropy, Vol. 19, Pages 203: Fractional Diffusion in a Solid with Mass Absorption]]>
http://www.mdpi.com/1099-4300/19/5/203
The space-time-fractional diffusion equation with the Caputo time-fractional derivative and Riesz fractional Laplacian is considered in the case of axial symmetry. Mass absorption (mass release) is described by a source term proportional to concentration. The integral transform technique is used. Different particular cases of the solution are studied. The numerical results are illustrated graphically.Entropy2017-05-02195Article10.3390/e190502032031099-43002017-05-02doi: 10.3390/e19050203Yuriy PovstenkoTamara KyrylychGrażyna Rygał<![CDATA[Entropy, Vol. 19, Pages 202: Kinetics of Interactions of Matter, Antimatter and Radiation Consistent with Antisymmetric (CPT-Invariant) Thermodynamics]]>
http://www.mdpi.com/1099-4300/19/5/202
This work investigates the influence of directional properties of decoherence on kinetics rate equations. The physical reality is understood as a chain of unitary and decoherence events. The former are quantum-deterministic, while the latter introduce uncertainty and increase entropy. For interactions of matter and antimatter, two approaches are considered: symmetric decoherence, which corresponds to conventional symmetric (CP-invariant) thermodynamics, and antisymmetric decoherence, which corresponds to antisymmetric (CPT-invariant) thermodynamics. Radiation, in its interactions with matter and antimatter, is shown to be decoherence-neutral. The symmetric and antisymmetric assumptions result in different interactions of radiation with matter and antimatter. The theoretical predictions for these differences are testable by comparing absorption (emission) of light by thermodynamic systems made of matter and antimatter. Canonical typicality for quantum mixtures is briefly discussed in Appendix A.Entropy2017-05-02195Article10.3390/e190502022021099-43002017-05-02doi: 10.3390/e19050202A.Y. Klimenko<![CDATA[Entropy, Vol. 19, Pages 198: Discovery of Kolmogorov Scaling in the Natural Language]]>
http://www.mdpi.com/1099-4300/19/5/198
We consider the rate R and variance σ 2 of Shannon information in snippets of text based on word frequencies in the natural language. We empirically identify Kolmogorov’s scaling law in σ 2 ∝ k - 1 . 66 ± 0 . 12 (95% c.l.) as a function of k = 1 / N measured by word count N. This result highlights a potential association of information flow in snippets, analogous to energy cascade in turbulent eddies in fluids at high Reynolds numbers. We propose R and σ 2 as robust utility functions for objective ranking of concordances in efficient search for maximal information seamlessly across different languages and as a starting point for artificial attention.Entropy2017-05-02195Letter10.3390/e190501981981099-43002017-05-02doi: 10.3390/e19050198Maurice van Putten<![CDATA[Entropy, Vol. 19, Pages 114: The Solution of Modified Fractional Bergman’s Minimal Blood Glucose-Insulin Model]]>
http://www.mdpi.com/1099-4300/19/5/114
In the present paper, we use analytical techniques to solve fractional nonlinear differential equations systems that arise in Bergman’s minimal model, used to describe blood glucose and insulin metabolism, after intravenous tolerance testing. We also discuss the stability and uniqueness of the solution.Entropy2017-05-02195Article10.3390/e190501141141099-43002017-05-02doi: 10.3390/e19050114Badr AlkahtaniObaid AlgahtaniRavi DubeyPranay Goswami<![CDATA[Entropy, Vol. 19, Pages 200: Effects of Movable-Baffle on Heat Transfer and Entropy Generation in a Cavity Saturated by CNT Suspensions: Three-Dimensional Modeling]]>
http://www.mdpi.com/1099-4300/19/5/200
Convective heat transfer and entropy generation in a 3D closed cavity, equipped with adiabatic-driven baffle and filled with CNT (carbon nanotube)-water nanofluid, are numerically investigated for a range of Rayleigh numbers from 103 to 105. This research is conducted for three configurations; fixed baffle (V = 0), rotating baffle clockwise (V+) and rotating baffle counterclockwise (V−) and a range of CNT concentrations from 0 to 15%. Governing equations are formulated using potential vector vorticity formulation in its three-dimensional form, then solved by the finite volume method. The effects of motion direction of the inserted driven baffle and CNT concentration on heat transfer and entropy generation are studied. It was observed that for low Rayleigh numbers, the motion of the driven baffle enhances heat transfer regardless of its direction and the CNT concentration effect is negligible. However, with an increasing Rayleigh number, adding driven baffle increases the heat transfer only when it moves in the direction of the decreasing temperature gradient; elsewhere, convective heat transfer cannot be enhanced due to flow blockage at the corners of the baffle.Entropy2017-04-29195Article10.3390/e190502002001099-43002017-04-29doi: 10.3390/e19050200Abdullah Al-RashedWalid AichLioua KolsiOmid MahianAhmed HusseinMohamed Borjini<![CDATA[Entropy, Vol. 19, Pages 201: Utility, Revealed Preferences Theory, and Strategic Ambiguity in Iterated Games]]>
http://www.mdpi.com/1099-4300/19/5/201
Iterated games, in which the same economic interaction is repeatedly played between the same agents, are an important framework for understanding the effectiveness of strategic choices over time. To date, very little work has applied information theory to the information sets used by agents in order to decide what action to take next in such strategic situations. This article looks at the mutual information between previous game states and an agent’s next action by introducing two new classes of games: “invertible games” and “cyclical games”. By explicitly expanding out the mutual information between past states and the next action we show under what circumstances the explicit values of the utility are irrelevant for iterated games and this is then related to revealed preferences theory of classical economics. These information measures are then applied to the Traveler’s Dilemma game and the Prisoner’s Dilemma game, the Prisoner’s Dilemma being invertible, to illustrate their use. In the Prisoner’s Dilemma, a novel connection is made between the computational principles of logic gates and both the structure of games and the agents’ decision strategies. This approach is applied to the cyclical game Matching Pennies to analyse the foundations of a behavioural ambiguity between two well studied strategies: “Tit-for-Tat” and “Win-Stay, Lose-Switch”.Entropy2017-04-29195Article10.3390/e190502012011099-43002017-04-29doi: 10.3390/e19050201Michael Harré<![CDATA[Entropy, Vol. 19, Pages 199: Cooperative Particle Filtering for Tracking ERP Subcomponents from Multichannel EEG]]>
http://www.mdpi.com/1099-4300/19/5/199
In this study, we propose a novel method to investigate P300 variability over different trials. The method incorporates spatial correlation between EEG channels to form a cooperative coupled particle filtering method that tracks the P300 subcomponents, P3a and P3b, over trials. Using state space systems, the amplitude, latency, and width of each subcomponent are modeled as the main underlying parameters. With four electrodes, two coupled Rao-Blackwellised particle filter pairs are used to recursively estimate the system state over trials. A number of physiological constraints are also imposed to avoid generating invalid particles in the estimation process. Motivated by the bilateral symmetry of ERPs over the brain, the channels further share their estimates with their neighbors and combine the received information to obtain a more accurate and robust solution. The proposed algorithm is capable of estimating the P300 subcomponents in single trials and outperforms its non-cooperative counterpart.Entropy2017-04-29195Article10.3390/e190501991991099-43002017-04-29doi: 10.3390/e19050199Sadaf MonajemiDelaram JarchiSim-Heng OngSaeid Sanei<![CDATA[Entropy, Vol. 19, Pages 196: Symbolic Analysis of Brain Dynamics Detects Negative Stress]]>
http://www.mdpi.com/1099-4300/19/5/196
The electroencephalogram (EEG) is the most common tool used to study mental disorders. In the last years, the use of this recording for recognition of negative stress has been receiving growing attention. However, precise identification of this emotional state is still an interesting unsolved challenge. Nowadays, stress presents a high prevalence in developed countries and, moreover, its chronic condition often leads to concomitant physical and mental health problems. Recently, a measure of time series irregularity, such as quadratic sample entropy (QSEn), has been suggested as a promising single index for discerning between emotions of calm and stress. Unfortunately, this index only considers repetitiveness of similar patterns and, hence, it is unable to quantify successfully dynamics associated with the data temporal structure. With the aim of extending QSEn ability for identification of stress from the EEG signal, permutation entropy (PEn) and its modification to be amplitude-aware (AAPEn) have been analyzed in the present work. These metrics assess repetitiveness of ordinal patterns, thus considering causal information within each one of them and obtaining improved estimates of predictability. Results have shown that PEn and AAPEn present a discriminant power between emotional states of calm and stress similar to QSEn, i.e., around 65%. Additionally, they have also revealed complementary dynamics to those quantified by QSEn, thus suggesting a synchronized behavior between frontal and parietal counterparts from both hemispheres of the brain. More precisely, increased stress levels have resulted in activation of the left frontal and right parietal regions and, simultaneously, in relaxing of the right frontal and left parietal areas. Taking advantage of this brain behavior, a discriminant model only based on AAPEn and QSEn computed from the EEG channels P3 and P4 has reached a diagnostic accuracy greater than 80%, which improves slightly the current state of the art. Moreover, because this classification system is notably easier than others previously proposed, it could be used for continuous monitoring of negative stress, as well as for its regulation towards more positive moods in controlled environments.Entropy2017-04-28195Article10.3390/e190501961961099-43002017-04-28doi: 10.3390/e19050196Beatriz García-MartínezArturo Martínez-RodrigoRoberto ZangrónizJosé PastorRaúl Alcaraz<![CDATA[Entropy, Vol. 19, Pages 197: A New Kind of Permutation Entropy Used to Classify Sleep Stages from Invisible EEG Microstructure]]>
http://www.mdpi.com/1099-4300/19/5/197
Permutation entropy and order patterns in an EEG signal have been applied by several authors to study sleep, anesthesia, and epileptic absences. Here, we discuss a new version of permutation entropy, which is interpreted as distance to white noise. It has a scale similar to the well-known χ 2 distributions and can be supported by a statistical model. Critical values for significance are provided. Distance to white noise is used as a parameter which measures depth of sleep, where the vigilant awake state of the human EEG is interpreted as “almost white noise”. Classification of sleep stages from EEG data usually relies on delta waves and graphic elements, which can be seen on a macroscale of several seconds. The distance to white noise can anticipate such emerging waves before they become apparent, evaluating invisible tendencies of variations within 40 milliseconds. Data segments of 30 s of high-resolution EEG provide a reliable classification. Application to the diagnosis of sleep disorders is indicated.Entropy2017-04-28195Article10.3390/e190501971971099-43002017-04-28doi: 10.3390/e19050197Christoph Bandt<![CDATA[Entropy, Vol. 19, Pages 194: Criticality and Information Dynamics in Epidemiological Models]]>
http://www.mdpi.com/1099-4300/19/5/194
Understanding epidemic dynamics has always been a challenge. As witnessed from the ongoing Zika or the seasonal Influenza epidemics, we still need to improve our analytical methods to better understand and control epidemics. While the emergence of complex sciences in the turn of the millennium have resulted in their implementation in modelling epidemics, there is still a need for improving our understanding of critical dynamics in epidemics. In this study, using agent-based modelling, we simulate a Susceptible-Infected-Susceptible (SIS) epidemic on a homogeneous network. We use transfer entropy and active information storage from information dynamics framework to characterise the critical transition in epidemiological models. Our study shows that both (bias-corrected) transfer entropy and active information storage maximise after the critical threshold ( R 0 = 1). This is the first step toward an information dynamics approach to epidemics. Understanding the dynamics around the criticality in epidemiological models can provide us insights about emergent diseases and disease control.Entropy2017-04-27195Article10.3390/e190501941941099-43002017-04-27doi: 10.3390/e19050194E. ErtenJoseph LizierMahendra PiraveenanMikhail Prokopenko<![CDATA[Entropy, Vol. 19, Pages 193: Stochastic Stirling Engine Operating in Contact with Active Baths]]>
http://www.mdpi.com/1099-4300/19/5/193
A Stirling engine made of a colloidal particle in contact with a nonequilibrium bath is considered and analyzed with the tools of stochastic energetics. We model the bath by non Gaussian persistent noise acting on the colloidal particle. Depending on the chosen definition of an isothermal transformation in this nonequilibrium setting, we find that either the energetics of the engine parallels that of its equilibrium counterpart or, in the simplest case, that it ends up being less efficient. Persistence, more than non-Gaussian effects, are responsible for this result.Entropy2017-04-27195Article10.3390/e190501931931099-43002017-04-27doi: 10.3390/e19050193Ruben ZakineAlexandre SolonTodd GingrichFrédéric van Wijland<![CDATA[Entropy, Vol. 19, Pages 147: Assessing Catchment Resilience Using Entropy Associated with Mean Annual Runoff for the Upper Vaal Catchment in South Africa]]>
http://www.mdpi.com/1099-4300/19/5/147
The importance of the mean annual runoff (MAR)-hydrological variable is paramount for catchment planning, development and management. MAR depicts the amount of uncertainty or chaos (implicitly information content) of the catchment. The uncertainty associated with MAR of quaternary catchments (QCs) in the Upper Vaal catchment of South Africa has been quantified through Shannon entropy. As a result of chaos over a period of time, the hydrological catchment behavior/response in terms of MAR could be characterized by its resilience. Uncertainty (chaos) in QCs was used as a surrogate measure of catchment resilience. MAR data on surface water resources (WR) of South Africa of 1990 (i.e., WR90), 2005 (WR2005) and 2012 (W2012) were used in this study. A linear zoning for catchment resilience in terms of water resources sustainability was defined. Regression models (with high correlation) between the relative changes/variations in MAR data sets and relative changes in entropy were established, for WR2005 and WR2012. These models were compared with similar relationships for WR90 and WR2005, previously reported. The MAR pseudo-elasticity of the uncertainty associated with MAR was derived from regression models to characterize the resilience state of QCs. The MAR pseudo-elasticity values were relatively small to have an acceptable level of catchment resilience in the Upper Vaal catchment. Within the resilience zone, it was also shown that the effect of mean annual evaporation (MAE) was negatively significant on MAR pseudo-elasticity, compared to the effect of mean annual precipitation (MAP), which was positively insignificant.Entropy2017-04-27195Article10.3390/e190501471471099-43002017-04-27doi: 10.3390/e19050147Masengo Ilunga<![CDATA[Entropy, Vol. 19, Pages 192: Entropy in the Tangled Nature Model of Evolution]]>
http://www.mdpi.com/1099-4300/19/5/192
Applications of entropy principles to evolution and ecology are of tantamount importance given the central role spatiotemporal structuring plays in both evolution and ecological succession. We obtain here a qualitative interpretation of the role of entropy in evolving ecological systems. Our interpretation is supported by mathematical arguments using simulation data generated by the Tangled Nature Model (TNM), a stochastic model of evolving ecologies. We define two types of configurational entropy and study their empirical time dependence obtained from the data. Both entropy measures increase logarithmically with time, while the entropy per individual decreases in time, in parallel with the growth of emergent structures visible from other aspects of the simulation. We discuss the biological relevance of these entropies to describe niche space and functional space of ecosystems, as well as their use in characterizing the number of taxonomic configurations compatible with different niche partitioning and functionality. The TNM serves as an illustrative example of how to calculate and interpret these entropies, which are, however, also relevant to real ecosystems, where they can be used to calculate the number of functional and taxonomic configurations that an ecosystem can realize.Entropy2017-04-27195Article10.3390/e190501921921099-43002017-04-27doi: 10.3390/e19050192Ty RoachJames NultonPaolo SibaniForest RohwerPeter Salamon<![CDATA[Entropy, Vol. 19, Pages 189: Entropy-Based Parameter Estimation for the Four-Parameter Exponential Gamma Distribution]]>
http://www.mdpi.com/1099-4300/19/5/189
Two methods based on the principle of maximum entropy (POME), the ordinary entropy method (ENT) and the parameter space expansion method (PSEM), are developed for estimating the parameters of a four-parameter exponential gamma distribution. Using six data sets for annual precipitation at the Weihe River basin in China, the PSEM was applied for estimating parameters for the four-parameter exponential gamma distribution and was compared to the methods of moments (MOM) and of maximum likelihood estimation (MLE). It is shown that PSEM enables the four-parameter exponential distribution to fit the data well, and can further improve the estimation.Entropy2017-04-26195Article10.3390/e190501891891099-43002017-04-26doi: 10.3390/e19050189Songbai SongXiaoyan SongYan Kang<![CDATA[Entropy, Vol. 19, Pages 191: Image Bi-Level Thresholding Based on Gray Level-Local Variance Histogram]]>
http://www.mdpi.com/1099-4300/19/5/191
Thresholding is a popular method of image segmentation. Many thresholding methods utilize only the gray level information of pixels in the image, which may lead to poor segmentation performance because the spatial correlation information between pixels is ignored. To improve the performance of thresolding methods, a novel two-dimensional histogram—called gray level-local variance (GLLV) histogram—is proposed in this paper as an entropic thresholding method to segment images with bimodal histograms. The GLLV histogram is constructed by using the gray level information of pixels and its local variance in a neighborhood. Local variance measures the dispersion of gray level distribution of pixels in a neighborhood. If a pixel’s gray level is close to its neighboring pixels, its local variance is small, and vice versa. Therefore, local variance can reflect the spatial information between pixels. The GLLV histogram takes not only the gray level, but also the spatial information into consideration. Experimental results show that an entropic thresholding method based on the GLLV histogram can achieve better segmentation performance.Entropy2017-04-26195Article10.3390/e190501911911099-43002017-04-26doi: 10.3390/e19050191Xiulian ZhengHong YeYinggan Tang<![CDATA[Entropy, Vol. 19, Pages 188: When the Map Is Better Than the Territory]]>
http://www.mdpi.com/1099-4300/19/5/188
The causal structure of any system can be analyzed at a multitude of spatial and temporal scales. It has long been thought that while higher scale (macro) descriptions may be useful to observers, they are at best a compressed description and at worse leave out critical information and causal relationships. However, recent research applying information theory to causal analysis has shown that the causal structure of some systems can actually come into focus and be more informative at a macroscale. That is, a macroscale description of a system (a map) can be more informative than a fully detailed microscale description of the system (the territory). This has been called “causal emergence.” While causal emergence may at first seem counterintuitive, this paper grounds the phenomenon in a classic concept from information theory: Shannon’s discovery of the channel capacity. I argue that systems have a particular causal capacity, and that different descriptions of those systems take advantage of that capacity to various degrees. For some systems, only macroscale descriptions use the full causal capacity. These macroscales can either be coarse-grains, or may leave variables and states out of the model (exogenous, or “black boxed”) in various ways, which can improve the efficacy and informativeness via the same mathematical principles of how error-correcting codes take advantage of an information channel’s capacity. The causal capacity of a system can approach the channel capacity as more and different kinds of macroscales are considered. Ultimately, this provides a general framework for understanding how the causal structure of some systems cannot be fully captured by even the most detailed microscale description.Entropy2017-04-26195Article10.3390/e190501881881099-43002017-04-26doi: 10.3390/e19050188Erik Hoel<![CDATA[Entropy, Vol. 19, Pages 190: On the Anonymity Risk of Time-Varying User Profiles]]>
http://www.mdpi.com/1099-4300/19/5/190
Websites and applications use personalisation services to profile their users, collect their patterns and activities and eventually use this data to provide tailored suggestions. User preferences and social interactions are therefore aggregated and analysed. Every time a user publishes a new post or creates a link with another entity, either another user, or some online resource, new information is added to the user profile. Exposing private data does not only reveal information about single users’ preferences, increasing their privacy risk, but can expose more about their network that single actors intended. This mechanism is self-evident in social networks where users receive suggestions based on their friends’ activities. We propose an information-theoretic approach to measure the differential update of the anonymity risk of time-varying user profiles. This expresses how privacy is affected when new content is posted and how much third-party services get to know about the users when a new activity is shared. We use actual Facebook data to show how our model can be applied to a real-world scenario.Entropy2017-04-26195Article10.3390/e190501901901099-43002017-04-26doi: 10.3390/e19050190Silvia PuglisiDavid Rebollo-MonederoJordi Forné<![CDATA[Entropy, Vol. 19, Pages 187: Multicomponent and Longitudinal Imaging Seen as a Communication Channel—An Application to Stroke]]>
http://www.mdpi.com/1099-4300/19/5/187
In longitudinal medical studies, multicomponent images of the tissues, acquired at a given stage of a disease, are used to provide information on the fate of the tissues. We propose a quantification of the predictive value of multicomponent images using information theory. To this end, we revisit the predictive information introduced for monodimensional time series and extend it to multicomponent images. The interest of this theoretical approach is illustrated on multicomponent magnetic resonance images acquired on stroke patients at acute and late stages, for which we propose an original and realistic model of noise together with a spatial encoding for the images. We address therefrom very practical questions such as the impact of noise on the predictability, the optimal choice of an observation scale and the predictability gain brought by the addition of imaging components.Entropy2017-04-26195Article10.3390/e190501871871099-43002017-04-26doi: 10.3390/e19050187Mathilde GiacaloneCarole FrindelEmmanuel GrenierDavid Rousseau<![CDATA[Entropy, Vol. 19, Pages 184: Recognition of Traveling Surges in HVDC with Wavelet Entropy]]>
http://www.mdpi.com/1099-4300/19/5/184
Traveling surges are commonly adopted in protection devices of high-voltage direct current (HVDC) transmission systems. Lightning strikes also can produce large-amplitude traveling surges which lead to the malfunction of relays. To ensure the reliable operation of protection devices, recognition of traveling surges must be considered. Wavelet entropy, which can reveal time-frequency distribution features, is a potential tool for traveling surge recognition. In this paper, the effectiveness of wavelet entropy in characterizing traveling surges is demonstrated by comparing its representations of different kinds of surges and discussing its stability with the effects of propagation distance and fault resistance. A wavelet entropy-based recognition method is proposed and tested by simulated traveling surges. The results show wavelet entropy can discriminate fault traveling surges with a good recognition rate.Entropy2017-04-26195Article10.3390/e190501841841099-43002017-04-26doi: 10.3390/e19050184Guomin LuoQizhi LinLin ZhouJinghan He<![CDATA[Entropy, Vol. 19, Pages 186: A Quantum Description of the Stern–Gerlach Experiment]]>
http://www.mdpi.com/1099-4300/19/5/186
A detailed analysis of the classic Stern–Gerlach experiment is presented. An analytical simple solution is presented for the quantum description of the translational and spin dynamics of a silver atom in a magnetic field with a gradient along a single z-direction. This description is then used to obtain an approximate quantum description of the more realistic case with a magnetic field gradient also in a second y-direction. An explicit relation is derived for how an initial off center deviation in the y-direction affects the final result observed at the detector. This shows that the “mouth shape” pattern at the detector observed in the original Stern–Gerlach experiment is a generic consequence of the gradient in the y-direction. This is followed by a discussion of the spin dynamics during the entry of the silver atom into the magnet. An analytical relation is derived for a simplified case of a field only along the z-direction. A central question for the conceptual understanding of the Stern–Gerlach experiment has been how an initially unpolarized spin ends up in a polarized state at the detector. It is argued that this can be understood with the use of the adiabatic approximation. When the atoms first experience the magnetic field outside the magnet, there is in general a change in the spin state, which transforms from a degenerate eigenstate in the absence of a field into one of two possible non-degenerate states in the field. If the direction of the field changes during the passage through the device, there is a corresponding adiabatic change of the spin state. It is shown that an application of the adiabatic approximation in this way is consistent with the previously derived exact relations.Entropy2017-04-25195Article10.3390/e190501861861099-43002017-04-25doi: 10.3390/e19050186Håkan WennerströmPer-Olof Westlund<![CDATA[Entropy, Vol. 19, Pages 173: En-LDA: An Novel Approach to Automatic Bug Report Assignment with Entropy Optimized Latent Dirichlet Allocation]]>
http://www.mdpi.com/1099-4300/19/5/173
With the increasing number of bug reports coming into the open bug repository, it is impossible to triage bug reports manually by software managers. This paper proposes a novel approach called En-LDA (Entropy optimized Latent Dirichlet Allocation (LDA)) for automatic bug report assignment. Specifically, we propose entropy to optimize the number of topics of the LDA model and further use the entropy optimized LDA to capture the expertise and interest of developers in bug resolution. A developer’s interest in a topic is modeled by the number of the developer’s comments on bug reports of the topic divided by the number of all the developer’s comments. A developer’s expertise in a topic is modeled by the number of the developer’s comments on bug reports of the topic divided by the number of all developers’ comments on the topic. Given a new bug report, En-LDA recommends a ranked list of developers who are potentially adequate to resolve the new bug. Experiments on Eclipse JDT and Mozilla Firefox projects show that En-LDA can achieve high recall up to 84% and 58%, and precision up to 28% and 41%, respectively, which indicates promising aspects of the proposed approach.Entropy2017-04-25195Article10.3390/e190501731731099-43002017-04-25doi: 10.3390/e19050173Wen ZhangYangbo CuiTaketoshi Yoshida<![CDATA[Entropy, Vol. 19, Pages 185: Slow Dynamics and Structure of Supercooled Water in Confinement]]>
http://www.mdpi.com/1099-4300/19/4/185
We review our simulation results on properties of supercooled confined water. We consider two situations: water confined in a hydrophilic pore that mimics an MCM-41 environment and water at interface with a protein. The behavior upon cooling of the α relaxation of water in both environments is well interpreted in terms of the Mode Coupling Theory of glassy dynamics. Moreover, we find a crossover from a fragile to a strong regime. We relate this crossover to the crossing of the Widom line emanating from the liquid-liquid critical point, and in confinement we connect this crossover also to a crossover of the two body excess entropy of water upon cooling. Hydration water exhibits a second, distinctly slower relaxation caused by its dynamical coupling with the protein. The crossover upon cooling of this long relaxation is related to the protein dynamics.Entropy2017-04-24194Review10.3390/e190401851851099-43002017-04-24doi: 10.3390/e19040185Gaia CamisascaMargherita De MarzioMauro RoverePaola Gallo<![CDATA[Entropy, Vol. 19, Pages 183: Low Complexity List Decoding for Polar Codes with Multiple CRC Codes]]>
http://www.mdpi.com/1099-4300/19/4/183
Polar codes are the first family of error correcting codes that provably achieve the capacity of symmetric binary-input discrete memoryless channels with low complexity. Since the development of polar codes, there have been many studies to improve their finite-length performance. As a result, polar codes are now adopted as a channel code for the control channel of 5G new radio of the 3rd generation partnership project. However, the decoder implementation is one of the big practical problems and low complexity decoding has been studied. This paper addresses a low complexity successive cancellation list decoding for polar codes utilizing multiple cyclic redundancy check (CRC) codes. While some research uses multiple CRC codes to reduce memory and time complexity, we consider the operational complexity of decoding, and reduce it by optimizing CRC positions in combination with a modified decoding operation. Resultingly, the proposed scheme obtains not only complexity reduction from early stopping of decoding, but also additional reduction from the reduced number of decoding paths.Entropy2017-04-24194Article10.3390/e190401831831099-43002017-04-24doi: 10.3390/e19040183Jong-Hwan KimSang-Hyo KimJi-Woong JangYoung-Sik Kim<![CDATA[Entropy, Vol. 19, Pages 182: Carnot-Like Heat Engines Versus Low-Dissipation Models]]>
http://www.mdpi.com/1099-4300/19/4/182
In this paper, a comparison between two well-known finite time heat engine models is presented: the Carnot-like heat engine based on specific heat transfer laws between the cyclic system and the external heat baths and the Low-Dissipation model where irreversibilities are taken into account by explicit entropy generation laws. We analyze the mathematical relation between the natural variables of both models and from this the resulting thermodynamic implications. Among them, particular emphasis has been placed on the physical consistency between the heat leak and time evolution on the one side, and between parabolic and loop-like behaviors of the parametric power-efficiency plots. A detailed analysis for different heat transfer laws in the Carnot-like model in terms of the maximum power efficiencies given by the Low-Dissipation model is also presented.Entropy2017-04-23194Article10.3390/e190401821821099-43002017-04-23doi: 10.3390/e19040182Julian Gonzalez-AyalaJosé RocoAlejandro MedinaAntonio Calvo Hernández<![CDATA[Entropy, Vol. 19, Pages 180: Using Measured Values in Bell’s Inequalities Entails at Least One Hypothesis in Addition to Local Realism]]>
http://www.mdpi.com/1099-4300/19/4/180
The recent loophole-free experiments have confirmed the violation of Bell’s inequalities in nature. Yet, in order to insert measured values in Bell’s inequalities, it is unavoidable to make a hypothesis similar to “ergodicity at the hidden variables level”. This possibility opens a promising way out from the old controversy between quantum mechanics and local realism. Here, I review the reason why such a hypothesis (actually, it is one of a set of related hypotheses) in addition to local realism is necessary, and present a simple example, related to Bell’s inequalities, where the hypothesis is violated. This example shows that the violation of the additional hypothesis is necessary, but not sufficient, to violate Bell’s inequalities without violating local realism. The example also provides some clues that may reveal the violation of the additional hypothesis in an experiment.Entropy2017-04-22194Article10.3390/e190401801801099-43002017-04-22doi: 10.3390/e19040180Alejandro Hnilo<![CDATA[Entropy, Vol. 19, Pages 181: Citizen Science and Topology of Mind: Complexity, Computation and Criticality in Data-Driven Exploration of Open Complex Systems]]>
http://www.mdpi.com/1099-4300/19/4/181
Recently emerging data-driven citizen sciences need to harness an increasing amount of massive data with varying quality. This paper develops essential theoretical frameworks, example models, and a general definition of complexity measure, and examines its computational complexity for an interactive data-driven citizen science within the context of guided self-organization. We first define a conceptual model that incorporates the quality of observation in terms of accuracy and reproducibility, ranging between subjectivity, inter-subjectivity, and objectivity. Next, we examine the database’s algebraic and topological structure in relation to informational complexity measures, and evaluate its computational complexities with respect to an exhaustive optimization. Conjectures of criticality are obtained on the self-organizing processes of observation and dynamical model development. Example analysis is demonstrated with the use of biodiversity assessment database—the process that inevitably involves human subjectivity for management within open complex systems.Entropy2017-04-22194Article10.3390/e190401811811099-43002017-04-22doi: 10.3390/e19040181Masatoshi Funabashi<![CDATA[Entropy, Vol. 19, Pages 179: On the Definition of Diversity Order Based on Renyi Entropy for Frequency Selective Fading Channels]]>
http://www.mdpi.com/1099-4300/19/4/179
Outage probabilities are important measures of the performance of wireless communication systems, but to obtain outage probabilities it is necessary to first determine detailed system parameters, followed by complicated calculations. When there are multiple candidates of diversity techniques applicable for a system, the diversity order can be used to roughly but quickly compare the techniques for a wide range of operating environments. For a system transmitting over frequency selective fading channels, the diversity order can be defined as the number of multi-paths if multi-paths have all equal energy. However, diversity order may not be adequately defined when the energy values are different. In order to obtain a rough value of diversity order, one may use the number of multi-paths or the reciprocal value of the multi-path energy variance. Such definitions are not very useful for evaluating the performance of diversity techniques since the former is meaningful only when the target outage probability is extremely small, while the latter is reasonable when the target outage probability is very large. In this paper, we propose a new definition of diversity order for frequency selective fading channels. The proposed scheme is based on Renyi entropy, which is widely used in biology and many other fields. We provide various simulation results to show that the diversity order using the proposed definition is tightly correlated with the corresponding outage probability, and thus the proposed scheme can be used for quickly selecting the best diversity technique among multiple candidates.Entropy2017-04-20194Article10.3390/e190401791791099-43002017-04-20doi: 10.3390/e19040179Seungyeob ChaeMinjoong Rim<![CDATA[Entropy, Vol. 19, Pages 177: Entropy in Natural Time and the Associated Complexity Measures]]>
http://www.mdpi.com/1099-4300/19/4/177
Natural time is a new time domain introduced in 2001. The analysis of time series associated with a complex system in natural time may provide useful information and may reveal properties that are usually hidden when studying the system in conventional time. In this new time domain, an entropy has been defined, and complexity measures based on this entropy, as well as its value under time-reversal have been introduced and found applications in various complex systems. Here, we review these applications in the electric signals that precede rupture, e.g., earthquakes, in the analysis of electrocardiograms, as well as in global atmospheric phenomena, like the El Niño/La Niña Southern Oscillation.Entropy2017-04-20194Article10.3390/e190401771771099-43002017-04-20doi: 10.3390/e19040177Nicholas Sarlis<![CDATA[Entropy, Vol. 19, Pages 178: Entropy “2”-Soft Classification of Objects]]>
http://www.mdpi.com/1099-4300/19/4/178
A proposal for a new method of classification of objects of various nature, named “2”-soft classification, which allows for referring objects to one of two types with optimal entropy probability for available collection of learning data with consideration of additive errors therein. A decision rule of randomized parameters and probability density function (PDF) is formed, which is determined by the solution of the problem of the functional entropy linear programming. A procedure for “2”-soft classification is developed, consisting of the computer simulation of the randomized decision rule with optimal entropy PDF parameters. Examples are provided.Entropy2017-04-20194Article10.3390/e190401781781099-43002017-04-20doi: 10.3390/e19040178Yuri PopkovZeev VolkovichYuri DubnovRenata AvrosElena Ravve<![CDATA[Entropy, Vol. 19, Pages 174: Leaks: Quantum, Classical, Intermediate and More]]>
http://www.mdpi.com/1099-4300/19/4/174
We introduce the notion of a leak for general process theories and identify quantum theory as a theory with minimal leakage, while classical theory has maximal leakage. We provide a construction that adjoins leaks to theories, an instance of which describes the emergence of classical theory by adjoining decoherence leaks to quantum theory. Finally, we show that defining a notion of purity for processes in general process theories has to make reference to the leaks of that theory, a feature missing in standard definitions; hence, we propose a refined definition and study the resulting notion of purity for quantum, classical and intermediate theories.Entropy2017-04-19194Article10.3390/e190401741741099-43002017-04-19doi: 10.3390/e19040174John SelbyBob Coecke<![CDATA[Entropy, Vol. 19, Pages 176: Multi-Scale Permutation Entropy Based on Improved LMD and HMM for Rolling Bearing Diagnosis]]>
http://www.mdpi.com/1099-4300/19/4/176
Based on the combination of improved Local Mean Decomposition (LMD), Multi-scale Permutation Entropy (MPE) and Hidden Markov Model (HMM), the fault types of bearings are diagnosed. Improved LMD is proposed based on the self-similarity of roller bearing vibration signal by extending the right and left side of the original signal to suppress its edge effect. First, the vibration signals of the rolling bearing are decomposed into several product function (PF) components by improved LMD respectively. Then, the phase space reconstruction of the PF1 is carried out by using the mutual information (MI) method and the false nearest neighbor (FNN) method to calculate the delay time and the embedding dimension, and then the scale is set to obtain the MPE of PF1. After that, the MPE features of rolling bearings are extracted. Finally, the features of MPE are used as HMM training and diagnosis. The experimental results show that the proposed method can effectively identify the different faults of the rolling bearing.Entropy2017-04-19194Article10.3390/e190401761761099-43002017-04-19doi: 10.3390/e19040176Yangde GaoFrancesco VilleccoMing LiWanqing Song<![CDATA[Entropy, Vol. 19, Pages 175: Second Law Analysis of a Mobile Air Conditioning System with Internal Heat Exchanger Using Low GWP Refrigerants]]>
http://www.mdpi.com/1099-4300/19/4/175
This paper investigates the results of a Second Law analysis applied to a mobile air conditioning system (MACs) integrated with an internal heat exchanger (IHX) by considering R152a, R1234yf and R1234ze as low global warming potential (GWP) refrigerants and establishing R134a as baseline. System simulation is performed considering the maximum value of entropy generated in the IHX. The maximum entropy production occurs at an effectiveness of 66% for both R152a and R134a, whereas for the cases of R1234yf and R1234ze occurs at 55%. Sub-cooling and superheating effects are evaluated for each one of the cases. It is also found that the sub-cooling effect shows the greatest impact on the cycle efficiency. The results also show the influence of isentropic efficiency on relative exergy destruction, resulting that the most affected components are the compressor and the condenser for all of the refrigerants studied herein. It is also found that the most efficient operation of the system resulted to be when using the R1234ze refrigerant.Entropy2017-04-19194Article10.3390/e190401751751099-43002017-04-19doi: 10.3390/e19040175Vicente Pérez-GarcíaJuan Belman-FloresJosé Rodríguez-MuñozVíctor. Rangel-HernándezArmando Gallegos-Muñoz<![CDATA[Entropy, Vol. 19, Pages 172: Multilevel Integration Entropies: The Case of Reconstruction of Structural Quasi-Stability in Building Complex Datasets]]>
http://www.mdpi.com/1099-4300/19/4/172
The emergence of complex datasets permeates versatile research disciplines leading to the necessity to develop methods for tackling complexity through finding the patterns inherent in datasets. The challenge lies in transforming the extracted patterns into pragmatic knowledge. In this paper, new information entropy measures for the characterization of the multidimensional structure extracted from complex datasets are proposed, complementing the conventionally-applied algebraic topology methods. Derived from topological relationships embedded in datasets, multilevel entropy measures are used to track transitions in building the high dimensional structure of datasets captured by the stratified partition of a simplicial complex. The proposed entropies are found suitable for defining and operationalizing the intuitive notions of structural relationships in a cumulative experience of a taxi driver’s cognitive map formed by origins and destinations. The comparison of multilevel integration entropies calculated after each new added ride to the data structure indicates slowing the pace of change over time in the origin-destination structure. The repetitiveness in taxi driver rides, and the stability of origin-destination structure, exhibits the relative invariance of rides in space and time. These results shed light on taxi driver’s ride habits, as well as on the commuting of persons whom he/she drove.Entropy2017-04-18194Article10.3390/e190401721721099-43002017-04-18doi: 10.3390/e19040172Slobodan MaletićYi Zhao<![CDATA[Entropy, Vol. 19, Pages 146: Design and Implementation of SOC Prediction for a Li-Ion Battery Pack in an Electric Car with an Embedded System]]>
http://www.mdpi.com/1099-4300/19/4/146
Li-Ion batteries are widely preferred in electric vehicles. The charge status of batteries is a critical evaluation issue, and many researchers are studying in this area. State of charge gives information about how much longer the battery can be used and when the charging process will be cut off. Incorrect predictions may cause overcharging or over-discharging of the battery. In this study, a low-cost embedded system is used to determine the state of charge of an electric car. A Li-Ion battery cell is trained using a feed-forward neural network via Matlab/Neural Network Toolbox. The trained cell is adapted to the whole battery pack of the electric car and embedded via Matlab/Simulink to a low-cost microcontroller that proposed a system in real-time. The experimental results indicated that accurate robust estimation results could be obtained by the proposed system.Entropy2017-04-17194Article10.3390/e190401461461099-43002017-04-17doi: 10.3390/e19040146Emel SoyluTuncay SoyluRaif Bayir<![CDATA[Entropy, Vol. 19, Pages 171: Entropy Generation of Double Diffusive Forced Convection in Porous Channels with Thick Walls and Soret Effect]]>
http://www.mdpi.com/1099-4300/19/4/171
The second law performance of double diffusive forced convection in a horizontal porous channel with thick walls was considered. The Soret effect is included in the concentration equation and the first order chemical reaction was chosen for the concentration boundary conditions at the porous-solid walls interfaces. This investigation is focused on two principal types of boundary conditions. The first assumes a constant temperature condition at the outer surfaces of the solid walls, and the second assumes a constant heat flux at the lower wall and convection heat transfer at the upper wall. After obtaining the velocity, temperature and concentration distributions, the local and total entropy generation formulations were used to visualize the second law performance of the two cases. The results indicate that the total entropy generation rate is directly related to the lower wall thickness. Interestingly, it was observed that the total entropy generation rate for the second case reaches a minimum value, if the upper and lower wall thicknesses are chosen correctly. However, this observation was not true for the first case. These analyses can be useful for the design of microreactors and microcombustor systems when the second law analysis is taken into account.Entropy2017-04-15194Article10.3390/e190401711711099-43002017-04-15doi: 10.3390/e19040171Mohsen TorabiMehrdad TorabiG.P. Peterson<![CDATA[Entropy, Vol. 19, Pages 170: Dynamic Rankings for Seed Selection in Complex Networks: Balancing Costs and Coverage]]>
http://www.mdpi.com/1099-4300/19/4/170
Information spreading processes within the complex networks are usually initiated by a selection of highly influential nodes in accordance with the used seeding strategy. The majority of earlier studies assumed the usage of selected seeds at the beginning of the process. Our previous research revealed the advantage of using a sequence of seeds instead of a single stage approach. The current study extends sequential seeding and further improves results with the use of dynamic rankings, which are created by recalculation of network measures used for additional seed selection during the process instead of static ranking computed only once at the beginning. For calculation of network centrality measures such as degree, only non-infected nodes are taken into account. Results showed increased coverage represented by a percentage of activated nodes dependent on intervals between recalculations as well as the trade-off between outcome and computational costs. For over 90% of simulation cases, dynamic rankings with a high frequency of recalculations delivered better coverage than approaches based on static rankings.Entropy2017-04-15194Article10.3390/e190401701701099-43002017-04-15doi: 10.3390/e19040170Jarosław Jankowski<![CDATA[Entropy, Vol. 19, Pages 169: Where There is Life There is Mind: In Support of a Strong Life-Mind Continuity Thesis]]>
http://www.mdpi.com/1099-4300/19/4/169
This paper considers questions about continuity and discontinuity between life and mind. It begins by examining such questions from the perspective of the free energy principle (FEP). The FEP is becoming increasingly influential in neuroscience and cognitive science. It says that organisms act to maintain themselves in their expected biological and cognitive states, and that they can do so only by minimizing their free energy given that the long-term average of free energy is entropy. The paper then argues that there is no singular interpretation of the FEP for thinking about the relation between life and mind. Some FEP formulations express what we call an independence view of life and mind. One independence view is a cognitivist view of the FEP. It turns on information processing with semantic content, thus restricting the range of systems capable of exhibiting mentality. Other independence views exemplify what we call an overly generous non-cognitivist view of the FEP, and these appear to go in the opposite direction. That is, they imply that mentality is nearly everywhere. The paper proceeds to argue that non-cognitivist FEP, and its implications for thinking about the relation between life and mind, can be usefully constrained by key ideas in recent enactive approaches to cognitive science. We conclude that the most compelling account of the relationship between life and mind treats them as strongly continuous, and that this continuity is based on particular concepts of life (autopoiesis and adaptivity) and mind (basic and non-semantic).Entropy2017-04-14194Article10.3390/e190401691691099-43002017-04-14doi: 10.3390/e19040169Michael D. KirchhoffTom Froese<![CDATA[Entropy, Vol. 19, Pages 167: Application of the Fuzzy Oil Drop Model Describes Amyloid as a Ribbonlike Micelle]]>
http://www.mdpi.com/1099-4300/19/4/167
We propose a mathematical model describing the formation of micellar forms—whether spherical, globular, cylindrical, or ribbonlike—as well as its adaptation to protein structure. Our model, based on the fuzzy oil drop paradigm, assumes that in a spherical micelle the distribution of hydrophobicity produced by the alignment of polar molecules with the external water environment can be modeled by a 3D Gaussian function. Perturbing this function by changing the values of its sigma parameters leads to a variety of conformations—the model is therefore applicable to globular, cylindrical, and ribbonlike micelles. In the context of protein structures ranging from globular to ribbonlike, our model can explain the emergence of fibrillar forms; particularly amyloids.Entropy2017-04-14194Article10.3390/e190401671671099-43002017-04-14doi: 10.3390/e19040167Irena RotermanMateusz BanachLeszek Konieczny<![CDATA[Entropy, Vol. 19, Pages 159: A Study of the Transfer Entropy Networks on Industrial Electricity Consumption]]>
http://www.mdpi.com/1099-4300/19/4/159
We study information transfer routes among cross-industry and cross-region electricity consumption data based on transfer entropy and the MST (Minimum Spanning Tree) model. First, we characterize the information transfer routes with transfer entropy matrixes, and find that the total entropy transfer of the relatively developed Guangdong Province is lower than others, with significant industrial cluster within the province. Furthermore, using a reshuffling method, we find that driven industries contain much more information flows than driving industries, and are more influential on the degree of order of regional industries. Finally, based on the Chu-Liu-Edmonds MST algorithm, we extract the minimum spanning trees of provincial industries. Individual MSTs show that the MSTs follow a chain-like formation in developed provinces and star-like structures in developing provinces. Additionally, all MSTs with the root of minimal information outflow industrial sector are of chain-form.Entropy2017-04-13194Article10.3390/e190401591591099-43002017-04-13doi: 10.3390/e19040159Can-Zhong YaoPeng-Cheng KuangQing-Wen LinBo-Yi Sun<![CDATA[Entropy, Vol. 19, Pages 166: Entropic Aspects of Nonlinear Partial Differential Equations: Classical and Quantum Mechanical Perspectives]]>
http://www.mdpi.com/1099-4300/19/4/166
There has been increasing research activity in recent years concerning the properties and the applications of nonlinear partial differential equations that are closely related to nonstandard entropic functionals, such as the Tsallis and Renyi entropies.[...]Entropy2017-04-12194Editorial10.3390/e190401661661099-43002017-04-12doi: 10.3390/e19040166Angelo Plastino<![CDATA[Entropy, Vol. 19, Pages 165: An Entropy-Based Approach for Evaluating Travel Time Predictability Based on Vehicle Trajectory Data]]>
http://www.mdpi.com/1099-4300/19/4/165
With the great development of intelligent transportation systems (ITS), travel time prediction has attracted the interest of many researchers, and a large number of prediction methods have been developed. However, as an unavoidable topic, the predictability of travel time series is the basic premise for travel time prediction, which has received less attention than the methodology. Based on the analysis of the complexity of the travel time series, this paper defines travel time predictability to express the probability of correct travel time prediction, and proposes an entropy-based method to measure the upper bound of travel time predictability. Multiscale entropy is employed to quantify the complexity of the travel time series, and the relationships between entropy and the upper bound of travel time predictability are presented. Empirical studies are made with vehicle trajectory data in an express road section to shape the features of travel time predictability. The effectiveness of time scales, tolerance, and series length to entropy and travel time predictability are analyzed, and some valuable suggestions about the accuracy of travel time predictability are discussed. Finally, comparisons between travel time predictability and actual prediction results from two prediction models, ARIMA and BPNN, are made. Experimental results demonstrate the validity and reliability of the proposed travel time predictability.Entropy2017-04-11194Article10.3390/e190401651651099-43002017-04-11doi: 10.3390/e19040165Tao XuXianrui XuYujie HuXiang Li<![CDATA[Entropy, Vol. 19, Pages 164: Heisenberg and Entropic Uncertainty Measures for Large-Dimensional Harmonic Systems]]>
http://www.mdpi.com/1099-4300/19/4/164
The D-dimensional harmonic system (i.e., a particle moving under the action of a quadratic potential) is, together with the hydrogenic system, the main prototype of the physics of multidimensional quantum systems. In this work, we rigorously determine the leading term of the Heisenberg-like and entropy-like uncertainty measures of this system as given by the radial expectation values and the Rényi entropies, respectively, at the limit of large D. The associated multidimensional position-momentum uncertainty relations are discussed, showing that they saturate the corresponding general ones. A conjecture about the Shannon-like uncertainty relation is given, and an interesting phenomenon is observed: the Heisenberg-like and Rényi-entropy-based equality-type uncertainty relations for all of the D-dimensional harmonic oscillator states in the pseudoclassical ( D → ∞ ) limit are the same as the corresponding ones for the hydrogenic systems, despite the so different character of the oscillator and Coulomb potentials.Entropy2017-04-09194Article10.3390/e190401641641099-43002017-04-09doi: 10.3390/e19040164David Puertas-CentenoIrene ToranzoJesús Dehesa<![CDATA[Entropy, Vol. 19, Pages 163: Modelling Urban Sprawl Using Remotely Sensed Data: A Case Study of Chennai City, Tamilnadu]]>
http://www.mdpi.com/1099-4300/19/4/163
Urban sprawl (US), propelled by rapid population growth leads to the shrinkage of productive agricultural lands and pristine forests in the suburban areas and, in turn, adversely affects the provision of ecosystem services. The quantification of US is thus crucial for effective urban planning and environmental management. Like many megacities in fast growing developing countries, Chennai, the capital of Tamilnadu and one of the business hubs in India, has experienced extensive US triggered by the doubling of total population over the past three decades. However, the extent and level of US has not yet been quantified and a prediction for future extent of US is lacking. We employed the Random Forest (RF) classification on Landsat imageries from 1991, 2003, and 2016, and computed six landscape metrics to delineate the extent of urban areas within a 10 km suburban buffer of Chennai. The level of US was then quantified using Renyi’s entropy. A land change model was subsequently used to project land cover for 2027. A 70.35% expansion in urban areas was observed mainly towards the suburban periphery of Chennai between 1991 and 2016. The Renyi’s entropy value for year 2016 was 0.9, exhibiting a two-fold level of US when compared to 1991. The spatial metrics values indicate that the existing urban areas became denser and the suburban agricultural, forests and particularly barren lands were transformed into fragmented urban settlements. The forecasted land cover for 2027 indicates a conversion of 13,670.33 ha (16.57% of the total landscape) of existing forests and agricultural lands into urban areas with an associated increase in the entropy value to 1.7, indicating a tremendous level of US. Our study provides useful metrics for urban planning authorities to address the social-ecological consequences of US and to protect ecosystem services.Entropy2017-04-07194Article10.3390/e190401631631099-43002017-04-07doi: 10.3390/e19040163Rajchandar PadmanabanAvit K. BhowmikPedro CabralAlexander ZamyatinOraib AlmegdadiShuangao Wang<![CDATA[Entropy, Vol. 19, Pages 162: Situatedness and Embodiment of Computational Systems]]>
http://www.mdpi.com/1099-4300/19/4/162
In this paper, the role of the environment and physical embodiment of computational systems for explanatory purposes will be analyzed. In particular, the focus will be on cognitive computational systems, understood in terms of mechanisms that manipulate semantic information. It will be argued that the role of the environment has long been appreciated, in particular in the work of Herbert A. Simon, which has inspired the mechanistic view on explanation. From Simon’s perspective, the embodied view on cognition seems natural but it is nowhere near as critical as its proponents suggest. The only point of difference between Simon and embodied cognition is the significance of body-based off-line cognition; however, it will be argued that it is notoriously over-appreciated in the current debate. The new mechanistic view on explanation suggests that even if it is critical to situate a mechanism in its environment and study its physical composition, or realization, it is also stressed that not all detail counts, and that some bodily features of cognitive systems should be left out from explanations.Entropy2017-04-07194Article10.3390/e190401621621099-43002017-04-07doi: 10.3390/e19040162Marcin Miłkowski<![CDATA[Entropy, Vol. 19, Pages 161: P-Adic Analog of Navier–Stokes Equations: Dynamics of Fluid’s Flow in Percolation Networks (from Discrete Dynamics with Hierarchic Interactions to Continuous Universal Scaling Model)]]>
http://www.mdpi.com/1099-4300/19/4/161
Recently p-adic (and, more generally, ultrametric) spaces representing tree-like networks of percolation, and as a special case of capillary patterns in porous media, started to be used to model the propagation of fluids (e.g., oil, water, oil-in-water, and water-in-oil emulsion). The aim of this note is to derive p-adic dynamics described by fractional differential operators (Vladimirov operators) starting with discrete dynamics based on hierarchically-structured interactions between the fluids’ volumes concentrated at different levels of the percolation tree and coming to the multiscale universal topology of the percolating nets. Similar systems of discrete hierarchic equations were widely applied to modeling of turbulence. However, in the present work this similarity is only formal since, in our model, the trees are real physical patterns with a tree-like topology of capillaries (or fractures) in random porous media (not cascade trees, as in the case of turbulence, which we will be discussed elsewhere for the spinner flowmeter commonly used in the petroleum industry). By going to the “continuous limit” (with respect to the p-adic topology) we represent the dynamics on the tree-like configuration space as an evolutionary nonlinear p-adic fractional (pseudo-) differential equation, the tree-like analog of the Navier–Stokes equation. We hope that our work helps to come closer to a nonlinear equation solution, taking into account the scaling, hierarchies, and formal derivations, imprinted from the similar properties of the real physical world. Once this coupling is resolved, the more problematic question of information scaling in industrial applications will be achieved.Entropy2017-04-07194Article10.3390/e190401611611099-43002017-04-07doi: 10.3390/e19040161Klaudia OleschkoAndrei KhrennikovMaría Correa López<![CDATA[Entropy, Vol. 19, Pages 160: Consistent Estimation of Partition Markov Models]]>
http://www.mdpi.com/1099-4300/19/4/160
The Partition Markov Model characterizes the process by a partition L of the state space, where the elements in each part of L share the same transition probability to an arbitrary element in the alphabet. This model aims to answer the following questions: what is the minimal number of parameters needed to specify a Markov chain and how to estimate these parameters. In order to answer these questions, we build a consistent strategy for model selection which consist of: giving a size n realization of the process, finding a model within the Partition Markov class, with a minimal number of parts to represent the process law. From the strategy, we derive a measure that establishes a metric in the state space. In addition, we show that if the law of the process is Markovian, then, eventually, when n goes to infinity, L will be retrieved. We show an application to model internet navigation patterns.Entropy2017-04-06194Article10.3390/e190401601601099-43002017-04-06doi: 10.3390/e19040160Jesús GarcíaVerónica González-López<![CDATA[Entropy, Vol. 19, Pages 152: An Approach to the Evaluation of the Quality of Accounting Information Based on Relative Entropy in Fuzzy Linguistic Environments]]>
http://www.mdpi.com/1099-4300/19/4/152
There is a risk when company stakeholders make decisions using accounting information with varied qualities in the same way. In order to evaluate the accounting information quality, this paper proposed an approach to the evaluation of the quality of accounting information based on relative entropy in fuzzy linguistic environments. Firstly, the accounting information quality evaluation criteria are constructed not only from the quality of the accounting information content but also from the accounting information generation environment. Considering that the rating values with respect to the criteria are in linguistic forms with different granularities, the method to deal with the linguistic rating values is given. In the method, the linguistic terms are modeled with the 2-tuple linguistic model. Relative entropy is used to calculate the information consistency, which is used to derive the weight of experts and criteria. Finally, the example is given to illustrate the feasibility and practicability of the proposed method.Entropy2017-04-05194Article10.3390/e190401521521099-43002017-04-05doi: 10.3390/e19040152Ming LiXiaoli NingMingzhu LiYingcheng Xu<![CDATA[Entropy, Vol. 19, Pages 158: Nonequilibrium Thermodynamics and Steady State Density Matrix for Quantum Open Systems]]>
http://www.mdpi.com/1099-4300/19/4/158
We consider the generic model of a finite-size quantum electron system connected to two (temperature and particle) reservoirs. The quantum open system is driven out of equilibrium by the presence of both potential temperature and chemical differences between the two reservoirs. The nonequilibrium (NE) thermodynamical properties of such a quantum open system are studied for the steady state regime. In such a regime, the corresponding NE density matrix is built on the so-called generalised Gibbs ensembles. From different expressions of the NE density matrix, we can identify the terms related to the entropy production in the system. We show, for a simple model, that the entropy production rate is always a positive quantity. Alternative expressions for the entropy production are also obtained from the Gibbs–von Neumann conventional formula and discussed in detail. Our results corroborate and expand earlier works found in the literature.Entropy2017-04-02194Article10.3390/e190401581581099-43002017-04-02doi: 10.3390/e19040158Hervé Ness<![CDATA[Entropy, Vol. 19, Pages 157: Quadratic Mutual Information Feature Selection]]>
http://www.mdpi.com/1099-4300/19/4/157
We propose a novel feature selection method based on quadratic mutual information which has its roots in Cauchy–Schwarz divergence and Renyi entropy. The method uses the direct estimation of quadratic mutual information from data samples using Gaussian kernel functions, and can detect second order non-linear relations. Its main advantages are: (i) unified analysis of discrete and continuous data, excluding any discretization; and (ii) its parameter-free design. The effectiveness of the proposed method is demonstrated through an extensive comparison with mutual information feature selection (MIFS), minimum redundancy maximum relevance (MRMR), and joint mutual information (JMI) on classification and regression problem domains. The experiments show that proposed method performs comparably to the other methods when applied to classification problems, except it is considerably faster. In the case of regression, it compares favourably to the others, but is slower.Entropy2017-04-01194Article10.3390/e190401571571099-43002017-04-01doi: 10.3390/e19040157Davor SlugaUroš Lotrič<![CDATA[Entropy, Vol. 19, Pages 156: Use of Exergy Analysis to Quantify the Effect of Lithium Bromide Concentration in an Absorption Chiller]]>
http://www.mdpi.com/1099-4300/19/4/156
Absorption chillers present opportunities to utilize sustainable fuels in the production of chilled water. An assessment of the steam driven absorption chiller at the University of Idaho, was performed to quantify the current exergy destruction rates. Measurements of external processes and flows were used to create a mathematical model. Using engineering equation solver to analyze and identify the major sources of exergy destruction within the chiller. It was determined that the absorber, generator and condenser are the largest contribution to the exergy destruction at 30%, 31% and 28% of the respectively. The exergetic efficiency is found to be 16% with a Coefficient of performance (COP) of 0.65. Impacts of weak solution concentration of lithium bromide on the exergy destruction rates were evaluated using parametric studies. The studies reveled an optimum concentration that could be obtained by increasing the weak solution concentration from 56% to 58.8% a net decrease in 0.4% of the exergy destruction caused by the absorption chiller can be obtained. The 2.8% increase in lithium-bromide concentration decreases the exergy destruction primarily within the absorber with a decrease of 5.1%. This increase in concentration is shown to also decrease the maximum cooling capacity by 3% and increase the exergy destruction of the generator by 4.9%. The study also shows that the increase in concentration will change the internal temperatures by 3 to 7 °C. Conversely, reducing the weak solution concentration results is also shown to increase the exergetic destruction rates while also potentially increasing the cooling capacity.Entropy2017-04-01194Article10.3390/e190401561561099-43002017-04-01doi: 10.3390/e19040156Andrew LakeBehanz RezaieSteven Beyerlein<![CDATA[Entropy, Vol. 19, Pages 155: Random Walks Associated with Nonlinear Fokker–Planck Equations]]>
http://www.mdpi.com/1099-4300/19/4/155
A nonlinear random walk related to the porous medium equation (nonlinear Fokker–Planck equation) is investigated. This random walk is such that when the number of steps is sufficiently large, the probability of finding the walker in a certain position after taking a determined number of steps approximates to a q-Gaussian distribution ( G q , β ( x ) ∝ [ 1 − ( 1 − q ) β x 2 ] 1 / ( 1 − q ) ), which is a solution of the porous medium equation. This can be seen as a verification of a generalized central limit theorem where the attractor is a q-Gaussian distribution, reducing to the Gaussian one when the linearity is recovered ( q → 1 ). In addition, motivated by this random walk, a nonlinear Markov chain is suggested.Entropy2017-04-01194Article10.3390/e190401551551099-43002017-04-01doi: 10.3390/e19040155Renio dos Santos MendesErvin LenziLuis MalacarneSergio PicoliMax Jauregui<![CDATA[Entropy, Vol. 19, Pages 153: Maxentropic Solutions to a Convex Interpolation Problem Motivated by Utility Theory]]>
http://www.mdpi.com/1099-4300/19/4/153
Here, we consider the following inverse problem: Determination of an increasing continuous function U ( x ) on an interval [ a , b ] from the knowledge of the integrals ∫ U ( x ) d F X i ( x ) = π i where the X i are random variables taking values on [ a , b ] and π i are given numbers. This is a linear integral equation with discrete data, which can be transformed into a generalized moment problem when U ( x ) is supposed to have a positive derivative, and it becomes a classical interpolation problem if the X i are deterministic. In some cases, e.g., in utility theory in economics, natural growth and convexity constraints are required on the function, which makes the inverse problem more interesting. Not only that, the data may be provided in intervals and/or measured up to an additive error. It is the purpose of this work to show how the standard method of maximum entropy, as well as the method of maximum entropy in the mean, provides an efficient method to deal with these problems.Entropy2017-04-01194Article10.3390/e190401531531099-43002017-04-01doi: 10.3390/e19040153Henryk GzylSilvia Mayoral<![CDATA[Entropy, Vol. 19, Pages 154: Is Turbulence a State of Maximum Energy Dissipation?]]>
http://www.mdpi.com/1099-4300/19/4/154
Turbulent flows are known to enhance turbulent transport. It has then even been suggested that turbulence is a state of maximum energy dissipation. In this paper, we re-examine critically this suggestion in light of several recent works around the Maximum Entropy Production principle (MEP) that has been used in several out-of-equilibrium systems. We provide a set of four different optimization principles, based on maximization of energy dissipation, entropy production, Kolmogorov–Sinai entropy and minimization of mixing time, and study the connection between these principles using simple out-of-equilibrium models describing mixing of a scalar quantity. We find that there is a chained-relationship between most probable stationary states of the system, and their ability to obey one of the four principles. This provides an empirical justification of the Maximum Entropy Production principle in this class of systems, including some turbulent flows, for special boundary conditions. Otherwise, we claim that the minimization of the mixing time would be a more appropriate principle. We stress that this principle might actually be limited to flows where symmetry or dynamics impose pure mixing of a quantity (like angular momentum, momentum or temperature). The claim that turbulence is a state of maximum energy dissipation, a quantity intimately related to entropy production, is therefore limited to special situations that nevertheless include classical systems such as shear flows, Rayleigh–Bénard convection and von Kármán flows, forced with constant velocity or temperature conditions.Entropy2017-03-31194Article10.3390/e190401541541099-43002017-03-31doi: 10.3390/e19040154Martin MihelichDavide FarandaDidier PaillardBérengère Dubrulle<![CDATA[Entropy, Vol. 19, Pages 151: A Combined Entropy/Phase-Field Approach to Gravity]]>
http://www.mdpi.com/1099-4300/19/4/151
Terms related to gradients of scalar fields are introduced as scalar products into the formulation of entropy. A Lagrange density is then formulated by adding constraints based on known conservation laws. Applying the Lagrange formalism to the resulting Lagrange density leads to the Poisson equation of gravitation and also includes terms which are related to the curvature of space. The formalism further leads to terms possibly explaining nonlinear extensions known from modified Newtonian dynamics approaches. The article concludes with a short discussion of the presented methodology and provides an outlook on other phenomena which might be dealt with using this new approach.Entropy2017-03-31194Article10.3390/e190401511511099-43002017-03-31doi: 10.3390/e19040151Georg J. Schmitz<![CDATA[Entropy, Vol. 19, Pages 149: A Distribution Family Bridging the Gaussian and the Laplace Laws, Gram–Charlier Expansions, Kurtosis Behaviour, and Entropy Features]]>
http://www.mdpi.com/1099-4300/19/4/149
The paper devises a family of leptokurtic bell-shaped distributions which is based on the hyperbolic secant raised to a positive power, and bridges the Laplace and Gaussian laws on asymptotic arguments. Moment and cumulant generating functions are then derived and represented in terms of polygamma functions. The behaviour of shape parameters, namely kurtosis and entropy, is investigated. In addition, Gram–Charlier-type (GCT) expansions, based on the aforementioned distributions and their orthogonal polynomials, are specified, and an operational criterion is provided to meet modelling requirements in a possibly severe kurtosis and skewness environment. The role played by entropy within the kurtosis ranges of GCT expansions is also examined.Entropy2017-03-31194Article10.3390/e190401491491099-43002017-03-31doi: 10.3390/e19040149Mario FalivaMaria Zoia<![CDATA[Entropy, Vol. 19, Pages 150: Minimum Sample Size for Reliable Causal Inference Using Transfer Entropy]]>
http://www.mdpi.com/1099-4300/19/4/150
Transfer Entropy has been applied to experimental datasets to unveil causality between variables. In particular, its application to non-stationary systems has posed a great challenge due to restrictions on the sample size. Here, we have investigated the minimum sample size that produces a reliable causal inference. The methodology has been applied to two prototypical models: the linear model autoregressive-moving average and the non-linear logistic map. The relationship between the Transfer Entropy value and the sample size has been systematically examined. Additionally, we have shown the dependence of the reliable sample size and the strength of coupling between the variables. Our methodology offers a realistic lower bound for the sample size to produce a reliable outcome.Entropy2017-03-31194Article10.3390/e190401501501099-43002017-03-31doi: 10.3390/e19040150Antônio RamosElbert Macau<![CDATA[Entropy, Vol. 19, Pages 148: Unsupervised Symbolization of Signal Time Series for Extraction of the Embedded Information]]>
http://www.mdpi.com/1099-4300/19/4/148
This paper formulates an unsupervised algorithm for symbolization of signal time series to capture the embedded dynamic behavior. The key idea is to convert time series of the digital signal into a string of (spatially discrete) symbols from which the embedded dynamic information can be extracted in an unsupervised manner (i.e., no requirement for labeling of time series). The main challenges here are: (1) definition of the symbol assignment for the time series; (2) identification of the partitioning segment locations in the signal space of time series; and (3) construction of probabilistic finite-state automata (PFSA) from the symbol strings that contain temporal patterns. The reported work addresses these challenges by maximizing the mutual information measures between symbol strings and PFSA states. The proposed symbolization method has been validated by numerical simulation as well as by experimentation in a laboratory environment. Performance of the proposed algorithm has been compared to that of two commonly used algorithms of time series partitioning.Entropy2017-03-31194Article10.3390/e190401481481099-43002017-03-31doi: 10.3390/e19040148Yue LiAsok Ray<![CDATA[Entropy, Vol. 19, Pages 145: Multiscale Cross-Approximate Entropy Analysis of Bilateral Fingertips Photoplethysmographic Pulse Amplitudes among Middle-to-Old Aged Individuals with or without Type 2 Diabetes]]>
http://www.mdpi.com/1099-4300/19/4/145
Multiscale cross-approximate entropy (MC-ApEn) between two different physiological signals could evaluate cardiovascular health in diabetes. Whether MC-ApEn analysis between two similar signals such as photoplethysmographic (PPG) pulse amplitudes of bilateral fingertips can reflect diabetes status is unknown. From a middle-to-old-aged population free of prior cardiovascular disease, we selected the unaffected (no type 2 diabetes, n = 36), the well-controlled diabetes (glycated hemoglobin (HbA1c) &lt; 8%, n = 30), and the poorly- controlled diabetes (HbA1c ≥ 8%, n = 26) groups. MC-ApEn indexes were calculated from simultaneous consecutive 1500 PPG pulse amplitudes signals of bilateral index fingertips. The average of scale factors 1–5 (MC-ApEnSS) and of scale factors 6–10 (MC-ApEnLS) were defined as the small- and large-scales MC-ApEn, respectively. The MC-ApEnLS index was highest in the unaffected, followed by the well-controlled diabetes, and then the poorly-controlled diabetes (0.70, 0.62, and 0.53; all paired p-values were &lt;0.05); in contrast, the MC-ApEnSS index did not differ between groups. Our findings suggested that the bilateral fingertips large-scale MC-ApEnLS index of PPG pulse amplitudes might be able to evaluate the glycemic status and detect subtle vascular disease in type 2 diabetes.Entropy2017-03-30194Article10.3390/e190401451451099-43002017-03-30doi: 10.3390/e19040145Hsien-Tsai WuCheng-Chan YangGen-Min LinBagus HaryadiShiao-Chiang ChuChieh-Ming YangCheuk-Kwan Sun<![CDATA[Entropy, Vol. 19, Pages 144: The Many Classical Faces of Quantum Structures]]>
http://www.mdpi.com/1099-4300/19/4/144
Interpretational problems with quantum mechanics can be phrased precisely by only talking about empirically accessible information. This prompts a mathematical reformulation of quantum mechanics in terms of classical mechanics. We survey this programme in terms of algebraic quantum theory.Entropy2017-03-29194Article10.3390/e190401441441099-43002017-03-29doi: 10.3390/e19040144Chris Heunen<![CDATA[Entropy, Vol. 19, Pages 143: Paradigms of Cognition]]>
http://www.mdpi.com/1099-4300/19/4/143
An abstract, quantitative theory which connects elements of information —key ingredients in the cognitive proces—is developed. Seemingly unrelated results are thereby unified. As an indication of this, consider results in classical probabilistic information theory involving information projections and so-called Pythagorean inequalities. This has a certain resemblance to classical results in geometry bearing Pythagoras’ name. By appealing to the abstract theory presented here, you have a common point of reference for these results. In fact, the new theory provides a general framework for the treatment of a multitude of global optimization problems across a range of disciplines such as geometry, statistics and statistical physics. Several applications are given, among them an “explanation” of Tsallis entropy is suggested. For this, as well as for the general development of the abstract underlying theory, emphasis is placed on interpretations and associated philosophical considerations. Technically, game theory is the key tool.Entropy2017-03-27194Article10.3390/e190401431431099-43002017-03-27doi: 10.3390/e19040143Flemming Topsøe<![CDATA[Entropy, Vol. 19, Pages 142: A Novel Framework for Shock Filter Using Partial Differential Equations]]>
http://www.mdpi.com/1099-4300/19/4/142
In dilation or erosion processes, a shock filter is widely used in signal enhancing or image deburring. Traditionally, sign function is employed in shock filtering for reweighting of edge-detection in images and decides whether a pixel should dilate to the local maximum or evolve to the local minimum. Some researchers replace sign function with tanh function or arctan function, trying to change the evolution tracks of the pixels when filtering is in progress. However, analysis here reveals that only function replacement does usually not work. This paper revisits first shock filters and their modifications. Then, a fuzzy shock filter is proposed after a membership function in a shock filter model is adopted to adjust the evolve rate of image pixels. The proposed filter is a parameter tuning system, which unites several formulations of shock filters into one fuzzy framework. Experimental results show that the new filter is flexible and robust and can converge fast.Entropy2017-03-26194Article10.3390/e190401421421099-43002017-03-26doi: 10.3390/e19040142Chunmei DuanHongqian Lu<![CDATA[Entropy, Vol. 19, Pages 141: Permutation Entropy for the Characterisation of Brain Activity Recorded with Magnetoencephalograms in Healthy Ageing]]>
http://www.mdpi.com/1099-4300/19/4/141
The characterisation of healthy ageing of the brain could help create a fingerprint of normal ageing that might assist in the early diagnosis of neurodegenerative conditions. This study examined changes in resting state magnetoencephalogram (MEG) permutation entropy due to age and gender in a sample of 220 healthy participants (98 males and 122 females, ages ranging between 7 and 84). Entropy was quantified using normalised permutation entropy and modified permutation entropy, with an embedding dimension of 5 and a lag of 1 as the input parameters for both algorithms. Effects of age were observed over the five regions of the brain, i.e., anterior, central, posterior, and left and right lateral, with the anterior and central regions containing the highest permutation entropy. Statistically significant differences due to age were observed in the different brain regions for both genders, with the evolutions described using the fitting of polynomial regressions. Nevertheless, no significant differences between the genders were observed across all ages. These results suggest that the evolution of entropy in the background brain activity, quantified with permutation entropy algorithms, might be considered an alternative illustration of a ‘nominal’ physiological rhythm.Entropy2017-03-25194Article10.3390/e190401411411099-43002017-03-25doi: 10.3390/e19040141Elizabeth ShumbayawondaAlberto FernándezMichael HughesDaniel Abásolo<![CDATA[Entropy, Vol. 19, Pages 137: Impact Location and Quantification on an Aluminum Sandwich Panel Using Principal Component Analysis and Linear Approximation with Maximum Entropy]]>
http://www.mdpi.com/1099-4300/19/4/137
To avoid structural failures it is of critical importance to detect, locate and quantify impact damage as soon as it occurs. This can be achieved by impact identification methodologies, which continuously monitor the structure, detecting, locating, and quantifying impacts as they occur. This article presents an improved impact identification algorithm that uses principal component analysis (PCA) to extract features from the monitored signals and an algorithm based on linear approximation with maximum entropy to estimate the impacts. The proposed methodology is validated with two experimental applications, which include an aluminum plate and an aluminum sandwich panel. The results are compared with those of other impact identification algorithms available in literature, demonstrating that the proposed method outperforms these algorithms.Entropy2017-03-25194Article10.3390/e190401371371099-43002017-03-25doi: 10.3390/e19040137Viviana MeruanePablo VélizEnrique López DroguettAlejandro Ortiz-Bernardin<![CDATA[Entropy, Vol. 19, Pages 140: Ionic Liquids Confined in Silica Ionogels: Structural, Thermal, and Dynamical Behaviors]]>
http://www.mdpi.com/1099-4300/19/4/140
Ionogels are porous monoliths providing nanometer-scale confinement of an ionic liquid within an oxide network. Various dynamic parameters and the detailed nature of phase transitions were investigated by using a neutron scattering technique, giving smaller time and space scales compared to earlier results from other techniques. By investigating the nature of the hydrogen mean square displacement (local mobility), qualitative information on diffusion and different phase transitions were obtained. The results presented herein show similar short-time molecular dynamics between pristine ionic liquids and confined ionic liquids through residence time and diffusion coefficient values, thus, explaining in depth the good ionic conductivity of ionogels.Entropy2017-03-24194Article10.3390/e190401401401099-43002017-03-24doi: 10.3390/e19040140Subhankur MitraCarole CerclierQuentin BerrodFilippo FerdeghiniRodrigo de Oliveira-SilvaPatrick JudeinsteinJean le BideauJean-Marc Zanotti