Entropy
http://www.mdpi.com/journal/entropy
Latest open access articles published in Entropy at http://www.mdpi.com/journal/entropy<![CDATA[Entropy, Vol. 17, Pages 6169-6178: Short-Lived Lattice Quasiparticles for Strongly Interacting Fluids]]>
http://www.mdpi.com/1099-4300/17/9/6169
It is shown that lattice kinetic theory based on short-lived quasiparticles proves very effective in simulating the complex dynamics of strongly interacting fluids (SIF). In particular, it is pointed out that the shear viscosity of lattice fluids is the sum of two contributions, one due to the usual interactions between particles (collision viscosity) and the other due to the interaction with the discrete lattice (propagation viscosity). Since the latter is negative, the sum may turn out to be orders of magnitude smaller than each of the two contributions separately, thus providing a mechanism to access SIF regimes at ordinary values of the collisional viscosity. This concept, as applied to quantum superfluids in one-dimensional optical lattices, is shown to reproduce shear viscosities consistent with the AdS-CFT holographic bound on the viscosity/entropy ratio. This shows that lattice kinetic theory continues to hold for strongly coupled hydrodynamic regimes where continuum kinetic theory may no longer be applicable.Entropy2015-09-03179Article10.3390/e17096169616961781099-43002015-09-03doi: 10.3390/e17096169Miller JimenezSauro Succi<![CDATA[Entropy, Vol. 17, Pages 6150-6168: Conformal Gauge Transformations in Thermodynamics]]>
http://www.mdpi.com/1099-4300/17/9/6150
In this work, we show that the thermodynamic phase space is naturally endowed with a non-integrable connection, defined by all of those processes that annihilate the Gibbs one-form, i.e., reversible processes. We argue that such a connection is invariant under re-scalings of the connection one-form, whilst, as a consequence of the non-integrability of the connection, its curvature is not and, therefore, neither is the associated pseudo-Riemannian geometry. We claim that this is not surprising, since these two objects are associated with irreversible processes. Moreover, we provide the explicit form in which all of the elements of the geometric structure of the thermodynamic phase space change under a re-scaling of the connection one-form. We call this transformation of the geometric structure a conformal gauge transformation. As an example, we revisit the change of the thermodynamic representation and consider the resulting change between the two metrics on the thermodynamic phase space, which induce Weinhold’s energy metric and Ruppeiner’s entropy metric. As a by-product, we obtain a proof of the well-known conformal relation between Weinhold’s and Ruppeiner’s metrics along the equilibrium directions. Finally, we find interesting properties of the almost para-contact structure and of its eigenvectors, which may be of physical interest.Entropy2015-09-02179Article10.3390/e17096150615061681099-43002015-09-02doi: 10.3390/e17096150Alessandro BravettiCesar Lopez-MonsalvoFrancisco Nettel<![CDATA[Entropy, Vol. 17, Pages 6129-6149: Nonlinear Predictive Control of a Hydropower System Model]]>
http://www.mdpi.com/1099-4300/17/9/6129
A six-dimensional nonlinear hydropower system controlled by a nonlinear predictive control method is presented in this paper. In terms of the nonlinear predictive control method; the performance index with terminal penalty function is selected. A simple method to find an appropriate terminal penalty function is introduced and its effectiveness is proved. The input-to-state-stability of the controlled system is proved by using the Lyapunov function. Subsequently a six-dimensional model of the hydropower system is presented in the paper. Different with other hydropower system models; the above model includes the hydro-turbine system; the penstock system; the generator system; and the hydraulic servo system accurately describing the operational process of a hydropower plant. Furthermore, the numerical experiments show that the six-dimensional nonlinear hydropower system controlled by the method is stable. In addition, the numerical experiment also illustrates that the nonlinear predictive control method enjoys great advantages over a traditional control method in nonlinear systems. Finally, a strategy to combine the nonlinear predictive control method with other methods is proposed to further facilitate the application of the nonlinear predictive control method into practice.Entropy2015-09-01179Article10.3390/e17096129612961491099-43002015-09-01doi: 10.3390/e17096129Runfan ZhangDiyi ChenXiaoyi Ma<![CDATA[Entropy, Vol. 17, Pages 6110-6128: Entropic Dynamics]]>
http://www.mdpi.com/1099-4300/17/9/6110
Entropic Dynamics is a framework in which dynamical laws are derived as an application of entropic methods of inference. No underlying action principle is postulated. Instead, the dynamics is driven by entropy subject to the constraints appropriate to the problem at hand. In this paper we review three examples of entropic dynamics. First we tackle the simpler case of a standard diffusion process which allows us to address the central issue of the nature of time. Then we show that imposing the additional constraint that the dynamics be non-dissipative leads to Hamiltonian dynamics. Finally, considerations from information geometry naturally lead to the type of Hamiltonian that describes quantum theory.Entropy2015-09-01179Review10.3390/e17096110611061281099-43002015-09-01doi: 10.3390/e17096110Ariel Caticha<![CDATA[Entropy, Vol. 17, Pages 6093-6109: Optimal Base Wavelet Selection for ECG Noise Reduction Using a Comprehensive Entropy Criterion]]>
http://www.mdpi.com/1099-4300/17/9/6093
The selection of an appropriate wavelet is an essential issue that should be addressed in the wavelet-based filtering of electrocardiogram (ECG) signals. Since entropy can measure the features of uncertainty associated with the ECG signal, a novel comprehensive entropy criterion Ecom based on multiple criteria related to entropy and energy is proposed in this paper to search for an optimal base wavelet for a specific ECG signal. Taking account of the decomposition capability of wavelets and the similarity in information between the decomposed coefficients and the analyzed signal, the proposed Ecom criterion integrates eight criteria, i.e., energy, entropy, energy-to-entropy ratio, joint entropy, conditional entropy, mutual information, relative entropy, as well as comparison information entropy for optimal wavelet selection. The experimental validation is conducted on the basis of ECG signals of sixteen subjects selected from the MIT-BIH Arrhythmia Database. The Ecom is compared with each of these eight criteria through four filtering performance indexes, i.e., output signal to noise ratio (SNRo), root mean square error (RMSE), percent root mean-square difference (PRD) and correlation coefficients. The filtering results of ninety-six ECG signals contaminated by noise have verified that Ecom has outperformed the other eight criteria in the selection of best base wavelets for ECG signal filtering. The wavelet identified by the Ecom has achieved the best filtering performance than the other comparative criteria. A hypothesis test also validates that SNRo, RMSE, PRD and correlation coefficients of Ecom are significantly different from those of the shape-matched approach ( , two-sided t- test).Entropy2015-09-01179Article10.3390/e17096093609361091099-43002015-09-01doi: 10.3390/e17096093Hong HeYonghong TanYuexia Wang<![CDATA[Entropy, Vol. 17, Pages 6072-6092: Distributing Secret Keys with Quantum Continuous Variables: Principle, Security and Implementations]]>
http://www.mdpi.com/1099-4300/17/9/6072
The ability to distribute secret keys between two parties with information-theoretic security, that is regardless of the capacities of a malevolent eavesdropper, is one of the most celebrated results in the field of quantum information processing and communication. Indeed, quantum key distribution illustrates the power of encoding information on the quantum properties of light and has far-reaching implications in high-security applications. Today, quantum key distribution systems operate in real-world conditions and are commercially available. As with most quantum information protocols, quantum key distribution was first designed for qubits, the individual quanta of information. However, the use of quantum continuous variables for this task presents important advantages with respect to qubit-based protocols, in particular from a practical point of view, since it allows for simple implementations that require only standard telecommunication technology. In this review article, we describe the principle of continuous-variable quantum key distribution, focusing in particular on protocols based on coherent states. We discuss the security of these protocols and report on the state-of-the-art in experimental implementations, including the issue of side-channel attacks. We conclude with promising perspectives in this research field.Entropy2015-08-31179Review10.3390/e17096072607260921099-43002015-08-31doi: 10.3390/e17096072Eleni DiamantiAnthony Leverrier<![CDATA[Entropy, Vol. 17, Pages 6056-6071: Self-Similar Solutions of Rényi’s Entropy and the Concavity of Its Entropy Power]]>
http://www.mdpi.com/1099-4300/17/9/6056
We study the class of self-similar probability density functions with finite mean and variance, which maximize Rényi’s entropy. The investigation is restricted in the Schwartz space S(Rd) and in the space of l-differentiable compactly supported functions Clc (Rd). Interestingly, the solutions of this optimization problem do not coincide with the solutions of the usual porous medium equation with a Dirac point source, as occurs in the optimization of Shannon’s entropy. We also study the concavity of the entropy power in Rd with respect to time using two different methods. The first one takes advantage of the solutions determined earlier, while the second one is based on a setting that could be used for Riemannian manifolds.Entropy2015-08-31179Article10.3390/e17096056605660711099-43002015-08-31doi: 10.3390/e17096056Agapitos Hatzinikitas<![CDATA[Entropy, Vol. 17, Pages 6044-6055: The Effect of a Long-Range Correlated-Hopping Interaction on Bariev Spin Chains]]>
http://www.mdpi.com/1099-4300/17/9/6044
We introduce a long-range particle and spin interaction into the standard Bariev model and show that this interaction is equivalent to a phase shift in the kinetic term of the Hamiltonian. When the particles circle around the chain and across the boundary, the accumulated phase shift acts as a twist boundary condition with respect to the normal periodic boundary condition. This boundary phase term depends on the total number of particles in the system and also the number of particles in different spin states, which relates to the spin fluctuations in the system. The model is solved exactly via a unitary transformation by the coordinate Bethe ansatz. We calculate the Bethe equations and work out the energy spectrum with varying number of particles and spins.Entropy2015-08-28179Article10.3390/e17096044604460551099-43002015-08-28doi: 10.3390/e17096044Tao YangFa-Kai WenKun HaoLi-Ke CaoRui-Hong Yue<![CDATA[Entropy, Vol. 17, Pages 6025-6043: New Exact Solutions of the New Hamiltonian Amplitude-Equation and Fokas Lenells Equation]]>
http://www.mdpi.com/1099-4300/17/9/6025
In this paper, exact solutions of the new Hamiltonian amplitude equation and Fokas-Lenells equation are successfully obtained. The extended trial equation method (ETEM) and generalized Kudryashov method (GKM) are applied to find several exact solutions of the new Hamiltonian amplitude equation and Fokas-Lenells equation. Primarily, we seek some exact solutions of the new Hamiltonian amplitude equation and Fokas-Lenells equation by using ETEM. Then, we research dark soliton solutions of the new Hamiltonian amplitude equation and Fokas-Lenells equation by using GKM. Lastly, according to the values of some parameters, we draw two and three dimensional graphics of imaginary and real values of certain solutions found by utilizing both methods.Entropy2015-08-27179Article10.3390/e17096025602560431099-43002015-08-27doi: 10.3390/e17096025Seyma DemirayHasan Bulut<![CDATA[Entropy, Vol. 17, Pages 6007-6024: A Gloss Composition and Context Clustering Based Distributed Word Sense Representation Model]]>
http://www.mdpi.com/1099-4300/17/9/6007
In recent years, there has been an increasing interest in learning a distributed representation of word sense. Traditional context clustering based models usually require careful tuning of model parameters, and typically perform worse on infrequent word senses. This paper presents a novel approach which addresses these limitations by first initializing the word sense embeddings through learning sentence-level embeddings from WordNet glosses using a convolutional neural networks. The initialized word sense embeddings are used by a context clustering based model to generate the distributed representations of word senses. Our learned representations outperform the publicly available embeddings on half of the metrics in the word similarity task, 6 out of 13 sub tasks in the analogical reasoning task, and gives the best overall accuracy in the word sense effect classification task, which shows the effectiveness of our proposed distributed distribution learning model.Entropy2015-08-27179Article10.3390/e17096007600760241099-43002015-08-27doi: 10.3390/e17096007Tao ChenRuifeng XuYulan HeXuan Wang<![CDATA[Entropy, Vol. 17, Pages 5995-6006: Proportionate Minimum Error Entropy Algorithm for Sparse System Identification]]>
http://www.mdpi.com/1099-4300/17/9/5995
Sparse system identification has received a great deal of attention due to its broad applicability. The proportionate normalized least mean square (PNLMS) algorithm, as a popular tool, achieves excellent performance for sparse system identification. In previous studies, most of the cost functions used in proportionate-type sparse adaptive algorithms are based on the mean square error (MSE) criterion, which is optimal only when the measurement noise is Gaussian. However, this condition does not hold in most real-world environments. In this work, we use the minimum error entropy (MEE) criterion, an alternative to the conventional MSE criterion, to develop the proportionate minimum error entropy (PMEE) algorithm for sparse system identification, which may achieve much better performance than the MSE based methods especially in heavy-tailed non-Gaussian situations. Moreover, we analyze the convergence of the proposed algorithm and derive a sufficient condition that ensures the mean square convergence. Simulation results confirm the excellent performance of the new algorithm.Entropy2015-08-27179Letter10.3390/e17095995599560061099-43002015-08-27doi: 10.3390/e17095995Zongze WuSiyuan PengBadong ChenHaiquan ZhaoJose Principe<![CDATA[Entropy, Vol. 17, Pages 5980-5994: Numerical Investigation into Natural Convection and Entropy Generation in a Nanofluid-Filled U-Shaped Cavity]]>
http://www.mdpi.com/1099-4300/17/9/5980
This current work studies the heat transfer performance and entropy generation of natural convection in a nanofluid-filled U-shaped cavity. The flow behavior and heat transfer performance in the cavity are governed using the continuity equation, momentum equations, energy equation and Boussinesq approximation, and are solved numerically using the finite-volume method and SIMPLE C algorithm. The simulations examine the effects of the nanoparticle volume fraction, Rayleigh number and the geometry parameters of the U-shaped cavity on the mean Nusselt number and total entropy generation. It shows that the mean Nusselt number increases and the total entropy generation reduces as the volume fraction of nanoparticles increases. In addition, the results show that the mean Nusselt number and the total entropy generation are both increased as the Rayleigh number increases. Finally, it also shows that mean Nusselt number can be increased and the total entropy generation can be reduced by extending the length of the low temperature walls or widening the width of the low temperature walls.Entropy2015-08-26179Article10.3390/e17095980598059941099-43002015-08-26doi: 10.3390/e17095980Ching-Chang ChoHer-Terng YauChing-Huang ChiuKuo-Ching Chiu<![CDATA[Entropy, Vol. 17, Pages 5965-5979: Friction Signal Denoising Using Complete Ensemble EMD with Adaptive Noise and Mutual Information]]>
http://www.mdpi.com/1099-4300/17/9/5965
During the measurement of friction force, the measured signal generally contains noise. To remove the noise and preserve the important features of the signal, a hybrid filtering method is introduced that uses the mutual information and a new waveform. This new waveform is the difference between the original signal and the sum of intrinsic mode functions (IMFs), which are obtained by empirical mode decomposition (EMD) or its improved versions. To evaluate the filter performance for the friction signal, ensemble EMD (EEMD), complementary ensemble EMD (CEEMD), and complete ensemble EMD with adaptive noise (CEEMDAN) are employed in combination with the proposed filtering method. The combination is used to filter the synthesizing signals at first. For the filtering of the simulation signal, the filtering effect is compared under conditions of different ensemble number, sampling frequency, and the input signal-noise ratio, respectively. Results show that CEEMDAN outperforms other signal filtering methods. In particular, this method is successful in filtering the friction signal as evaluated by the de-trended fluctuation analysis (DFA) algorithm.Entropy2015-08-25179Article10.3390/e17095965596559791099-43002015-08-25doi: 10.3390/e17095965Chengwei LiLiwei ZhanLiqun Shen<![CDATA[Entropy, Vol. 17, Pages 5938-5964: Geometry of Multiscale Nonequilibrium Thermodynamics]]>
http://www.mdpi.com/1099-4300/17/9/5938
The time evolution of macroscopic systems can be experimentally observed and mathematically described on many different levels of description. It has been conjectured that the governing equations on all levels are particular realizations of a single abstract equation. We support this conjecture by interpreting the abstract equation as a geometrical formulation of general nonequilibrium thermodynamics.Entropy2015-08-25179Review10.3390/e17095938593859641099-43002015-08-25doi: 10.3390/e17095938Miroslav Grmela<![CDATA[Entropy, Vol. 17, Pages 5920-5937: Entropy Associated with Information Storage and Its Retrieval]]>
http://www.mdpi.com/1099-4300/17/8/5920
We provide an entropy analysis for light storage and light retrieval. In this analysis, entropy extraction and reduction in a typical light storage experiment are identified. The spatiotemporal behavior of entropy is presented for D1 transition in cold sodium atoms. The governing equations are the reduced Maxwell field equations and the Liouville–von Neumann equation for the density matrix of the dressed atom.Entropy2015-08-24178Article10.3390/e17085920592059371099-43002015-08-24doi: 10.3390/e17085920Abu Alhasan<![CDATA[Entropy, Vol. 17, Pages 5903-5919: Maximal Repetitions in Written Texts: Finite Energy Hypothesis vs. Strong Hilberg Conjecture]]>
http://www.mdpi.com/1099-4300/17/8/5903
The article discusses two mutually-incompatible hypotheses about the stochastic mechanism of the generation of texts in natural language, which could be related to entropy. The first hypothesis, the finite energy hypothesis, assumes that texts are generated by a process with exponentially-decaying probabilities. This hypothesis implies a logarithmic upper bound for maximal repetition, as a function of the text length. The second hypothesis, the strong Hilberg conjecture, assumes that the topological entropy grows as a power law. This hypothesis leads to a hyperlogarithmic lower bound for maximal repetition. By a study of 35 written texts in German, English and French, it is found that the hyperlogarithmic growth of maximal repetition holds for natural language. In this way, the finite energy hypothesis is rejected, and the strong Hilberg conjecture is partly corroborated.Entropy2015-08-21178Article10.3390/e17085903590359191099-43002015-08-21doi: 10.3390/e17085903Łukasz Dębowski<![CDATA[Entropy, Vol. 17, Pages 5888-5902: Generalised Complex Geometry in Thermodynamical Fluctuation Theory]]>
http://www.mdpi.com/1099-4300/17/8/5888
We present a brief overview of some key concepts in the theory of generalized complex manifolds. This new geometry interpolates, so to speak, between symplectic geometry and complex geometry. As such it provides an ideal framework to analyze thermodynamical fluctuation theory in the presence of gravitational fields. To illustrate the usefulness of generalized complex geometry, we examine a simplified version of the Unruh effect: the thermalising effect of gravitational fields on the Schroedinger wavefunction.Entropy2015-08-20178Article10.3390/e17085888588859021099-43002015-08-20doi: 10.3390/e17085888P. de CórdobaJ. Isidro<![CDATA[Entropy, Vol. 17, Pages 5868-5887: Detection of Causality between Process Variables Based on Industrial Alarm Data Using Transfer Entropy]]>
http://www.mdpi.com/1099-4300/17/8/5868
In modern industrial processes, it is easier and less expensive to configure alarms by software settings rather than by wiring, which causes the rapid growth of the number of alarms. Moreover, because there exist complex interactions, in particular the causal relationship among different parts in the process, a fault may propagate along propagation pathways once an abnormal situation occurs, which brings great difficulty to operators to identify its root cause immediately and to take proper actions correctly. Therefore, causality detection becomes a very important problem in the context of multivariate alarm analysis and design. Transfer entropy has become an effective and widely-used method to detect causality between different continuous process variables in both linear and nonlinear situations in recent years. However, such conventional methods to detect causality based on transfer entropy are computationally costly. Alternatively, using binary alarm series can be more computational-friendly and more direct because alarm data analysis is straightforward for alarm management in practice. The methodology and implementation issues are discussed in this paper. Illustrated by several case studies, including both numerical cases and simulated industrial cases, the proposed method is demonstrated to be suitable for industrial situations contaminated by noise.Entropy2015-08-20178Article10.3390/e17085868586858871099-43002015-08-20doi: 10.3390/e17085868Weijun YuFan Yang<![CDATA[Entropy, Vol. 17, Pages 5848-5867: A Model for Scale-Free Networks: Application to Twitter]]>
http://www.mdpi.com/1099-4300/17/8/5848
In the last few years, complex networks have become an increasingly relevant research topic due to the large number of fields of application. Particularly, complex networks are especially significant in the area of modern online social networks (OSNs). OSNs are actually a challenge for complex network analysis, as they present some characteristics that hinder topology processing. Concretely, social networks’ volume is exceedingly big, as they have a high number of nodes and links. One of the most popular and influential OSNs is Twitter. In this paper, we present a model to describe the growth of scale-free networks. This model is applied to Twitter after checking that it can be considered a “scale-free” complex network fulfilling the small world property. Checking this property involves the calculation of the shortest path between any two nodes of the network. Given the difficulty of this computation for large networks, a new heuristic method is also proposed to find the upper bounds of the path lengths instead of computing the exact length.Entropy2015-08-17178Article10.3390/e17085848584858671099-43002015-08-17doi: 10.3390/e17085848Sofía AparicioJavier Villazón-TerrazasGonzalo Álvarez<![CDATA[Entropy, Vol. 17, Pages 5829-5847: Parametric Analysis of a Two-Shaft Aeroderivate Gas Turbine of 11.86 MW]]>
http://www.mdpi.com/1099-4300/17/8/5829
The aeroderivate gas turbines are widely used for power generation in the oil and gas industry. In offshore marine platforms, the aeroderivative gas turbines provide the energy required to drive mechanically compressors, pumps and electric generators. Therefore, the study of the performance of aeroderivate gas turbines based on a parametric analysis is relevant to carry out a diagnostic of the engine, which can lead to operational as well as predictive and/or corrective maintenance actions. This work presents a methodology based on the exergetic analysis to estimate the irrevesibilities and exergetic efficiencies of the main components of a two-shaft aeroderivate gas turbine. The studied engine is the Solar Turbine Mars 100, which is rated to provide 11.86 MW. In this engine, the air is compressed in an axial compressor achieving a pressure ratio of 17.7 relative to ambient conditions and a high pressure turbine inlet temperature of 1220 °C. Even if the thermal efficiency associated to the pressure ratio of 17.7 is 1% lower than the maximum thermal efficiency, the irreversibilities related to this pressure ratio decrease approximately 1 GW with respect to irreversibilities of the optimal pressure ratio for the thermal efficiency. In addition, this paper contributes to develop a mathematical model to estimate the high turbine inlet temperature as well as the pressure ratio of the low and high pressure turbines.Entropy2015-08-14178Article10.3390/e17085829582958471099-43002015-08-14doi: 10.3390/e17085829R. Lugo-LeyteM. Salazar-PereyraH. MéndezI. Aguilar-AdayaJ. Ambriz-GarcíaJ. Vargas<![CDATA[Entropy, Vol. 17, Pages 5811-5828: Combined Power Quality Disturbances Recognition Using Wavelet Packet Entropies and S-Transform]]>
http://www.mdpi.com/1099-4300/17/8/5811
Aiming at the combined power quality +disturbance recognition, an automated recognition method based on wavelet packet entropy (WPE) and modified incomplete S-transform (MIST) is proposed in this paper. By combining wavelet packet Tsallis singular entropy, energy entropy and MIST, a 13-dimension vector of different power quality (PQ) disturbances including single disturbances and combined disturbances is extracted. Then, a ruled decision tree is designed to recognize the combined disturbances. The proposed method is tested and evaluated using a large number of simulated PQ disturbances and some real-life signals, which include voltage sag, swell, interruption, oscillation transient, impulsive transient, harmonics, voltage fluctuation and their combinations. In addition, the comparison of the proposed recognition approach with some existing techniques is made. The experimental results show that the proposed method can effectively recognize the single and combined PQ disturbances.Entropy2015-08-12178Article10.3390/e17085811581158281099-43002015-08-12doi: 10.3390/e17085811Zhigang LiuYan CuiWenhui Li<![CDATA[Entropy, Vol. 17, Pages 5799-5810: Entropy Bounds and Field Equations]]>
http://www.mdpi.com/1099-4300/17/8/5799
For general metric theories of gravity, we compare the approach that describes/derives the field equations of gravity as a thermodynamic identity with the one which looks at them from entropy bounds. The comparison is made through the consideration of the matter entropy flux across (Rindler) horizons, studied by making use of the notion of a limiting thermodynamic scale l* of matter, previously introduced in the context of entropy bounds. In doing this: (i) a bound for the entropy of any lump of matter with a given energy-momentum tensor Tab is considered, in terms of a quantity, which is independent of the theory of gravity that we use; this quantity is the variation of the Clausius entropy of a suitable horizon when the element of matter crosses it; (ii) by making use of the equations of motion of the theory, the same quantity is then expressed as the variation of Wald’s entropy of that horizon (and this leads to a generalized form of the generalized covariant entropy bound, applicable to general diffeomorphism-invariant theories of gravity); and (iii) a notion of l* for horizons, as well as an expression for it, is given.Entropy2015-08-12178Article10.3390/e17085799579958101099-43002015-08-12doi: 10.3390/e17085799Alessandro Pesci<![CDATA[Entropy, Vol. 17, Pages 5784-5798: Computing and Learning Year-Round Daily Patterns of Hourly Wind Speed and Direction and Their Global Associations with Meteorological Factors]]>
http://www.mdpi.com/1099-4300/17/8/5784
Daily wind patterns and their relational associations with other metocean (oceanographic and meteorological) variables were algorithmically computed and extracted from a year-long wind and weather dataset, which was collected hourly from an ocean buoy located in the Penghu archipelago of Taiwan. The computational algorithm is called data cloud geometry (DCG). This DCG algorithm is a clustering-based nonparametric learning approach that was constructed and developed implicitly based on various entropy concepts. Regarding the bivariate aspect of wind speed and wind direction, the resulting multiscale clustering hierarchy revealed well-known wind characteristics of year-round pattern cycles pertaining to the particular geographic location of the buoy. A wind pattern due to a set of extreme weather days was also identified. Moreover, in terms of the relational aspect of wind and other weather variables, causal patterns were revealed through applying the DCG algorithm alternatively on the row and column axes of a data matrix by iteratively adapting distance measures to computed DCG tree structures. This adaptation technically constructed and integrated a multiscale, two-sample testing into the distance measure. These computed wind patterns and pattern-based causal relationships are useful for both general sailing and competition planning.Entropy2015-08-11178Article10.3390/e17085784578457981099-43002015-08-11doi: 10.3390/e17085784Hsing-Ti WuHsieh FushingLaurence Chuang<![CDATA[Entropy, Vol. 17, Pages 5771-5783: Active Control of a Chaotic Fractional Order Economic System]]>
http://www.mdpi.com/1099-4300/17/8/5771
In this paper, a fractional order economic system is studied. An active control technique is applied to control chaos in this system. The stabilization of equilibria is obtained by both theoretical analysis and the simulation result. The numerical simulations, via the improved Adams–Bashforth algorithm, show the effectiveness of the proposed controller.Entropy2015-08-11178Article10.3390/e17085771577157831099-43002015-08-11doi: 10.3390/e17085771Haci BaskonusToufik MekkaouiZakia HammouchHasan Bulut<![CDATA[Entropy, Vol. 17, Pages 5752-5770: Consistency of Learning Bayesian Network Structures with Continuous Variables: An Information Theoretic Approach]]>
http://www.mdpi.com/1099-4300/17/8/5752
We consider the problem of learning a Bayesian network structure given n examples and the prior probability based on maximizing the posterior probability. We propose an algorithm that runs in O(n log n) time and that addresses continuous variables and discrete variables without assuming any class of distribution. We prove that the decision is strongly consistent, i.e., correct with probability one as n ! 1. To date, consistency has only been obtained for discrete variables for this class of problem, and many authors have attempted to prove consistency when continuous variables are present. Furthermore, we prove that the “log n” term that appears in the penalty term of the description length can be replaced by 2(1+ε) log log n to obtain strong consistency, where ε &gt; 0 is arbitrary, which implies that the Hannan–Quinn proposition holds.Entropy2015-08-10178Article10.3390/e17085752575257701099-43002015-08-10doi: 10.3390/e17085752Joe Suzuki<![CDATA[Entropy, Vol. 17, Pages 5729-5751: Deformed Algebras and Generalizations of Independence on Deformed Exponential Families]]>
http://www.mdpi.com/1099-4300/17/8/5729
A deformed exponential family is a generalization of exponential families. Since the useful classes of power law tailed distributions are described by the deformed exponential families, they are important objects in the theory of complex systems. Though the deformed exponential families are defined by deformed exponential functions, these functions do not satisfy the law of exponents in general. The deformed algebras have been introduced based on the deformed exponential functions. In this paper, after summarizing such deformed algebraic structures, it is clarified how deformed algebras work on deformed exponential families. In fact, deformed algebras cause generalization of expectations. The three kinds of expectations for random variables are introduced in this paper, and it is discussed why these generalized expectations are natural from the viewpoint of information geometry. In addition, deformed algebras cause generalization of independences. Whereas it is difficult to check the well-definedness of deformed independence in general, the κ-independence is always well-defined on κ-exponential families. This is one of advantages of κ-exponential families in complex systems. Consequently, we can well generalize the maximum likelihood method for the κ-exponential family from the viewpoint of information geometry.Entropy2015-08-10178Article10.3390/e17085729572957511099-43002015-08-10doi: 10.3390/e17085729Hiroshi MatsuzoeTatsuaki Wada<![CDATA[Entropy, Vol. 17, Pages 5711-5728: Fruit Classification by Wavelet-Entropy and Feedforward Neural Network Trained by Fitness-Scaled Chaotic ABC and Biogeography-Based Optimization]]>
http://www.mdpi.com/1099-4300/17/8/5711
Fruit classification is quite difficult because of the various categories and similar shapes and features of fruit. In this work, we proposed two novel machine-learning based classification methods. The developed system consists of wavelet entropy (WE), principal component analysis (PCA), feedforward neural network (FNN) trained by fitness-scaled chaotic artificial bee colony (FSCABC) and biogeography-based optimization (BBO), respectively. The K-fold stratified cross validation (SCV) was utilized for statistical analysis. The classification performance for 1653 fruit images from 18 categories showed that the proposed “WE + PCA + FSCABC-FNN” and “WE + PCA + BBO-FNN” methods achieve the same accuracy of 89.5%, higher than state-of-the-art approaches: “(CH + MP + US) + PCA + GA-FNN ” of 84.8%, “(CH + MP + US) + PCA + PSO-FNN” of 87.9%, “(CH + MP + US) + PCA + ABC-FNN” of 85.4%, “(CH + MP + US) + PCA + kSVM” of 88.2%, and “(CH + MP + US) + PCA + FSCABC-FNN” of 89.1%. Besides, our methods used only 12 features, less than the number of features used by other methods. Therefore, the proposed methods are effective for fruit classification.Entropy2015-08-07178Article10.3390/e17085711571157281099-43002015-08-07doi: 10.3390/e17085711Shuihua WangYudong ZhangGenlin JiJiquan YangJianguo WuLing Wei<![CDATA[Entropy, Vol. 17, Pages 5695-5710: New Region Planning in France? Better Order or More Disorder?]]>
http://www.mdpi.com/1099-4300/17/8/5695
This paper grounds the critique of the reduction of regions in a country , not only in its geographical and social context but also in its entropic space. The various recent plans leading to the reduction of the number of regions in metropolitan France are discussed, based on the mere distribution in the number of municipalities in the plans and analyzed according to various distribution laws. Each case, except the present distribution with 22 regions, on the mainland, does not seem to fit presently used theoretical models. In addition, the number of inhabitants is examined in each plan. The same conclusion holds. Therefore, a theoretical argument based on entropy considerations is proposed, thereby pointing to whether more order or less disorder is the key question—discounting political considerations.Entropy2015-08-06178Article10.3390/e17085695569557101099-43002015-08-06doi: 10.3390/e17085695Marcel Ausloos<![CDATA[Entropy, Vol. 17, Pages 5673-5694: Binary Classification with a Pseudo Exponential Model and Its Application for Multi-Task Learning]]>
http://www.mdpi.com/1099-4300/17/8/5673
In this paper, we investigate the basic properties of binary classification with a pseudo model based on the Itakura–Saito distance and reveal that the Itakura–Saito distance is a unique appropriate measure for estimation with the pseudo model in the framework of general Bregman divergence. Furthermore, we propose a novelmulti-task learning algorithm based on the pseudo model in the framework of the ensemble learning method. We focus on a specific setting of the multi-task learning for binary classification problems. The set of features is assumed to be common among all tasks, which are our targets of performance improvement. We consider a situation where the shared structures among the dataset are represented by divergence between underlying distributions associated with multiple tasks. We discuss statistical properties of the proposed method and investigate the validity of the proposed method with numerical experiments.Entropy2015-08-06178Article10.3390/e17085673567356941099-43002015-08-06doi: 10.3390/e17085673Takashi TakenouchiOsamu KomoriShinto Eguchi<![CDATA[Entropy, Vol. 17, Pages 5660-5672: Gaussian Network’s Dynamics Reflected into Geometric Entropy]]>
http://www.mdpi.com/1099-4300/17/8/5660
We consider a geometric entropy as a measure of complexity for Gaussian networks, namely networks having Gaussian random variables sitting on vertices and their correlations as weighted links. We then show how the network dynamics described by the well-known Ornstein–Uhlenbeck process reflects into such a measure. We unveil a crossing of the entropy time behaviors between switching on and off links. Moreover, depending on the number of links switched on or off, the entropy time behavior can be non-monotonic.Entropy2015-08-06178Article10.3390/e17085660566056721099-43002015-08-06doi: 10.3390/e17085660Domenico FeliceStefano Mancini<![CDATA[Entropy, Vol. 17, Pages 5635-5659: Unconditionally Secure Quantum Signatures]]>
http://www.mdpi.com/1099-4300/17/8/5635
Signature schemes, proposed in 1976 by Diffie and Hellman, have become ubiquitous across modern communications. They allow for the exchange of messages from one sender to multiple recipients, with the guarantees that messages cannot be forged or tampered with and that messages also can be forwarded from one recipient to another without compromising their validity. Signatures are different from, but no less important than encryption, which ensures the privacy of a message. Commonly used signature protocols—signatures based on the Rivest–Adleman–Shamir (RSA) algorithm, the digital signature algorithm (DSA), and the elliptic curve digital signature algorithm (ECDSA)—are only computationally secure, similar to public key encryption methods. In fact, since these rely on the difficulty of finding discrete logarithms or factoring large primes, it is known that they will become completely insecure with the emergence of quantum computers. We may therefore see a shift towards signature protocols that will remain secure even in a post-quantum world. Ideally, such schemes would provide unconditional or information-theoretic security. In this paper, we aim to provide an accessible and comprehensive review of existing unconditionally securesecure signature schemes for signing classical messages, with a focus on unconditionally secure quantum signature schemes.Entropy2015-08-04178Review10.3390/e17085635563556591099-43002015-08-04doi: 10.3390/e17085635Ryan AmiriErika Andersson<![CDATA[Entropy, Vol. 17, Pages 5611-5634: Conspiratorial Beliefs Observed through Entropy Principles]]>
http://www.mdpi.com/1099-4300/17/8/5611
We propose a novel approach framed in terms of information theory and entropy to tackle the issue of the propagation of conspiracy theories. We represent the initial report of an event (such as the 9/11 terroristic attack) as a series of strings of information, each string classified by a two-state variable Ei = ±1, i = 1, …, N. If the values of the Ei are set to −1 for all strings, a state of minimum entropy is achieved. Comments on the report, focusing repeatedly on several strings Ek, might alternate their meaning (from −1 to +1). The representation of the event is turned fuzzy with an increased entropy value. Beyond some threshold value of entropy, chosen by simplicity to its maximum value, meaning N/2 variables with Ei = 1, the chance is created that a conspiracy theory might be initiated/propagated. Therefore, the evolution of the associated entropy is a way to measure the degree of penetration of a conspiracy theory. Our general framework relies on online content made voluntarily available by crowds of people, in response to some news or blog articles published by official news agencies. We apply different aggregation levels (comment, person, discussion thread) and discuss the associated patterns of entropy change.Entropy2015-08-04178Article10.3390/e17085611561156341099-43002015-08-04doi: 10.3390/e17085611Nataša GoloSerge Galam<![CDATA[Entropy, Vol. 17, Pages 5593-5610: Entropy Minimization Design Approach of Supersonic Internal Passages]]>
http://www.mdpi.com/1099-4300/17/8/5593
Fluid machinery operating in the supersonic regime unveil avenues towards more compact technology. However, internal supersonic flows are associated with high aerodynamic and thermal penalties, which usually prevent their practical implementation. Indeed, both shock losses and the limited operational range represent particular challenges to aerodynamic designers that should be taken into account at the initial phase of the design process. This paper presents a design methodology for supersonic passages based on direct evaluations of the velocity field using the method of characteristics and computation of entropy generation across shock waves. This meshless function evaluation tool is then coupled to an optimization scheme, based on evolutionary algorithms that minimize the entropy generation across the supersonic passage. Finally, we assessed the results with 3D Reynolds Averaged Navier Stokes calculations.Entropy2015-08-03178Article10.3390/e17085593559356101099-43002015-08-03doi: 10.3390/e17085593Jorge SousaGuillermo Paniagua<![CDATA[Entropy, Vol. 17, Pages 5580-5592: Adaptive Fuzzy Control for Nonlinear Fractional-Order Uncertain Systems with Unknown Uncertainties and External Disturbance]]>
http://www.mdpi.com/1099-4300/17/8/5580
In this paper, the problem of robust control of nonlinear fractional-order systems in the presence of uncertainties and external disturbance is investigated. Fuzzy logic systems are used for estimating the unknown nonlinear functions. Based on the fractional Lyapunov direct method and some proposed Lemmas, an adaptive fuzzy controller is designed. The proposed method can guarantee all the signals in the closed-loop systems remain bounded and the tracking errors converge to an arbitrary small region of the origin. Lastly, an illustrative example is given to demonstrate the effectiveness of the proposed results.Entropy2015-08-03178Article10.3390/e17085580558055921099-43002015-08-03doi: 10.3390/e17085580Ling LiYeguo Sun<![CDATA[Entropy, Vol. 17, Pages 5561-5579: A New Chaotic System with Positive Topological Entropy]]>
http://www.mdpi.com/1099-4300/17/8/5561
This paper introduces a new simple system with a butterfly chaotic attractor. This system has rich and complex dynamics. With some typical parameters, its Lyapunov dimension is greater than other known three dimensional chaotic systems. It exhibits chaotic behavior over a large range of parameters, and the divergence of flow of this system is not a constant. The dynamics of this new system are analyzed via Lyapunov exponent spectrum, bifurcation diagrams, phase portraits and the Poincaré map. The compound structures of this new system are also analyzed. By means of topological horseshoe theory and numerical computation, the Poincaré map defined for the system is proved to be semi-conjugate to 3-shift map, and thus the system has positive topological entropy.Entropy2015-08-03178Article10.3390/e17085561556155791099-43002015-08-03doi: 10.3390/e17085561Zhonglin WangJian MaZengqiang ChenQing Zhang<![CDATA[Entropy, Vol. 17, Pages 5549-5560: Convergence of a Fixed-Point Minimum Error Entropy Algorithm]]>
http://www.mdpi.com/1099-4300/17/8/5549
The minimum error entropy (MEE) criterion is an important learning criterion in information theoretical learning (ITL). However, the MEE solution cannot be obtained in closed form even for a simple linear regression problem, and one has to search it, usually, in an iterative manner. The fixed-point iteration is an efficient way to solve the MEE solution. In this work, we study a fixed-point MEE algorithm for linear regression, and our focus is mainly on the convergence issue. We provide a sufficient condition (although a little loose) that guarantees the convergence of the fixed-point MEE algorithm. An illustrative example is also presented.Entropy2015-08-03178Article10.3390/e17085549554955601099-43002015-08-03doi: 10.3390/e17085549Yu ZhangBadong ChenXi LiuZejian YuanJose Principe<![CDATA[Entropy, Vol. 17, Pages 5522-5548: Life’s a Gas: A Thermodynamic Theory of Biological Evolution]]>
http://www.mdpi.com/1099-4300/17/8/5522
This paper outlines a thermodynamic theory of biological evolution. Beginning with a brief summary of the parallel histories of the modern evolutionary synthesis and thermodynamics, we use four physical laws and processes (the first and second laws of thermodynamics, diffusion and the maximum entropy production principle) to frame the theory. Given that open systems such as ecosystems will move towards maximizing dispersal of energy, we expect biological diversity to increase towards a level, Dmax, representing maximum entropic production (Smax). Based on this theory, we develop a mathematical model to predict diversity over the last 500 million years. This model combines diversification, post-extinction recovery and likelihood of discovery of the fossil record. We compare the output of this model with that of the observed fossil record. The model predicts that life diffuses into available energetic space (ecospace) towards a dynamic equilibrium, driven by increasing entropy within the genetic material. This dynamic equilibrium is punctured by extinction events, which are followed by restoration of Dmax through diffusion into available ecospace. Finally we compare and contrast our thermodynamic theory with the MES in relation to a number of important characteristics of evolution (progress, evolutionary tempo, form versus function, biosphere architecture, competition and fitness).Entropy2015-07-31178Article10.3390/e17085522552255481099-43002015-07-31doi: 10.3390/e17085522Keith Skene<![CDATA[Entropy, Vol. 17, Pages 5503-5521: Optimization of the Changing Phase Fluid in a Carnot Type Engine for the Recovery of a Given Waste Heat Source]]>
http://www.mdpi.com/1099-4300/17/8/5503
A Carnot type engine with a changing phase during the heating and the cooling is modeled with its thermal contact with the heat source. In a first optimization, the optimal high temperature of the cycle is determined to maximize the power output. The temperature and the mass flow rate of the heat source are given. This does not take into account the converter internal fluid and its mass flow rate. It is an exogenous optimization of the converter. In a second optimization, the endogenous optimization, the isothermal heating corresponds only to the vaporization of the selected fluid. The maximization of the power output gives the optimal vaporization temperature of the cycled fluid. Using these two optima allows connecting the temperature of the heat source to the working fluid used. For a given temperature level, mass flow rate and composition of the waste heat to recover, an optimal fluid and its temperature of vaporization are deduced. The optimal conditions size also the internal mass flow rate and the compression ratio (pump size). The optimum corresponds to the maximum of the power output and must be combined with the environmental fluid impact and the technological constraints.Entropy2015-07-31178Article10.3390/e17085503550355211099-43002015-07-31doi: 10.3390/e17085503Mathilde BlaiseMichel FeidtDenis Maillet<![CDATA[Entropy, Vol. 17, Pages 5472-5502: The Intrinsic Cause-Effect Power of Discrete Dynamical Systems—From Elementary Cellular Automata to Adapting Animats]]>
http://www.mdpi.com/1099-4300/17/8/5472
Current approaches to characterize the complexity of dynamical systems usually rely on state-space trajectories. In this article instead we focus on causal structure, treating discrete dynamical systems as directed causal graphs—systems of elements implementing local update functions. This allows us to characterize the system’s intrinsic cause-effect structure by applying the mathematical and conceptual tools developed within the framework of integrated information theory (IIT). In particular, we assess the number of irreducible mechanisms (concepts) and the total amount of integrated conceptual information Φ specified by a system. We analyze: (i) elementary cellular automata (ECA); and (ii) small, adaptive logic-gate networks (“animats”), similar to ECA in structure but evolving by interacting with an environment. We show that, in general, an integrated cause-effect structure with many concepts and high Φ is likely to have high dynamical complexity. Importantly, while a dynamical analysis describes what is “happening” in a system from the extrinsic perspective of an observer, the analysis of its cause-effect structure reveals what a system “is” from its own intrinsic perspective, exposing its dynamical and evolutionary potential under many different scenarios.Entropy2015-07-31178Article10.3390/e17085472547255021099-43002015-07-31doi: 10.3390/e17085472Larissa AlbantakisGiulio Tononi<![CDATA[Entropy, Vol. 17, Pages 5450-5471: Probabilistic Forecasts: Scoring Rules and Their Decomposition and Diagrammatic Representation via Bregman Divergences]]>
http://www.mdpi.com/1099-4300/17/8/5450
A scoring rule is a device for evaluation of forecasts that are given in terms of the probability of an event. In this article we will restrict our attention to binary forecasts. We may think of a scoring rule as a penalty attached to a forecast after the event has been observed. Thus a relatively small penalty will accrue if a high probability forecast that an event will occur is followed by occurrence of the event. On the other hand, a relatively large penalty will accrue if this forecast is followed by non-occurrence of the event. Meteorologists have been foremost in developing scoring rules for the evaluation of probabilistic forecasts. Here we use a published meteorological data set to illustrate diagrammatically the Brier score and the divergence score, and their statistical decompositions, as examples of Bregman divergences. In writing this article, we have in mind environmental scientists and modellers for whom meteorological factors are important drivers of biological, physical and chemical processes of interest. In this context, we briefly draw attention to the potential for probabilistic forecasting of the within-season component of nitrous oxide emissions from agricultural soils.Entropy2015-07-31178Article10.3390/e17085450545054711099-43002015-07-31doi: 10.3390/e17085450Gareth HughesCairistiona Topp<![CDATA[Entropy, Vol. 17, Pages 5437-5449: Reaction Kinetic Parameters and Surface Thermodynamic Properties of Cu2O Nanocubes]]>
http://www.mdpi.com/1099-4300/17/8/5437
Cuprous oxide (Cu2O) nanocubes were synthesized by reducing Cu(OH)2 in the presence of sodium citrate at room temperature. The samples were characterized in detail by field-emission scanning electron microscopy, transmission electron microscopy, high-resolution transmission electron microscopy, X-ray powder diffraction, and N2 absorption (BET specific surface area). The equations for acquiring reaction kinetic parameters and surface thermodynamic properties of Cu2O nanocubes were deduced by establishment of the relations between thermodynamic functions of Cu2O nanocubes and these of the bulk Cu2O. Combined with thermochemical cycle, transition state theory, basic theory of chemical thermodynamics, and in situ microcalorimetry, reaction kinetic parameters, specific surface enthalpy, specific surface Gibbs free energy, and specific surface entropy of Cu2O nanocubes were successfully determined. We also introduced a universal route for gaining reaction kinetic parameters and surface thermodynamic properties of nanomaterials.Entropy2015-07-30178Article10.3390/e17085437543754491099-43002015-07-30doi: 10.3390/e17085437Xingxing LiHuanfeng TangXianrui LuShi LinLili ShiZaiyin Huang<![CDATA[Entropy, Vol. 17, Pages 5422-5436: A Robust Planning Algorithm for Groups of Entities in Discrete Spaces]]>
http://www.mdpi.com/1099-4300/17/8/5422
Automated planning is a well-established field of artificial intelligence (AI), with applications in route finding, robotics and operational research, among others. The task of developing a plan is often solved by finding a path in a graph representing the search domain; a robust plan consists of numerous paths that can be chosen if the execution of the best (optimal) one fails. While robust planning for a single entity is rather simple, development of a robust plan for multiple entities in a common environment can lead to combinatorial explosion. This paper proposes a novel hybrid approach, joining heuristic search and the wavefront algorithm to provide a plan featuring robustness in areas where it is needed, while maintaining a low level of computational complexity.Entropy2015-07-30178Article10.3390/e17085422542254361099-43002015-07-30doi: 10.3390/e17085422Igor WojnickiSebastian ErnstWojciech Turek<![CDATA[Entropy, Vol. 17, Pages 5402-5421: Fractional State Space Analysis of Economic Systems]]>
http://www.mdpi.com/1099-4300/17/8/5402
This paper examines modern economic growth according to the multidimensional scaling (MDS) method and state space portrait (SSP) analysis. Electing GDP per capita as the main indicator for economic growth and prosperity, the long-run perspective from 1870 to 2010 identifies the main similarities among 34 world partners’ modern economic growth and exemplifies the historical waving mechanics of the largest world economy, the USA. MDS reveals two main clusters among the European countries and their old offshore territories, and SSP identifies the Great Depression as a mild challenge to the American global performance, when compared to the Second World War and the 2008 crisis.Entropy2015-07-29178Article10.3390/e17085402540254211099-43002015-07-29doi: 10.3390/e17085402J. MachadoMaria MataAntónio Lopes<![CDATA[Entropy, Vol. 17, Pages 5382-5401: Multifractal Dimensional Dependence Assessment Based on Tsallis Mutual Information]]>
http://www.mdpi.com/1099-4300/17/8/5382
Entropy-based tools are commonly used to describe the dynamics of complex systems. In the last few decades, non-extensive statistics, based on Tsallis entropy, and multifractal techniques have shown to be useful to characterize long-range interaction and scaling behavior. In this paper, an approach based on generalized Tsallis dimensions is used for the formulation of mutual-information-related dependence coefficients in the multifractal domain. Different versions according to the normalizing factor, as well as to the inclusion of the non-extensivity correction term are considered and discussed. An application to the assessment of dimensional interaction in the structural dynamics of a seismic real series is carried out to illustrate the usefulness and comparative performance of the measures introduced.Entropy2015-07-29178Article10.3390/e17085382538254011099-43002015-07-29doi: 10.3390/e17085382José AnguloFrancisco Esquivel<![CDATA[Entropy, Vol. 17, Pages 5353-5381: Approximate Methods for Maximum Likelihood Estimation of Multivariate Nonlinear Mixed-Effects Models]]>
http://www.mdpi.com/1099-4300/17/8/5353
Multivariate nonlinear mixed-effects models (MNLMM) have received increasing use due to their flexibility for analyzing multi-outcome longitudinal data following possibly nonlinear profiles. This paper presents and compares five different iterative algorithms for maximum likelihood estimation of the MNLMM. These algorithmic schemes include the penalized nonlinear least squares coupled to the multivariate linear mixed-effects (PNLS-MLME) procedure, Laplacian approximation, the pseudo-data expectation conditional maximization (ECM) algorithm, the Monte Carlo EM algorithm and the importance sampling EM algorithm. When fitting the MNLMM, it is rather difficult to exactly evaluate the observed log-likelihood function in a closed-form expression, because it involves complicated multiple integrals. To address this issue, the corresponding approximations of the observed log-likelihood function under the five algorithms are presented. An expected information matrix of parameters is also provided to calculate the standard errors of model parameters. A comparison of computational performances is investigated through simulation and a real data example from an AIDS clinical study.Entropy2015-07-29178Article10.3390/e17085353535353811099-43002015-07-29doi: 10.3390/e17085353Wan-Lun Wang<![CDATA[Entropy, Vol. 17, Pages 5333-5352: Statistical Evidence Measured on a Properly Calibrated Scale across Nested and Non-nested Hypothesis Comparisons]]>
http://www.mdpi.com/1099-4300/17/8/5333
Statistical modeling is often used to measure the strength of evidence for or against hypotheses about given data. We have previously proposed an information-dynamic framework in support of a properly calibrated measurement scale for statistical evidence, borrowing some mathematics from thermodynamics, and showing how an evidential analogue of the ideal gas equation of state could be used to measure evidence for a one-sided binomial hypothesis comparison (“coin is fair” vs. “coin is biased towards heads”). Here we take three important steps forward in generalizing the framework beyond this simple example, albeit still in the context of the binomial model. We: (1) extend the scope of application to other forms of hypothesis comparison; (2) show that doing so requires only the original ideal gas equation plus one simple extension, which has the form of the Van der Waals equation; (3) begin to develop the principles required to resolve a key constant, which enables us to calibrate the measurement scale across applications, and which we find to be related to the familiar statistical concept of degrees of freedom. This paper thus moves our information-dynamic theory substantially closer to the goal of producing a practical, properly calibrated measure of statistical evidence for use in general applications.Entropy2015-07-29178Article10.3390/e17085333533353521099-43002015-07-29doi: 10.3390/e17085333Veronica VielandSang-Cheol Seok<![CDATA[Entropy, Vol. 17, Pages 5304-5332: Entropy Generation through Deterministic Spiral Structures in a Corner Boundary-Layer Flow]]>
http://www.mdpi.com/1099-4300/17/8/5304
It is shown that nonlinear interactions between boundary layers on adjacent corner surfaces produce deterministic stream wise spiral structures. The synchronization properties of nonlinear spectral velocity equations of Lorenz form yield clearly defined deterministic spiral structures at several downstream stations. The computational procedure includes Burg’s method to obtain power spectral densities, yielding the available kinetic energy dissipation rates within the spiral structures. The singular value decomposition method is applied to the nonlinear time series solutions yielding empirical entropies, from which empirical entropic indices are then extracted. The intermittency exponents obtained from the entropic indices allow the computation of the entropy generation through the spiral structures to the final dissipation of the fluctuating kinetic energy into background thermal energy, resulting in an increase in the entropy. The entropy generation rates through the spiral structures are compared with the entropy generation rates within an empirical turbulent boundary layer at several stream wise stations.Entropy2015-07-27178Article10.3390/e17085304530453321099-43002015-07-27doi: 10.3390/e17085304LaVar Isaacson<![CDATA[Entropy, Vol. 17, Pages 5288-5303: Escalation with Overdose Control is More Efficient and Safer than Accelerated Titration for Dose Finding]]>
http://www.mdpi.com/1099-4300/17/8/5288
The standard 3 + 3 or “modified Fibonacci” up-and-down (MF-UD) method of dose escalation is by far the most used design in dose-finding cancer trials. However, MF-UD has always shown inferior performance when compared with its competitors regarding number of patients treated at optimal doses. A consequence of using less effective designs is that more patients are treated with doses outside the therapeutic window. In June 2012, the U S Food and Drug Administration (FDA) rejected the proposal to use Escalation with Overdose Control (EWOC), an established dose-finding method which has been extensively used in FDA-approved first in human trials and imposed a variation of the MF-UD, known as accelerated titration (AT) design. This event motivated us to perform an extensive simulation study comparing the operating characteristics of AT and EWOC. We show that the AT design has poor operating characteristics relative to three versions of EWOC under several practical scenarios. From the clinical investigator’s perspective, lower bias and mean square error make EWOC designs preferable than AT designs without compromising safety. From a patient’s perspective, uniformly higher proportion of patients receiving doses within an optimal range of the true MTD makes EWOC designs preferable than AT designs.Entropy2015-07-27178Article10.3390/e17085288528853031099-43002015-07-27doi: 10.3390/e17085288André RogatkoGalen Cook-WiensMourad TighiouartSteven Piantadosi<![CDATA[Entropy, Vol. 17, Pages 5274-5287: Prebiotic Competition between Information Variants, With Low Error Catastrophe Risks]]>
http://www.mdpi.com/1099-4300/17/8/5274
During competition for resources in primitive networks increased fitness of an information variant does not necessarily equate with successful elimination of its competitors. If variability is added fast to a system, speedy replacement of pre-existing and less-efficient forms of order is required as novel information variants arrive. Otherwise, the information capacity of the system fills up with information variants (an effect referred as “error catastrophe”). As the cost for managing the system’s exceeding complexity increases, the correlation between performance capabilities of information variants and their competitive success decreases, and evolution of such systems toward increased efficiency slows down. This impasse impedes the understanding of evolution in prebiotic networks. We used the simulation platform Biotic Abstract Dual Automata (BiADA) to analyze how information variants compete in a resource-limited space. We analyzed the effect of energy-related features (differences in autocatalytic efficiency, energy cost of order, energy availability, transformation rates and stability of order) on this competition. We discuss circumstances and controllers allowing primitive networks acquire novel information with minimal “error catastrophe” risks. We present a primitive mechanism for maximization of energy flux in dynamic networks. This work helps evaluate controllers of evolution in prebiotic networks and other systems where information variants compete.Entropy2015-07-27178Article10.3390/e17085274527452871099-43002015-07-27doi: 10.3390/e17085274Radu PopaVily Cimpoiasu<![CDATA[Entropy, Vol. 17, Pages 5257-5273: A Pilot Directional Protection for HVDC Transmission Line Based on Relative Entropy of Wavelet Energy]]>
http://www.mdpi.com/1099-4300/17/8/5257
On the basis of analyzing high-voltage direct current (HVDC) transmission system and its fault superimposed circuit, the direction of the fault components of the voltage and the current measured at one end of transmission line is certified to be different for internal faults and external faults. As an estimate of the differences between two signals, relative entropy is an effective parameter for recognizing transient signals in HVDC transmission lines. In this paper, the relative entropy of wavelet energy is applied to distinguish internal fault from external fault. For internal faults, the directions of fault components of voltage and current are opposite at the two ends of the transmission line, indicating a huge difference of wavelet energy relative entropy; for external faults, the directions are identical, indicating a small difference. The simulation results based on PSCAD/EMTDC show that the proposed pilot protection system acts accurately for faults under different conditions, and its performance is not affected by fault type, fault location, fault resistance and noise.Entropy2015-07-27178Article10.3390/e17085257525752731099-43002015-07-27doi: 10.3390/e17085257Sheng LinShan GaoZhengyou HeYujia Deng<![CDATA[Entropy, Vol. 17, Pages 5241-5256: Neural Network Reorganization Analysis During an Auditory Oddball Task in Schizophrenia Using Wavelet Entropy]]>
http://www.mdpi.com/1099-4300/17/8/5241
The aim of the present study was to characterize the neural network reorganization during a cognitive task in schizophrenia (SCH) by means of wavelet entropy (WE). Previous studies suggest that the cognitive impairment in patients with SCH could be related to the disrupted integrative functions of neural circuits. Nevertheless, further characterization of this effect is needed, especially in the time-frequency domain. This characterization is sensitive to fast neuronal dynamics and their synchronization that may be an important component of distributed neuronal interactions; especially in light of the disconnection hypothesis for SCH and its electrophysiological correlates. In this work, the irregularity dynamics elicited by an auditory oddball paradigm were analyzed through synchronized-averaging (SA) and single-trial (ST) analyses. They provide complementary information on the spatial patterns involved in the neural network reorganization. Our results from 20 healthy controls and 20 SCH patients showed a WE decrease from baseline to response both in controls and SCH subjects. These changes were significantly more pronounced for healthy controls after ST analysis, mainly in central and frontopolar areas. On the other hand, SA analysis showed more widespread spatial differences than ST results. These findings suggest that the activation response is weakly phase-locked to stimulus onset in SCH and related to the default mode and salience networks. Furthermore, the less pronounced changes in WE from baseline to response for SCH patients suggest an impaired ability to reorganize neural dynamics during an oddball task.Entropy2015-07-27178Article10.3390/e17085241524152561099-43002015-07-27doi: 10.3390/e17085241Javier Gomez-PilarJesús PozaAlejandro BachillerCarlos GómezVicente MolinaRoberto Hornero<![CDATA[Entropy, Vol. 17, Pages 5218-5240: An Integrated Index for the Identification of Focal Electroencephalogram Signals Using Discrete Wavelet Transform and Entropy Measures]]>
http://www.mdpi.com/1099-4300/17/8/5218
The dynamics of brain area influenced by focal epilepsy can be studied using focal and non-focal electroencephalogram (EEG) signals. This paper presents a new method to detect focal and non-focal EEG signals based on an integrated index, termed the focal and non-focal index (FNFI), developed using discrete wavelet transform (DWT) and entropy features. The DWT decomposes the EEG signals up to six levels, and various entropy measures are computed from approximate and detail coefficients of sub-band signals. The computed entropy measures are average wavelet, permutation, fuzzy and phase entropies. The proposed FNFI developed using permutation, fuzzy and Shannon wavelet entropies is able to clearly discriminate focal and non-focal EEG signals using a single number. Furthermore, these entropy measures are ranked using different techniques, namely the Bhattacharyya space algorithm, Student’s t-test, the Wilcoxon test, the receiver operating characteristic (ROC) and entropy. These ranked features are fed to various classifiers, namely k-nearest neighbour (KNN), probabilistic neural network (PNN), fuzzy classifier and least squares support vector machine (LS-SVM), for automated classification of focal and non-focal EEG signals using the minimum number of features. The identification of the focal EEG signals can be helpful to locate the epileptogenic focus.Entropy2015-07-27178Article10.3390/e17085218521852401099-43002015-07-27doi: 10.3390/e17085218Rajeev SharmaRam PachoriU. Acharya<![CDATA[Entropy, Vol. 17, Pages 5199-5217: Generalized Combination Complex Synchronization for Fractional-Order Chaotic Complex Systems]]>
http://www.mdpi.com/1099-4300/17/8/5199
Based on two fractional-order chaotic complex drive systems and one fractional-order chaotic complex response system with different dimensions, we propose generalized combination complex synchronization. In this new synchronization scheme, there are two complex scaling matrices that are non-square matrices. On the basis of the stability theory of fractional-order linear systems, we design a general controller via active control. Additionally, by virtue of two complex scaling matrices, generalized combination complex synchronization between fractional-order chaotic complex systems and real systems is investigated. Finally, three typical examples are given to demonstrate the effectiveness and feasibility of the schemes.Entropy2015-07-24178Article10.3390/e17085199519952171099-43002015-07-24doi: 10.3390/e17085199Cuimei JiangShutang LiuDa Wang<![CDATA[Entropy, Vol. 17, Pages 5171-5198: Information-Theoretic Characterization and Undersampling Ratio Determination for Compressive Radar Imaging in a Simulated Environment]]>
http://www.mdpi.com/1099-4300/17/8/5171
Assuming sparsity or compressibility of the underlying signals, compressed sensing or compressive sampling (CS) exploits the informational efficiency of under-sampled measurements for increased efficiency yet acceptable accuracy in information gathering, transmission and processing, though it often incurs extra computational cost in signal reconstruction. Shannon information quantities and theorems, such as source rate-distortion, trans-information and rate distortion theorem concerning lossy data compression, provide a coherent framework, which is complementary to classic CS theory, for analyzing informational quantities and for determining the necessary number of measurements in CS. While there exists some information-theoretic research in the past on CS in general and compressive radar imaging in particular, systematic research is needed to handle issues related to scene description in cluttered environments and trans-information quantification in complex sparsity-clutter-sampling-noise settings. The novelty of this paper lies in furnishing a general strategy for information-theoretic analysis of scene compressibility, trans-information of radar echo data about the scene and the targets of interest, respectively, and limits to undersampling ratios necessary for scene reconstruction subject to distortion given sparsity-clutter-noise constraints. A computational experiment was performed to demonstrate informational analysis regarding the scene-sampling-reconstruction process and to generate phase transition diagrams showing relations between undersampling ratios and sparsity-clutter-noise-distortion constraints. The strategy proposed in this paper is valuable for information-theoretic analysis and undersampling theorem developments in compressive radar imaging and other computational imaging applications.Entropy2015-07-24178Article10.3390/e17085171517151981099-43002015-07-24doi: 10.3390/e17085171Jingxiong ZhangKe YangFengzhu LiuYing Zhang<![CDATA[Entropy, Vol. 17, Pages 5157-5170: Non-Fourier Heat Transfer with Phonons and Electrons in a Circular Thin Layer Surrounding a Hot Nanodevice]]>
http://www.mdpi.com/1099-4300/17/8/5157
A nonlocal model for heat transfer with phonons and electrons is applied to infer the steady-state radial temperature profile in a circular layer surrounding an inner hot component. Such a profile, following by the numerical solution of the heat equation, predicts that the temperature behaves in an anomalous way, since for radial distances from the heat source smaller than the mean-free path of phonons and electrons, it increases for increasing distances. The compatibility of this temperature behavior with the second law of thermodynamics is investigated by calculating numerically the local entropy production as a function of the radial distance. It turns out that such a production is positive and strictly decreasing with the radial distance.Entropy2015-07-24178Article10.3390/e17085157515751701099-43002015-07-24doi: 10.3390/e17085157Vito CimmelliIsabella CarlomagnoAntonio Sellitto<![CDATA[Entropy, Vol. 17, Pages 5145-5156: Scale-Invariant Rotating Black Holes in Quadratic Gravity]]>
http://www.mdpi.com/1099-4300/17/8/5145
Black hole solutions in pure quadratic theories of gravity are interesting since they allow the formulation of a set of scale-invariant thermodynamics laws. Recently, we have proven that static scale-invariant black holes have a well-defined entropy, which characterizes equivalent classes of solutions. In this paper, we generalize these results and explore the thermodynamics of rotating black holes in pure quadratic gravity.Entropy2015-07-23178Article10.3390/e17085145514551561099-43002015-07-23doi: 10.3390/e17085145Guido CognolaMassimiliano RinaldiLuciano Vanzo<![CDATA[Entropy, Vol. 17, Pages 5133-5144: Setting Diverging Colors for a Large-Scale Hypsometric Lunar Map Based on Entropy]]>
http://www.mdpi.com/1099-4300/17/7/5133
A hypsometric map is a type of map used to represent topographic characteristics by filling different map areas with diverging colors. The setting of appropriate diverging colors is essential for the map to reveal topographic details. When lunar real environmental exploration programs are performed, large-scale hypsometric maps with a high resolution and greater topographic detail are helpful. Compared to the situation on Earth, fewer lunar exploration objects are available, and the topographic waviness is smaller at a large scale, indicating that presenting the topographic details using traditional hypsometric map-making methods may be difficult. To solve this problem, we employed the Chang’E2 (CE2) topographic and imagery data with a resolution of 7 m and developed a new hypsometric map-making method by setting the diverging colors based on information entropy. The resulting map showed that this method is suitable for presenting the topographic details and might be useful for developing a better understanding of the environment of the lunar surface.Entropy2015-07-22177Article10.3390/e17075133513351441099-43002015-07-22doi: 10.3390/e17075133Xingguo ZengLingli MuJianjun LiuYiman Yang<![CDATA[Entropy, Vol. 17, Pages 5117-5132: An Entropy-Based Approach to Path Analysis of Structural Generalized Linear Models: A Basic Idea]]>
http://www.mdpi.com/1099-4300/17/7/5117
A path analysis method for causal systems based on generalized linear models is proposed by using entropy. A practical example is introduced, and a brief explanation of the entropy coefficient of determination is given. Direct and indirect effects of explanatory variables are discussed as log odds ratios, i.e., relative information, and a method for summarizing the effects is proposed. The example dataset is re-analyzed by using the method.Entropy2015-07-22177Article10.3390/e17075117511751321099-43002015-07-22doi: 10.3390/e17075117Nobuoki EshimaMinoru TabataClaudio BorroniYutaka Kano<![CDATA[Entropy, Vol. 17, Pages 5101-5116: Analytic Exact Upper Bound for the Lyapunov Dimension of the Shimizu–Morioka System]]>
http://www.mdpi.com/1099-4300/17/7/5101
In applied investigations, the invariance of the Lyapunov dimension under a diffeomorphism is often used. However, in the case of irregular linearization, this fact was not strictly considered in the classical works. In the present work, the invariance of the Lyapunov dimension under diffeomorphism is demonstrated in the general case. This fact is used to obtain the analytic exact upper bound of the Lyapunov dimension of an attractor of the Shimizu–Morioka system.Entropy2015-07-22177Article10.3390/e17075101510151161099-43002015-07-22doi: 10.3390/e17075101Gennady LeonovTatyana AlexeevaNikolay Kuznetsov<![CDATA[Entropy, Vol. 17, Pages 5085-5100: Averaged Extended Tree Augmented Naive Classifier]]>
http://www.mdpi.com/1099-4300/17/7/5085
This work presents a new general purpose classifier named Averaged Extended Tree Augmented Naive Bayes (AETAN), which is based on combining the advantageous characteristics of Extended Tree Augmented Naive Bayes (ETAN) and Averaged One-Dependence Estimator (AODE) classifiers. We describe the main properties of the approach and algorithms for learning it, along with an analysis of its computational time complexity. Empirical results with numerous data sets indicate that the new approach is superior to ETAN and AODE in terms of both zero-one classification accuracy and log loss. It also compares favourably against weighted AODE and hidden Naive Bayes. The learning phase of the new approach is slower than that of its competitors, while the time complexity for the testing phase is similar. Such characteristics suggest that the new classifier is ideal in scenarios where online learning is not required.Entropy2015-07-21177Article10.3390/e17075085508551001099-43002015-07-21doi: 10.3390/e17075085Aaron MeehanCassio de Campos<![CDATA[Entropy, Vol. 17, Pages 5063-5084: Minimal Rényi–Ingarden–Urbanik Entropy of Multipartite Quantum States]]>
http://www.mdpi.com/1099-4300/17/7/5063
We study the entanglement of a pure state of a composite quantum system consisting of several subsystems with d levels each. It can be described by the Rényi–Ingarden–Urbanik entropy Sq of a decomposition of the state in a product basis, minimized over all local unitary transformations. In the case q = 0, this quantity becomes a function of the rank of the tensor representing the state, while in the limit q → ∞, the entropy becomes related to the overlap with the closest separable state and the geometric measure of entanglement. For any bipartite system, the entropy S1 coincides with the standard entanglement entropy. We analyze the distribution of the minimal entropy for random states of three- and four-qubit systems. In the former case, the distribution of the three-tangle is studied and some of its moments are evaluated, while in the latter case, we analyze the distribution of the hyperdeterminant. The behavior of the maximum overlap of a three-qudit system with the closest separable state is also investigated in the asymptotic limit.Entropy2015-07-20177Article10.3390/e17075063506350841099-43002015-07-20doi: 10.3390/e17075063Marco EnríquezZbigniew PuchałaKarol Życzkowski<![CDATA[Entropy, Vol. 17, Pages 5047-5062: Evaluation of the Atmospheric Chemical Entropy Production of Mars]]>
http://www.mdpi.com/1099-4300/17/7/5047
Thermodynamic disequilibrium is a necessary situation in a system in which complex emergent structures are created and maintained. It is known that most of the chemical disequilibrium, a particular type of thermodynamic disequilibrium, in Earth’s atmosphere is a consequence of life. We have developed a thermochemical model for the Martian atmosphere to analyze the disequilibrium by chemical reactions calculating the entropy production. It follows from the comparison with the Earth atmosphere that the magnitude of the entropy produced by the recombination reaction forming O3 (O + O2 + CO2 ⥦ O3 + CO2) in the atmosphere of the Earth is larger than the entropy produced by the dominant set of chemical reactions considered for Mars, as a consequence of the low density and the poor variety of species of the Martian atmosphere. If disequilibrium is needed to create and maintain self-organizing structures in a system, we conclude that the current Martian atmosphere is unable to support large physico-chemical structures, such as those created on Earth.Entropy2015-07-20177Article10.3390/e17075047504750621099-43002015-07-20doi: 10.3390/e17075047Alfonso Delgado-BonalF. Martín-Torres<![CDATA[Entropy, Vol. 17, Pages 5043-5046: Reply to C. Tsallis’ “Conceptual Inadequacy of the Shore and Johnson Axioms for Wide Classes of Complex Systems”]]>
http://www.mdpi.com/1099-4300/17/7/5043
In a recent PRL (2013, 111, 180604), we invoked the Shore and Johnson axioms which demonstrate that the least-biased way to infer probability distributions fpig from data is to maximize the Boltzmann-Gibbs entropy. We then showed which biases are introduced in models obtained by maximizing nonadditive entropies. A rebuttal of our work appears in entropy (2015, 17, 2853) and argues that the Shore and Johnson axioms are inapplicable to a wide class of complex systems. Here we highlight the errors in this reasoning.Entropy2015-07-17177Reply10.3390/e17075043504350461099-43002015-07-17doi: 10.3390/e17075043Steve PresséKingshuk GhoshJulian LeeKen Dill<![CDATA[Entropy, Vol. 17, Pages 5022-5042: The Critical Point Entanglement and Chaos in the Dicke Model]]>
http://www.mdpi.com/1099-4300/17/7/5022
Ground state properties and level statistics of the Dicke model for a finite number of atoms are investigated based on a progressive diagonalization scheme (PDS). Particle number statistics, the entanglement measure and the Shannon information entropy at the resonance point in cases with a finite number of atoms as functions of the coupling parameter are calculated. It is shown that the entanglement measure defined in terms of the normalized von Neumann entropy of the reduced density matrix of the atoms reaches its maximum value at the critical point of the quantum phase transition where the system is most chaotic. Noticeable change in the Shannon information entropy near or at the critical point of the quantum phase transition is also observed. In addition, the quantum phase transition may be observed not only in the ground state mean photon number and the ground state atomic inversion as shown previously, but also in fluctuations of these two quantities in the ground state, especially in the atomic inversion fluctuation.Entropy2015-07-16177Article10.3390/e17075022502250421099-43002015-07-16doi: 10.3390/e17075022Lina BaoFeng PanJing LuJerry Draayer<![CDATA[Entropy, Vol. 17, Pages 5000-5021: A Novel Method for Seismogenic Zoning Based on Triclustering: Application to the Iberian Peninsula]]>
http://www.mdpi.com/1099-4300/17/7/5000
A previous definition of seismogenic zones is required to do a probabilistic seismic hazard analysis for areas of spread and low seismic activity. Traditional zoning methods are based on the available seismic catalog and the geological structures. It is admitted that thermal and resistant parameters of the crust provide better criteria for zoning. Nonetheless, the working out of the rheological profiles causes a great uncertainty. This has generated inconsistencies, as different zones have been proposed for the same area. A new method for seismogenic zoning by means of triclustering is proposed in this research. The main advantage is that it is solely based on seismic data. Almost no human decision is made, and therefore, the method is nearly non-biased. To assess its performance, the method has been applied to the Iberian Peninsula, which is characterized by the occurrence of small to moderate magnitude earthquakes. The catalog of the National Geographic Institute of Spain has been used. The output map is checked for validity with the geology. Moreover, a geographic information system has been used for two purposes. First, the obtained zones have been depicted within it. Second, the data have been used to calculate the seismic parameters (b-value, annual rate). Finally, the results have been compared to Kohonen’s self-organizing maps.Entropy2015-07-16177Article10.3390/e17075000500050211099-43002015-07-16doi: 10.3390/e17075000Francisco Martínez-ÁlvarezDavid Gutiérrez-AvilésAntonio Morales-EstebanJorge ReyesJosé Amaro-MelladoCristina Rubio-Escudero<![CDATA[Entropy, Vol. 17, Pages 4986-4999: Maximum Entropy Estimation of Probability Distribution of Variables in Higher Dimensions from Lower Dimensional Data]]>
http://www.mdpi.com/1099-4300/17/7/4986
A common statistical situation concerns inferring an unknown distribution Q(x) from a known distribution P(y), where X (dimension n), and Y (dimension m) have a known functional relationship. Most commonly, n ≤ m, and the task is relatively straightforward for well-defined functional relationships. For example, if Y1 and Y2 are independent random variables, each uniform on [0, 1], one can determine the distribution of X = Y1 + Y2; here m = 2 and n = 1. However, biological and physical situations can arise where n &gt; m and the functional relation Y→X is non-unique. In general, in the absence of additional information, there is no unique solution to Q in those cases. Nevertheless, one may still want to draw some inferences about Q. To this end, we propose a novel maximum entropy (MaxEnt) approach that estimates Q(x) based only on the available data, namely, P(y). The method has the additional advantage that one does not need to explicitly calculate the Lagrange multipliers. In this paper we develop the approach, for both discrete and continuous probability distributions, and demonstrate its validity. We give an intuitive justification as well, and we illustrate with examples.Entropy2015-07-15177Article10.3390/e17074986498649991099-43002015-07-15doi: 10.3390/e17074986Jayajit Sayak MukherjeeSusan Hodge<![CDATA[Entropy, Vol. 17, Pages 4974-4985: Lag Synchronization of Complex Lorenz System with Applications to Communication]]>
http://www.mdpi.com/1099-4300/17/7/4974
In communication, the signal at the receiver end at time t is the signal from the transmitter side at time t −Τ (Τ ≥ 0 and it is the lag time) as the time lag of transmission. Therefore, lag synchronization (LS) is more accurate than complete synchronization to design communication scheme. Taking complex Lorenz system as an example, we design the LS controller according to error feedback. Using chaotic masking, we propose a communication scheme based on LS and independent component analysis (ICA). It is suitable to transmit multiple messages with all kinds of amplitudes and it has the ability of anti-noise. Numerical simulations verify the feasibility and effectiveness of the presented schemes.Entropy2015-07-15177Article10.3390/e17074974497449851099-43002015-07-15doi: 10.3390/e17074974Fangfang Zhang<![CDATA[Entropy, Vol. 17, Pages 4959-4973: Broad Niche Overlap between Invasive Nile Tilapia Oreochromis niloticus and Indigenous Congenerics in Southern Africa: Should We be Concerned?]]>
http://www.mdpi.com/1099-4300/17/7/4959
This study developed niche models for the native ranges of Oreochromis andersonii, O. mortimeri, and O. mossambicus, and assessed how much of their range is climatically suitable for the establishment of O. niloticus, and then reviewed the conservation implications for indigenous congenerics as a result of overlap with O. niloticus based on documented congeneric interactions. The predicted potential geographical range of O. niloticus reveals a broad climatic suitability over most of southern Africa and overlaps with all the endemic congenerics. This is of major conservation concern because six of the eight river systems predicted to be suitable for O. niloticus have already been invaded and now support established populations. Oreochromis niloticus has been implicated in reducing the abundance of indigenous species through competitive exclusion and hybridisation. Despite these well-documented adverse ecological effects, O. niloticus remains one of the most widely cultured and propagated fish species in aquaculture and stock enhancements in the southern Africa sub-region. Aquaculture is perceived as a means of protein security, poverty alleviation, and economic development and, as such, any future decisions on its introduction will be based on the trade-off between socio-economic benefits and potential adverse ecological effects.Entropy2015-07-14177Article10.3390/e17074959495949731099-43002015-07-14doi: 10.3390/e17074959Tsungai ZengeyaAnthony BoothChristian Chimimba<![CDATA[Entropy, Vol. 17, Pages 4940-4958: Identity Authentication over Noisy Channels]]>
http://www.mdpi.com/1099-4300/17/7/4940
Identity authentication is the process of verifying users’ validity. Unlike classical key-based authentications, which are built on noiseless channels, this paper introduces a general analysis and design framework for identity authentication over noisy channels. Specifically, the authentication scenarios of single time and multiple times are investigated. For each scenario, the lower bound on the opponent’s success probability is derived, and it is smaller than the classical identity authentication’s. In addition, it can remain the same, even if the secret key is reused. Remarkably, the Cartesian authentication code proves to be helpful for hiding the secret key to maximize the secrecy performance. Finally, we show a potential application of this authentication technique.Entropy2015-07-14177Article10.3390/e17074940494049581099-43002015-07-14doi: 10.3390/e17074940Fanfan ZhengZhiqing XiaoShidong ZhouJing WangLianfen Huang<![CDATA[Entropy, Vol. 17, Pages 4918-4939: Fisher Information Properties]]>
http://www.mdpi.com/1099-4300/17/7/4918
A set of Fisher information properties are presented in order to draw a parallel with similar properties of Shannon differential entropy. Already known properties are presented together with new ones, which include: (i) a generalization of mutual information for Fisher information; (ii) a new proof that Fisher information increases under conditioning; (iii) showing that Fisher information decreases in Markov chains; and (iv) bound estimation error using Fisher information. This last result is especially important, because it completes Fano’s inequality, i.e., a lower bound for estimation error, showing that Fisher information can be used to define an upper bound for this error. In this way, it is shown that Shannon’s differential entropy, which quantifies the behavior of the random variable, and the Fisher information, which quantifies the internal structure of the density function that defines the random variable, can be used to characterize the estimation error.Entropy2015-07-13177Article10.3390/e17074918491849391099-43002015-07-13doi: 10.3390/e17074918Pablo Zegers<![CDATA[Entropy, Vol. 17, Pages 4891-4917: Informational and Causal Architecture of Discrete-Time Renewal Processes]]>
http://www.mdpi.com/1099-4300/17/7/4891
Renewal processes are broadly used to model stochastic behavior consisting of isolated events separated by periods of quiescence, whose durations are specified by a given probability law. Here, we identify the minimal sufficient statistic for their prediction (the set of causal states), calculate the historical memory capacity required to store those states (statistical complexity), delineate what information is predictable (excess entropy), and decompose the entropy of a single measurement into that shared with the past, future, or both. The causal state equivalence relation defines a new subclass of renewal processes with a finite number of causal states despite having an unbounded interevent count distribution. We use the resulting formulae to analyze the output of the parametrized Simple Nonunifilar Source, generated by a simple two-state hidden Markov model, but with an infinite-state machine presentation. All in all, the results lay the groundwork for analyzing more complex processes with infinite statistical complexity and infinite excess entropy.Entropy2015-07-13177Article10.3390/e17074891489149171099-43002015-07-13doi: 10.3390/e17074891Sarah MarzenJames Crutchfield<![CDATA[Entropy, Vol. 17, Pages 4863-4890: Entropy, Information and Complexity or Which Aims the Arrow of Time?]]>
http://www.mdpi.com/1099-4300/17/7/4863
In this article, we analyze the interrelationships among such notions as entropy, information, complexity, order and chaos and show using the theory of categories how to generalize the second law of thermodynamics as a law of increasing generalized entropy or a general law of complification. This law could be applied to any system with morphisms, including all of our universe and its subsystems. We discuss how such a general law and other laws of nature drive the evolution of the universe, including physicochemical and biological evolutions. In addition, we determine eliminating selection in physicochemical evolution as an extremely simplified prototype of natural selection. Laws of nature do not allow complexity and entropy to reach maximal values by generating structures. One could consider them as a kind of “breeder” of such selection.Entropy2015-07-10177Article10.3390/e17074863486348901099-43002015-07-10doi: 10.3390/e17074863George MikhailovskyAlexander Levich<![CDATA[Entropy, Vol. 17, Pages 4838-4862: Faster Together: Collective Quantum Search]]>
http://www.mdpi.com/1099-4300/17/7/4838
Joining independent quantum searches provides novel collective modes of quantum search (merging) by utilizing the algorithm’s underlying algebraic structure. If n quantum searches, each targeting a single item, join the domains of their classical oracle functions and sum their Hilbert spaces (merging), instead of acting independently (concatenation), then they achieve a reduction of the search complexity by factor O(√n).Entropy2015-07-10177Article10.3390/e17074838483848621099-43002015-07-10doi: 10.3390/e17074838Demosthenes EllinasChristos Konstandakis<![CDATA[Entropy, Vol. 17, Pages 4809-4837: Power-Type Functions of Prediction Error of Sea Level Time Series]]>
http://www.mdpi.com/1099-4300/17/7/4809
This paper gives the quantitative relationship between prediction error and given past sample size in our research of sea level time series. The present result exhibits that the prediction error of sea level time series in terms of given past sample size follows decayed power functions, providing a quantitative guideline for the quality control of sea level prediction.Entropy2015-07-09177Article10.3390/e17074809480948371099-43002015-07-09doi: 10.3390/e17074809Ming LiYuanchun LiJianxing Leng<![CDATA[Entropy, Vol. 17, Pages 4786-4808: Modeling and Analysis of Entropy Generation in Light Heating of Nanoscaled Silicon and Germanium Thin Films]]>
http://www.mdpi.com/1099-4300/17/7/4786
In this work, the irreversible processes in light heating of Silicon (Si) and Germanium (Ge) thin films are examined. Each film is exposed to light irradiation with radiative and convective boundary conditions. Heat, electron and hole transport and generation-recombination processes of electron-hole pairs are studied in terms of a phenomenological model obtained from basic principles of irreversible thermodynamics. We present an analysis of the contributions to the entropy production in the stationary state due to the dissipative effects associated with electron and hole transport, generation-recombination of electron-hole pairs as well as heat transport. The most significant contribution to the entropy production comes from the interaction of light with the medium in both Si and Ge. This interaction includes two processes, namely, the generation of electron-hole pairs and the transferring of energy from the absorbed light to the lattice. In Si the following contribution in magnitude comes from the heat transport. In Ge all the remaining contributions to entropy production have nearly the same order of magnitude. The results are compared and explained addressing the differences in the magnitude of the thermodynamic forces, Onsager’s coefficients and transport properties of Si and Ge.Entropy2015-07-09177Article10.3390/e17074786478648081099-43002015-07-09doi: 10.3390/e17074786José Nájera-CarpioFederico VázquezAldo Figueroa<![CDATA[Entropy, Vol. 17, Pages 4775-4785: Fractional Differential Texture Descriptors Based on the Machado Entropy for Image Splicing Detection]]>
http://www.mdpi.com/1099-4300/17/7/4775
Image splicing is a common operation in image forgery. Different techniques of image splicing detection have been utilized to regain people’s trust. This study introduces a texture enhancement technique involving the use of fractional differential masks based on the Machado entropy. The masks slide over the tampered image, and each pixel of the tampered image is convolved with the fractional mask weight window on eight directions. Consequently, the fractional differential texture descriptors are extracted using the gray-level co-occurrence matrix for image splicing detection. The support vector machine is used as a classifier that distinguishes between authentic and spliced images. Results prove that the achieved improvements of the proposed algorithm are compatible with other splicing detection methods.Entropy2015-07-08177Article10.3390/e17074775477547851099-43002015-07-08doi: 10.3390/e17074775Rabha IbrahimZahra MoghaddasiHamid JalabRafidah Noor<![CDATA[Entropy, Vol. 17, Pages 4762-4774: H∞ Control for Markov Jump Systems with Nonlinear Noise Intensity Function and Uncertain Transition Rates]]>
http://www.mdpi.com/1099-4300/17/7/4762
The problem of robust H∞ control is investigated for Markov jump systems with nonlinear noise intensity function and uncertain transition rates. A robust H∞ performance criterion is developed for the given systems for the first time. Based on the developed performance criterion, the desired H∞ state-feedback controller is also designed, which guarantees the robust H∞ performance of the closed-loop system. All the conditions are in terms of linear matrix inequalities (LMIs), and hence they can be readily solved by any LMI solver. Finally, a numerical example is provided to demonstrate the effectiveness of the proposed methods.Entropy2015-07-06177Article10.3390/e17074762476247741099-43002015-07-06doi: 10.3390/e17074762Xiaonian WangYafeng Guo<![CDATA[Entropy, Vol. 17, Pages 4744-4761: Energetic and Exergetic Analysis of an Ejector-Expansion Refrigeration Cycle Using the Working Fluid R32]]>
http://www.mdpi.com/1099-4300/17/7/4744
The performance characteristics of an ejector-expansion refrigeration cycle (EEC) using R32 have been investigated in comparison with that using R134a. The coefficient of performance (COP), the exergy destruction, the exergy efficiency and the suction nozzle pressure drop (SNPD) are discussed. The results show that the application of an ejector instead of a throttle valve in R32 cycle decreases the cycle’s total exergy destruction by 8.84%–15.84% in comparison with the basic cycle (BC). The R32 EEC provides 5.22%–13.77% COP improvement and 5.13%–13.83% exergy efficiency improvement respectively over the BC for the given ranges of evaporating and condensing temperatures. There exists an optimum suction nozzle pressure drop (SNPD) which gives a maximum system COP and volumetric cooling capacity (VCC) under a specified condition. The value of the optimum SNPD mainly depends on the efficiencies of the ejector components, but is virtually independent of evaporating temperature and condensing temperature. In addition, the improvement of the component efficiency, especially the efficiencies of diffusion nozzle and the motive nozzle, can enhance the EEC performance.Entropy2015-07-06177Article10.3390/e17074744474447611099-43002015-07-06doi: 10.3390/e17074744Zhenying ZhangLirui TongLi ChangYanhua ChenXingguo Wang<![CDATA[Entropy, Vol. 17, Pages 4701-4743: Asymptotic Description of Neural Networks with Correlated Synaptic Weights]]>
http://www.mdpi.com/1099-4300/17/7/4701
We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. Given a completely connected network of neurons in which the synaptic weights are Gaussian correlated random variables, we describe the asymptotic law of the network when the number of neurons goes to infinity. We introduce the process-level empirical measure of the trajectories of the solutions to the equations of the finite network of neurons and the averaged law (with respect to the synaptic weights) of the trajectories of the solutions to the equations of the network of neurons. The main result of this article is that the image law through the empirical measure satisfies a large deviation principle with a good rate function which is shown to have a unique global minimum. Our analysis of the rate function allows us also to characterize the limit measure as the image of a stationary Gaussian measure defined on a transformed set of trajectories.Entropy2015-07-06177Article10.3390/e17074701470147431099-43002015-07-06doi: 10.3390/e17074701Olivier FaugerasJames MacLaurin<![CDATA[Entropy, Vol. 17, Pages 4684-4700: Geometric Interpretation of Surface Tension Equilibrium in Superhydrophobic Systems]]>
http://www.mdpi.com/1099-4300/17/7/4684
Surface tension and surface energy are closely related, although not identical concepts. Surface tension is a generalized force; unlike a conventional mechanical force, it is not applied to any particular body or point. Using this notion, we suggest a simple geometric interpretation of the Young, Wenzel, Cassie, Antonoff and Girifalco–Good equations for the equilibrium during wetting. This approach extends the traditional concept of Neumann’s triangle. Substances are presented as points, while tensions are vectors connecting the points, and the equations and inequalities of wetting equilibrium obtain simple geometric meaning with the surface roughness effect interpreted as stretching of corresponding vectors; surface heterogeneity is their linear combination, and contact angle hysteresis is rotation. We discuss energy dissipation mechanisms during wetting due to contact angle hysteresis, the superhydrophobicity and the possible entropic nature of the surface tension.Entropy2015-07-06177Article10.3390/e17074684468447001099-43002015-07-06doi: 10.3390/e17074684Michael NosonovskyRahul Ramachandran<![CDATA[Entropy, Vol. 17, Pages 4664-4683: A New Feature Extraction Method Based on the Information Fusion of Entropy Matrix and Covariance Matrix and Its Application in Face Recognition]]>
http://www.mdpi.com/1099-4300/17/7/4664
The classic principal components analysis (PCA), kernel PCA (KPCA) and linear discriminant analysis (LDA) feature extraction methods evaluate the importance of components according to their covariance contribution, not considering the entropy contribution, which is important supplementary information for the covariance. To further improve the covariance-based methods such as PCA (or KPCA), this paper firstly proposed an entropy matrix to load the uncertainty information of random variables similar to the covariance matrix loading the variation information in PCA. Then an entropy-difference matrix was used as a weighting matrix for transforming the original training images. This entropy-difference weighting (EW) matrix not only made good use of the local information of the training samples, contrast to the global method of PCA, but also considered the category information similar to LDA idea. Then the EW method was integrated with PCA (or KPCA), to form new feature extracting method. The new method was used for face recognition with the nearest neighbor classifier. The experimental results based on the ORL and Yale databases showed that the proposed method with proper threshold parameters reached higher recognition rates than the usual PCA (or KPCA) methods.Entropy2015-07-03177Article10.3390/e17074664466446831099-43002015-07-03doi: 10.3390/e17074664Shunfang WangPing Liu<![CDATA[Entropy, Vol. 17, Pages 4654-4663: Continuous Variable Quantum Key Distribution with a Noisy Laser]]>
http://www.mdpi.com/1099-4300/17/7/4654
Existing experimental implementations of continuous-variable quantum key distribution require shot-noise limited operation, achieved with shot-noise limited lasers. However, loosening this requirement on the laser source would allow for cheaper, potentially integrated systems. Here, we implement a theoretically proposed prepare-and-measure continuous-variable protocol and experimentally demonstrate the robustness of it against preparation noise stemming for instance from technical laser noise. Provided that direct reconciliation techniques are used in the post-processing we show that for small distances large amounts of preparation noise can be tolerated in contrast to reverse reconciliation where the key rate quickly drops to zero. Our experiment thereby demonstrates that quantum key distribution with non-shot-noise limited laser diodes might be feasible.Entropy2015-07-03177Article10.3390/e17074654465446631099-43002015-07-03doi: 10.3390/e17074654Christian JacobsenTobias GehringUlrik Andersen<![CDATA[Entropy, Vol. 17, Pages 4644-4653: Quantifying Redundant Information in Predicting a Target Random Variable]]>
http://www.mdpi.com/1099-4300/17/7/4644
We consider the problem of defining a measure of redundant information that quantifies how much common information two or more random variables specify about a target random variable. We discussed desired properties of such a measure, and propose new measures with some desirable properties.Entropy2015-07-02177Article10.3390/e17074644464446531099-43002015-07-02doi: 10.3390/e17074644Virgil GriffithTracey Ho<![CDATA[Entropy, Vol. 17, Pages 4627-4643: Differentiating Interictal and Ictal States in Childhood Absence Epilepsy through Permutation Rényi Entropy]]>
http://www.mdpi.com/1099-4300/17/7/4627
Permutation entropy (PE) has been widely exploited to measure the complexity of the electroencephalogram (EEG), especially when complexity is linked to diagnostic information embedded in the EEG. Recently, the authors proposed a spatial-temporal analysis of the EEG recordings of absence epilepsy patients based on PE. The goal here is to improve the ability of PE in discriminating interictal states from ictal states in absence seizure EEG. For this purpose, a parametrical definition of permutation entropy is introduced here in the field of epileptic EEG analysis: the permutation Rényi entropy (PEr). PEr has been extensively tested against PE by tuning the involved parameters (order, delay time and alpha). The achieved results demonstrate that PEr outperforms PE, as there is a statistically-significant, wider gap between the PEr levels during the interictal states and PEr levels observed in the ictal states compared to PE. PEr also outperformed PE as the input to a classifier aimed at discriminating interictal from ictal states.Entropy2015-07-02177Article10.3390/e17074627462746431099-43002015-07-02doi: 10.3390/e17074627Nadia MammoneJonas Duun-HenriksenTroels KjaerFrancesco Morabito<![CDATA[Entropy, Vol. 17, Pages 4602-4626: A New Robust Regression Method Based on Minimization of Geodesic Distances on a Probabilistic Manifold: Application to Power Laws]]>
http://www.mdpi.com/1099-4300/17/7/4602
In regression analysis for deriving scaling laws that occur in various scientific disciplines, usually standard regression methods have been applied, of which ordinary least squares (OLS) is the most popular. In many situations, the assumptions underlying OLS are not fulfilled, and several other approaches have been proposed. However, most techniques address only part of the shortcomings of OLS. We here discuss a new and more general regression method, which we call geodesic least squares regression (GLS). The method is based on minimization of the Rao geodesic distance on a probabilistic manifold. For the case of a power law, we demonstrate the robustness of the method on synthetic data in the presence of significant uncertainty on both the data and the regression model. We then show good performance of the method in an application to a scaling law in magnetic confinement fusion.Entropy2015-07-01177Article10.3390/e17074602460246261099-43002015-07-01doi: 10.3390/e17074602Geert Verdoolaege<![CDATA[Entropy, Vol. 17, Pages 4582-4601: Applications of the Fuzzy Sumudu Transform for the Solution of First Order Fuzzy Differential Equations]]>
http://www.mdpi.com/1099-4300/17/7/4582
In this paper, we study the classical Sumudu transform in fuzzy environment, referred to as the fuzzy Sumudu transform (FST). We also propose some results on the properties of the FST, such as linearity, preserving, fuzzy derivative, shifting and convolution theorem. In order to show the capability of the FST, we provide a detailed procedure to solve fuzzy differential equations (FDEs). A numerical example is provided to illustrate the usage of the FST.Entropy2015-07-01177Article10.3390/e17074582458246011099-43002015-07-01doi: 10.3390/e17074582Norazrizal RahmanMuhammad Ahmad<![CDATA[Entropy, Vol. 17, Pages 4563-4581: A Possible Cosmological Application of Some Thermodynamic Properties of the Black Body Radiation in n-Dimensional Euclidean Spaces]]>
http://www.mdpi.com/1099-4300/17/7/4563
In this work, we present the generalization of some thermodynamic properties of the black body radiation (BBR) towards an n-dimensional Euclidean space. For this case, the Planck function and the Stefan–Boltzmann law have already been given by Landsberg and de Vos and some adjustments by Menon and Agrawal. However, since then, not much more has been done on this subject, and we believe there are some relevant aspects yet to explore. In addition to the results previously found, we calculate the thermodynamic potentials, the efficiency of the Carnot engine, the law for adiabatic processes and the heat capacity at constant volume. There is a region at which an interesting behavior of the thermodynamic potentials arises: maxima and minima appear for the n—dimensional BBR system at very high temperatures and low dimensionality, suggesting a possible application to cosmology. Finally, we propose that an optimality criterion in a thermodynamic framework could be related to the 3—dimensional nature of the universe.Entropy2015-06-29177Article10.3390/e17074563456345811099-43002015-06-29doi: 10.3390/e17074563Julian Gonzalez-AyalaJennifer Perez-OregonRubén CorderoFernando Angulo-Brown<![CDATA[Entropy, Vol. 17, Pages 4547-4562: Noiseless Linear Amplifiers in Entanglement-Based Continuous-Variable Quantum Key Distribution]]>
http://www.mdpi.com/1099-4300/17/7/4547
We propose a method to improve the performance of two entanglement-based continuous-variable quantum key distribution protocols using noiseless linear amplifiers. The two entanglement-based schemes consist of an entanglement distribution protocol with an untrusted source and an entanglement swapping protocol with an untrusted relay. Simulation results show that the noiseless linear amplifiers can improve the performance of these two protocols, in terms of maximal transmission distances, when we consider small amounts of entanglement, as typical in realistic setups.Entropy2015-06-26177Article10.3390/e17074547454745621099-43002015-06-26doi: 10.3390/e17074547Yichen ZhangZhengyu LiChristian WeedbrookKevin MarshallStefano PirandolaSong YuHong Guo<![CDATA[Entropy, Vol. 17, Pages 4533-4546: Reliability Analysis Based on a Jump Diffusion Model with Two Wiener Processes for Cloud Computing with Big Data]]>
http://www.mdpi.com/1099-4300/17/7/4533
At present, many cloud services are managed by using open source software, such as OpenStack and Eucalyptus, because of the unification management of data, cost reduction, quick delivery and work savings. The operation phase of cloud computing has a unique feature, such as the provisioning processes, the network-based operation and the diversity of data, because the operation phase of cloud computing changes depending on many external factors. We propose a jump diffusion model with two-dimensional Wiener processes in order to consider the interesting aspects of the network traffic and big data on cloud computing. In particular, we assess the stability of cloud software by using the sample paths obtained from the jump diffusion model with two-dimensional Wiener processes. Moreover, we discuss the optimal maintenance problem based on the proposed jump diffusion model. Furthermore, we analyze actual data to show numerical examples of dependability optimization based on the software maintenance cost considering big data on cloud computing.Entropy2015-06-26177Article10.3390/e17074533453345461099-43002015-06-26doi: 10.3390/e17074533Yoshinobu TamuraShigeru Yamada<![CDATA[Entropy, Vol. 17, Pages 4519-4532: Estimating Portfolio Value at Risk in the Electricity Markets Using an Entropy Optimized BEMD Approach]]>
http://www.mdpi.com/1099-4300/17/7/4519
In this paper, we propose a new entropy-optimized bivariate empirical mode decomposition (BEMD)-based model for estimating portfolio value at risk (PVaR). It reveals and analyzes different components of the price fluctuation. These components are decomposed and distinguished by their different behavioral patterns and fluctuation range, by the BEMD model. The entropy theory has been introduced for the identification of the model parameters during the modeling process. The decomposed bivariate data components are calculated with the DCC-GARCH models. Empirical studies suggest that the proposed model outperforms the benchmark multivariate exponential weighted moving average (MEWMA) and DCC-GARCH model, in terms of conventional out-of-sample performance evaluation criteria for the model accuracy.Entropy2015-06-26177Article10.3390/e17074519451945321099-43002015-06-26doi: 10.3390/e17074519Yingchao ZouLean YuKaijian He<![CDATA[Entropy, Vol. 17, Pages 4500-4518: Clausius’ Disgregation: A Conceptual Relic that Sheds Light on the Second Law]]>
http://www.mdpi.com/1099-4300/17/7/4500
The present work analyzes the cognitive process that led Clausius towards the translation of the Second Law of Thermodynamics into mathematical expressions. We show that Clausius’ original formal expression of the Second Law was achieved by making extensive use of the concept of disgregation, a quantity which has subsequently disappeared from the thermodynamic language. Our analysis demonstrates that disgregation stands as a crucial logical step of such process and sheds light on the comprehension of such fundamental relation. The introduction of entropy—which occurred three years after the first formalization of the Second Law—was aimed at making the Second Law exploitable in practical contexts. The reasons for the disappearance of disgregation, as well as of other “pre-modern” quantities, from the thermodynamics language are discussed.Entropy2015-06-25177Article10.3390/e17074500450045181099-43002015-06-25doi: 10.3390/e17074500Emilio PellegrinoElena GhibaudiLuigi Cerruti<![CDATA[Entropy, Vol. 17, Pages 4485-4499: On Monotone Embedding in Information Geometry]]>
http://www.mdpi.com/1099-4300/17/7/4485
A paper was published (Harsha and Subrahamanian Moosath, 2014) in which the authors claimed to have discovered an extension to Amari's \(\alpha\)-geometry through a general monotone embedding function. It will be pointed out here that this so-called \((F, G)\)-geometry (which includes \(F\)-geometry as a special case) is identical to Zhang's (2004) extension to the \(\alpha\)-geometry, where the name of the pair of monotone embedding functions \(\rho\) and \(\tau\) were used instead of \(F\) and \(H\) used in Harsha and Subrahamanian Moosath (2014). Their weighting function \(G\) for the Riemannian metric appears cosmetically due to a rewrite of the score function in log-representation as opposed to \((\rho, \tau)\)-representation in Zhang (2004). It is further shown here that the resulting metric and \(\alpha\)-connections obtained by Zhang (2004) through arbitrary monotone embeddings is a unique extension of the \(\alpha\)-geometric structure. As a special case, Naudts' (2004) \(\phi\)-logarithm embedding (using the so-called \(\log_\phi\) function) is recovered with the identification \(\rho=\phi, \, \tau=\log_\phi\), with \(\phi\)-exponential \(\exp_\phi\) given by the associated convex function linking the two representations.Entropy2015-06-25177Article10.3390/e17074485448544991099-43002015-06-25doi: 10.3390/e17074485Jun Zhang<![CDATA[Entropy, Vol. 17, Pages 4454-4484: Modeling Soil Moisture Profiles in Irrigated Fields by the Principle of Maximum Entropy]]>
http://www.mdpi.com/1099-4300/17/6/4454
Vertical soil moisture profiles based on the principle of maximum entropy (POME) were validated using field and model data and applied to guide an irrigation cycle over a maize field in north central Alabama (USA). The results demonstrate that a simple two-constraint entropy model under the assumption of a uniform initial soil moisture distribution can simulate most soil moisture profiles that occur in the particular soil and climate regime that prevails in the study area. The results of the irrigation simulation demonstrated that the POME model produced a very efficient irrigation strategy with minimal losses (about 1.9% of total applied water). However, the results for finely-textured (silty clay) soils were problematic in that some plant stress did develop due to insufficient applied water. Soil moisture states in these soils fell to around 31% of available moisture content, but only on the last day of the drying side of the irrigation cycle. Overall, the POME approach showed promise as a general strategy to guide irrigation in humid environments, such as the Southeastern United States.Entropy2015-06-23176Article10.3390/e17064454445444841099-43002015-06-23doi: 10.3390/e17064454Vikalp MishraWalter EllenburgOsama Al-HamdanJosh BruceJames Cruise<![CDATA[Entropy, Vol. 17, Pages 4439-4453: Analysis of the Keller–Segel Model with a Fractional Derivative without Singular Kernel]]>
http://www.mdpi.com/1099-4300/17/6/4439
Using some investigations based on information theory, the model proposed by Keller and Segel was extended to the concept of fractional derivative using the derivative with fractional order without singular kernel recently proposed by Caputo and Fabrizio. We present in detail the existence of the coupled-solutions using the fixed-point theorem. A detailed analysis of the uniqueness of the coupled-solutions is also presented. Using an iterative approach, we derive special coupled-solutions of the modified system and we present some numerical simulations to see the effect of the fractional order.Entropy2015-06-23176Article10.3390/e17064439443944531099-43002015-06-23doi: 10.3390/e17064439Abdon AtanganaBadr Alkahtani<![CDATA[Entropy, Vol. 17, Pages 4413-4438: Detecting Chronotaxic Systems from Single-Variable Time Series with Separable Amplitude and Phase]]>
http://www.mdpi.com/1099-4300/17/6/4413
The recent introduction of chronotaxic systems provides the means to describe nonautonomous systems with stable yet time-varying frequencies which are resistant to continuous external perturbations. This approach facilitates realistic characterization of the oscillations observed in living systems, including the observation of transitions in dynamics which were not considered previously. The novelty of this approach necessitated the development of a new set of methods for the inference of the dynamics and interactions present in chronotaxic systems. These methods, based on Bayesian inference and detrended fluctuation analysis, can identify chronotaxicity in phase dynamics extracted from a single time series. Here, they are applied to numerical examples and real experimental electroencephalogram (EEG) data. We also review the current methods, including their assumptions and limitations, elaborate on their implementation, and discuss future perspectives.Entropy2015-06-23176Article10.3390/e17064413441344381099-43002015-06-23doi: 10.3390/e17064413Gemma LancasterPhilip ClemsonYevhen SuprunenkoTomislav StankovskiAneta Stefanovska<![CDATA[Entropy, Vol. 17, Pages 4364-4412: Intransitivity in Theory and in the Real World]]>
http://www.mdpi.com/1099-4300/17/6/4364
This work considers reasons for and implications of discarding the assumption of transitivity—the fundamental postulate in the utility theory of von Neumann and Morgenstern, the adiabatic accessibility principle of Caratheodory and most other theories related to preferences or competition. The examples of intransitivity are drawn from different fields, such as law, biology and economics. This work is intended as a common platform that allows us to discuss intransitivity in the context of different disciplines. The basic concepts and terms that are needed for consistent treatment of intransitivity in various applications are presented and analysed in a unified manner. The analysis points out conditions that necessitate appearance of intransitivity, such as multiplicity of preference criteria and imperfect (i.e., approximate) discrimination of different cases. The present work observes that with increasing presence and strength of intransitivity, thermodynamics gradually fades away leaving space for more general kinetic considerations. Intransitivity in competitive systems is linked to complex phenomena that would be difficult or impossible to explain on the basis of transitive assumptions. Human preferences that seem irrational from the perspective of the conventional utility theory, become perfectly logical in the intransitive and relativistic framework suggested here. The example of competitive simulations for the risk/benefit dilemma demonstrates the significance of intransitivity in cyclic behaviour and abrupt changes in the system. The evolutionary intransitivity parameter, which is introduced in the Appendix, is a general measure of intransitivity, which is particularly useful in evolving competitive systems.Entropy2015-06-19176Article10.3390/e17064364436444121099-43002015-06-19doi: 10.3390/e17064364Alexander Klimenko<![CDATA[Entropy, Vol. 17, Pages 4323-4363: Information Geometry Formalism for the Spatially Homogeneous Boltzmann Equation]]>
http://www.mdpi.com/1099-4300/17/6/4323
Information Geometry generalizes to infinite dimension by modeling the tangent space of the relevant manifold of probability densities with exponential Orlicz spaces. We review here several properties of the exponential manifold on a suitable set Ɛ of mutually absolutely continuous densities. We study in particular the fine properties of the Kullback-Liebler divergence in this context. We also show that this setting is well-suited for the study of the spatially homogeneous Boltzmann equation if Ɛ is a set of positive densities with finite relative entropy with respect to the Maxwell density. More precisely, we analyze the Boltzmann operator in the geometric setting from the point of its Maxwell’s weak form as a composition of elementary operations in the exponential manifold, namely tensor product, conditioning, marginalization and we prove in a geometric way the basic facts, i.e., the H-theorem. We also illustrate the robustness of our method by discussing, besides the Kullback-Leibler divergence, also the property of Hyvärinen divergence. This requires us to generalize our approach to Orlicz–Sobolev spaces to include derivatives.Entropy2015-06-19176Article10.3390/e17064323432343631099-43002015-06-19doi: 10.3390/e17064323Bertrand LodsGiovanni Pistone<![CDATA[Entropy, Vol. 17, Pages 4293-4322: Concurrence Measurement for the Two-Qubit Optical and Atomic States]]>
http://www.mdpi.com/1099-4300/17/6/4293
Concurrence provides us an effective approach to quantify entanglement, which is quite important in quantum information processing applications. In the paper, we mainly review some direct concurrence measurement protocols of the two-qubit optical or atomic system. We first introduce the concept of concurrence for a two-qubit system. Second, we explain the approaches of the concurrence measurement in both a linear and a nonlinear optical system. Third, we introduce some protocols for measuring the concurrence of the atomic entanglement system.Entropy2015-06-19176Review10.3390/e17064293429343221099-43002015-06-19doi: 10.3390/e17064293Lan ZhouYu-Bo Sheng<![CDATA[Entropy, Vol. 17, Pages 4271-4292: A Hybrid Physical and Maximum-Entropy Landslide Susceptibility Model]]>
http://www.mdpi.com/1099-4300/17/6/4271
The clear need for accurate landslide susceptibility mapping has led to multiple approaches. Physical models are easily interpreted and have high predictive capabilities but rely on spatially explicit and accurate parameterization, which is commonly not possible. Statistical methods can include other factors influencing slope stability such as distance to roads, but rely on good landslide inventories. The maximum entropy (MaxEnt) model has been widely and successfully used in species distribution mapping, because data on absence are often uncertain. Similarly, knowledge about the absence of landslides is often limited due to mapping scale or methodology. In this paper a hybrid approach is described that combines the physically-based landslide susceptibility model “Stability INdex MAPping” (SINMAP) with MaxEnt. This method is tested in a coastal watershed in Pacifica, CA, USA, with a well-documented landslide history including 3 inventories of 154 scars on 1941 imagery, 142 in 1975, and 253 in 1983. Results indicate that SINMAP alone overestimated susceptibility due to insufficient data on root cohesion. Models were compared using SINMAP stability index (SI) or slope alone, and SI or slope in combination with other environmental factors: curvature, a 50-m trail buffer, vegetation, and geology. For 1941 and 1975, using slope alone was similar to using SI alone; however in 1983 SI alone creates an Areas Under the receiver operator Curve (AUC) of 0.785, compared with 0.749 for slope alone. In maximum-entropy models created using all environmental factors, the stability index (SI) from SINMAP represented the greatest contributions in all three years (1941: 48.1%; 1975: 35.3; and 1983: 48%), with AUC of 0.795, 0822, and 0.859, respectively; however; using slope instead of SI created similar overall AUC values, likely due to the combined effect with plan curvature indicating focused hydrologic inputs and vegetation identifying the effect of root cohesion. The combined approach––using either stability index or slope––highlights the importance of additional environmental variables in modeling landslide initiation.Entropy2015-06-19176Article10.3390/e17064271427142921099-43002015-06-19doi: 10.3390/e17064271Jerry DavisLeonhard Blesius<![CDATA[Entropy, Vol. 17, Pages 4255-4270: New Hyperbolic Function Solutions for Some Nonlinear Partial Differential Equation Arising in Mathematical Physics]]>
http://www.mdpi.com/1099-4300/17/6/4255
In this study, we investigate some new analytical solutions to the (1 + 1)-dimensional nonlinear Dispersive Modified Benjamin–Bona–Mahony equation and the (2 + 1)-dimensional cubic Klein–Gordon equation by using the generalized Kudryashov method. After we submitted the general properties of the generalized Kudryashov method in Section 2, we applied this method to these problems to obtain some new analytical solutions, such as rational function solutions, exponential function solutions and hyperbolic function solutions in Section 3. Afterwards, we draw two- and three-dimensional surfaces of analytical solutions by using Wolfram Mathematica 9.Entropy2015-06-19176Article10.3390/e17064255425542701099-43002015-06-19doi: 10.3390/e17064255Haci BaskonusHasan Bulut