Entropy
http://www.mdpi.com/journal/entropy
Latest open access articles published in Entropy at http://www.mdpi.com/journal/entropy<![CDATA[Entropy, Vol. 18, Pages 174: Quantum Errors and Disturbances: Response to Busch, Lahti and Werner]]>
http://www.mdpi.com/1099-4300/18/5/174
Busch, Lahti and Werner (BLW) have recently criticized the operator approach to the description of quantum errors and disturbances. Their criticisms are justified to the extent that the physical meaning of the operator definitions has not hitherto been adequately explained. We rectify that omission. We then examine BLW’s criticisms in the light of our analysis. We argue that, although the BLW approach favour (based on the Wasserstein two-deviation) has its uses, there are important physical situations where an operator approach is preferable. We also discuss the reason why the error-disturbance relation is still giving rise to controversies almost a century after Heisenberg first stated his microscope argument. We argue that the source of the difficulties is the problem of interpretation, which is not so wholly disconnected from experimental practicalities as is sometimes supposed.Entropy2016-05-06185Article10.3390/e180501741741099-43002016-05-06doi: 10.3390/e18050174David Appleby<![CDATA[Entropy, Vol. 18, Pages 173: Magnetically-Driven Quantum Heat Engines: The Quasi-Static Limit of Their Efficiency]]>
http://www.mdpi.com/1099-4300/18/5/173
The concept of a quantum heat engine (QHEN) has been discussed in the literature, not only due to its intrinsic scientific interest, but also as an alternative to efficiently recover, on a nanoscale device, thermal energy in the form of useful work. The quantum character of a QHEN relies, for instance, on the fact that any of its intermediate states is determined by a density matrix operator. In particular, this matrix can represent a mixed state. For a classical heat engine, a theoretical upper bound for its efficiency is obtained by analyzing its quasi-static operation along a cycle drawn by a sequence of quasi-equilibrium states. A similar analysis can be carried out for a quantum engine, where quasi-static processes are driven by the evolution of ensemble-averaged observables, via variation of the corresponding operators or of the density matrix itself on a tunable physical parameter. We recently proposed two new conceptual designs for a magnetically-driven quantum engine, where the tunable parameter is the intensity of an external magnetic field. Along this article, we shall present the general quantum thermodynamics formalism developed in order to analyze this type of QHEN, and moreover, we shall apply it to describe the theoretical efficiency of two different practical implementations of this concept: an array of semiconductor quantum dots and an ensemble of graphene flakes submitted to mechanical tension.Entropy2016-05-06185Article10.3390/e180501731731099-43002016-05-06doi: 10.3390/e18050173Enrique MuñozFrancisco PeñaAlejandro González<![CDATA[Entropy, Vol. 18, Pages 172: Stochastic Resonance, Self-Organization and Information Dynamics in Multistable Systems]]>
http://www.mdpi.com/1099-4300/18/5/172
A class of complex self-organizing systems subjected to fluctuations of environmental or intrinsic origin and to nonequilibrium constraints in the form of an external periodic forcing is analyzed from the standpoint of information theory. Conditions under which the response of information entropy and related quantities to the nonequilibrium constraint can be optimized via a stochastic resonance-type mechanism are identified, and the role of key parameters is assessed.Entropy2016-05-04185Article10.3390/e180501721721099-43002016-05-04doi: 10.3390/e18050172Grégoire NicolisCatherine Nicolis<![CDATA[Entropy, Vol. 18, Pages 170: Forecasting Energy Value at Risk Using Multiscale Dependence Based Methodology]]>
http://www.mdpi.com/1099-4300/18/5/170
In this paper, we propose a multiscale dependence-based methodology to analyze the dependence structure and to estimate the downside portfolio risk measures in the energy markets. More specifically, under this methodology, we formulate a new bivariate Empirical Mode Decomposition (EMD) copula based approach to analyze and model the multiscale dependence structure in the energy markets. The proposed model constructs the Copula-based dependence structure formulation in the Bivariate Empirical Mode Decomposition (BEMD)-based multiscale domain. Results from the empirical studies using the typical Australian electricity daily prices show that there exists a multiscale dependence structure between different regional markets across different scales. The proposed model taking into account the multiscale dependence structure demonstrates statistically significantly-improved performance in terms of accuracy and reliability measures.Entropy2016-05-04185Article10.3390/e180501701701099-43002016-05-04doi: 10.3390/e18050170Kaijian HeRui ZhaYanhui ChenKin Lai<![CDATA[Entropy, Vol. 18, Pages 169: Pressure and Compressibility of Conformal Field Theories from the AdS/CFT Correspondence]]>
http://www.mdpi.com/1099-4300/18/5/169
The equation of state associated with N = 4 supersymmetric Yang–Mills in four dimensions, for S U ( N ) in the large N limit, is investigated using the AdS/CFT correspondence. An asymptotically AdS black-hole on the gravity side provides a thermal background for the Yang–Mills theory on the boundary in which the cosmological constant is equivalent to a volume. The thermodynamic variable conjugate to the cosmological constant is a pressure, and the P - V diagram of the quark-gluon plasma is studied. It is known that there is a critical point where the heat capacity diverges, and this is reflected in the isothermal compressibility. Critical exponents are derived and found to be mean field in the large N limit. The same analysis applied to three- and six-dimensional conformal field theories again yields mean field exponents associated with the compressibility at the critical point.Entropy2016-05-03185Article10.3390/e180501691691099-43002016-05-03doi: 10.3390/e18050169Brian Dolan<![CDATA[Entropy, Vol. 18, Pages 158: Low-Complexity Iterative Approximated Water-Filling Based Power Allocation in an Ultra-Dense Network]]>
http://www.mdpi.com/1099-4300/18/5/158
It is highly possible that future wireless communication systems will adopt ultra-dense deployment to cope with the increasing demand on spectrum efficiency and energy efficiency. The pivotal issue to achieve the potential benefits of the ultra-dense network is to deal with the complex inter-site interference. In this paper, in order to maximize the spectrum efficiency of the system, we first make a reasonable approximation on the inter-site interference to convert the problem into a convex optimization problem. Then, the Lagrangian Multiplier method is adopted to obtain the expression of the optimum power allocation, and the water filling algorithm, as one of the most classical algorithms in the information theory, can be applied to maximize the sum rate or spectrum efficiency of the system. Since the classical iteratively searching water filling algorithm needs many iterations to converge to the optimal solution, we develop a low-complexity iterative approximate water filling algorithm. Simulation results show that the developed algorithm can achieve very close performance to the classical iteratively searching water filling based power allocation with only a few iterations under different scenarios, which leads to a significant complexity reduction.Entropy2016-05-03185Article10.3390/e180501581581099-43002016-05-03doi: 10.3390/e18050158Xin SuBei LiuXiaopeng ZhuJie ZengChiyang Xiao<![CDATA[Entropy, Vol. 18, Pages 163: Quantum Private Query Protocol Based on Two Non-Orthogonal States]]>
http://www.mdpi.com/1099-4300/18/5/163
We propose a loss tolerant quantum private query (QPQ) protocol based on two non-orthogonal states and unambiguous state discrimination (USD) measurement. By analyzing a two-point attack by a third party, we find that our protocol has a stronger ability to resist external attacks than G-protocol and Y-protocol. Our protocol requires a smaller number of compressions than that in G-protocol (Gao et al., Opt. Exp. 2012, 20, 17411–17420) and Y-protocol (Yan et al. Quant. Inf. Process. 2014, 13, 805–813), which means less post-processing. Our protocol shows better database security and user privacy compared with G-protocol.Entropy2016-05-03185Article10.3390/e180501631631099-43002016-05-03doi: 10.3390/e18050163Yan ChangShibin ZhangGuihua HanZhiwei ShengLili YanJinxin Xiong<![CDATA[Entropy, Vol. 18, Pages 168: Scaling-Up Quantum Heat Engines Efficiently via Shortcuts to Adiabaticity]]>
http://www.mdpi.com/1099-4300/18/5/168
The finite-time operation of a quantum heat engine that uses a single particle as a working medium generally increases the output power at the expense of inducing friction that lowers the cycle efficiency. We propose to scale up a quantum heat engine utilizing a many-particle working medium in combination with the use of shortcuts to adiabaticity to boost the nonadiabatic performance by eliminating quantum friction and reducing the cycle time. To this end, we first analyze the finite-time thermodynamics of a quantum Otto cycle implemented with a quantum fluid confined in a time-dependent harmonic trap. We show that nonadiabatic effects can be controlled and tailored to match the adiabatic performance using a variety of shortcuts to adiabaticity. As a result, the nonadiabatic dynamics of the scaled-up many-particle quantum heat engine exhibits no friction, and the cycle can be run at maximum efficiency with a tunable output power. We demonstrate our results with a working medium consisting of particles with inverse-square pairwise interactions that includes non-interacting and hard-core bosons as limiting cases.Entropy2016-04-30185Article10.3390/e180501681681099-43002016-04-30doi: 10.3390/e18050168Mathieu BeauJuan JaramilloAdolfo del Campo<![CDATA[Entropy, Vol. 18, Pages 156: Orthogonal Vector Computations]]>
http://www.mdpi.com/1099-4300/18/5/156
Quantum computation is the suitable orthogonal encoding of possibly holistic functional properties into state vectors, followed by a projective measurement.Entropy2016-04-29185Article10.3390/e180501561561099-43002016-04-29doi: 10.3390/e18050156Karl Svozil<![CDATA[Entropy, Vol. 18, Pages 166: Performance of Continuous Quantum Thermal Devices Indirectly Connected to Environments]]>
http://www.mdpi.com/1099-4300/18/5/166
A general quantum thermodynamics network is composed of thermal devices connected to environments through quantum wires. The coupling between the devices and the wires may introduce additional decay channels which modify the system performance with respect to the directly-coupled device. We analyze this effect in a quantum three-level device connected to a heat bath or to a work source through a two-level wire. The steady state heat currents are decomposed into the contributions of the set of simple circuits in the graph representing the master equation. Each circuit is associated with a mechanism in the device operation and the system performance can be described by a small number of circuit representatives of those mechanisms. Although in the limit of weak coupling between the device and the wire the new irreversible contributions can become small, they prevent the system from reaching the Carnot efficiency.Entropy2016-04-28185Article10.3390/e180501661661099-43002016-04-28doi: 10.3390/e18050166J. GonzálezDaniel AlonsoJosé Palao<![CDATA[Entropy, Vol. 18, Pages 162: From Steam Engines to Chemical Reactions: Gibbs’ Contribution to the Extension of the Second Law]]>
http://www.mdpi.com/1099-4300/18/5/162
The present work analyzes the foundations of Gibbs’ thermodynamic equilibrium theory, with the general aim of understanding how the Second Law—as formulated by Clausius in 1865—has been embodied into Gibbs’ formal system and extended to processes involving chemical reactions. We show that Gibbs’ principle of maximal entropy (and minimal energy) is the implicit expression of Clausius’ Second Law. In addition, by making explicit some implicit passages of Gibbs logical path, we provide an original formal justification of Gibbs’ principle. Finally we provide an analysis of how Gibbs’ principle—conceived for homogeneous isolated systems with fixed chemical composition—has come to be applied to systems entailing chemical transformations.Entropy2016-04-28185Article10.3390/e180501621621099-43002016-04-28doi: 10.3390/e18050162Emilio PellegrinoLuigi CerrutiElena Ghibaudi<![CDATA[Entropy, Vol. 18, Pages 165: A Cost/Speed/Reliability Tradeoff to Erasing]]>
http://www.mdpi.com/1099-4300/18/5/165
We present a Kullback–Leibler (KL) control treatment of the fundamental problem of erasing a bit. We introduce notions of reliability of information storage via a reliability timescale τ r , and speed of erasing via an erasing timescale τ e . Our problem formulation captures the tradeoff between speed, reliability, and the KL cost required to erase a bit. We show that rapid erasing of a reliable bit costs at least log 2 - log 1 - e - τ e τ r &gt; log 2 , which goes to 1 2 log 2 τ r τ e when τ r &gt; &gt; τ e .Entropy2016-04-28185Article10.3390/e180501651651099-43002016-04-28doi: 10.3390/e18050165Manoj Gopalkrishnan<![CDATA[Entropy, Vol. 18, Pages 167: Estimation of Tsunami Bore Forces on a Coastal Bridge Using an Extreme Learning Machine]]>
http://www.mdpi.com/1099-4300/18/5/167
This paper proposes a procedure to estimate tsunami wave forces on coastal bridges through a novel method based on Extreme Learning Machine (ELM) and laboratory experiments. This research included three water depths, ten wave heights, and four bridge models with a variety of girders providing a total of 120 cases. The research was designed and adapted to estimate tsunami bore forces including horizontal force, vertical uplift and overturning moment on a coastal bridge. The experiments were carried out on 1:40 scaled concrete bridge models in a wave flume with dimensions of 24 m × 1.5 m × 2 m. Two six-axis load cells and four pressure sensors were installed to the base plate to measure forces. In the numerical procedure, estimation and prediction results of the ELM model were compared with Genetic Programming (GP) and Artificial Neural Networks (ANNs) models. The experimental results showed an improvement in predictive accuracy, and capability of generalization could be achieved by the ELM approach in comparison with GP and ANN. Moreover, results indicated that the ELM models developed could be used with confidence for further work on formulating novel model predictive strategy for tsunami bore forces on a coastal bridge. The experimental results indicated that the new algorithm could produce good generalization performance in most cases and could learn thousands of times faster than conventional popular learning algorithms. Therefore, it can be conclusively obtained that utilization of ELM is certainly developing as an alternative approach to estimate the tsunami bore forces on a coastal bridge.Entropy2016-04-28185Article10.3390/e180501671671099-43002016-04-28doi: 10.3390/e18050167Iman MazinaniZubaidah IsmailShahaboddin ShamshirbandAhmad HashimMarjan MansourvarErfan Zalnezhad<![CDATA[Entropy, Vol. 18, Pages 164: Finding Influential Users in Social Media Using Association Rule Learning]]>
http://www.mdpi.com/1099-4300/18/5/164
Influential users play an important role in online social networks since users tend to have an impact on one other. Therefore, the proposed work analyzes users and their behavior in order to identify influential users and predict user participation. Normally, the success of a social media site is dependent on the activity level of the participating users. For both online social networking sites and individual users, it is of interest to find out if a topic will be interesting or not. In this article, we propose association learning to detect relationships between users. In order to verify the findings, several experiments were executed based on social network analysis, in which the most influential users identified from association rule learning were compared to the results from Degree Centrality and Page Rank Centrality. The results clearly indicate that it is possible to identify the most influential users using association rule learning. In addition, the results also indicate a lower execution time compared to state-of-the-art methods.Entropy2016-04-27185Article10.3390/e180501641641099-43002016-04-27doi: 10.3390/e18050164Fredrik ErlandssonPiotr BródkaAnton BorgHenric Johnson<![CDATA[Entropy, Vol. 18, Pages 160: Fractal Information by Means of Harmonic Mappings and Some Physical Implications]]>
http://www.mdpi.com/1099-4300/18/5/160
Considering that the motions of the complex system structural units take place on continuous, but non-differentiable curves, in the frame of the extended scale relativity model (in its Schrödinger-type variant), it is proven that the imaginary part of a scalar potential of velocities can be correlated with the fractal information and, implicitly, with a tensor of “tensions”, which is fundamental in the construction of the constitutive laws of material. In this way, a specific differential geometry based on a Poincaré-type metric of the Lobachevsky plane (which is invariant to the homographic group of transformations) and also a specific variational principle (whose field equations represent an harmonic map from the usual space into the Lobachevsky plane) are generated. Moreover, fractal information (which is made explicit at any scale resolution) is produced, so that the field variables define a gravitational field. This latter situation is specific to a variational principle in the sense of Matzner–Misner and to certain Ernst-type field equations, the fractal information being contained in the material structure and, thus, in its own space associated with it.Entropy2016-04-26185Article10.3390/e180501601601099-43002016-04-26doi: 10.3390/e18050160Maricel AgopAlina GavriluţViorel PăunDumitru FilipeanuFlorin LucaConstantin GreceaLiliana Topliceanu<![CDATA[Entropy, Vol. 18, Pages 161: Unified Approach to Thermodynamic Optimization of Generic Objective Functions in the Linear Response Regime]]>
http://www.mdpi.com/1099-4300/18/5/161
While many efforts have been devoted to optimizing the power output for a finite-time thermodynamic process, thermodynamic optimization under realistic situations is not necessarily concerned with power alone; rather, it may be of great relevance to optimize generic objective functions that are combinations of power, entropy production, and/or efficiency. One can optimize the objective function for a given model; generally the obtained results are strongly model dependent. However, if the thermodynamic process in question is operated in the linear response regime, then we show in this work that it is possible to adopt a unified approach to optimizing the objective function, thanks to Onsager’s theory of linear irreversible thermodynamics. A dissipation bound is derived, and based on it, the efficiency associated with the optimization problem, which is universal in the linear response regime and irrespective of model details, can be obtained in a unified way. Our results are in good agreement with previous findings. Moreover, we unveil that the ratio between the stopping time of a finite-time process and the optimized duration time plays a pivotal role in determining the corresponding efficiency in the case of linear response.Entropy2016-04-26185Article10.3390/e180501611611099-43002016-04-26doi: 10.3390/e18050161Yan Wang<![CDATA[Entropy, Vol. 18, Pages 159: On the Measurement of Randomness (Uncertainty): A More Informative Entropy]]>
http://www.mdpi.com/1099-4300/18/5/159
As a measure of randomness or uncertainty, the Boltzmann–Shannon entropy H has become one of the most widely used summary measures of a variety of attributes (characteristics) in different disciplines. This paper points out an often overlooked limitation of H: comparisons between differences in H-values are not valid. An alternative entropy H K is introduced as a preferred member of a new family of entropies for which difference comparisons are proved to be valid by satisfying a given value-validity condition. The H K is shown to have the appropriate properties for a randomness (uncertainty) measure, including a close linear relationship to a measurement criterion based on the Euclidean distance between probability distributions. This last point is demonstrated by means of computer generated random distributions. The results are also compared with those of another member of the entropy family. A statistical inference procedure for the entropy H K is formulated.Entropy2016-04-26185Article10.3390/e180501591591099-43002016-04-26doi: 10.3390/e18050159Tarald Kvålseth<![CDATA[Entropy, Vol. 18, Pages 131: Analytical Modeling of MHD Flow over a Permeable Rotating Disk in the Presence of Soret and Dufour Effects: Entropy Analysis]]>
http://www.mdpi.com/1099-4300/18/5/131
The main concern of the present article is to study steady magnetohydrodynamics (MHD) flow, heat transfer and entropy generation past a permeable rotating disk using a semi numerical/analytical method named Homotopy Analysis Method (HAM). The results of the present study are compared with numerical quadrature solutions employing a shooting technique with excellent correlation in special cases. The entropy generation equation is derived as a function of velocity, temperature and concentration gradients. Effects of flow physical parameters including magnetic interaction parameter, suction parameter, Prandtl number, Schmidt number, Soret and Dufour number on the fluid velocity, temperature and concentration distributions as well as entropy generation number are analysed and discussed in detail. Results show that increasing the Soret number or decreasing the Dufour number tends to decrease the temperature distribution while the concentration distribution is enhanced. The averaged entropy generation number increases with increasing magnetic interaction parameter, suction parameter, Prandtl number, and Schmidt number.Entropy2016-04-26185Article10.3390/e180501311311099-43002016-04-26doi: 10.3390/e18050131Navid FreidoonimehrMohammad RashidiShirley AbelmanGiulio Lorenzini<![CDATA[Entropy, Vol. 18, Pages 157: Logical Entropy of Fuzzy Dynamical Systems]]>
http://www.mdpi.com/1099-4300/18/4/157
Recently the logical entropy was suggested by D. Ellerman (2013) as a new information measure. The present paper deals with studying logical entropy and logical mutual information and their properties in a fuzzy probability space. In particular, chain rules for logical entropy and for logical mutual information of fuzzy partitions are established. Using the concept of logical entropy of fuzzy partition we define the logical entropy of fuzzy dynamical systems. Finally, it is proved that the logical entropy of fuzzy dynamical systems is invariant under isomorphism of fuzzy dynamical systems.Entropy2016-04-23184Article10.3390/e180401571571099-43002016-04-23doi: 10.3390/e18040157Dagmar MarkechováBeloslav Riečan<![CDATA[Entropy, Vol. 18, Pages 155: Analyses of the Instabilities in the Discretized Diffusion Equations via Information Theory]]>
http://www.mdpi.com/1099-4300/18/4/155
In a previous investigation (Bigerelle and Iost, 2004), the authors have proposed a physical interpretation of the instability λ = Δt/Δx2 &gt; 1/2 of the parabolic partial differential equations when solved by finite differences. However, our results were obtained using integration techniques based on erf functions meaning that no statistical fluctuation was introduced in the mathematical background. In this paper, we showed that the diffusive system can be divided into sub-systems onto which a Brownian motion is applied. Monte Carlo simulations are carried out to reproduce the macroscopic diffusive system. It is shown that the amount of information characterized by the compression ratio of information of the system is pertinent to quantify the entropy of the system according to some concepts introduced by the authors (Bigerelle and Iost, 2007). Thanks to this mesoscopic discretization, it is proved that information on each sub-cell of the diffusion map decreases with time before the unstable equality λ = 1/2 and increases after this threshold involving an increase in negentropy, i.e., a decrease in entropy contrarily to the second principle of thermodynamics.Entropy2016-04-21184Article10.3390/e180401551551099-43002016-04-21doi: 10.3390/e18040155Maxence BigerelleHakim NaceurAlain Iost<![CDATA[Entropy, Vol. 18, Pages 153: Measuring Electromechanical Coupling in Patients with Coronary Artery Disease and Healthy Subjects]]>
http://www.mdpi.com/1099-4300/18/4/153
Coronary artery disease (CAD) is the most common cause of death globally. To detect CAD noninvasively at an early stage before clinical symptoms occur is still nowadays challenging. Analysis of the variation of heartbeat interval (RRI) opens a new avenue for evaluating the functional change of cardiovascular system which is accepted to occur at the subclinical stage of CAD. In addition, systolic time interval (STI) and diastolic time interval (DTI) also show potential. There may be coupling in these electromechanical time series due to their physiological connection. However, to the best of our knowledge no publication has systematically investigated how can the coupling be measured and how it changes in CAD patients. In this study, we enrolled 39 CAD patients and 36 healthy subjects and for each subject the electrocardiogram (ECG) and photoplethysmography (PPG) signals were recorded simultaneously for 5 min. The RRI series, STI series, and DTI series were constructed, respectively. We used linear cross correlation (CC), coherence function (CF), as well as nonlinear mutual information (MI), cross conditional entropy (XCE), cross sample entropy (XSampEn), and cross fuzzy entropy (XFuzzyEn) to analyse the bivariate RRI-DTI coupling, RRI-STI coupling, and STI-DTI coupling, respectively. Our results suggest that the linear CC and CF generally have no significant difference between the two groups for all three types of bivariate coupling. The MI only shows weak change in RRI-DTI coupling. By comparison, the three entropy-based coupling measurements show significantly decreased coupling in CAD patients except XSampEn for RRI-DTI coupling (less significant) and XCE for STI-DTI and RRI-STI coupling (not significant). Additionally, the XFuzzyEn performs best as it was still significant if we further applied the Bonferroni correction in our statistical analysis. Our study indicates that the intrinsic electromechanical coupling is most probably nonlinear and can better be measured by nonlinear entropy-based measurements especially the XFuzzyEn. Besides, CAD patients are accompanied by a loss of electromechanical coupling. Our results suggest that cardiac electromechanical coupling may potentially serve as a noninvasive diagnostic tool for CAD.Entropy2016-04-21184Article10.3390/e180401531531099-43002016-04-21doi: 10.3390/e18040153Lizhen JiPeng LiChengyu LiuXinpei WangJing YangChangchun Liu<![CDATA[Entropy, Vol. 18, Pages 134: Heat Transfer Enhancement and Entropy Generation of Nanofluids Laminar Convection in Microchannels with Flow Control Devices]]>
http://www.mdpi.com/1099-4300/18/4/134
The heat transfer enhancement and entropy generation of Al2O3-water nanofluids laminar convective flow in the microchannels with flow control devices (cylinder, rectangle, protrusion, and v-groove) were investigated in this research. The effects of the geometrical structure of the microchannel, nanofluids concentration φ(0%–3%), and Reynolds number Re (50–300) were comparatively studied by means of performance parameters, as well as the limiting streamlines and temperature contours on the modified heated surfaces. The results reveal that the relative Fanning frictional factor f/f0 of the microchannel with rectangle and protrusion devices are much larger and smaller than others, respectively. As the nanofluids concentration increases, f/f0 increases accordingly. For the microchannel with rectangle ribs, there is a transition Re for obtaining the largest heat transfer. The relative Nusselt number Nu/Nu0 of the cases with larger nanofluids concentration are greater. The microchannels with cylinder and v-groove profiles have better heat transfer performance, especially at larger Re cases, while, the microchannel with the protrusion devices is better from an entropy generation minimization perspective. Furthermore, the variation of the relative entropy generation S′/S′0 are influenced by not only the change of Nu/Nu0 and f/f0, but also the physical parameters of working substances.Entropy2016-04-21184Article10.3390/e180401341341099-43002016-04-21doi: 10.3390/e18040134Ping LiYonghui XieDi ZhangGongnan Xie<![CDATA[Entropy, Vol. 18, Pages 154: Energetic and Exergetic Analysis of a Heat Exchanger Integrated in a Solid Biomass-Fuelled Micro-CHP System with an Ericsson Engine]]>
http://www.mdpi.com/1099-4300/18/4/154
A specific heat exchanger has been developed to transfer heat from flue gas to the working fluid (hot air) of the Ericsson engine of a solid biomass-fuelled micro combined heat and power (CHP). In this paper, the theoretical and experimental energetic analyses of this heat exchanger are compared. The experimental performances are described considering energetic and exergetic parameters, in particular the effectiveness on both hot and cold sides. A new exergetic parameter called the exergetic effectiveness is introduced, which allows a comparison between the real and the ideal heat exchanger considering the Second Law of Thermodynamics. A global analysis of exergetic fluxes in the whole micro-CHP system is presented, showing the repartition of the exergy destruction among the components.Entropy2016-04-20184Article10.3390/e180401541541099-43002016-04-20doi: 10.3390/e18040154Marie CreyxEric DelacourtCéline MorinSylvain LalotBernard Desmet<![CDATA[Entropy, Vol. 18, Pages 152: An Evolutionary Game Theoretic Approach to Multi-Sector Coordination and Self-Organization]]>
http://www.mdpi.com/1099-4300/18/4/152
Coordination games provide ubiquitous interaction paradigms to frame human behavioral features, such as information transmission, conventions and languages as well as socio-economic processes and institutions. By using a dynamical approach, such as Evolutionary Game Theory (EGT), one is able to follow, in detail, the self-organization process by which a population of individuals coordinates into a given behavior. Real socio-economic scenarios, however, often involve the interaction between multiple co-evolving sectors, with specific options of their own, that call for generalized and more sophisticated mathematical frameworks. In this paper, we explore a general EGT approach to deal with coordination dynamics in which individuals from multiple sectors interact. Starting from a two-sector, consumer/producer scenario, we investigate the effects of including a third co-evolving sector that we call public. We explore the changes in the self-organization process of all sectors, given the feedback that this new sector imparts on the other two.Entropy2016-04-20184Article10.3390/e180401521521099-43002016-04-20doi: 10.3390/e18040152Fernando SantosSara EncarnaçãoFrancisco SantosJuval PortugaliJorge Pacheco<![CDATA[Entropy, Vol. 18, Pages 151: Computational Principle and Performance Evaluation of Coherent Ising Machine Based on Degenerate Optical Parametric Oscillator Network]]>
http://www.mdpi.com/1099-4300/18/4/151
We present the operational principle of a coherent Ising machine (CIM) based on a degenerate optical parametric oscillator (DOPO) network. A quantum theory of CIM is formulated, and the computational ability of CIM is evaluated by numerical simulation based on c-number stochastic differential equations. We also discuss the advanced CIM with quantum measurement-feedback control and various problems which can be solved by CIM.Entropy2016-04-19184Article10.3390/e180401511511099-43002016-04-19doi: 10.3390/e18040151Yoshitaka HaribaraShoko UtsunomiyaYoshihisa Yamamoto<![CDATA[Entropy, Vol. 18, Pages 150: On the Approximate Solutions of Local Fractional Differential Equations with Local Fractional Operators]]>
http://www.mdpi.com/1099-4300/18/4/150
In this paper, we consider the local fractional decomposition method, variational iteration method, and differential transform method for analytic treatment of linear and nonlinear local fractional differential equations, homogeneous or nonhomogeneous. The operators are taken in the local fractional sense. Some examples are given to demonstrate the simplicity and the efficiency of the presented methods.Entropy2016-04-19184Article10.3390/e180401501501099-43002016-04-19doi: 10.3390/e18040150Hossein JafariHassan JassimFairouz TchierDumitru Baleanu<![CDATA[Entropy, Vol. 18, Pages 149: Interference Energy Spectrum of the Infinite Square Well]]>
http://www.mdpi.com/1099-4300/18/4/149
Certain superposition states of the 1-D infinite square well have transient zeros at locations other than the nodes of the eigenstates that comprise them. It is shown that if an infinite potential barrier is suddenly raised at some or all of these zeros, the well can be split into multiple adjacent infinite square wells without affecting the wavefunction. This effects a change of the energy eigenbasis of the state to a basis that does not commute with the original, and a subsequent measurement of the energy now reveals a completely different spectrum, which we call the interference energy spectrum of the state. This name is appropriate because the same splitting procedure applied at the stationary nodes of any eigenstate does not change the measurable energy of the state. Of particular interest, this procedure can result in measurable energies that are greater than the energy of the highest mode in the original superposition, raising questions about the conservation of energy akin to those that have been raised in the study of superoscillations. An analytic derivation is given for the interference spectrum of a given wavefunction Ψ ( x , t ) with N known zeros located at points s i = ( x i , t i ) . Numerical simulations were used to verify that a barrier can be rapidly raised at a zero of the wavefunction without significantly affecting it. The interpretation of this result with respect to the conservation of energy and the energy-time uncertainty relation is discussed, and the idea of alternate energy eigenbases is fleshed out. The question of whether or not a preferred discrete energy spectrum is an inherent feature of a particle’s quantum state is examined.Entropy2016-04-19184Article10.3390/e180401491491099-43002016-04-19doi: 10.3390/e18040149Mordecai WaegellYakir AharonovTaylor Patti<![CDATA[Entropy, Vol. 18, Pages 148: Entropy Generation and Heat Transfer Performances of Al2O3-Water Nanofluid Transitional Flow in Rectangular Channels with Dimples and Protrusions]]>
http://www.mdpi.com/1099-4300/18/4/148
Nanofluid has great potentials in heat transfer enhancement and entropy generation decrease as an effective cooling medium. Effects of Al2O3-water nanofluid flow on entropy generation and heat transfer performance in a rectangular conventional channel are numerically investigated in this study. Four different volume fractions are considered and the boundary condition with a constant heat flux is adopted. The flow Reynolds number covers laminar flow, transitional flow and turbulent flow. The influences of the flow regime and nanofluid volume fraction are examined. Furthermore, dimples and protrusions are employed, and the impacts on heat transfer characteristic and entropy generation are acquired. It is found that the average heat transfer entropy generation rate descends and the average friction entropy generation rate rises with an increasing nanofluid volume fraction. The effect of nanofluid on average heat transfer entropy generation rate declines when Reynolds number ascends, which is inverse for average friction entropy generation rate. The average wall temperature and temperature uniformity both drop accompanied with increasing pumping power with the growth in nanofluid volume fraction. The employment of dimples and protrusions significantly decreases the average entropy generation rate and improve the heat transfer performance. The effect of dimple-case shows great difference with that of protrusion-case.Entropy2016-04-19184Article10.3390/e180401481481099-43002016-04-19doi: 10.3390/e18040148Yonghui XieLu ZhengDi ZhangGongnan Xie<![CDATA[Entropy, Vol. 18, Pages 144: Exploration of Quantum Interference in Document Relevance Judgement Discrepancy]]>
http://www.mdpi.com/1099-4300/18/4/144
Quantum theory has been applied in a number of fields outside physics, e.g., cognitive science and information retrieval (IR). Recently, it has been shown that quantum theory can subsume various key IR models into a single mathematical formalism of Hilbert vector spaces. While a series of quantum-inspired IR models has been proposed, limited effort has been devoted to verify the existence of the quantum-like phenomenon in real users’ information retrieval processes, from a real user study perspective. In this paper, we aim to explore and model the quantum interference in users’ relevance judgement about documents, caused by the presentation order of documents. A user study in the context of IR tasks have been carried out. The existence of the quantum interference is tested by the violation of the law of total probability and the validity of the order effect. Our main findings are: (1) there is an apparent judging discrepancy across different users and document presentation orders, and empirical data have violated the law of total probability; (2) most search trials recorded in the user study show the existence of the order effect, and the incompatible decision perspectives in the quantum question (QQ) model are valid in some trials. We further explain the judgement discrepancy in more depth, in terms of four effects (comparison, unfamiliarity, attraction and repulsion) and also analyse the dynamics of document relevance judgement in terms of the evolution of the information need subspace.Entropy2016-04-18184Article10.3390/e180401441441099-43002016-04-18doi: 10.3390/e18040144Benyou WangPeng ZhangJingfei LiDawei SongYuexian HouZhenguo Shang<![CDATA[Entropy, Vol. 18, Pages 146: A Quantum Query Expansion Approach for Session Search]]>
http://www.mdpi.com/1099-4300/18/4/146
Recently, Quantum Theory (QT) has been employed to advance the theory of Information Retrieval (IR). Various analogies between QT and IR have been established. Among them, a typical one is applying the idea of photon polarization in IR tasks, e.g., for document ranking and query expansion. In this paper, we aim to further extend this work by constructing a new superposed state of each document in the information need space, based on which we can incorporate the quantum interference idea in query expansion. We then apply the new quantum query expansion model to session search, which is a typical Web search task. Empirical evaluation on the large-scale Clueweb12 dataset has shown that the proposed model is effective in the session search tasks, demonstrating the potential of developing novel and effective IR models based on intuitions and formalisms of QT.Entropy2016-04-18184Article10.3390/e180401461461099-43002016-04-18doi: 10.3390/e18040146Peng ZhangJingfei LiBenyou WangXiaozhao ZhaoDawei SongYuexian HouMassimo Melucci<![CDATA[Entropy, Vol. 18, Pages 145: Training Concept, Evolution Time, and the Maximum Entropy Production Principle]]>
http://www.mdpi.com/1099-4300/18/4/145
The maximum entropy production principle (MEPP) is a type of entropy optimization which demands that complex non-equilibrium systems should organize such that the rate of the entropy production is maximized. Our take on this principle is that to prove or disprove the validity of the MEPP and to test the scope of its applicability, it is necessary to conduct experiments in which the entropy produced per unit time is measured with a high precision. Thus we study electric-field-induced self-assembly in suspensions of carbon nanotubes and realize precise measurements of the entropy production rate (EPR). As a strong voltage is applied the suspended nanotubes merge together into a conducting cloud which produces Joule heat and, correspondingly, produces entropy. We introduce two types of EPR, which have qualitatively different significance: global EPR (g-EPR) and the entropy production rate of the dissipative cloud itself (DC-EPR). The following results are obtained: (1) As the system reaches the maximum of the DC-EPR, it becomes stable because the applied voltage acts as a stabilizing thermodynamic potential; (2) We discover metastable states characterized by high, near-maximum values of the DC-EPR. Under certain conditions, such efficient entropy-producing regimes can only be achieved if the system is allowed to initially evolve under mildly non-equilibrium conditions, namely at a reduced voltage; (3) Without such a “training” period the system typically is not able to reach the allowed maximum of the DC-EPR if the bias is high; (4) We observe that the DC-EPR maximum is achieved within a time, Te, the evolution time, which scales as a power-law function of the applied voltage; (5) Finally, we present a clear example in which the g-EPR theoretical maximum can never be achieved. Yet, under a wide range of conditions, the system can self-organize and achieve a dissipative regime in which the DC-EPR equals its theoretical maximum.Entropy2016-04-18184Article10.3390/e180401451451099-43002016-04-18doi: 10.3390/e18040145Alexey BezryadinErik Kountz<![CDATA[Entropy, Vol. 18, Pages 147: Numerical Simulation of Williamson Combined Natural and Forced Convective Fluid Flow between Parallel Vertical Walls with Slip Effects and Radiative Heat Transfer in a Porous Medium]]>
http://www.mdpi.com/1099-4300/18/4/147
Numerical study of the slip effects and radiative heat transfer on a steady state fully developed Williamson flow of an incompressible Newtonian fluid; between parallel vertical walls of a microchannel with isothermal walls in a porous medium is performed. The slip effects are considered at both boundary conditions. Radiative highly absorbing medium is modeled by the Rosseland approximation. The non-dimensional governing Navier–Stokes and energy coupled partial differential equations formed a boundary problem are solved numerically using the fourth order Runge–Kutta algorithm by means of a shooting method. Numerical outcomes for the skin friction coefficient, the rate of heat transfer represented by the local Nusselt number were presented even as the velocity and temperature profiles illustrated graphically and analyzed. The effects of the temperature number, Grashof number, thermal radiation parameter, Reynolds number, velocity slip length, Darcy number, and temperature jump, on the flow field and temperature field and their effects on the boundaries are presented and discussed.Entropy2016-04-18184Article10.3390/e180401471471099-43002016-04-18doi: 10.3390/e18040147Mohammad Abdollahzadeh JamalabadiPayam HooshmandNavid BagheriHamidReza KhakRahMajid Dousti<![CDATA[Entropy, Vol. 18, Pages 142: Reproducibility Probability Estimation and RP-Testing for Some Nonparametric Tests]]>
http://www.mdpi.com/1099-4300/18/4/142
Several reproducibility probability (RP)-estimators for the binomial, sign, Wilcoxon signed rank and Kendall tests are studied. Their behavior in terms of MSE is investigated, as well as their performances for RP-testing. Two classes of estimators are considered: the semi-parametric one, where RP-estimators are derived from the expression of the exact or approximated power function, and the non-parametric one, whose RP-estimators are obtained on the basis of the nonparametric plug-in principle. In order to evaluate the precision of RP-estimators for each test, the MSE is computed, and the best overall estimator turns out to belong to the semi-parametric class. Then, in order to evaluate the RP-testing performances provided by RP estimators for each test, the disagreement between the RP-testing decision rule, i.e., “accept H0 if the RP-estimate is lower than, or equal to, 1/2, and reject H0 otherwise”, and the classical one (based on the critical value or on the p-value) is obtained. It is shown that the RP-based testing decision for some semi-parametric RP estimators exactly replicates the classical one. In many situations, the RP-estimator replicating the classical decision rule also provides the best MSE.Entropy2016-04-16184Article10.3390/e180401421421099-43002016-04-16doi: 10.3390/e18040142Lucio De CapitaniDaniele De Martini<![CDATA[Entropy, Vol. 18, Pages 143: Mixed Diffusive-Convective Relaxation of a Warm Beam of Energetic Particles in Cold Plasma]]>
http://www.mdpi.com/1099-4300/18/4/143
This work addresses the features of fast particle transport in the bump-on-tail problem for varying the width of the fluctuation spectrum, in the view of possible applications to studies of energetic particle transport in fusion plasmas. Our analysis is built around the idea that strongly-shaped beams do not relax through diffusion only and that there exists an intermediate time scale where the relaxations are convective (ballistic-like). We cast this idea in the form of a self-consistent nonlinear dynamical model, which extends the classic equations of the quasi-linear theory to “broad” beams with internal structure. We also present numerical simulation results of the relaxation of a broad beam of energetic particles in cold plasma. These generally demonstrate the mixed diffusive-convective features of supra-thermal particle transport essentially depending on nonlinear wave-particle interactions and phase-space structures. Taking into account the modes of the stable linear spectrum is crucial for the self-consistent evolution of the distribution function and the fluctuation intensity spectrum.Entropy2016-04-16184Article10.3390/e180401431431099-43002016-04-16doi: 10.3390/e18040143Nakia CarlevaroAlexander MilovanovMatteo FalessiGiovanni MontaniDavide TerzaniFulvio Zonca<![CDATA[Entropy, Vol. 18, Pages 139: Progress in Finite Time Thermodynamic Studies for Internal Combustion Engine Cycles]]>
http://www.mdpi.com/1099-4300/18/4/139
On the basis of introducing the origin and development of finite time thermodynamics (FTT), this paper reviews the progress in FTT optimization for internal combustion engine (ICE) cycles from the following four aspects: the studies on the optimum performances of air standard endoreversible (with only the irreversibility of heat resistance) and irreversible ICE cycles, including Otto, Diesel, Atkinson, Brayton, Dual, Miller, Porous Medium and Universal cycles with constant specific heats, variable specific heats, and variable specific ratio of the conventional and quantum working fluids (WFs); the studies on the optimum piston motion (OPM) trajectories of ICE cycles, including Otto and Diesel cycles with Newtonian and other heat transfer laws; the studies on the performance limits of ICE cycles with non-uniform WF with Newtonian and other heat transfer laws; as well as the studies on the performance simulation of ICE cycles. In the studies, the optimization objectives include work, power, power density, efficiency, entropy generation rate, ecological function, and so on. The further direction for the studies is explored.Entropy2016-04-15184Review10.3390/e180401391391099-43002016-04-15doi: 10.3390/e18040139Yanlin GeLingen ChenFengrui Sun<![CDATA[Entropy, Vol. 18, Pages 141: Testing a Quantum Heat Pump with a Two-Level Spin]]>
http://www.mdpi.com/1099-4300/18/4/141
Once in its non-equilibrium steady state, a nanoscale system coupled to several heat baths may be thought of as a “quantum heat pump”. Depending on the direction of its stationary heat flows, it may function as, e.g., a refrigerator or a heat transformer. These continuous heat devices can be arbitrarily complex multipartite systems, and yet, their working principle is always the same: they are made up of several elementary three-level stages operating in parallel. As a result, it is possible to devise external “black-box” testing strategies to learn about their functionality and performance regardless of any internal details. In particular, one such heat pump can be tested by coupling a two-level spin to one of its “contact transitions”. The steady state of this external probe contains information about the presence of heat leaks and internal dissipation in the device and, also, about the direction of its steady-state heat currents. Provided that the irreversibility of the heat pump is low, one can further estimate its coefficient of performance. These techniques may find applications in the emerging field of quantum thermal engineering, as they facilitate the diagnosis and design optimization of complex thermodynamic cycles.Entropy2016-04-15184Article10.3390/e180401411411099-43002016-04-15doi: 10.3390/e18040141Luis CorreaMohammad Mehboudi<![CDATA[Entropy, Vol. 18, Pages 140: Open Markov Processes: A Compositional Perspective on Non-Equilibrium Steady States in Biology]]>
http://www.mdpi.com/1099-4300/18/4/140
In recent work, Baez, Fong and the author introduced a framework for describing Markov processes equipped with a detailed balanced equilibrium as open systems of a certain type. These “open Markov processes” serve as the building blocks for more complicated processes. In this paper, we describe the potential application of this framework in the modeling of biological systems as open systems maintained away from equilibrium. We show that non-equilibrium steady states emerge in open systems of this type, even when the rates of the underlying process are such that a detailed balanced equilibrium is permitted. It is shown that these non-equilibrium steady states minimize a quadratic form which we call “dissipation”. In some circumstances, the dissipation is approximately equal to the rate of change of relative entropy plus a correction term. On the other hand, Prigogine’s principle of minimum entropy production generally fails for non-equilibrium steady states. We use a simple model of membrane transport to illustrate these concepts.Entropy2016-04-15184Article10.3390/e180401401401099-43002016-04-15doi: 10.3390/e18040140Blake Pollard<![CDATA[Entropy, Vol. 18, Pages 137: Operational Complexity of Supplier-Customer Systems Measured by Entropy—Case Studies]]>
http://www.mdpi.com/1099-4300/18/4/137
This paper discusses a unified entropy-based approach for the quantitative measurement of operational complexity of company supplier-customer relations. Classical Shannon entropy is utilized. Beside this quantification tool, we also explore the relations between Shannon entropy and (c,d)-entropy in more details. An analytic description of so called iso-quant curves is given, too. We present five case studies, albeit in an anonymous setting, describing various details of general procedures for measuring the operational complexity of supplier-customer systems. In general, we assume a problem-oriented database exists, which contains detailed records of all product forecasts, orders and deliveries both in quantity and time, scheduled and realized, too. Data processing detects important flow variations both in volumes and times, e.g., order—forecast, delivery—order, and actual production—scheduled one. The unifying quantity used for entropy computation is the time gap between actual delivery time and order issue time, which is nothing else but a lead time in inventory control models. After data consistency checks, histograms and empirical distribution functions are constructed. Finally, the entropy, information-theoretic measure of supplier-customer operational complexity, is calculated. Basic steps of the algorithm are mentioned briefly, too. Results of supplier-customer system analysis from selected Czech small and medium-sized enterprises (SMEs) are presented in various computational and managerial decision making details. An enterprise is ranked as SME one, if it has at most 250 employees and its turnover does not exceed 50 million USD per year, or its balance sheet total does not exceed 43 million USD per year, alternatively.Entropy2016-04-14184Article10.3390/e180401371371099-43002016-04-14doi: 10.3390/e18040137Ladislav LukášJiří Hofman<![CDATA[Entropy, Vol. 18, Pages 138: The Free Energy Requirements of Biological Organisms; Implications for Evolution]]>
http://www.mdpi.com/1099-4300/18/4/138
Recent advances in nonequilibrium statistical physics have provided unprecedented insight into the thermodynamics of dynamic processes. The author recently used these advances to extend Landauer’s semi-formal reasoning concerning the thermodynamics of bit erasure, to derive the minimal free energy required to implement an arbitrary computation. Here, I extend this analysis, deriving the minimal free energy required by an organism to run a given (stochastic) map π from its sensor inputs to its actuator outputs. I use this result to calculate the input-output map π of an organism that optimally trades off the free energy needed to run π with the phenotypic fitness that results from implementing π. I end with a general discussion of the limits imposed on the rate of the terrestrial biosphere’s information processing by the flux of sunlight on the Earth.Entropy2016-04-13184Article10.3390/e180401381381099-43002016-04-13doi: 10.3390/e18040138David Wolpert<![CDATA[Entropy, Vol. 18, Pages 105: Generalized Analysis of a Distribution Separation Method]]>
http://www.mdpi.com/1099-4300/18/4/105
Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR), there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM) was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF), where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence), the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence). Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF) approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.Entropy2016-04-13184Article10.3390/e180401051051099-43002016-04-13doi: 10.3390/e18040105Peng ZhangQian YuYuexian HouDawei SongJingfei LiBin Hu<![CDATA[Entropy, Vol. 18, Pages 135: On the Stability of Classical Orbits of the Hydrogen Ground State in Stochastic Electrodynamics]]>
http://www.mdpi.com/1099-4300/18/4/135
De la Peña 1980 and Puthoff 1987 show that circular orbits in the hydrogen problem of Stochastic Electrodynamics connect to a stable situation, where the electron neither collapses onto the nucleus nor gets expelled from the atom. Although the Cole-Zou 2003 simulations support the stability, our recent numerics always lead to self-ionisation. Here the de la Peña-Puthoff argument is extended to elliptic orbits. For very eccentric orbits with energy close to zero and angular momentum below some not-small value, there is on the average a net gain in energy for each revolution, which explains the self-ionisation. Next, an 1 / r 2 potential is added, which could stem from a dipolar deformation of the nuclear charge by the electron at its moving position. This shape retains the analytical solvability. When it is enough repulsive, the ground state of this modified hydrogen problem is predicted to be stable. The same conclusions hold for positronium.Entropy2016-04-13184Article10.3390/e180401351351099-43002016-04-13doi: 10.3390/e18040135Theodorus Nieuwenhuizen<![CDATA[Entropy, Vol. 18, Pages 136: Correction: Kay, B.S. Entropy and Quantum Gravity. Entropy 2015, 17, 8174–8186]]>
http://www.mdpi.com/1099-4300/18/4/136
The following corrections should be made to the published paper [1]: First, the paragraph beginning with “One might argue that . . . ” and ending with “. . . increase monotonically with time.”[...]Entropy2016-04-13184Correction10.3390/e180401361361099-43002016-04-13doi: 10.3390/e18040136Bernard Kay<![CDATA[Entropy, Vol. 18, Pages 133: Correction on Liu, X.; Jiang, A.; Xu, N.; Xue, J. Increment Entropy as a Measure of Complexity for Time Series. Entropy 2016, 18, 22]]>
http://www.mdpi.com/1099-4300/18/4/133
The authors wish to make the following correction to their paper [1].[...]Entropy2016-04-13184Correction10.3390/e180401331331099-43002016-04-13doi: 10.3390/e18040133Xiaofeng LiuAimin JiangNing XuJianru Xue<![CDATA[Entropy, Vol. 18, Pages 132: Anti-Icing Superhydrophobic Surfaces: Controlling Entropic Molecular Interactions to Design Novel Icephobic Concrete]]>
http://www.mdpi.com/1099-4300/18/4/132
Tribology involves the study of friction, wear, lubrication, and adhesion, including biomimetic superhydrophobic and icephobic surfaces. The three aspects of icephobicity are the low ice adhesion, repulsion of incoming water droplets prior to freezing, and delayed frost formation. Although superhydrophobic surfaces are not always icephobic, the theoretical mechanisms behind icephobicity are similar to the entropically driven hydrophobic interactions. The growth of ice crystals in saturated vapor is partially governed by entropically driven diffusion of water molecules to definite locations similarly to hydrophobic interactions. The ice crystal formation can be compared to protein folding controlled by hydrophobic forces. Surface topography and surface energy can affect both the icephobicity and hydrophobicity. By controlling these properties, micro/nanostructured icephobic concrete was developed. The concrete showed ice adhesion strength one order of magnitude lower than regular concrete and could repel incoming water droplets at −5 °C. The icephobic performance of the concrete can be optimized by controlling the sand and polyvinyl alcohol fiber content.Entropy2016-04-12184Article10.3390/e180401321321099-43002016-04-12doi: 10.3390/e18040132Rahul RamachandranMarina KozhukhovaKonstantin SobolevMichael Nosonovsky<![CDATA[Entropy, Vol. 18, Pages 130: An Informed Framework for Training Classifiers from Social Media]]>
http://www.mdpi.com/1099-4300/18/4/130
Extracting information from social media has become a major focus of companies and researchers in recent years. Aside from the study of the social aspects, it has also been found feasible to exploit the collaborative strength of crowds to help solve classical machine learning problems like object recognition. In this work, we focus on the generally underappreciated problem of building effective datasets for training classifiers by automatically assembling data from social media. We detail some of the challenges of this approach and outline a framework that uses expanded search queries to retrieve more qualified data. In particular, we concentrate on collaboratively tagged media on the social platform Flickr, and on the problem of image classification to evaluate our approach. Finally, we describe a novel entropy-based method to incorporate an information-theoretic principle to guide our framework. Experimental validation against well-known public datasets shows the viability of this approach and marks an improvement over the state of the art in terms of simplicity and performance.Entropy2016-04-09184Article10.3390/e180401301301099-43002016-04-09doi: 10.3390/e18040130Dong ChengSami Abdulhak<![CDATA[Entropy, Vol. 18, Pages 129: The Effect of Threshold Values and Weighting Factors on the Association between Entropy Measures and Mortality after Myocardial Infarction in the Cardiac Arrhythmia Suppression Trial (CAST)]]>
http://www.mdpi.com/1099-4300/18/4/129
Heart rate variability (HRV) is a non-invasive measurement based on the intervals between normal heart beats that characterize cardiac autonomic function. Decreased HRV is associated with increased risk of cardiovascular events. Characterizing HRV using only moment statistics fails to capture abnormalities in regulatory function that are important aspects of disease risk. Thus, entropy measures are a promising approach to quantify HRV for risk stratification. The purpose of this study was to investigate this potential for approximate, corrected approximate, sample, fuzzy, and fuzzy measure entropy and its dependency on the parameter selection. Recently, published parameter sets and further parameter combinations were investigated. Heart rate data were obtained from the "Cardiac Arrhythmia Suppression Trial (CAST) RR Interval Sub-Study Database" (Physionet). Corresponding outcomes and clinical data were provided by one of the investigators. The use of previously-reported parameter sets on the pre-treatment data did not significantly add to the identification of patients at risk for cardiovascular death on follow-up. After arrhythmia suppression treatment, several parameter sets predicted outcomes for all patients and patients without coronary artery bypass grafting (CABG). The strongest results were seen using the threshold parameter as a multiple of the data’s standard deviation ( r = 0 . 2 · σ ). Approximate and sample entropy provided significant hazard ratios for patients without CABG and without diabetes for an entropy maximizing threshold approximation. Additional parameter combinations did not improve the results for pre-treatment data. The results of this study illustrate the influence of parameter selection on entropy measures’ potential for cardiovascular risk stratification and support the potential use of entropy measures in future studies.Entropy2016-04-08184Article10.3390/e180401291291099-43002016-04-08doi: 10.3390/e18040129Christopher MayerMartin BachlerAndreas HolzingerPhyllis SteinSiegfried Wassertheurer<![CDATA[Entropy, Vol. 18, Pages 127: Exergy Analysis of Complex Ship Energy Systems]]>
http://www.mdpi.com/1099-4300/18/4/127
With multiple primary and secondary energy converters (diesel engines, steam turbines, waste heat recovery (WHR) and oil-fired boilers, etc.) and extensive energy networks (steam, cooling water, exhaust gases, etc.), ships may be considered as complex energy systems. Understanding and optimizing such systems requires advanced holistic energy modeling. This modeling can be done in two ways: The simpler approach focuses on energy flows, and has already been tested, approved and presented; a new, more complicated approach, focusing on energy quality, i.e., exergy, is presented in this paper. Exergy analysis has rarely been applied to ships, and, as a general rule, the shipping industry is not familiar with this tool. This paper tries to fill this gap. We start by giving a short reminder of what exergy is and describe the principles of exergy modeling to explain what kind of results should be expected from such an analysis. We then apply these principles to the analysis of a large two-stroke diesel engine with its cooling and exhaust systems. Simulation results are then presented along with the exergy analysis. Finally, we propose solutions for energy and exergy saving which could be applied to marine engines and ships in general.Entropy2016-04-08184Article10.3390/e180401271271099-43002016-04-08doi: 10.3390/e18040127Pierre MartyJean-François HétetDavid ChaletPhilippe Corrignan<![CDATA[Entropy, Vol. 18, Pages 126: Reply to Jay Lawrence. Comments on Piero Quarati et al. Negentropy in Many-Body Quantum Systems. Entropy 2016, 18, 63]]>
http://www.mdpi.com/1099-4300/18/4/126
The Comments are explicitly related to contents of two published papers: actual [1] and [2].[...]Entropy2016-04-07184Reply10.3390/e180401261261099-43002016-04-07doi: 10.3390/e18040126Piero QuaratiMarcello LissiaAntonio Scarfone<![CDATA[Entropy, Vol. 18, Pages 110: Syntactic Parameters and a Coding Theory Perspective on Entropy and Complexity of Language Families]]>
http://www.mdpi.com/1099-4300/18/4/110
We present a simple computational approach to assigning a measure of complexity and information/entropy to families of natural languages, based on syntactic parameters and the theory of error correcting codes. We associate to each language a binary string of syntactic parameters and to a language family a binary code, with code words the binary string associated to each language. We then evaluate the code parameters (rate and relative minimum distance) and the position of the parameters with respect to the asymptotic bound of error correcting codes and the Gilbert–Varshamov bound. These bounds are, respectively, related to the Kolmogorov complexity and the Shannon entropy of the code and this gives us a computationally simple way to obtain estimates on the complexity and information, not of individual languages but of language families. This notion of complexity is related, from the linguistic point of view to the degree of variability of syntactic parameter across languages belonging to the same (historical) family.Entropy2016-04-07184Article10.3390/e180401101101099-43002016-04-07doi: 10.3390/e18040110Matilde Marcolli<![CDATA[Entropy, Vol. 18, Pages 125: Comments on Piero Quarati et al. Negentropy in Many-Body Quantum Systems. Entropy 2016, 18, 63]]>
http://www.mdpi.com/1099-4300/18/4/125
The purpose of this note is to express my concerns about the published paper by Quarati et al. (Entropy 2016, 18, 63). It is hoped that these comments will stimulate a constructive discussion of the issues involved.Entropy2016-04-07184Comment10.3390/e180401251251099-43002016-04-07doi: 10.3390/e18040125Jay Lawrence<![CDATA[Entropy, Vol. 18, Pages 123: Entropy Generation on MHD Casson Nanofluid Flow over a Porous Stretching/Shrinking Surface]]>
http://www.mdpi.com/1099-4300/18/4/123
In this article, entropy generation on MHD Casson nanofluid over a porous Stretching/Shrinking surface has been investigated. The influences of nonlinear thermal radiation and chemical reaction have also taken into account. The governing Casson nanofluid flow problem consists of momentum equation, energy equation and nanoparticle concentration. Similarity transformation variables have been used to transform the governing coupled partial differential equations into ordinary differential equations. The resulting highly nonlinear coupled ordinary differential equations have been solved numerically with the help of Successive linearization method (SLM) and Chebyshev spectral collocation method. The impacts of various pertinent parameters of interest are discussed for velocity profile, temperature profile, concentration profile and entropy profile. The expression for local Nusselt number and local Sherwood number are also analyzed and discussed with the help of tables. Furthermore, comparison with the existing is also made as a special case of our study.Entropy2016-04-06184Article10.3390/e180401231231099-43002016-04-06doi: 10.3390/e18040123Jia QingMuhammad BhattiMunawwar AbbasMohammad RashidiMohamed Ali<![CDATA[Entropy, Vol. 18, Pages 91: Bayesian Genetic Association Test when Secondary Phenotypes Are Available Only in the Case Group]]>
http://www.mdpi.com/1099-4300/18/4/91
In many case-control genetic association studies, a secondary phenotype that may have common genetic factors with disease status can be identified. When information on the secondary phenotype is available only for the case group due to cost and different data sources, a fitting linear regression model ignoring supplementary phenotype data may provide limited knowledge regarding genetic association. We set up a joint model and use a Bayesian framework to estimate and test the effect of genetic covariates on disease status considering the secondary phenotype as an instrumental variable. The application of our proposed procedure is demonstrated through the rheumatoid arthritis data provided by the 16th Genetic Analysis Workshop.Entropy2016-04-06184Article10.3390/e18040091911099-43002016-04-06doi: 10.3390/e18040091Yongku KimMinjung Kwak<![CDATA[Entropy, Vol. 18, Pages 124: Quantum Heat Machines Equivalence, Work Extraction beyond Markovianity, and Strong Coupling via Heat Exchangers]]>
http://www.mdpi.com/1099-4300/18/4/124
Various engine types are thermodynamically equivalent in the quantum limit of small “engine action”. Our previous derivation of the equivalence is restricted to Markovian heat baths and to implicit classical work repository (e.g., laser light in the semi-classical approximation). In this paper, all the components, baths, batteries, and engines, are explicitly taken into account. To neatly treat non-Markovian dynamics, we use mediating particles that function as a heat exchanger. We find that, on top of the previously observed equivalence, there is a higher degree of equivalence that cannot be achieved in the Markovian regime. Next, we focus on the quality of the battery charging process. A condition for positive energy increase and zero entropy increase (work) is given. Moreover, it is shown that, in the strong coupling regime, it is possible to super-charge a battery. With super-charging, the energy of the battery is increased while its entropy is being reduced at the same time.Entropy2016-04-06184Article10.3390/e180401241241099-43002016-04-06doi: 10.3390/e18040124Raam UzdinAmikam LevyRonnie Kosloff<![CDATA[Entropy, Vol. 18, Pages 116: The Effect of Spin Squeezing on the Entanglement Entropy of Kicked Tops]]>
http://www.mdpi.com/1099-4300/18/4/116
In this paper, we investigate the effects of spin squeezing on two-coupled quantum kicked tops, which have been previously shown to exhibit a quantum signature of chaos in terms of entanglement dynamics. Our results show that initial spin squeezing can lead to an enhancement in both the entanglement rate and the asymptotic entanglement for kicked tops when the initial state resides in the regular islands within a mixed classical phase space. On the other hand, we found a reduction in these two quantities if we were to choose the initial state deep inside the chaotic sea. More importantly, we have uncovered that an application of periodic spin squeezing can yield the maximum attainable entanglement entropy, albeit this is achieved at a reduced entanglement rate.Entropy2016-04-02184Article10.3390/e180401161161099-43002016-04-02doi: 10.3390/e18040116Ernest OngLock Chew<![CDATA[Entropy, Vol. 18, Pages 112: A Hybrid EEMD-Based SampEn and SVD for Acoustic Signal Processing and Fault Diagnosis]]>
http://www.mdpi.com/1099-4300/18/4/112
Acoustic signals are an ideal source of diagnosis data thanks to their intrinsic non-directional coverage, sensitivity to incipient defects, and insensitivity to structural resonance characteristics. However this makes prevailing signal de-nosing and feature extraction methods suffer from high computational cost, low signal to noise ratio (S/N), and difficulty to extract the compound acoustic emissions for various failure types. To address these challenges, we propose a hybrid signal processing technique to depict the embedded signal using generally effective features. The ensemble empirical mode decomposition (EEMD) is adopted as the fundamental pre-processor, which is integrated with the sample entropy (SampEn), singular value decomposition (SVD), and statistic feature processing (SFP) methods. The SampEn and SVD are identified as the condition indicators for periodical and irregular signals, respectively. Moreover, such a hybrid module is self-adaptive and robust to different signals, which ensures the generality of its performance. The hybrid signal processor is further integrated with a probabilistic classifier, pairwise-coupled relevance vector machine (PCRVM), to construct a new fault diagnosis system. Experimental verifications for industrial equipment show that the proposed diagnostic system is superior to prior methods in computational efficiency and the capability of simultaneously processing non-stationary and nonlinear condition monitoring signals.Entropy2016-04-01184Article10.3390/e180401121121099-43002016-04-01doi: 10.3390/e18040112Zhi-Xin YangJian-Hua Zhong<![CDATA[Entropy, Vol. 18, Pages 122: Many-Body-Localization Transition in the Strong Disorder Limit: Entanglement Entropy from the Statistics of Rare Extensive Resonances]]>
http://www.mdpi.com/1099-4300/18/4/122
The space of one-dimensional disordered interacting quantum models displaying a many-body localization (MBL) transition seems sufficiently rich to produce critical points with level statistics interpolating continuously between the Poisson statistics of the localized phase and the Wigner–Dyson statistics of the delocalized phase. In this paper, we consider the strong disorder limit of the MBL transition, where the level statistics at the MBL critical point is close to the Poisson statistics. We analyze a one-dimensional quantum spin model, in order to determine the statistical properties of the rare extensive resonances that are needed to destabilize the MBL phase. At criticality, we find that the entanglement entropy can grow with an exponent 0 &lt; α &lt; 1 anywhere between the area law α = 0 and the volume law α = 1 , as a function of the resonances properties, while the entanglement spectrum follows the strong multifractality statistics. In the MBL phase near criticality, we obtain the simple value ν = 1 for the correlation length exponent. Independently of the strong disorder limit, we explain why, for the many-body localization transition concerning individual eigenstates, the correlation length exponent ν is not constrained by the usual Harris inequality ν ≥ 2 / d , so that there is no theoretical inconsistency with the best numerical measure ν = 0 . 8 ( 3 ) obtained by Luitz et al. (2015).Entropy2016-04-01184Article10.3390/e180401221221099-43002016-04-01doi: 10.3390/e18040122Cécile Monthus<![CDATA[Entropy, Vol. 18, Pages 117: Entropy Generation on MHD Blood Flow of Nanofluid Due to Peristaltic Waves]]>
http://www.mdpi.com/1099-4300/18/4/117
This present study describes the entropy generation on magnetohydrodynamic (MHD) blood flow of a nanofluid induced by peristaltic waves. The governing equation of continuity, equation of motion, nano-particle and entropy equations are solved by neglecting the inertial forces and taking long wavelength approximation. The resulting highly non-linear coupled partial differential equation has been solved analytically with the help of perturbation method. Mathematical and graphical results of all the physical parameters for velocity, concentration, temperature, and entropy are also presented. Numerical computation has been used to evaluate the expression for the pressure rise and friction forces. Currently, magnetohydrodynamics is applicable in pumping the fluids for pulsating and non-pulsating continuous flows in different microchannel designs and it also very helpful to control the flow.Entropy2016-04-01184Article10.3390/e180401171171099-43002016-04-01doi: 10.3390/e18040117Mohammad RashidiMuhammad BhattiMunawwar AbbasMohamed Ali<![CDATA[Entropy, Vol. 18, Pages 118: Nonlinear Phenomena of Ultracold Atomic Gases in Optical Lattices: Emergence of Novel Features in Extended States]]>
http://www.mdpi.com/1099-4300/18/4/118
The system of a cold atomic gas in an optical lattice is governed by two factors: nonlinearity originating from the interparticle interaction, and the periodicity of the system set by the lattice. The high level of controllability associated with such an arrangement allows for the study of the competition and interplay between these two, and gives rise to a whole range of interesting and rich nonlinear effects. This review covers the basic idea and overview of such nonlinear phenomena, especially those corresponding to extended states. This includes “swallowtail” loop structures of the energy band, Bloch states with multiple periodicity, and those in “nonlinear lattices”, i.e., systems with the nonlinear interaction term itself being a periodic function in space.Entropy2016-03-31184Review10.3390/e180401181181099-43002016-03-31doi: 10.3390/e18040118Gentaro WatanabeB. VenkateshRaka Dasgupta<![CDATA[Entropy, Vol. 18, Pages 121: A Kinetic Perspective on k‒ε Turbulence Model and Corresponding Entropy Production]]>
http://www.mdpi.com/1099-4300/18/4/121
In this paper, we present an alternative derivation of the entropy production in turbulent flows, based on a formal analogy with the kinetic theory of rarefied gas. This analogy allows for proving that the celebrated \(k - \epsilon\) model for turbulent flows is nothing more than a set of coupled BGK (Bhatnagar–Gross–Krook)-like equations with a proper forcing. This opens a novel perspective on this model, which may help in sorting out the heuristic assumptions essential for its derivation, such as the balance between turbulent kinetic energy production and dissipation. The entropy production is an essential condition for the design and optimization of devices where turbulent flows are involved.Entropy2016-03-31184Article10.3390/e180401211211099-43002016-03-31doi: 10.3390/e18040121Pietro AsinariMatteo FasanoEliodoro Chiavazzo<![CDATA[Entropy, Vol. 18, Pages 119: Generalized Einstein’s Equations from Wald Entropy]]>
http://www.mdpi.com/1099-4300/18/4/119
We derive the gravitational equations of motion of general theories of gravity from thermodynamics applied to a local Rindler horizon through any point in spacetime. Specifically, for a given theory of gravity, we substitute the corresponding Wald entropy into the Clausius relation. Our approach works for all diffeomorphism-invariant theories of gravity in which the Lagrangian is a polynomial in the Riemann tensor.Entropy2016-03-31184Article10.3390/e180401191191099-43002016-03-31doi: 10.3390/e18040119Maulik ParikhSudipta Sarkar<![CDATA[Entropy, Vol. 18, Pages 120: An Exact Efficiency Formula for Holographic Heat Engines]]>
http://www.mdpi.com/1099-4300/18/4/120
Further consideration is given to the efficiency of a class of black hole heat engines that perform mechanical work via the pdV terms present in the First Law of extended gravitational thermodynamics. It is noted that, when the engine cycle is a rectangle with sides parallel to the (p,V) axes, the efficiency can be written simply in terms of the mass of the black hole evaluated at the corners. Since an arbitrary cycle can be approximated to any desired accuracy by a tiling of rectangles, a general geometrical algorithm for computing the efficiency of such a cycle follows. A simple generalization of the algorithm renders it applicable to broader classes of heat engine, even beyond the black hole context.Entropy2016-03-31184Article10.3390/e180401201201099-43002016-03-31doi: 10.3390/e18040120Clifford Johnson<![CDATA[Entropy, Vol. 18, Pages 115: A Comparison of Classification Methods for Telediagnosis of Parkinson’s Disease]]>
http://www.mdpi.com/1099-4300/18/4/115
Parkinson’s disease (PD) is a progressive and chronic nervous system disease that impairs the ability of speech, gait, and complex muscle-and-nerve actions. Early diagnosis of PD is quite important for alleviating the symptoms. Cost effective and convenient telemedicine technology helps to distinguish the patients with PD from healthy people using variations of dysphonia, gait or motor skills. In this study, a novel telemedicine technology was developed to detect PD remotely using dysphonia features. Feature transformation and several machine learning (ML) methods with 2-, 5- and 10-fold cross-validations were implemented on the vocal features. It was observed that the combination of principal component analysis (PCA) as a feature transformation (FT) and k-nearest neighbor (k-NN) as a classifier with 10-fold cross-validation has the best accuracy as 99.1%. All ML processes were applied to the prerecorded PD dataset using a newly created program named ParkDet 2.0. Additionally, the blind test interface was created on the ParkDet so that users could detect new patients with PD in future. Clinicians or medical technicians, without any knowledge of ML, will be able to use the blind test interface to detect PD at a clinic or remote location utilizing internet as a telemedicine application.Entropy2016-03-30184Article10.3390/e180401151151099-43002016-03-30doi: 10.3390/e18040115Haydar Ozkan<![CDATA[Entropy, Vol. 18, Pages 113: Second Law Analysis of Adiabatic and Non-Adiabatic Pipeline Flows of Unstable and Surfactant-Stabilized Emulsions]]>
http://www.mdpi.com/1099-4300/18/4/113
Entropy generation, and hence exergy destruction, in adiabatic flow of unstable and surfactant-stabilized emulsions was investigated experimentally in different diameter pipes. Four types of emulsion systems are investigated covering a broad range of the dispersed-phase concentration: (a) unstable oil-in-water (O/W) emulsions without surfactant; (b) surfactant-stabilized O/W emulsions; (c) unstable water-in-oil (W/O) emulsions without surfactant; and (d) surfactant-stabilized W/O emulsions. The entropy generation rate per unit pipe length is affected by the type of the emulsion as well as its stability. Unstable emulsions without any surfactant present at the interface generate less entropy in the turbulent regime as compared with the surfactant-stabilized emulsions of the same viscosity and density. The effect of surfactant is particularly severe in the case of W/O emulsions. In the turbulent regime, the rate of entropy generation in unstable W/O emulsions is much lower in comparison with that observed in the stable W/O emulsions. A significant delay in the transition from laminar to turbulent regime is also observed in the case of unstable W/O emulsion. Finally, the analysis and simulation results are presented on non-adiabatic pipeline flow of emulsions.Entropy2016-03-30184Article10.3390/e180401131131099-43002016-03-30doi: 10.3390/e18040113Rajinder Pal<![CDATA[Entropy, Vol. 18, Pages 114: Statistical Evidence Measured on a Properly Calibrated Scale for Multinomial Hypothesis Comparisons]]>
http://www.mdpi.com/1099-4300/18/4/114
Measurement of the strength of statistical evidence is a primary objective of statistical analysis throughout the biological and social sciences. Various quantities have been proposed as definitions of statistical evidence, notably the likelihood ratio, the Bayes factor and the relative belief ratio. Each of these can be motivated by direct appeal to intuition. However, for an evidence measure to be reliably used for scientific purposes, it must be properly calibrated, so that one “degree” on the measurement scale always refers to the same amount of underlying evidence, and the calibration problem has not been resolved for these familiar evidential statistics. We have developed a methodology for addressing the calibration issue itself, and previously applied this methodology to derive a calibrated evidence measure E in application to a broad class of hypothesis contrasts in the setting of binomial (single-parameter) likelihoods. Here we substantially generalize previous results to include the m-dimensional multinomial (multiple-parameter) likelihood. In the process we further articulate our methodology for addressing the measurement calibration issue, and we show explicitly how the more familiar definitions of statistical evidence are patently not well behaved with respect to the underlying evidence. We also continue to see striking connections between the calculating equations for E and equations from thermodynamics as we move to more complicated forms of the likelihood.Entropy2016-03-30184Article10.3390/e180401141141099-43002016-03-30doi: 10.3390/e18040114Veronica VielandSang-Cheol Seok<![CDATA[Entropy, Vol. 18, Pages 106: Complexity Analysis of Surface EMG for Overcoming ECG Interference toward Proportional Myoelectric Control]]>
http://www.mdpi.com/1099-4300/18/4/106
Electromyographic (EMG) signals from muscles in the body torso are often contaminated by electrocardiography (ECG) interferences, which consequently compromise EMG intensity estimation. The ECG interference has become a barrier to proportional control of myoelectric prosthesis using a neural machine interface called targeted muscle reinnervation (TMR), which involves transferring the residual amputated nerves to nonfunctional muscles (typically pectoralis muscles for high level amputations). This study investigates a novel approach toward implementation of proportional myoelectric control by applying sample entropy (SampEn) analysis of surface EMG signals for robust intensity estimation in the presence of significant ECG interference. Surface EMG data from able-bodied and TMR amputee subjects with different degrees of ECG interference were used for performance evaluation. The results showed that the SampEn analysis had high correlation with surface EMG amplitude measurement but low sensitivity to different degrees of ECG interference. Taking this advantage, SampEn analysis of surface EMG signal can be used to facilitate implementation of proportional myoelectric control against ECG interference. This is particularly important for TMR prosthesis users.Entropy2016-03-30184Article10.3390/e180401061061099-43002016-03-30doi: 10.3390/e18040106Xu ZhangXiaoting RenXiaoping GaoXiang ChenPing Zhou<![CDATA[Entropy, Vol. 18, Pages 111: Identifying the Probability Distribution of Fatigue Life Using the Maximum Entropy Principle]]>
http://www.mdpi.com/1099-4300/18/4/111
It is well-known that the fatigue lives of materials and structures have a considerable amount of scatter and they are commonly suggested to be considered in engineering design. In order to reduce the introduction of subjective uncertainties and obtain rational probability distributions, a computational method based on the maximum entropy principle is proposed for identifying the probability distribution of fatigue life in this paper. The first four statistical moments of fatigue life are involved to formulate constraints in the maximum entropy principle optimization problem. An accurate algorithm is also presented to find the Lagrange multipliers in the maximum entropy distribution, which can avoid the numerical singularity when solving a system of equations. Two fit indexes are used to measure the goodness-of-fit of the proposed method. The rationality and effectiveness of the proposed method are demonstrated by two groups of fatigue data sets available in the literature. Comparisons among the proposed method, the lognormal distribution and the three-parameter Weibull distribution are also carried out for the investigated groups of fatigue data sets.Entropy2016-03-30184Article10.3390/e180401111111099-43002016-03-30doi: 10.3390/e18040111Hongshuang LiDebing WenZizi LuYu WangFeng Deng<![CDATA[Entropy, Vol. 18, Pages 109: An Estimator of Mutual Information and its Application to Independence Testing]]>
http://www.mdpi.com/1099-4300/18/4/109
This paper proposes a novel estimator of mutual information for discrete and continuous variables. The main feature of this estimator is that it is zero for a large sample size n if and only if the two variables are independent. The estimator can be used to construct several histograms, compute estimations of mutual information, and choose the maximum value. We prove that the number of histograms constructed has an upper bound of O(log n) and apply this fact to the search. We compare the performance of the proposed estimator with an estimator of the Hilbert-Schmidt independence criterion (HSIC), though the proposed method is based on the minimum description length (MDL) principle and the HSIC provides a statistical test. The proposed method completes the estimation in O(n log n) time, whereas the HSIC kernel computation requires O(n3) time. We also present examples in which the HSIC fails to detect independence but the proposed method successfully detects it.Entropy2016-03-29184Article10.3390/e180401091091099-43002016-03-29doi: 10.3390/e18040109Joe Suzuki<![CDATA[Entropy, Vol. 18, Pages 108: Introduction to Supersymmetric Theory of Stochastics]]>
http://www.mdpi.com/1099-4300/18/4/108
Many natural and engineered dynamical systems, including all living objects, exhibit signatures of what can be called spontaneous dynamical long-range order (DLRO). This order’s omnipresence has long been recognized by the scientific community, as evidenced by a myriad of related concepts, theoretical and phenomenological frameworks, and experimental phenomena such as turbulence, 1/f noise, dynamical complexity, chaos and the butterfly effect, the Richter scale for earthquakes and the scale-free statistics of other sudden processes, self-organization and pattern formation, self-organized criticality, etc. Although several successful approaches to various realizations of DLRO have been established, the universal theoretical understanding of this phenomenon remained elusive. The possibility of constructing a unified theory of DLRO has emerged recently within the approximation-free supersymmetric theory of stochastics (STS). There, DLRO is the spontaneous breakdown of the topological or de Rahm supersymmetry that all stochastic differential equations (SDEs) possess. This theory may be interesting to researchers with very different backgrounds because the ubiquitous DLRO is a truly interdisciplinary entity. The STS is also an interdisciplinary construction. This theory is based on dynamical systems theory, cohomological field theories, the theory of pseudo-Hermitian operators, and the conventional theory of SDEs. Reviewing the literature on all these mathematical disciplines can be time consuming. As such, a concise and self-contained introduction to the STS, the goal of this paper, may be useful.Entropy2016-03-28184Review10.3390/e180401081081099-43002016-03-28doi: 10.3390/e18040108Igor Ovchinnikov<![CDATA[Entropy, Vol. 18, Pages 107: Thermodynamic Analysis of the Irreversibilities in Solar Absorption Refrigerators]]>
http://www.mdpi.com/1099-4300/18/4/107
A thermodynamic analysis of the irreversibility on solar absorption refrigerators is presented. Under the hierarchical decomposition and the hypothesis of an endoreversible model, many functional and practical domains are defined. The effect of external heat source temperature on the entropy rate and on the inverse specific cooling load (ISCL) multiplied by the total area of the refrigerator A/Qe are studied. This may help a constructor to well dimension the solar machine under an optimal technico-economical criterion A/Qe and with reasonable irreversibility on the refrigerator. The solar concentrator temperature effect on the total exchanged area, on the technico-economical ratio A/Qe, and on the internal entropy rate are illustrated and discussed. The originality of these results is that they allow a conceptual study of a solar absorption refrigeration cycle.Entropy2016-03-24184Article10.3390/e180401071071099-43002016-03-24doi: 10.3390/e18040107Emma Berrich BetoucheAli FellahAmmar Ben BrahimFethi AlouiMichel Feidt<![CDATA[Entropy, Vol. 18, Pages 100: Chaos on the Vallis Model for El Niño with Fractional Operators]]>
http://www.mdpi.com/1099-4300/18/4/100
The Vallis model for El Niño is an important model describing a very interesting physical problem. The aim of this paper is to investigate and compare the models using both integer and non-integer order derivatives. We first studied the model with the local derivative by presenting for the first time the exact solution for equilibrium points, and then we presented the exact solutions with the numerical simulations. We further examined the model within the scope of fractional order derivatives. The fractional derivatives used here are the Caputo derivative and Caputo–Fabrizio type. Within the scope of fractional derivatives, we presented the existence and unique solutions of the model. We derive special solutions of both models with Caputo and Caputo–Fabrizio derivatives. Some numerical simulations are presented to compare the models. We obtained more chaotic behavior from the model with Caputo–Fabrizio derivative than other one with local and Caputo derivative. When compare the three models, we realized that, the Caputo derivative plays a role of low band filter when the Caputo–Fabrizio presents more information that were not revealed in the model with local derivative.Entropy2016-03-23184Article10.3390/e180401001001099-43002016-03-23doi: 10.3390/e18040100Badr AlkahtaniAbdon Atangana<![CDATA[Entropy, Vol. 18, Pages 104: On the Path to Optimizing the Al-Co-Cr-Cu-Fe-Ni-Ti High Entropy Alloy Family for High Temperature Applications]]>
http://www.mdpi.com/1099-4300/18/4/104
The most commonly investigated high entropy alloy, AlCoCrCuFeNi, has been chosen for optimization of its microstructural and mechanical properties by means of compositional changes and heat treatments. Among the different available optimization paths, the decrease of segregating element Cu, the increase of oxidation protective elements Al and Cr and the approach towards a γ-γ′ microstructure like in Ni-based superalloys have been probed and compared. Microscopical observations have been made for every optimization step. Vickers microhardness measurements and/or tensile/compression test have been carried out when the alloy was appropriate. Five derived alloys AlCoCrFeNi, Al23Co15Cr23Cu8Fe15Ni16, Al8Co17Cr17Cu8Fe17Ni33, Al8Co17Cr14Cu8Fe17Ni34.8Mo0.1Ti1W0.1 and Al10Co25Cr8Fe15Ni36Ti6 (all at.%) have been compared to the original AlCoCrCuFeNi and the most promising one has been selected for further investigation.Entropy2016-03-23184Article10.3390/e180401041041099-43002016-03-23doi: 10.3390/e18040104Anna ManzoniSheela SinghHaneen DaoudRobert PoppRainer VölklUwe GlatzelNelia Wanderka<![CDATA[Entropy, Vol. 18, Pages 103: Assessment of Nociceptive Responsiveness Levels during Sedation-Analgesia by Entropy Analysis of EEG]]>
http://www.mdpi.com/1099-4300/18/3/103
The level of sedation in patients undergoing medical procedures is decided to assure unconsciousness and prevent pain. The monitors of depth of anesthesia, based on the analysis of the electroencephalogram (EEG), have been progressively introduced into the daily practice to provide additional information about the state of the patient. However, the quantification of analgesia still remains an open problem. The purpose of this work was to analyze the capability of prediction of nociceptive responses based on refined multiscale entropy (RMSE) and auto mutual information function (AMIF) applied to EEG signals recorded in 378 patients scheduled to undergo ultrasonographic endoscopy under sedation-analgesia. Two observed categorical responses after the application of painful stimulation were analyzed: the evaluation of the Ramsay Sedation Scale (RSS) after nail bed compression and the presence of gag reflex (GAG) during endoscopy tube insertion. In addition, bispectrum (BIS), heart rate (HR), predicted concentrations of propofol (CeProp) and remifentanil (CeRemi) were annotated with a resolution of 1 s. Results showed that functions based on RMSE, AMIF, HR and CeRemi permitted predicting different stimulation responses during sedation better than BIS.Entropy2016-03-18183Article10.3390/e180301031031099-43002016-03-18doi: 10.3390/e18030103José ValenciaUmberto MeliaMontserrat VallverdúXavier BorratMathieu JospinErik JensenAlberto PortaPedro GambúsPere Caminal<![CDATA[Entropy, Vol. 18, Pages 99: System Entropy Measurement of Stochastic Partial Differential Systems]]>
http://www.mdpi.com/1099-4300/18/3/99
System entropy describes the dispersal of a system’s energy and is an indication of the disorder of a physical system. Several system entropy measurement methods have been developed for dynamic systems. However, most real physical systems are always modeled using stochastic partial differential dynamic equations in the spatio-temporal domain. No efficient method currently exists that can calculate the system entropy of stochastic partial differential systems (SPDSs) in consideration of the effects of intrinsic random fluctuation and compartment diffusion. In this study, a novel indirect measurement method is proposed for calculating of system entropy of SPDSs using a Hamilton–Jacobi integral inequality (HJII)-constrained optimization method. In other words, we solve a nonlinear HJII-constrained optimization problem for measuring the system entropy of nonlinear stochastic partial differential systems (NSPDSs). To simplify the system entropy measurement of NSPDSs, the global linearization technique and finite difference scheme were employed to approximate the nonlinear stochastic spatial state space system. This allows the nonlinear HJII-constrained optimization problem for the system entropy measurement to be transformed to an equivalent linear matrix inequalities (LMIs)-constrained optimization problem, which can be easily solved using the MATLAB LMI-toolbox (MATLAB R2014a, version 8.3). Finally, several examples are presented to illustrate the system entropy measurement of SPDSs.Entropy2016-03-18183Article10.3390/e18030099991099-43002016-03-18doi: 10.3390/e18030099Bor-Sen ChenChao-Yi HsiehShih-Ju Ho<![CDATA[Entropy, Vol. 18, Pages 101: A Complexity-Based Approach for the Detection of Weak Signals in Ocean Ambient Noise]]>
http://www.mdpi.com/1099-4300/18/3/101
There are numerous studies showing that there is a constant increase in the ocean ambient noise level and the ever-growing demand for developing algorithms for detecting weak signals in ambient noise. In this study, we utilize dynamical and statistical complexity to detect the presence of weak ship noise embedded in ambient noise. The ambient noise and ship noise were recorded in the South China Sea. The multiscale entropy (MSE) method and the complexity-entropy causality plane (C-H plane) were used to quantify the dynamical and statistical complexity of the measured time series, respectively. We generated signals with varying signal-to-noise ratio (SNR) by varying the amplification of a ship signal. The simulation results indicate that the complexity is sensitive to change in the information in the ambient noise and the change in SNR, a finding that enables the detection of weak ship signals in strong background ambient noise. The simulation results also illustrate that complexity is better than the traditional spectrogram method, particularly effective for detecting low SNR signals in ambient noise. In addition, complexity-based MSE and C-H plane methods are simple, robust and do not assume any underlying dynamics in time series. Hence, complexity should be used in practical situations.Entropy2016-03-18183Article10.3390/e180301011011099-43002016-03-18doi: 10.3390/e18030101Shashidhar SiddagangaiahYaan LiXijing GuoXiao ChenQunfei ZhangKunde YangYixin Yang<![CDATA[Entropy, Vol. 18, Pages 102: Development of a Refractory High Entropy Superalloy]]>
http://www.mdpi.com/1099-4300/18/3/102
Microstructure, phase composition and mechanical properties of a refractory high entropy superalloy, AlMo0.5NbTa0.5TiZr, are reported in this work. The alloy consists of a nano-scale mixture of two phases produced by the decomposition from a high temperature body-centered cubic (BCC) phase. The first phase is present in the form of cuboidal-shaped nano-precipitates aligned in rows along &lt;100&gt;-type directions, has a disordered BCC crystal structure with the lattice parameter a1 = 326.9 ± 0.5 pm and is rich in Mo, Nb and Ta. The second phase is present in the form of channels between the cuboidal nano-precipitates, has an ordered B2 crystal structure with the lattice parameter a2 = 330.4 ± 0.5 pm and is rich in Al, Ti and Zr. Both phases are coherent and have the same crystallographic orientation within the former grains. The formation of this modulated nano-phase structure is discussed in the framework of nucleation-and-growth and spinodal decomposition mechanisms. The yield strength of this refractory high entropy superalloy is superior to the yield strength of Ni-based superalloys in the temperature range of 20 °C to 1200 °C.Entropy2016-03-17183Article10.3390/e180301021021099-43002016-03-17doi: 10.3390/e18030102Oleg SenkovDieter IsheimDavid SeidmanAdam Pilchak<![CDATA[Entropy, Vol. 18, Pages 98: Riemannian Laplace Distribution on the Space of Symmetric Positive Definite Matrices]]>
http://www.mdpi.com/1099-4300/18/3/98
The Riemannian geometry of the space Pm, of m × m symmetric positive definite matrices, has provided effective tools to the fields of medical imaging, computer vision and radar signal processing. Still, an open challenge remains, which consists of extending these tools to correctly handle the presence of outliers (or abnormal data), arising from excessive noise or faulty measurements. The present paper tackles this challenge by introducing new probability distributions, called Riemannian Laplace distributions on the space Pm. First, it shows that these distributions provide a statistical foundation for the concept of the Riemannian median, which offers improved robustness in dealing with outliers (in comparison to the more popular concept of the Riemannian center of mass). Second, it describes an original expectation-maximization algorithm, for estimating mixtures of Riemannian Laplace distributions. This algorithm is applied to the problem of texture classification, in computer vision, which is considered in the presence of outliers. It is shown to give significantly better performance with respect to other recently-proposed approaches.Entropy2016-03-16183Article10.3390/e18030098981099-43002016-03-16doi: 10.3390/e18030098Hatem HajriIoana IleaSalem SaidLionel BombrunYannick Berthoumieu<![CDATA[Entropy, Vol. 18, Pages 97: Constrained Inference When the Sampled and Target Populations Differ]]>
http://www.mdpi.com/1099-4300/18/3/97
In the analysis of contingency tables, often one faces two difficult criteria: sampled and target populations are not identical and prior information translates to the presence of general linear inequality restrictions. Under these situations, we present new models of estimating cell probabilities related to four well-known methods of estimation. We prove that each model yields maximum likelihood estimators under those restrictions. The performance ranking of these methods under equality restrictions is known. We compare these methods under inequality restrictions in a simulation study. It reveals that these methods may rank differently under inequality restriction than with equality. These four methods are also compared while US census data are analyzed.Entropy2016-03-16183Article10.3390/e18030097971099-43002016-03-16doi: 10.3390/e18030097Huijun YiBhaskar Bhattacharya<![CDATA[Entropy, Vol. 18, Pages 96: Preference Inconsistence-Based Entropy]]>
http://www.mdpi.com/1099-4300/18/3/96
Preference analysis is a class of important issues in ordinal decision making. As available information is usually obtained from different evaluation criteria or experts, the derived preference decisions may be inconsistent and uncertain. Shannon entropy is a suitable measurement of uncertainty. This work proposes the concepts of preference inconsistence set and preference inconsistence degree. Then preference inconsistence entropy is introduced by combining preference inconsistence degree and Shannon entropy. A number of properties and theorems as well as two applications are discussed. Feature selection is used for attribute reduction and sample condensation aims to obtain a consistent preference system. Forward feature selection algorithm, backward feature selection algorithm and sample condensation algorithm are developed. The experimental results show that the proposed model represents an effective solution for preference analysis.Entropy2016-03-15183Article10.3390/e18030096961099-43002016-03-15doi: 10.3390/e18030096Wei PanKun ShePengyuan Wei<![CDATA[Entropy, Vol. 18, Pages 95: A Cross-Entropy-Based Admission Control Optimization Approach for Heterogeneous Virtual Machine Placement in Public Clouds]]>
http://www.mdpi.com/1099-4300/18/3/95
Virtualization technologies make it possible for cloud providers to consolidate multiple IaaS provisions into a single server in the form of virtual machines (VMs). Additionally, in order to fulfill the divergent service requirements from multiple users, a cloud provider needs to offer several types of VM instances, which are associated with varying configurations and performance, as well as different prices. In such a heterogeneous virtual machine placement process, one significant problem faced by a cloud provider is how to optimally accept and place multiple VM service requests into its cloud data centers to achieve revenue maximization. To address this issue, in this paper, we first formulate such a revenue maximization problem during VM admission control as a multiple-dimensional knapsack problem, which is known to be NP-hard to solve. Then, we propose to use a cross-entropy-based optimization approach to address this revenue maximization problem, by obtaining a near-optimal eligible set for the provider to accept into its data centers, from the waiting VM service requests in the system. Finally, through extensive experiments and measurements in a simulated environment with the settings of VM instance classes derived from real-world cloud systems, we show that our proposed cross-entropy-based admission control optimization algorithm is efficient and effective in maximizing cloud providers’ revenue in a public cloud computing environment.Entropy2016-03-15183Article10.3390/e18030095951099-43002016-03-15doi: 10.3390/e18030095Li PanDatao Wang<![CDATA[Entropy, Vol. 18, Pages 94: Dark Energy: The Shadowy Reflection of Dark Matter?]]>
http://www.mdpi.com/1099-4300/18/3/94
In this article, we review a series of recent theoretical results regarding a conventional approach to the dark energy (DE) concept. This approach is distinguished among others for its simplicity and its physical relevance. By compromising General Relativity (GR) and Thermodynamics at cosmological scale, we end up with a model without DE. Instead, the Universe we are proposing is filled with a perfect fluid of self-interacting dark matter (DM), the volume elements of which perform hydrodynamic flows. To the best of our knowledge, it is the first time in a cosmological framework that the energy of the cosmic fluid internal motions is also taken into account as a source of the universal gravitational field. As we demonstrate, this form of energy may compensate for the DE needed to compromise spatial flatness, while, depending on the particular type of thermodynamic processes occurring in the interior of the DM fluid (isothermal or polytropic), the Universe depicts itself as either decelerating or accelerating (respectively). In both cases, there is no disagreement between observations and the theoretical prediction of the distant supernovae (SNe) Type Ia distribution. In fact, the cosmological model with matter content in the form of a thermodynamically-involved DM fluid not only interprets the observational data associated with the recent history of Universe expansion, but also confronts successfully with every major cosmological issue (such as the age and the coincidence problems). In this way, depending on the type of thermodynamic processes in it, such a model may serve either for a conventional DE cosmology or for a viable alternative one.Entropy2016-03-12183Review10.3390/e18030094941099-43002016-03-12doi: 10.3390/e18030094Kostas KleidisNikolaos Spyrou<![CDATA[Entropy, Vol. 18, Pages 93: Selected Remarks about Computer Processing in Terms of Flow Control and Statistical Mechanics]]>
http://www.mdpi.com/1099-4300/18/3/93
Despite the fact that much has been said about processing in computer science, it seems that there is still much to do. A classical approach assumes that the computations done by computers are a kind of mathematical operation (calculations of functions values) and have no special relations to energy transformation and flow. However, there is a possibility to get a new view on selected topics, and as a special case, the sorting problem is presented; we know many different sorting algorithms, including those that have complexity equal to O(n lg(n)) , which means that this problem is algorithmically closed, but it is also possible to focus on the problem of sorting in terms of flow control, entropy and statistical mechanics. This is done in relation to the existing definitions of sorting, connections between sorting and ordering and some important aspects of computer processing understood as a flow that are not taken into account in many theoretical considerations in computer science. The proposed new view is an attempt to change the paradigm in the description of algorithms’ performance by computational complexity and processing, taking into account the existing references between the idea of Turing machines and their physical implementations. This proposal can be expressed as a physics of computer processing; a reference point to further analysis of algorithmic and interactive processing in computer systems.Entropy2016-03-12183Article10.3390/e18030093931099-43002016-03-12doi: 10.3390/e18030093Dominik Strzałka<![CDATA[Entropy, Vol. 18, Pages 90: Analysis of Entropy Generation in the Flow of Peristaltic Nanofluids in Channels With Compliant Walls]]>
http://www.mdpi.com/1099-4300/18/3/90
Entropy generation during peristaltic flow of nanofluids in a non-uniform two dimensional channel with compliant walls has been studied. The mathematical modelling of the governing flow problem is obtained under the approximation of long wavelength and zero Reynolds number (creeping flow regime). The resulting non-linear partial differential equations are solved with the help of a perturbation method. The analytic and numerical results of different parameters are demonstrated mathematically and graphically. The present analysis provides a theoretical model to estimate the characteristics of several Newtonian and non-Newtonian fluid flows, such as peristaltic transport of blood.Entropy2016-03-11183Article10.3390/e18030090901099-43002016-03-11doi: 10.3390/e18030090Munawwar AbbasYanqin BaiMohammad RashidiMuhammad Bhatti<![CDATA[Entropy, Vol. 18, Pages 92: Long-Range Electron Transport Donor-Acceptor in Nonlinear Lattices]]>
http://www.mdpi.com/1099-4300/18/3/92
We study here several simple models of the electron transfer (ET) in a one-dimensional nonlinear lattice between a donor and an acceptor and propose a new fast mechanism of electron surfing on soliton-like excitations along the lattice. The nonlinear lattice is modeled as a classical one-dimensional Morse chain and the dynamics of the electrons are considered in the tight-binding approximation. This model is applied to the processes along a covalent bridge connecting donors and acceptors. First, it is shown that the electron forms bound states with the solitonic excitations in the lattice. These so-called solectrons may move with supersonic speed. In a heated system, the electron transfer between a donor and an acceptor is modeled as a diffusion-like process. We study in detail the role of thermal factors on the electron transfer. Then, we develop a simple model based on the classical Smoluchowski–Chandrasekhar picture of diffusion-controlled reactions as stochastic processes with emitters and absorbers. Acceptors are modeled by an absorbing boundary. Finally, we compare the new ET mechanisms described here with known ET data. We conclude that electron surfing on solitons could be a special fast way for ET over quite long distances.Entropy2016-03-11183Article10.3390/e18030092921099-43002016-03-11doi: 10.3390/e18030092Alexander ChetverikovWerner EbelingManuel Velarde<![CDATA[Entropy, Vol. 18, Pages 68: A Novel Weak Fuzzy Solution for Fuzzy Linear System]]>
http://www.mdpi.com/1099-4300/18/3/68
This article proposes a novel weak fuzzy solution for the fuzzy linear system. As a matter of fact, we define the right-hand side column of the fuzzy linear system as a piecewise fuzzy function to overcome the related shortcoming, which exists in the previous findings. The strong point of this proposal is that the weak fuzzy solution is always a fuzzy number vector. Two complex and non-complex linear systems under uncertainty are tested to validate the effectiveness and correctness of the presented method.Entropy2016-03-11183Article10.3390/e18030068681099-43002016-03-11doi: 10.3390/e18030068Soheil SalahshourAli AhmadianFudziah IsmailDumitru Baleanu<![CDATA[Entropy, Vol. 18, Pages 73: Selecting Video Key Frames Based on Relative Entropy and the Extreme Studentized Deviate Test]]>
http://www.mdpi.com/1099-4300/18/3/73
This paper studies the relative entropy and its square root as distance measures of neighboring video frames for video key frame extraction. We develop a novel approach handling both common and wavelet video sequences, in which the extreme Studentized deviate test is exploited to identify shot boundaries for segmenting a video sequence into shots. Then, video shots can be divided into different sub-shots, according to whether the video content change is large or not, and key frames are extracted from sub-shots. The proposed technique is general, effective and efficient to deal with video sequences of any kind. Our new approach can offer optional additional multiscale summarizations of video data, achieving a balance between having more details and maintaining less redundancy. Extensive experimental results show that the new scheme obtains very encouraging results in video key frame extraction, in terms of both objective evaluation metrics and subjective visual perception.Entropy2016-03-09183Article10.3390/e18030073731099-43002016-03-09doi: 10.3390/e18030073Yuejun GuoQing XuShihua SunXiaoxiao LuoMateu Sbert<![CDATA[Entropy, Vol. 18, Pages 88: Maximizing Diversity in Biology and Beyond]]>
http://www.mdpi.com/1099-4300/18/3/88
Entropy, under a variety of names, has long been used as a measure of diversity in ecology, as well as in genetics, economics and other fields. There is a spectrum of viewpoints on diversity, indexed by a real parameter q giving greater or lesser importance to rare species. Leinster and Cobbold (2012) proposed a one-parameter family of diversity measures taking into account both this variation and the varying similarities between species. Because of this latter feature, diversity is not maximized by the uniform distribution on species. So it is natural to ask: which distributions maximize diversity, and what is its maximum value? In principle, both answers depend on q, but our main theorem is that neither does. Thus, there is a single distribution that maximizes diversity from all viewpoints simultaneously, and any list of species has an unambiguous maximum diversity value. Furthermore, the maximizing distribution(s) can be computed in finite time, and any distribution maximizing diversity from some particular viewpoint q &gt; 0 actually maximizes diversity for all q. Although we phrase our results in ecological terms, they apply very widely, with applications in graph theory and metric geometry.Entropy2016-03-09183Article10.3390/e18030088881099-43002016-03-09doi: 10.3390/e18030088Tom LeinsterMark Meckes<![CDATA[Entropy, Vol. 18, Pages 89: Two Universality Properties Associated with the Monkey Model of Zipf’s Law]]>
http://www.mdpi.com/1099-4300/18/3/89
The distribution of word probabilities in the monkey model of Zipf’s law is associated with two universality properties: (1) the exponent in the approximate power law approaches −1 as the alphabet size increases and the letter probabilities are specified as the spacings from a random division of the unit interval for any distribution with a bounded density function on [0,1] ; and (2), on a logarithmic scale the version of the model with a finite word length cutoff and unequal letter probabilities is approximately normally distributed in the part of the distribution away from the tails. The first property is proved using a remarkably general limit theorem from Shao and Hahn for the logarithm of sample spacings constructed on [0,1] and the second property follows from Anscombe’s central limit theorem for a random number of independent and identically distributed (i.i.d.) random variables. The finite word length model leads to a hybrid Zipf-lognormal mixture distribution closely related to work in other areas.Entropy2016-03-09183Article10.3390/e18030089891099-43002016-03-09doi: 10.3390/e18030089Richard PerlineRon Perline<![CDATA[Entropy, Vol. 18, Pages 87: Entropy Production in the Theory of Heat Conduction in Solids]]>
http://www.mdpi.com/1099-4300/18/3/87
The evolution of the entropy production in solids due to heat transfer is usually associated with the Prigogine’s minimum entropy production principle. In this paper, we propose a critical review of the results of Prigogine and some comments on the succeeding literature. We suggest a characterization of the evolution of the entropy production of the system through the generalized Fourier modes, showing that they are the only states with a time independent entropy production. The variational approach and a Lyapunov functional of the temperature, monotonically decreasing with time, are discussed. We describe the analytic properties of the entropy production as a function of time in terms of the generalized Fourier coefficients of the system. Analytical tools are used throughout the paper and numerical examples will support the statements.Entropy2016-03-08183Article10.3390/e18030087871099-43002016-03-08doi: 10.3390/e18030087Federico Zullo<![CDATA[Entropy, Vol. 18, Pages 86: Exergy and Thermoeconomic Analysis for an Underground Train Station Air-Conditioning Cooling System]]>
http://www.mdpi.com/1099-4300/18/3/86
The necessity of air-conditioning causes the enormous energy use of underground train stations. Exergy and thermoeconomic analysis is applied to the annual operation of the air-conditioning system of a large underground train station in Taiwan. The current installation and the monitored data are taken to be the base case, which is then compared to three different optimized designs. The total revenue requirement levelized cost rate and the total exergy destruction rate are used to evaluate the merits. The results show that the cost optimization objective would obtain a lower total revenue requirement levelized cost rate, but at the expense of a higher total exergy destruction rate. Optimization of thermodynamic efficiency, however, leads to a lower total exergy destruction rate, but would increase the total revenue requirement levelized cost rate significantly. It has been shown that multi-objective optimization would result in a small marginal increase in total revenue requirement levelized cost rate, but achieve a significantly lower total exergy destruction rate. Results in terms of the normalized total revenue requirement levelized cost rate and the normalized total exergy destruction rate are also presented. It has been shown by second law analysis when applied to underground train stations that lower annual energy use and lower CO2 emissions can be achieved.Entropy2016-03-07183Article10.3390/e18030086861099-43002016-03-07doi: 10.3390/e18030086Ke LiaoYew Chuah<![CDATA[Entropy, Vol. 18, Pages 83: iDoRNA: An Interacting Domain-based Tool for Designing RNA-RNA Interaction Systems]]>
http://www.mdpi.com/1099-4300/18/3/83
RNA-RNA interactions play a crucial role in gene regulation in living organisms. They have gained increasing interest in the field of synthetic biology because of their potential applications in medicine and biotechnology. However, few novel regulators based on RNA-RNA interactions with desired structures and functions have been developed due to the challenges of developing design tools. Recently, we proposed a novel tool, called iDoDe, for designing RNA-RNA interacting sequences by first decomposing RNA structures into interacting domains and then designing each domain using a stochastic algorithm. However, iDoDe did not provide an optimal solution because it still lacks a mechanism to optimize the design. In this work, we have further developed the tool by incorporating a genetic algorithm (GA) to find an RNA solution with maximized structural similarity and minimized hybridized RNA energy, and renamed the tool iDoRNA. A set of suitable parameters for the genetic algorithm were determined and found to be a weighting factor of 0.7, a crossover rate of 0.9, a mutation rate of 0.1, and the number of individuals per population set to 8. We demonstrated the performance of iDoRNA in comparison with iDoDe by using six RNA-RNA interaction models. It was found that iDoRNA could efficiently generate all models of interacting RNAs with far more accuracy and required far less computational time than iDoDe. Moreover, we compared the design performance of our tool against existing design tools using forty-four RNA-RNA interaction models. The results showed that the performance of iDoRNA is better than RiboMaker when considering the ensemble defect, the fitness score and computation time usage. However, it appears that iDoRNA is outperformed by NUPACK and RNAiFold 2.0 when considering the ensemble defect. Nevertheless, iDoRNA can still be an useful alternative tool for designing novel RNA-RNA interactions in synthetic biology research. The source code of iDoRNA can be downloaded from the site http://synbio.sbi.kmutt.ac.th.Entropy2016-03-07183Article10.3390/e18030083831099-43002016-03-07doi: 10.3390/e18030083Jittrawan ThaiprasitBoonserm KaewkamnerdpongDujduan Waraho-ZhmayevSupapon CheevadhanarakAsawin Meechai<![CDATA[Entropy, Vol. 18, Pages 85: Assessing the Robustness of Thermoeconomic Diagnosis of Fouled Evaporators: Sensitivity Analysis of the Exergetic Performance of Direct Expansion Coils]]>
http://www.mdpi.com/1099-4300/18/3/85
Thermoeconomic diagnosis of refrigeration systems is a pioneering approach to the diagnosis of malfunctions, which has been recently proven to achieve good performances for the detection of specific faults. Being an exergy-based diagnostic technique, its performance is influenced by the trends of exergy functions in the “design” and “abnormal” conditions. In this paper the sensitivity of performance of thermoeconomic diagnosis in detecting a fouled direct expansion coil and quantifying the additional consumption it induces is investigated; this fault is critical due to the simultaneous air cooling and dehumidification occurring in the coil, that induce variations in both the chemical and thermal fractions of air exergy. The examined parameters are the temperature and humidity of inlet air, the humidity of reference state and the sensible/latent heat ratio (varied by considering different coil depths). The exergy analysis reveals that due to the more intense dehumidification occurring in presence of fouling, the exergy efficiency of the evaporator coil eventually increases. Once the diagnostic technique is based only on the thermal fraction of air exergy, the results suggest that the performance of the technique increases when inlet air has a lower absolute humidity, as evident from the “optimal performance” regions identified on a psychrometric chart.Entropy2016-03-05183Article10.3390/e18030085851099-43002016-03-05doi: 10.3390/e18030085Antonio PiacentinoPietro Catrini<![CDATA[Entropy, Vol. 18, Pages 84: Entropy and Fractal Antennas]]>
http://www.mdpi.com/1099-4300/18/3/84
The entropies of Shannon, Rényi and Kolmogorov are analyzed and compared together with their main properties. The entropy of some particular antennas with a pre-fractal shape, also called fractal antennas, is studied. In particular, their entropy is linked with the fractal geometrical shape and the physical performance.Entropy2016-03-04183Article10.3390/e18030084841099-43002016-03-04doi: 10.3390/e18030084Emanuel Guariglia<![CDATA[Entropy, Vol. 18, Pages 82: Hierarchical Decomposition Thermodynamic Approach for the Study of Solar Absorption Refrigerator Performance]]>
http://www.mdpi.com/1099-4300/18/3/82
A thermodynamic approach based on the hierarchical decomposition which is usually used in mechanical structure engineering is proposed. The methodology is applied to an absorption refrigeration cycle. Thus, a thermodynamic analysis of the performances on solar absorption refrigerators is presented. Under the hypothesis of an endoreversible model, the effects of the generator, the solar concentrator and the solar converter temperatures, on the coefficient of performance (COP), are presented and discussed. In fact, the coefficient of performance variations, according to the ratio of the heat transfer areas of the high temperature part (the thermal engine 2) Ah and the heat transfer areas of the low temperature part (the thermal receptor) Ar variations, are studied in this paper. For low values of the heat-transfer areas of the high temperature part and relatively important values of heat-transfer areas of the low temperature part as for example Ah equal to 30% of Ar, the coefficient of performance is relatively important (approximately equal to 65%). For an equal-area distribution corresponding to an area ratio Ah/Ar of 50%, the COP is approximately equal to 35%. The originality of this deduction is that it allows a conceptual study of the solar absorption cycle.Entropy2016-03-04183Article10.3390/e18030082821099-43002016-03-04doi: 10.3390/e18030082Emma Berrich BetoucheAli FellahAmmar Ben BrahimFethi AlouiMichel Feidt<![CDATA[Entropy, Vol. 18, Pages 81: Phase Transitions in Equilibrium and Non-Equilibrium Models on Some Topologies]]>
http://www.mdpi.com/1099-4300/18/3/81
On some regular and non-regular topologies, we studied the critical properties of models that present up-down symmetry, like the equilibrium Ising model and the nonequilibrium majority vote model. These are investigated on networks, like Apollonian (AN), Barabási–Albert (BA), small-worlds (SW), Voronoi–Delaunay (VD) and Erdös–Rényi (ER) random graphs. The review here is on phase transitions, critical points, exponents and universality classes that are compared to the results obtained for these models on regular square lattices (SL).Entropy2016-03-03183Article10.3390/e18030081811099-43002016-03-03doi: 10.3390/e18030081Francisco De Sousa Lima<![CDATA[Entropy, Vol. 18, Pages 80: Minimal Length, Measurability and Gravity]]>
http://www.mdpi.com/1099-4300/18/3/80
The present work is a continuation of the previous papers written by the author on the subject. In terms of the measurability (or measurable quantities) notion introduced in a minimal length theory, first the consideration is given to a quantum theory in the momentum representation. The same terms are used to consider the Markov gravity model that here illustrates the general approach to studies of gravity in terms of measurable quantities.Entropy2016-03-02183Article10.3390/e18030080801099-43002016-03-02doi: 10.3390/e18030080Alexander Shalyt-Margolin<![CDATA[Entropy, Vol. 18, Pages 79: Entanglement Entropy in a Triangular Billiard]]>
http://www.mdpi.com/1099-4300/18/3/79
The Schrödinger equation for a quantum particle in a two-dimensional triangular billiard can be written as the Helmholtz equation with a Dirichlet boundary condition. We numerically explore the quantum entanglement of the eigenfunctions of the triangle billiard and its relation to the irrationality of the triangular geometry. We also study the entanglement dynamics of the coherent state with its center chosen at the centroid of the different triangle configuration. Using the von Neumann entropy of entanglement, we quantify the quantum entanglement appearing in the eigenfunction of the triangular domain. We see a clear correspondence between the irrationality of the triangle and the average entanglement of the eigenfunctions. The entanglement dynamics of the coherent state shows a dependence on the geometry of the triangle. The effect of quantum squeezing on the coherent state is analyzed and it can be utilize to enhance or decrease the entanglement entropy in a triangular billiard.Entropy2016-03-01183Article10.3390/e18030079791099-43002016-03-01doi: 10.3390/e18030079Sijo JosephMiguel Sanjuán<![CDATA[Entropy, Vol. 18, Pages 78: Wavelet Entropy-Based Traction Inverter Open Switch Fault Diagnosis in High-Speed Railways]]>
http://www.mdpi.com/1099-4300/18/3/78
In this paper, a diagnosis plan is proposed to settle the detection and isolation problem of open switch faults in high-speed railway traction system traction inverters. Five entropy forms are discussed and compared with the traditional fault detection methods, namely, discrete wavelet transform and discrete wavelet packet transform. The traditional fault detection methods cannot efficiently detect the open switch faults in traction inverters because of the low resolution or the sudden change of the current. The performances of Wavelet Packet Energy Shannon Entropy (WPESE), Wavelet Packet Energy Tsallis Entropy (WPETE) with different non-extensive parameters, Wavelet Packet Energy Shannon Entropy with a specific sub-band (WPESE3,6), Empirical Mode Decomposition Shannon Entropy (EMDESE), and Empirical Mode Decomposition Tsallis Entropy (EMDETE) with non-extensive parameters in detecting the open switch fault are evaluated by the evaluation parameter. Comparison experiments are carried out to select the best entropy form for the traction inverter open switch fault detection. In addition, the DC component is adopted to isolate the failure Isolated Gate Bipolar Transistor (IGBT). The simulation experiments show that the proposed plan can diagnose single and simultaneous open switch faults correctly and timely.Entropy2016-03-01183Article10.3390/e18030078781099-43002016-03-01doi: 10.3390/e18030078Keting HuZhigang LiuShuangshuang Lin<![CDATA[Entropy, Vol. 18, Pages 75: The Impact of Entropy Production and Emission Mitigation on Economic Growth]]>
http://www.mdpi.com/1099-4300/18/3/75
Entropy production in industrial economies involves heat currents, driven by gradients of temperature, and particle currents, driven by specific external forces and gradients of temperature and chemical potentials. Pollution functions are constructed for the associated emissions. They reduce the output elasticities of the production factors capital, labor, and energy in the growth equation of the capital-labor-energy-creativity model, when the emissions approach their critical limits. These are drawn by, e.g., health hazards or threats to ecological and climate stability. By definition, the limits oblige the economic actors to dedicate shares of the available production factors to emission mitigation, or to adjustments to the emission-induced changes in the biosphere. Since these shares are missing for the production of the quantity of goods and services that would be available to consumers and investors without emission mitigation, the “conventional” output of the economy shrinks. The resulting losses of conventional output are estimated for two classes of scenarios: (1) energy conservation; and (2) nuclear exit and subsidies to photovoltaics. The data of the scenarios refer to Germany in the 1980s and after 11 March 2011. For the energy-conservation scenarios, a method of computing the reduction of output elasticities by emission abatement is proposed.Entropy2016-02-27183Article10.3390/e18030075751099-43002016-02-27doi: 10.3390/e18030075Reiner Kümmel<![CDATA[Entropy, Vol. 18, Pages 77: Tea Category Identification Using a Novel Fractional Fourier Entropy and Jaya Algorithm]]>
http://www.mdpi.com/1099-4300/18/3/77
This work proposes a tea-category identification (TCI) system, which can automatically determine tea category from images captured by a 3 charge-coupled device (CCD) digital camera. Three-hundred tea images were acquired as the dataset. Apart from the 64 traditional color histogram features that were extracted, we also introduced a relatively new feature as fractional Fourier entropy (FRFE) and extracted 25 FRFE features from each tea image. Furthermore, the kernel principal component analysis (KPCA) was harnessed to reduce 64 + 25 = 89 features. The four reduced features were fed into a feedforward neural network (FNN). Its optimal weights were obtained by Jaya algorithm. The 10 × 10-fold stratified cross-validation (SCV) showed that our TCI system obtains an overall average sensitivity rate of 97.9%, which was higher than seven existing approaches. In addition, we used only four features less than or equal to state-of-the-art approaches. Our proposed system is efficient in terms of tea-category identification.Entropy2016-02-27183Article10.3390/e18030077771099-43002016-02-27doi: 10.3390/e18030077Yudong ZhangXiaojun YangCarlo CattaniRavipudi RaoShuihua WangPreetha Phillips<![CDATA[Entropy, Vol. 18, Pages 64: Self-Replicating Spots in the Brusselator Model and Extreme Events in the One-Dimensional Case with Delay]]>
http://www.mdpi.com/1099-4300/18/3/64
We consider the paradigmatic Brusselator model for the study of dissipative structures in far from equilibrium systems. In two dimensions, we show the occurrence of a self-replication phenomenon leading to the fragmentation of a single localized spot into four daughter spots. This instability affects the new spots and leads to splitting behavior until the system reaches a hexagonal stationary pattern. This phenomenon occurs in the absence of delay feedback. In addition, we incorporate a time-delayed feedback loop in the Brusselator model. In one dimension, we show that the delay feedback induces extreme events in a chemical reaction diffusion system. We characterize their formation by computing the probability distribution of the pulse height. The long-tailed statistical distribution, which is often considered as a signature of the presence of rogue waves, appears for sufficiently strong feedback intensity. The generality of our analysis suggests that the feedback-induced instability leading to the spontaneous formation of rogue waves in a controllable way is a universal phenomenon.Entropy2016-02-27183Article10.3390/e18030064641099-43002016-02-27doi: 10.3390/e18030064Mustapha TlidiYerali GandicaGiorgio SonninoEtienne AverlantKrassimir Panajotov