Entropy
http://www.mdpi.com/journal/entropy
Latest open access articles published in Entropy at http://www.mdpi.com/journal/entropy<![CDATA[Entropy, Vol. 17, Pages 2706-2722: Identifying the Most Relevant Lag with Runs]]>
http://www.mdpi.com/1099-4300/17/5/2706
In this paper, we propose a nonparametric statistical tool to identify the most relevant lag in the model description of a time series. It is also shown that it can be used for model identification. The statistic is based on the number of runs, when the time series is symbolized depending on the empirical quantiles of the time series. With a Monte Carlo simulation, we show the size and power performance of our new test statistic under linear and nonlinear data generating processes. From the theoretical point of view, it is the first time that symbolic analysis and runs are proposed to identifying characteristic lags and also to help in the identification of univariate time series models. From a more applied point of view, the results show the power and competitiveness of the proposed tool with respect to other techniques without presuming or specifying a model.Entropy2015-04-28175Article10.3390/e17052706270627221099-43002015-04-28doi: 10.3390/e17052706Úrsula FauraMatilde LafuenteMariano Matilla-GarcíaManuel Ruiz<![CDATA[Entropy, Vol. 17, Pages 2688-2705: A Fuzzy Logic-Based Approach for Estimation of Dwelling Times of Panama Metro Stations]]>
http://www.mdpi.com/1099-4300/17/5/2688
Passenger flow modeling and station dwelling time estimation are significant elements for railway mass transit planning, but system operators usually have limited information to model the passenger flow. In this paper, an artificial-intelligence technique known as fuzzy logic is applied for the estimation of the elements of the origin-destination matrix and the dwelling time of stations in a railway transport system. The fuzzy inference engine used in the algorithm is based in the principle of maximum entropy. The approach considers passengers’ preferences to assign a level of congestion in each car of the train in function of the properties of the station platforms. This approach is implemented to estimate the passenger flow and dwelling times of the recently opened Line 1 of the Panama Metro. The dwelling times obtained from the simulation are compared to real measurements to validate the approach.Entropy2015-04-27175Article10.3390/e17052688268827051099-43002015-04-27doi: 10.3390/e17052688Aranzazu AlvarezFernando MerchanFrancisco PoyoRony George<![CDATA[Entropy, Vol. 17, Pages 2677-2687: Projective Synchronization of Chaotic Discrete Dynamical Systems via Linear State Error Feedback Control]]>
http://www.mdpi.com/1099-4300/17/5/2677
A projective synchronization scheme for a kind of n-dimensional discrete dynamical system is proposed by means of a linear feedback control technique. The scheme consists of master and slave discrete dynamical systems coupled by linear state error variables. A kind of novel 3-D chaotic discrete system is constructed, to which the test for chaos is applied. By using the stability principles of an upper or lower triangular matrix, two controllers for achieving projective synchronization are designed and illustrated with the novel systems. Lastly some numerical simulations are employed to validate the effectiveness of the proposed projective synchronization scheme.Entropy2015-04-27175Article10.3390/e17052677267726871099-43002015-04-27doi: 10.3390/e17052677Baogui XinZhiheng Wu<![CDATA[Entropy, Vol. 17, Pages 2655-2676: State Feedback with Memory for Constrained Switched Positive Linear Systems]]>
http://www.mdpi.com/1099-4300/17/5/2655
In this paper, the stabilization problem in switched linear systems with time-varying delay under constrained state and control is investigated. The synthesis of bounded state-feedback controllers with memory ensures that a closed-loop state is positive and stable. Firstly, synthesis with a sign-restricted (nonnegative and negative) control is considered for general switched systems; then, the stabilization issue under bounded controls including the asymmetrically bounded controls and states constraints are addressed. In addition, the results are extended to systems with interval and polytopic uncertainties. All the proposed conditions are solvable in term of linear programming. Numerical examples illustrate the applicability of the results.Entropy2015-04-27175Article10.3390/e17052655265526761099-43002015-04-27doi: 10.3390/e17052655Jinjin LiuKanjian Zhang<![CDATA[Entropy, Vol. 17, Pages 2642-2654: Stochastic Processes via the Pathway Model]]>
http://www.mdpi.com/1099-4300/17/5/2642
After collecting data from observations or experiments, the next step is to analyze the data to build an appropriate mathematical or stochastic model to describe the data so that further studies can be done with the help of the model. In this article, the input-output type mechanism is considered first, where reaction, diffusion, reaction-diffusion, and production-destruction type physical situations can fit in. Then techniques are described to produce thicker or thinner tails (power law behavior) in stochastic models. Then the pathway idea is described where one can switch to different functional forms of the probability density function through a parameter called the pathway parameter. The paper is a continuation of related solar neutrino research published previously in this journal.Entropy2015-04-24175Article10.3390/e17052642264226541099-43002015-04-24doi: 10.3390/e17052642Arak MathaiHans Haubold<![CDATA[Entropy, Vol. 17, Pages 2624-2641: Recurrence Plot Based Damage Detection Method by Integrating Control Chart]]>
http://www.mdpi.com/1099-4300/17/5/2624
Because of the importance of damage detection in manufacturing systems and other areas, many fault detection methods have been developed that are based on a vibration signal. Little work, however, has been reported in the literature on using a recurrence plot method to analyze the vibration signal for damage detection. In this paper, we develop a recurrence plot based fault detection method by integrating the statistical process control technique. The recurrence plots of the vibration signals are derived by using the recurrence plot (RP) method. Five types of features are extracted from the recurrence plots to quantify the vibration signals’ characteristic. Then, the control chart, a multivariate statistical process control technique, is used to monitor these features. The control chart technique, however, has the assumption that all the data should follow a normal distribution. The RP based bootstrap control chart is proposed to estimate the control chart parameters. The performance of the proposed RP based bootstrap control chart is evaluated by a simulation study and compared with other univariate bootstrap control charts based on recurrence plot features. A real case study of rolling element bearing fault detection demonstrates that the proposed fault detection method achieves a very good performance.Entropy2015-04-24175Article10.3390/e17052624262426411099-43002015-04-24doi: 10.3390/e17052624Cheng ZhouWeidong Zhang<![CDATA[Entropy, Vol. 17, Pages 2606-2623: Uncovering Discrete Non-Linear Dependence with Information Theory]]>
http://www.mdpi.com/1099-4300/17/5/2606
In this paper, we model discrete time series as discrete Markov processes of arbitrary order and derive the approximate distribution of the Kullback-Leibler divergence between a known transition probability matrix and its sample estimate. We introduce two new information-theoretic measurements: information memory loss and information codependence structure. The former measures the memory content within a Markov process and determines its optimal order. The latter assesses the codependence among Markov processes. Both measurements are evaluated on toy examples and applied on high frequency foreign exchange data, focusing on 2008 financial crisis and 2010/2011 Euro crisis.Entropy2015-04-23175Article10.3390/e17052606260626231099-43002015-04-23doi: 10.3390/e17052606Anton GolubGregor ChliamovitchAlexandre DupuisBastien Chopard<![CDATA[Entropy, Vol. 17, Pages 2590-2605: Entropy and Recurrence Measures of a Financial Dynamic System by an Interacting Voter System]]>
http://www.mdpi.com/1099-4300/17/5/2590
A financial time series agent-based model is reproduced and investigated by the statistical physics system, the finite-range interacting voter system. The voter system originally describes the collective behavior of voters who constantly update their positions on a particular topic, which is a continuous-time Markov process. In the proposed model, the fluctuations of stock price changes are attributed to the market information interaction amongst the traders and certain similarities of investors’ behaviors. Further, the complexity of return series of the financial model is studied in comparison with two real stock indexes, the Shanghai Stock Exchange Composite Index and the Hang Seng Index, by composite multiscale entropy analysis and recurrence analysis. The empirical research shows that the simulation data for the proposed model could grasp some natural features of actual markets to some extent.Entropy2015-04-23175Article10.3390/e17052590259026051099-43002015-04-23doi: 10.3390/e17052590Hong-Li NiuJun Wang<![CDATA[Entropy, Vol. 17, Pages 2573-2589: Source Localization by Entropic Inference and Backward Renormalization Group Priors]]>
http://www.mdpi.com/1099-4300/17/5/2573
A systematic method of transferring information from coarser to finer resolution based on renormalization group (RG) transformations is introduced. It permits building informative priors in finer scales from posteriors in coarser scales since, under some conditions, RG transformations in the space of hyperparameters can be inverted. These priors are updated using renormalized data into posteriors by Maximum Entropy. The resulting inference method, backward RG (BRG) priors, is tested by doing simulations of a functional magnetic resonance imaging (fMRI) experiment. Its results are compared with a Bayesian approach working in the finest available resolution. Using BRG priors sources can be partially identified even when signal to noise ratio levels are up to ~ -25dB improving vastly on the single step Bayesian approach. For low levels of noise the BRG prior is not an improvement over the single scale Bayesian method. Analysis of the histograms of hyperparameters can show how to distinguish if the method is failing, due to very high levels of noise, or if the identification of the sources is, at least partially possible.Entropy2015-04-23175Article10.3390/e17052573257325891099-43002015-04-23doi: 10.3390/e17052573Nestor Caticha<![CDATA[Entropy, Vol. 17, Pages 2556-2572: Optimum Accelerated Degradation Tests for the Gamma Degradation Process Case under the Constraint of Total Cost]]>
http://www.mdpi.com/1099-4300/17/5/2556
An accelerated degradation test (ADT) is regarded as an effective alternative to an accelerated life test in the sense that an ADT can provide more accurate information on product reliability, even when few or no failures may be expected before the end of a practical test period. In this paper, statistical methods for optimal designing ADT plans are developed assuming that the degradation characteristic follows a gamma process (GP). The GP-based approach has an advantage that it can deal with more frequently encountered situations in which the degradation should always be nonnegative and strictly increasing over time. The optimal ADT plan is developed under the total experimental cost constraint by determining the optimal settings of variables such as the number of measurements, the measurement times, the test stress levels and the number of units allocated to each stress level such that the asymptotic variance of the maximum likelihood estimator of the q-th quantile of the lifetime distribution at the use condition is minimized. In addition, compromise plans are developed to provide means to check the adequacy of the assumed acceleration model. Finally, sensitivity analysis procedures for assessing the effects of the uncertainties in the pre-estimates of unknown parameters are illustrated with an example.Entropy2015-04-23175Article10.3390/e17052556255625721099-43002015-04-23doi: 10.3390/e17052556Heonsang Lim<![CDATA[Entropy, Vol. 17, Pages 2544-2555: Thermodynamic Analysis of Double-Stage Compression Transcritical CO2 Refrigeration Cycles with an Expander]]>
http://www.mdpi.com/1099-4300/17/4/2544
Four different double-compression CO2 transcritical refrigeration cycles are studied: double-compression external intercooler cycle (DCEI), double-compression external intercooler cycle with an expander (DCEIE), double-compression flash intercooler cycle (DCFI), double-compression flash intercooler cycle with an expander (DCFIE). The results showed that the optimum gas cooler pressure and optimum intermediate pressure of the flash intercooler cycles are lower than that of the external intercooler cycle. The use of an expander in the DCEI cycle leads to a decrease of the optimum gas cooler pressure and little variation of the optimum intermediate pressure. However, the replacement of the throttle valve with an expander in the DCFI cycle results in little variation of the optimal gas cooler pressure and an increase of the optimum intermediate pressure. The DCFI cycle outperforms the DCEI cycle under all the chosen operating conditions. The DCEIE cycle outperforms the DCFIE cycle when the evaporating temperature exceeds 0 °C or the gas cooler outlet temperature surpasses 35 °C. When the gas cooler exit temperature varies from 32 °C to 48 °C, the DCEI cycle, DCEIE cycle, DCFI cycle and DCFIE cycle yield averaged 4.6%, 29.2%, 12.9% and 22.3% COP improvement, respectively, over the basic cycle.Entropy2015-04-22174Article10.3390/e17042544254425551099-43002015-04-22doi: 10.3390/e17042544Zhenying ZhangLirui TongXingguo Wang<![CDATA[Entropy, Vol. 17, Pages 2459-2543: Justifying Objective Bayesianism on Predicate Languages]]>
http://www.mdpi.com/1099-4300/17/4/2459
Objective Bayesianism says that the strengths of one’s beliefs ought to be probabilities, calibrated to physical probabilities insofar as one has evidence of them, and otherwise sufficiently equivocal. These norms of belief are often explicated using the maximum entropy principle. In this paper we investigate the extent to which one can provide a unified justification of the objective Bayesian norms in the case in which the background language is a first-order predicate language, with a view to applying the resulting formalism to inductive logic. We show that the maximum entropy principle can be motivated largely in terms of minimising worst-case expected loss.Entropy2015-04-22174Article10.3390/e17042459245925431099-43002015-04-22doi: 10.3390/e17042459Jürgen LandesJon Williamson<![CDATA[Entropy, Vol. 17, Pages 2432-2458: Information Geometry on Complexity and Stochastic Interaction]]>
http://www.mdpi.com/1099-4300/17/4/2432
Interdependencies of stochastically interacting units are usually quantified by the Kullback-Leibler divergence of a stationary joint probability distribution on the set of all configurations from the corresponding factorized distribution. This is a spatial approach which does not describe the intrinsically temporal aspects of interaction. In the present paper, the setting is extended to a dynamical version where temporal interdependencies are also captured by using information geometry of Markov chain manifolds.Entropy2015-04-21174Article10.3390/e17042432243224581099-43002015-04-21doi: 10.3390/e17042432Nihat Ay<![CDATA[Entropy, Vol. 17, Pages 2409-2431: Collaborative Performance Research on Multi-level Hospital Management Based on Synergy Entropy-HoQ]]>
http://www.mdpi.com/1099-4300/17/4/2409
Because of the general lack of multi-level hospital management collaboration performance effectiveness research, this paper proposes a multi-level hospital management Synergy Entropy-House of Quality (HoQ) Measurement Model by innovatively combining the House of Quality (HoQ) measure model with a Synergy Entropy computing principle. Triangular fuzzy functions are used to determine the importance degree parameter of each hospital management element which combined with the results from the Synergy Entropy evaluation of the hospital management elements, arrive at a comprehensive collaborative computation result for the various elements, ensuring results objectivity. Finally, the analysis of the collaborative research on multi-level hospital management demonstrated the scientific effectiveness of the hospital management Synergy Entropy-House of Quality (HoQ) Measurement Model.Entropy2015-04-20174Article10.3390/e17042409240924311099-43002015-04-20doi: 10.3390/e17042409Lei ChenXuedong LiangTao Li<![CDATA[Entropy, Vol. 17, Pages 2367-2408: An Entropy-Based Network Anomaly Detection Method]]>
http://www.mdpi.com/1099-4300/17/4/2367
Data mining is an interdisciplinary subfield of computer science involving methods at the intersection of artificial intelligence, machine learning and statistics. One of the data mining tasks is anomaly detection which is the analysis of large quantities of data to identify items, events or observations which do not conform to an expected pattern. Anomaly detection is applicable in a variety of domains, e.g., fraud detection, fault detection, system health monitoring but this article focuses on application of anomaly detection in the field of network intrusion detection.The main goal of the article is to prove that an entropy-based approach is suitable to detect modern botnet-like malware based on anomalous patterns in network. This aim is achieved by realization of the following points: (i) preparation of a concept of original entropy-based network anomaly detection method, (ii) implementation of the method, (iii) preparation of original dataset, (iv) evaluation of the method.Entropy2015-04-20174Article10.3390/e17042367236724081099-43002015-04-20doi: 10.3390/e17042367Przemysław BerezińskiBartosz JasiulMarcin Szpyrka<![CDATA[Entropy, Vol. 17, Pages 2355-2366: A Criterion for Topological Close-Packed Phase Formation in High Entropy Alloys]]>
http://www.mdpi.com/1099-4300/17/4/2355
The stability of topological close-packed (TCP) phases were found to be well related to the average value of the d-orbital energy level \( \overline{Md} \) for most reported high entropy alloys (HEAs). Excluding some HEAs that contain high levels of the elements aluminum and vanadium, the results of this study indicated that the TCP phases form at \( \overline{Md} \) &gt; 1.09. This criterion, as a semi-empirical method, can play a key role in designing and preparing HEAs with high amounts of transitional elements.Entropy2015-04-20174Article10.3390/e17042355235523661099-43002015-04-20doi: 10.3390/e17042355Yiping LuYong DongLi JiangTongmin WangTingju LiYong Zhang<![CDATA[Entropy, Vol. 17, Pages 2341-2354: Multi-State Quantum Dissipative Dynamics in Sub-Ohmic Environment: The Strong Coupling Regime]]>
http://www.mdpi.com/1099-4300/17/4/2341
We study the dissipative quantum dynamics and the asymptotic behavior of a particle in a bistable potential interacting with a sub-Ohmic broadband environment. The reduced dynamics, in the intermediate to strong dissipation regime, is obtained beyond the two-level system approximation by using a real-time path integral approach. We find a crossover dynamic regime with damped intra-well oscillations and incoherent tunneling and a completely incoherent regime at strong damping. Moreover, a nonmonotonic behavior of the left/right well population difference is found as a function of the damping strength.Entropy2015-04-17174Article10.3390/e17042341234123541099-43002015-04-17doi: 10.3390/e17042341Luca MagazzùDavide ValentiAngelo CarolloBernardo Spagnolo<![CDATA[Entropy, Vol. 17, Pages 2328-2340: Exergy Analysis of a Ground-Coupled Heat Pump Heating System with Different Terminals]]>
http://www.mdpi.com/1099-4300/17/4/2328
In order to evaluate and improve the performance of a ground-coupled heat pump (GCHP) heating system with radiant floors as terminals, an exergy analysis based on test results is performed in this study. The system is divided into four subsystems, and the exergy loss and exergy efficiency of each subsystem are calculated using the expressions derived based on exergy balance equations. The average values of the measured parameters are used for the exergy analysis. The analysis results show that the two largest exergy losses occur in the heat pump and terminals, with losses of 55.3% and 22.06%, respectively, and the lowest exergy efficiency occurs in the ground heat exchange system. Therefore, GCHP system designers should pay close attention to the selection of heat pumps and terminals, especially in the design of ground heat exchange systems. Compared with the scenario system in which fan coil units (FCUs) are substituted for the radiant floors, the adoption of radiant floors can result in a decrease of 12% in heating load, an increase of 3.24% in exergy efficiency of terminals and an increase of 1.18% in total exergy efficiency of the system. The results may point out the direction and ways of optimizing GCHP systems.Entropy2015-04-17174Article10.3390/e17042328232823401099-43002015-04-17doi: 10.3390/e17042328Xiao ChenXiaoli Hao<![CDATA[Entropy, Vol. 17, Pages 2304-2327: Information-Theoretic Inference of Common Ancestors]]>
http://www.mdpi.com/1099-4300/17/4/2304
A directed acyclic graph (DAG) partially represents the conditional independence structure among observations of a system if the local Markov condition holds, that is if every variable is independent of its non-descendants given its parents. In general, there is a whole class of DAGs that represents a given set of conditional independence relations. We are interested in properties of this class that can be derived from observations of a subsystem only. To this end, we prove an information-theoretic inequality that allows for the inference of common ancestors of observed parts in any DAG representing some unknown larger system. More explicitly, we show that a large amount of dependence in terms of mutual information among the observations implies the existence of a common ancestor that distributes this information. Within the causal interpretation of DAGs, our result can be seen as a quantitative extension of Reichenbach’s principle of common cause to more than two variables. Our conclusions are valid also for non-probabilistic observations, such as binary strings, since we state the proof for an axiomatized notion of “mutual information” that includes the stochastic as well as the algorithmic version.Entropy2015-04-16174Article10.3390/e17042304230423271099-43002015-04-16doi: 10.3390/e17042304Bastian SteudelNihat Ay<![CDATA[Entropy, Vol. 17, Pages 2281-2303: Some Comments on the Entropy-Based Criteria for Piping]]>
http://www.mdpi.com/1099-4300/17/4/2281
This paper is an extension of previous work which characterises soil behaviours using the grading entropy diagram. The present work looks at the piping process in granular soils, by considering some new data from flood-protection dikes. The piping process is divided into three parts here: particle movement at the micro scale to segregate free water; sand boil development (which is the initiation of the pipe), and pipe growth. In the first part of the process, which occurs during the rising flood, the increase in shear stress along the dike base may cause segregation of water into micro pipes if the subsoil in the dike base is relatively loose. This occurs at the maximum dike base shear stress level (ratio of shear stress and strength) zone which is close to the toe. In the second part of the process, the shear strain increment causes a sudden, asymmetric slide and cracking of the dike leading to the localized excess pore pressure, liquefaction and the formation of a sand boil. In the third part of the process, the soil erosion initiated through the sand boil continues, and the pipe grows. The piping in the Hungarian dikes often occurs in a two-layer system; where the base layer is coarser with higher permeability and the cover layer is finer with lower permeability. The new data presented here show that the soils ejected from the sand boils are generally silty sands and sands, which are prone to both erosion (on the basis of the entropy criterion) and liquefaction. They originate from the cover layer which is basically identical to the soil used in the Dutch backward erosion experiments.Entropy2015-04-15174Article10.3390/e17042281228123031099-43002015-04-15doi: 10.3390/e17042281Emöke ImreLaszlo NagyJanos LőrinczNegar RahemiTom SchanzVijay SinghStephen Fityus<![CDATA[Entropy, Vol. 17, Pages 2253-2280: Integrating Entropy and Copula Theories for Hydrologic Modeling and Analysis]]>
http://www.mdpi.com/1099-4300/17/4/2253
Entropy is a measure of uncertainty and has been commonly used for various applications, including probability inferences in hydrology. Copula has been widely used for constructing joint distributions to model the dependence structure of multivariate hydrological random variables. Integration of entropy and copula theories provides new insights in hydrologic modeling and analysis, for which the development and application are still in infancy. Two broad branches of integration of the two concepts, entropy copula and copula entropy, are introduced in this study. On the one hand, the entropy theory can be used to derive new families of copulas based on information content matching. On the other hand, the copula entropy provides attractive alternatives in the nonlinear dependence measurement even in higher dimensions. We introduce in this study the integration of entropy and copula theories in the dependence modeling and analysis to illustrate the potential applications in hydrology and water resources.Entropy2015-04-15174Review10.3390/e17042253225322801099-43002015-04-15doi: 10.3390/e17042253Zengchao HaoVijay Singh<![CDATA[Entropy, Vol. 17, Pages 2228-2252: A Community-Based Approach to Identifying Influential Spreaders]]>
http://www.mdpi.com/1099-4300/17/4/2228
Identifying influential spreaders in complex networks has a significant impact on understanding and control of spreading process in networks. In this paper, we introduce a new centrality index to identify influential spreaders in a network based on the community structure of the network. The community-based centrality (CbC) considers both the number and sizes of communities that are directly linked by a node. We discuss correlations between CbC and other classical centrality indices. Based on simulations of the single source of infection with the Susceptible-Infected-Recovered (SIR) model, we find that CbC can help to identify some critical influential nodes that other indices cannot find. We also investigate the stability of CbC.Entropy2015-04-14174Article10.3390/e17042228222822521099-43002015-04-14doi: 10.3390/e17042228Zhiying ZhaoXiaofan WangWei ZhangZhiliang Zhu<![CDATA[Entropy, Vol. 17, Pages 2218-2227: Cryptographic Aspects of Quantum Reading]]>
http://www.mdpi.com/1099-4300/17/4/2218
Besides achieving secure communication between two spatially-separated parties,another important issue in modern cryptography is related to secure communication intime, i.e., the possibility to confidentially store information on a memory for later retrieval.Here we explore this possibility in the setting of quantum reading, which exploits quantumentanglement to efficiently read data from a memory whereas classical strategies (e.g., basedon coherent states or their mixtures) cannot retrieve any information. From this point ofview, the technique of quantum reading can provide a new form of technological security fordata storage.Entropy2015-04-13174Article10.3390/e17042218221822271099-43002015-04-13doi: 10.3390/e17042218Gaetana Spedalieri<![CDATA[Entropy, Vol. 17, Pages 2198-2217: Entropic-Skins Geometry to Describe Wall Turbulence Intermittency]]>
http://www.mdpi.com/1099-4300/17/4/2198
In order to describe the phenomenon of intermittency in wall turbulence and, more particularly, the behaviour of moments and and intermittency exponents ζP with the order p and distance to the wall, we developed a new geometrical framework called “entropic-skins geometry” based on the notion of scale-entropy which is here applied to an experimental database of boundary layer flows. Each moment has its own spatial multi-scale support Ωp (“skin”). The model assumes the existence of a hierarchy of multi-scale sets Ωp ranged from the “bulk” to the “crest”. The crest noted characterizes the geometrical support where the most intermittent (the highest) fluctuations in energy dissipation occur; the bulk is the geometrical support for the whole range of fluctuations. The model assumes then the existence of a dynamical flux through the hierarchy of skins. The specific case where skins display a fractal structure is investigated. Bulk fractal dimension and crest dimension are linked by a scale-entropy flux defining a reversibility efficiency (d is the embedding dimension). The model, initially developed for homogeneous and isotropic turbulent flows, is applied here to wall bounded turbulence where intermittency exponents are measured by extended self-similarity. We obtained for intermittency exponents the analytical expression with γ ≈ 0.36 in agreement with experimental results.Entropy2015-04-13174Article10.3390/e17042198219822171099-43002015-04-13doi: 10.3390/e17042198Diogo Queiros-CondeJohan CarlierLavinia GrosuMichel Stanislas<![CDATA[Entropy, Vol. 17, Pages 2184-2197: Implications of Non-Differentiable Entropy on a Space-Time Manifold]]>
http://www.mdpi.com/1099-4300/17/4/2184
Assuming that the motions of a complex system structural units take place on continuous, but non-differentiable curves of a space-time manifold, the scale relativity model with arbitrary constant fractal dimension (the hydrodynamic and wave function versions) is built. For non-differentiability through stochastic processes of the Markov type, the non-differentiable entropy concept on a space-time manifold in the hydrodynamic version and its correspondence with motion variables (energy, momentum, etc.) are established. Moreover, for the same non-differentiability type, through a scale resolution dependence of a fundamental length and wave function independence with respect to the proper time, a non-differentiable Klein–Gordon-type equation in the wave function version is obtained. For a phase-amplitude functional dependence on the wave function, the non-differentiable spontaneous symmetry breaking mechanism implies pattern generation in the form of Cooper non-differentiable-type pairs, while its non-differentiable topology implies some fractal logic elements (fractal bit, fractal gates, etc.).Entropy2015-04-13174Article10.3390/e17042184218421971099-43002015-04-13doi: 10.3390/e17042184Maricel AgopAlina GavriluţGavril ŞtefanBogdan Doroftei<![CDATA[Entropy, Vol. 17, Pages 2170-2183: High-Speed Spindle Fault Diagnosis with the Empirical Mode Decomposition and Multiscale Entropy Method]]>
http://www.mdpi.com/1099-4300/17/4/2170
The root mean square (RMS) value of a vibration signal is an important indicator used to represent the amplitude of vibrations in evaluating the quality of high-speed spindles. However, RMS is unable to detect a number of common fault characteristics that occur prior to bearing failure. Extending the operational life and quality of spindles requires reliable fault diagnosis techniques for the analysis of vibration signals from three axes. This study used empirical mode decomposition to decompose signals into intrinsic mode functions containing a zero-crossing rate and energy to represent the characteristics of rotating elements. The MSE curve was then used to identify a number of characteristic defects. The purpose of this research was to obtain vibration signals along three axes with the aim of extending the operational life of devices included in the product line of an actual spindle manufacturing company.Entropy2015-04-13174Article10.3390/e17042170217021831099-43002015-04-13doi: 10.3390/e17042170Nan-Kai HsiehWei-Yen LinHong-Tsu Young<![CDATA[Entropy, Vol. 17, Pages 2140-2169: Deep Belief Network-Based Approaches for Link Prediction in Signed Social Networks]]>
http://www.mdpi.com/1099-4300/17/4/2140
In some online social network services (SNSs), the members are allowed to label their relationships with others, and such relationships can be represented as the links with signed values (positive or negative). The networks containing such relations are named signed social networks (SSNs), and some real-world complex systems can be also modeled with SSNs. Given the information of the observed structure of an SSN, the link prediction aims to estimate the values of the unobserved links. Noticing that most of the previous approaches for link prediction are based on the members’ similarity and the supervised learning method, however, research work on the investigation of the hidden principles that drive the behaviors of social members are rarely conducted. In this paper, the deep belief network (DBN)-based approaches for link prediction are proposed. Including an unsupervised link prediction model, a feature representation method and a DBN-based link prediction method are introduced. The experiments are done on the datasets from three SNSs (social networking services) in different domains, and the results show that our methods can predict the values of the links with high performance and have a good generalization ability across these datasets.Entropy2015-04-10174Article10.3390/e17042140214021691099-43002015-04-10doi: 10.3390/e17042140Feng LiuBingquan LiuChengjie SunMing LiuXiaolong Wang<![CDATA[Entropy, Vol. 17, Pages 2117-2139: Image Encryption Using Chebyshev Map and Rotation Equation]]>
http://www.mdpi.com/1099-4300/17/4/2117
We propose a novel image encryption algorithm based on two pseudorandom bit generators: Chebyshev map based and rotation equation based. The first is used for permutation, and the second one for substitution operations. Detailed security analysis has been provided on the novel image encryption algorithm using visual testing, key space evaluation, histogram analysis, information entropy calculation, correlation coefficient analysis, differential analysis, key sensitivity test, and computational and complexity analysis. Based on the theoretical and empirical results the novel image encryption scheme demonstrates an excellent level of security.Entropy2015-04-09174Article10.3390/e17042117211721391099-43002015-04-09doi: 10.3390/e17042117Borislav StoyanovKrasimir Kordov<![CDATA[Entropy, Vol. 17, Pages 2094-2116: Research and Measurement of Software Complexity Based on Wuli, Shili, Renli (WSR) and Information Entropy]]>
http://www.mdpi.com/1099-4300/17/4/2094
Complexity is an important factor throughout the software life cycle. It is increasingly difficult to guarantee software quality, cost and development progress with the increase in complexity. Excessive complexity is one of the main reasons for the failure of software projects, so effective recognition, measurement and control of complexity becomes the key of project management. At first, this paper analyzes the current research situation of software complexity systematically and points out existing problems in current research. Then, it proposes a WSR framework of software complexity, which divides the complexity of software into three levels of Wuli (WL), Shili (SL) and Renli (RL), so that the staff in different roles may have a better understanding of complexity. Man is the main source of complexity, but the current research focuses on WL complexity, and the research of RL complexity is extremely scarce, so this paper emphasizes the research of RL complexity of software projects. This paper not only analyzes the composing factors of RL complexity, but also provides the definition of RL complexity. Moreover, it puts forward a quantitative measurement method of the complexity of personnel organization hierarchy and the complexity of personnel communication information based on information entropy first and analyzes and validates the scientificity and rationality of this measurement method through a large number of cases.Entropy2015-04-08174Article10.3390/e17042094209421161099-43002015-04-08doi: 10.3390/e17042094Rong Jiang<![CDATA[Entropy, Vol. 17, Pages 2082-2093: Target Detection and Ranging through Lossy Media using Chaotic Radar]]>
http://www.mdpi.com/1099-4300/17/4/2082
A chaotic radar system has been developed for through-wall detection and ranging of targets. The chaotic signal generated by an improved Colpitts oscillator is designed as a probe signal. Ranging to target is achieved by the cross-correlation between the time-delayed reflected return signal and the replica of the transmitted chaotic signal. In this paper, we explore the performance of the chaotic radar system for target detection and ranging through lossy media. Experimental results show that the designed chaotic radar has the advantages of high range resolution, unambiguous correlation profile, and can be used for through wall target detection and sensing.Entropy2015-04-08174Article10.3390/e17042082208220931099-43002015-04-08doi: 10.3390/e17042082Bingjie WangHang XuPeng YangLi LiuJingxia Li<![CDATA[Entropy, Vol. 17, Pages 2062-2081: Kappa and q Indices: Dependence on the Degrees of Freedom]]>
http://www.mdpi.com/1099-4300/17/4/2062
The kappa distributions, or their equivalent, the q-exponential distributions, are the natural generalization of the classical Boltzmann-Maxwell distributions, applied to the study of the particle populations in collisionless space plasmas. A huge step in the development of the theory of kappa distributions and their applications in space plasma physics has been achieved with the discovery that the observed kappa distributions are connected with the solid statistical background of non-extensive statistical mechanics. Now that the statistical framework has been identified, it is straightforward to improve our understanding of the nature of the kappa index (or the entropic q-index) that governs these distributions. One critical topic is the dependence of the kappa index on the degrees of freedom. In this paper, we first show how this specific dependence is naturally emerged, using the formalism of the N-particle kappa distribution of velocities. Then, the result is extended in the presence of potential energies. It is shown that the kappa index is simply related to the kinetic and potential degrees of freedom. In addition, it is shown that various problems of non-extensive statistical mechanics, such as (i) the correlation dependence on the total number of particles; and (ii) the normalization divergence for finite kappa indices, are resolved considering the kappa index dependence on the degrees of freedom.Entropy2015-04-08174Article10.3390/e17042062206220811099-43002015-04-08doi: 10.3390/e17042062George Livadiotis<![CDATA[Entropy, Vol. 17, Pages 2039-2061: Experimental and Thermoeconomic Analysis of Small-Scale Solar Organic Rankine Cycle (SORC) System]]>
http://www.mdpi.com/1099-4300/17/4/2039
A small-scale solar organic Rankine cycle (ORC) is a promising renewable energy-driven power generation technology that can be used in the rural areas of developing countries. A prototype was developed and tested for its performance characteristics under a range of solar source temperatures. The solar ORC system power output was calculated based on the thermal and solar collector efficiency. The maximum solar power output was observed in April. The solar ORC unit power output ranged from 0.4 kW to 1.38 kW during the year. The highest power output was obtained when the expander inlet pressure was 13 bar and the solar source temperature was 120 °C. The area of the collector for the investigation was calculated based on the meteorological conditions of Busan City (South Korea). In the second part, economic and thermoeconomic analyses were carried out to determine the cost of energy per kWh from the solar ORC. The selling price of electricity generation was found to be $0.68/kWh and $0.39/kWh for the prototype and low cost solar ORC, respectively. The sensitivity analysis was carried out in order to find the influencing economic parameters for the change in NPV. Finally, the sustainability index was calculated to assess the sustainable development of the solar ORC system.Entropy2015-04-07174Article10.3390/e17042039203920611099-43002015-04-07doi: 10.3390/e17042039Suresh BaralDokyun KimEunkoo YunKyung Kim<![CDATA[Entropy, Vol. 17, Pages 2025-2038: A Method to Derive the Definition of Generalized Entropy from Generalized Exergy for Any State in Many-Particle Systems]]>
http://www.mdpi.com/1099-4300/17/4/2025
The literature reports the proofs that entropy is an inherent property of any system in any state and governs thermal energy, which depends on temperature and is transferred by heat interactions. A first novelty proposed in the present study is that mechanical energy, determined by pressure and transferred by work interactions, is also characterized by the entropy property. The second novelty is that a generalized definition of entropy relating to temperature, chemical potential and pressure of many-particle systems, is established to calculate the thermal, chemical and mechanical entropy contribution due to heat, mass and work interactions. The expression of generalized entropy is derived from generalized exergy, which in turn depends on temperature, chemical potential and pressure of the system, and by the entropy-exergy relationship constituting the basis of the method adopted to analyze the available energy and its transfer interactions with a reference system which may be external or constitute a subsystem. This method is underpinned by the Second Law statement enunciated in terms of existence and uniqueness of stable equilibrium for each value of energy content of the system. The equality of chemical potential and equality of pressure are assumed, in addition to equality of temperature, to be necessary conditions for stable equilibrium.Entropy2015-04-07174Article10.3390/e17042025202520381099-43002015-04-07doi: 10.3390/e17042025Pierfrancesco Palazzo<![CDATA[Entropy, Vol. 17, Pages 2010-2024: Resource Requirements and Speed versus Geometry of Unconditionally Secure Physical Key Exchanges]]>
http://www.mdpi.com/1099-4300/17/4/2010
The imperative need for unconditional secure key exchange is expounded by the increasing connectivity of networks and by the increasing number and level of sophistication of cyberattacks. Two concepts that are theoretically information-secure are quantum key distribution (QKD) and Kirchoff-Law-Johnson-Noise (KLJN). However, these concepts require a dedicated connection between hosts in peer-to-peer (P2P) networks which can be impractical and or cost prohibitive. A practical and cost effective method is to have each host share their respective cable(s) with other hosts such that two remote hosts can realize a secure key exchange without the need of an additional cable or key exchanger. In this article we analyze the cost complexities of cable, key exchangers, and time required in the star network. We mentioned the reliability of the star network and compare it with other network geometries. We also conceived a protocol and equation for the number of secure bit exchange periods needed in a star network. We then outline other network geometries and trade-off possibilities that seem interesting to explore.Entropy2015-04-03174Article10.3390/e17042010201020241099-43002015-04-03doi: 10.3390/e17042010Elias GonzalezRobert BalogLaszlo Kish<![CDATA[Entropy, Vol. 17, Pages 1971-2009: Translation of Ludwig Boltzmann’s Paper “On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations Regarding the Conditions for Thermal Equilibrium” Sitzungberichte der Kaiserlichen Akademie der Wissenschaften. Mathematisch-Naturwissen Classe. Abt. II, LXXVI 1877, pp 373-435 (Wien. Ber. 1877, 76:373-435). Reprinted in Wiss. Abhandlungen, Vol. II, reprint 42, p. 164-223, Barth, Leipzig, 1909]]>
http://www.mdpi.com/1099-4300/17/4/1971
Translation of the seminal 1877 paper by Ludwig Boltzmann which for the first time established the probabilistic basis of entropy. Includes a scientific commentary.Entropy2015-04-02174Article10.3390/e17041971197120091099-43002015-04-02doi: 10.3390/e17041971Kim SharpFranz Matschinsky<![CDATA[Entropy, Vol. 17, Pages 1958-1970: Assessing Coupling Dynamics from an Ensemble of Time Series]]>
http://www.mdpi.com/1099-4300/17/4/1958
Finding interdependency relations between time series provides valuable knowledge about the processes that generated the signals. Information theory sets a natural framework for important classes of statistical dependencies. However, a reliable estimation from information-theoretic functionals is hampered when the dependency to be assessed is brief or evolves in time. Here, we show that these limitations can be partly alleviated when we have access to an ensemble of independent repetitions of the time series. In particular, we gear a data-efficient estimator of probability densities to make use of the full structure of trial-based measures. By doing so, we can obtain time-resolved estimates for a family of entropy combinations (including mutual information, transfer entropy and their conditional counterparts), which are more accurate than the simple average of individual estimates over trials. We show with simulated and real data generated by coupled electronic circuits that the proposed approach allows one to recover the time-resolved dynamics of the coupling between different subsystems.Entropy2015-04-02174Article10.3390/e17041958195819701099-43002015-04-02doi: 10.3390/e17041958Germán Gómez-HerreroWei WuKalle RutanenMiguel SorianoGordon PipaRaul Vicente<![CDATA[Entropy, Vol. 17, Pages 1946-1957: A Simple Decoder for Topological Codes]]>
http://www.mdpi.com/1099-4300/17/4/1946
Here we study an efficient algorithm for decoding topological codes. It is a simple form of HDRG decoder, which could be straightforwardly generalized to complex decoding problems. Specific results are obtained for the planar code with both i.i.d. and spatially correlated errors. The method is shown to compare well with existing ones, despite its simplicity.Entropy2015-04-01174Article10.3390/e17041946194619571099-43002015-04-01doi: 10.3390/e17041946James Wootton<![CDATA[Entropy, Vol. 17, Pages 1936-1945: On Nonlinear Complexity and Shannon’s Entropy of Finite Length Random Sequences]]>
http://www.mdpi.com/1099-4300/17/4/1936
Pseudorandom binary sequences have important uses in many fields, such as spread spectrum communications, statistical sampling and cryptography. There are two kinds of method in evaluating the properties of sequences, one is based on the probability measure, and the other is based on the deterministic complexity measures. However, the relationship between these two methods still remains an interesting open problem. In this paper, we mainly focus on the widely used nonlinear complexity of random sequences, study on its distribution, expectation and variance of memoryless sources. Furthermore, the relationship between nonlinear complexity and Shannon’s entropy is also established here. The results show that the Shannon’s entropy is strictly monotonically decreased with nonlinear complexity.Entropy2015-04-01174Article10.3390/e17041936193619451099-43002015-04-01doi: 10.3390/e17041936Lingfeng LiuSuoxia MiaoBocheng Liu<![CDATA[Entropy, Vol. 17, Pages 1916-1935: Pressure Tensor of Nanoscopic Liquid Drops]]>
http://www.mdpi.com/1099-4300/17/4/1916
This study describes the structure of an inhomogeneous fluid of one or several components that forms a spherical interface. Using the stress tensor of Percus–Romero, which depends on the density of one particle and the intermolecular potential, it provides an analytical development leading to the microscopic expressions of the pressure differences and the interfacial properties of both systems. The results are compared with a previous study and agree with the description of the mean field.Entropy2015-04-01174Article10.3390/e17041916191619351099-43002015-04-01doi: 10.3390/e17041916José G. Segovia-LópezAdrian Carbajal-Domínguez<![CDATA[Entropy, Vol. 17, Pages 1896-1915: Kinetic Theory Modeling and Efficient Numerical Simulation of Gene Regulatory Networks Based on Qualitative Descriptions]]>
http://www.mdpi.com/1099-4300/17/4/1896
In this work, we begin by considering the qualitative modeling of biological regulatory systems using process hitting, from which we define its probabilistic counterpart by considering the chemical master equation within a kinetic theory framework. The last equation is efficiently solved by considering a separated representation within the proper generalized decomposition framework that allows circumventing the so-called curse of dimensionality. Finally, model parameters can be added as extra-coordinates in order to obtain a parametric solution of the model.Entropy2015-04-01174Article10.3390/e17041896189619151099-43002015-04-01doi: 10.3390/e17041896Francisco ChinestaMorgan MagninOlivier RouxAmine AmmarElias Cueto<![CDATA[Entropy, Vol. 17, Pages 1882-1895: Statistical Correlations of the N-particle Moshinsky Model]]>
http://www.mdpi.com/1099-4300/17/4/1882
We study the correlation of the ground state of an N-particle Moshinsky model by computing the Shannon entropy in both position and momentum spaces. We have derived the Shannon entropy and mutual information with analytical forms of such an N-particle Moshinsky model, and this helps us test the entropic uncertainty principle. The Shannon entropy in position space decreases as interaction strength increases. However, Shannon entropy in momentum space has the opposite trend. Shannon entropy of the whole system satisfies the equality of entropic uncertainty principle. Our results also indicate that, independent of the sizes of the two subsystems, the mutual information increases monotonically as the interaction strength increases.Entropy2015-03-31174Article10.3390/e17041882188218951099-43002015-03-31doi: 10.3390/e17041882Hsuan PengYew Ho<![CDATA[Entropy, Vol. 17, Pages 1850-1881: Computing Bi-Invariant Pseudo-Metrics on Lie Groups for Consistent Statistics]]>
http://www.mdpi.com/1099-4300/17/4/1850
In computational anatomy, organ’s shapes are often modeled as deformations of a reference shape, i.e., as elements of a Lie group. To analyze the variability of the human anatomy in this framework, we need to perform statistics on Lie groups. A Lie group is a manifold with a consistent group structure. Statistics on Riemannian manifolds have been well studied, but to use the statistical Riemannian framework on Lie groups, one needs to define a Riemannian metric compatible with the group structure: a bi-invariant metric. However, it is known that Lie groups, which are not a direct product of compact and abelian groups, have no bi-invariant metric. However, what about bi-invariant pseudo-metrics? In other words: could we remove the assumption of the positivity of the metric and obtain consistent statistics on Lie groups through the pseudo-Riemannian framework? Our contribution is two-fold. First, we present an algorithm that constructs bi-invariant pseudo-metrics on a given Lie group, in the case of existence. Then, by running the algorithm on commonly-used Lie groups, we show that most of them do not admit any bi-invariant (pseudo-) metric. We thus conclude that the (pseudo-) Riemannian setting is too limited for the definition of consistent statistics on general Lie groups.Entropy2015-03-31174Article10.3390/e17041850185018811099-43002015-03-31doi: 10.3390/e17041850Nina MiolaneXavier Pennec<![CDATA[Entropy, Vol. 17, Pages 1814-1849: Geometry of Fisher Information Metric and the Barycenter Map]]>
http://www.mdpi.com/1099-4300/17/4/1814
Geometry of Fisher metric and geodesics on a space of probability measures defined on a compact manifold is discussed and is applied to geometry of a barycenter map associated with Busemann function on an Hadamard manifold \(X\). We obtain an explicit formula of geodesic and then several theorems on geodesics, one of which asserts that any two probability measures can be joined by a unique geodesic. Using Fisher metric and thus obtained properties of geodesics, a fibre space structure of barycenter map and geodesical properties of each fibre are discussed. Moreover, an isometry problem on an Hadamard manifold \(X\) and its ideal boundary \(\partial X\)—for a given homeomorphism \(\Phi\) of \(\partial X\) find an isometry of \(X\) whose \(\partial X\)-extension coincides with \(\Phi\)—is investigated in terms of the barycenter map.Entropy2015-03-30174Article10.3390/e17041814181418491099-43002015-03-30doi: 10.3390/e17041814Mitsuhiro ItohHiroyasu Satoh<![CDATA[Entropy, Vol. 17, Pages 1795-1813: Preclinical Diagnosis of Magnetic Resonance (MR) Brain Images via Discrete Wavelet Packet Transform with Tsallis Entropy and Generalized Eigenvalue Proximal Support Vector Machine (GEPSVM)]]>
http://www.mdpi.com/1099-4300/17/4/1795
Background: Developing an accurate computer-aided diagnosis (CAD) system of MR brain images is essential for medical interpretation and analysis. In this study, we propose a novel automatic CAD system to distinguish abnormal brains from normal brains in MRI scanning. Methods: The proposed method simplifies the task to a binary classification problem. We used discrete wavelet packet transform (DWPT) to extract wavelet packet coefficients from MR brain images. Next, Shannon entropy (SE) and Tsallis entropy (TE) were harnessed to obtain entropy features from DWPT coefficients. Finally, generalized eigenvalue proximate support vector machine (GEPSVM), and GEPSVM with radial basis function (RBF) kernel, were employed as classifier. We tested the four proposed diagnosis methods (DWPT + SE + GEPSVM, DWPT + TE + GEPSVM, DWPT + SE + GEPSVM + RBF, and DWPT + TE + GEPSVM + RBF) on three benchmark datasets of Dataset-66, Dataset-160, and Dataset-255. Results: The 10 repetition of K-fold stratified cross validation results showed the proposed DWPT + TE + GEPSVM + RBF method excelled not only other three proposed classifiers but also existing state-of-the-art methods in terms of classification accuracy. In addition, the DWPT + TE + GEPSVM + RBF method achieved accuracy of 100%, 100%, and 99.53% on Dataset-66, Dataset-160, and Dataset-255, respectively. For Dataset-255, the offline learning cost 8.4430s and online prediction cost merely 0.1059s. Conclusions: We have proved the effectiveness of the proposed method, which achieved nearly 100% accuracy over three benchmark datasets.Entropy2015-03-30174Article10.3390/e17041795179518131099-43002015-03-30doi: 10.3390/e17041795Yudong ZhangZhengchao DongShuihua WangGenlin JiJiquan Yang<![CDATA[Entropy, Vol. 17, Pages 1775-1794: Multidimensional Scaling Visualization Using Parametric Similarity Indices]]>
http://www.mdpi.com/1099-4300/17/4/1775
In this paper, we apply multidimensional scaling (MDS) and parametric similarity indices (PSI) in the analysis of complex systems (CS). Each CS is viewed as a dynamical system, exhibiting an output time-series to be interpreted as a manifestation of its behavior. We start by adopting a sliding window to sample the original data into several consecutive time periods. Second, we define a given PSI for tracking pieces of data. We then compare the windows for different values of the parameter, and we generate the corresponding MDS maps of ‘points’. Third, we use Procrustes analysis to linearly transform the MDS charts for maximum superposition and to build a globalMDS map of “shapes”. This final plot captures the time evolution of the phenomena and is sensitive to the PSI adopted. The generalized correlation, theMinkowski distance and four entropy-based indices are tested. The proposed approach is applied to the Dow Jones Industrial Average stock market index and the Europe Brent Spot Price FOB time-series.Entropy2015-03-30174Article10.3390/e17041775177517941099-43002015-03-30doi: 10.3390/e17041775J. Tenreiro MachadoAntónio LopesAlexandra Galhano<![CDATA[Entropy, Vol. 17, Pages 1755-1774: Generalized Remote Preparation of Arbitrary m-qubit Entangled States via Genuine Entanglements]]>
http://www.mdpi.com/1099-4300/17/4/1755
Herein, we present a feasible, general protocol for quantum communication within a network via generalized remote preparation of an arbitrary m-qubit entangled state designed with genuine tripartite Greenberger–Horne–Zeilinger-type entangled resources. During the implementations, we construct novel collective unitary operations; these operations are tasked with performing the necessary phase transfers during remote state preparations. We have distilled our implementation methods into a five-step procedure, which can be used to faithfully recover the desired state during transfer. Compared to previous existing schemes, our methodology features a greatly increased success probability. After the consumption of auxiliary qubits and the performance of collective unitary operations, the probability of successful state transfer is increased four-fold and eight-fold for arbitrary two- and three-qubit entanglements when compared to other methods within the literature, respectively. We conclude this paper with a discussion of the presented scheme for state preparation, including: success probabilities, reducibility and generalizability.Entropy2015-03-30174Article10.3390/e17041755175517741099-43002015-03-30doi: 10.3390/e17041755Dong WangRoss HoehnLiu YeSabre Kais<![CDATA[Entropy, Vol. 17, Pages 1734-1754: Research on the Stability of Open Financial System]]>
http://www.mdpi.com/1099-4300/17/4/1734
We propose a new herd mechanism and embed it into an open financial market system, which allows traders to get in and out of the system based on some transition rates. Moreover, the novel mechanism can avoid the volatility disappearance when the population scale increases. There are three kinds of heterogeneous agents in the system: optimistic, pessimistic and fundamental. Interactions especially occur among three different groups of agents instead of two, which makes the artificial financial market more close to the real one. By the simulation results of this complex system, we can explain stylized facts like volatility clustering and find the key parameters of market bubbles and market collapses.Entropy2015-03-27174Article10.3390/e17041734173417541099-43002015-03-27doi: 10.3390/e17041734Haijun YangLin LiDeshen Wang<![CDATA[Entropy, Vol. 17, Pages 1701-1733: Synchronicity from Synchronized Chaos]]>
http://www.mdpi.com/1099-4300/17/4/1701
The synchronization of loosely-coupled chaotic oscillators, a phenomenon investigated intensively for the last two decades, may realize the philosophical concept of “synchronicity”—the commonplace notion that related eventsmysteriously occur at the same time. When extended to continuous media and/or large discrete arrays, and when general (non-identical) correspondences are considered between states, intermittent synchronous relationships indeed become ubiquitous. Meaningful synchronicity follows naturally if meaningful events are identified with coherent structures, defined by internal synchronization between remote degrees of freedom; a condition that has been posited as necessary for synchronizability with an external system. The important case of synchronization between mind and matter is realized if mind is analogized to a computer model, synchronizing with a sporadically observed system, as in meteorological data assimilation. Evidence for the ubiquity of synchronization is reviewed along with recent proposals that: (1) synchronization of different models of the same objective process may be an expeditious route to improved computational modeling and may also describe the functioning of conscious brains; and (2) the nonlocality in quantum phenomena implied by Bell’s theorem may be explained in a variety of deterministic (hidden variable) interpretations if the quantum world resides on a generalized synchronization “manifold”.Entropy2015-03-27174Article10.3390/e17041701170117331099-43002015-03-27doi: 10.3390/e17041701Gregory Duane<![CDATA[Entropy, Vol. 17, Pages 1690-1700: Maximum Entropy and Probability Kinematics Constrained by Conditionals]]>
http://www.mdpi.com/1099-4300/17/4/1690
Two open questions of inductive reasoning are solved: (1) does the principle of maximum entropy (PME) give a solution to the obverse Majerník problem; and (2) isWagner correct when he claims that Jeffrey’s updating principle (JUP) contradicts PME? Majerník shows that PME provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether PME also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that in the special case introduced by Wagner PME does not contradict JUP, but elegantly generalizes it and offers a more integrated approach to probability updating.Entropy2015-03-27174Article10.3390/e17041690169017001099-43002015-03-27doi: 10.3390/e17041690Stefan Lukits<![CDATA[Entropy, Vol. 17, Pages 1673-1689: Analysis of Data Complexity in Human DNA for Gene-Containing Zone Prediction]]>
http://www.mdpi.com/1099-4300/17/4/1673
This study delves further into the analysis of genomic data by computing a variety of complexity measures. We analyze the effect of window size and evaluate the precision and recall of the prediction of gene zones, aided with a much larger dataset (full chromosomes). A technique based on the separation of two cases (gene-containing and non-gene-containing) has been developed as a basic gene predictor for automated DNA analysis. This predictor was tested on various sequences of human DNA obtained from public databases, in a set of three experiments. The first one covers window size and other parameters; the second one corresponds to an analysis of a full human chromosome (198 million nucleic acids); and the last one tests subject variability (with five different individual subjects). All three experiments have high-quality results, in terms of recall and precision, thus indicating the effectiveness of the predictor.Entropy2015-03-27174Article10.3390/e17041673167316891099-43002015-03-27doi: 10.3390/e17041673Ricardo MongeJuan Crespo<![CDATA[Entropy, Vol. 17, Pages 1660-1672: Evolutionary Voluntary Prisoner’s Dilemma Game under Deterministic and Stochastic Dynamics]]>
http://www.mdpi.com/1099-4300/17/4/1660
The voluntary prisoner’s dilemma (VPD) game has sparked interest from various fields since it was proposed as an effective mechanism to incentivize cooperative behavior. Current studies show that the inherent cyclic dominance of the strategies of the VPD game results in periodic oscillations in population. This paper investigated the influence of the level of individual rationality and the size of a population on the evolutionary dynamics of the VPD game. Different deterministic dynamics, such as the replicator dynamic, the Smith dynamic, the Brown-von Neumann-Nash (BNN) dynamic and the best response (BR) dynamic, for the evolutionary VPD game were modeled and simulated. The stochastic evolutionary dynamics based on quasi birth and death (QBD) process was proposed for the evolutionary VPD game and compared with deterministic dynamics. The results indicated that with the increase of the loners’ fixed payoff, the loner is more likely to remain in the stable state of a VPD game under any of the dynamics mentioned above. However, the different speeds of motion under the dynamics in the cycle dominance proved to be diverse under different evolutionary dynamics and also highly sensitive to the rationality of individuals in a population. Furthermore, in QBD stochastic dynamics, the size of the population has a remarkable effect on the possibility distribution. When the population size increases, the limited distribution of the QBD process will be in accordance with the results in the deterministic dynamics.Entropy2015-03-26174Article10.3390/e17041660166016721099-43002015-03-26doi: 10.3390/e17041660Qian YuRan ChenXiaoyan Wen<![CDATA[Entropy, Vol. 17, Pages 1634-1659: Quantum Discord and Information Deficit in Spin Chains]]>
http://www.mdpi.com/1099-4300/17/4/1634
We examine the behavior of quantum correlations of spin pairs in a finite anisotropic XY spin chain immersed in a transverse magnetic field, through the analysis of the quantum discord and the conventional and quadratic one-way information deficits. We first provide a brief review of these measures, showing that the last ones can be obtained as particular cases of a generalized information deficit based on general entropic forms. All of these measures coincide with an entanglement entropy in the case of pure states, but can be non-zero in separable mixed states, vanishing just for classically correlated states. It is then shown that their behavior in the exact ground state of the chain exhibits similar features, deviating significantly from that of the pair entanglement below the critical field. In contrast with entanglement, they reach full range in this region, becoming independent of the pair separation and coupling range in the immediate vicinity of the factorizing field. It is also shown, however, that significant differences between the quantum discord and the information deficits arise in the local minimizing measurement that defines them. Both analytical and numerical results are provided.Entropy2015-03-26174Article10.3390/e17041634163416591099-43002015-03-26doi: 10.3390/e17041634Norma CanosaLeonardo CilibertiRaúl Rossignoli<![CDATA[Entropy, Vol. 17, Pages 1606-1633: A Fundamental Scale of Descriptions for Analyzing Information Content of Communication Systems]]>
http://www.mdpi.com/1099-4300/17/4/1606
The complexity of the description of a system is a function of the entropy of its symbolic description. Prior to computing the entropy of the system’s description, an observation scale has to be assumed. In texts written in artificial and natural languages, typical scales are binary, characters, and words. However, considering languages as structures built around certain preconceived set of symbols, like words or characters, limits the level of complexity that can be revealed analytically. This study introduces the notion of the fundamental description scale to analyze the essence of the structure of a language. The concept of Fundamental Scale is tested for English and musical instrument digital interface (MIDI) music texts using an algorithm developed to split a text in a collection of sets of symbols that minimizes the observed entropy of the system. This Fundamental Scale reflects more details of the complexity of the language than using bits, characters or words. Results show that this Fundamental Scale allows to compare completely different languages, such as English and MIDI coded music regarding its structural entropy. This comparative power facilitates the study of the complexity of the structure of different communication systems.Entropy2015-03-25174Article10.3390/e17041606160616331099-43002015-03-25doi: 10.3390/e17041606Gerardo FebresKlaus Jaffe<![CDATA[Entropy, Vol. 17, Pages 1581-1605: Kählerian Information Geometry for Signal Processing]]>
http://www.mdpi.com/1099-4300/17/4/1581
We prove the correspondence between the information geometry of a signal filter and a Kähler manifold. The information geometry of a minimum-phase linear system with a finite complex cepstrum norm is a Kähler manifold. The square of the complex cepstrum norm of the signal filter corresponds to the Kähler potential. The Hermitian structure of the Kähler manifold is explicitly emergent if and only if the impulse response function of the highest degree in z is constant in model parameters. The Kählerian information geometry takes advantage of more efficient calculation steps for the metric tensor and the Ricci tensor. Moreover, α-generalization on the geometric tensors is linear in α . It is also robust to find Bayesian predictive priors, such as superharmonic priors, because Laplace–Beltrami operators on Kähler manifolds are in much simpler forms than those of the non-Kähler manifolds. Several time series models are studied in the Kählerian information geometry.Entropy2015-03-25174Article10.3390/e17041581158116051099-43002015-03-25doi: 10.3390/e17041581Jaehyung ChoiAndrew Mullhaupt<![CDATA[Entropy, Vol. 17, Pages 1558-1580: High Recharge Areas in the Choushui River Alluvial Fan (Taiwan) Assessed from Recharge Potential Analysis and Average Storage Variation Indexes]]>
http://www.mdpi.com/1099-4300/17/4/1558
High recharge areas significantly influence the groundwater quality and quantity in regional groundwater systems. Many studies have applied recharge potential analysis (RPA) to estimate groundwater recharge potential (GRP) and have delineated high recharge areas based on the estimated GRP. However, most of these studies define the RPA parameters with supposition, and this represents a major source of uncertainty for applying RPA. To objectively define the RPA parameter values without supposition, this study proposes a systematic method based on the theory of parameter identification. A surrogate variable, namely the average storage variation (ASV) index, is developed to calibrate the RPA parameters, because of the lack of direct GRP observations. The study results show that the correlations between the ASV indexes and computed GRP values improved from 0.67 before calibration to 0.85 after calibration, thus indicating that the calibrated RPA parameters represent the recharge characteristics of the study area well; these data also highlight how defining the RPA parameters with ASV indexes can help to improve the accuracy. The calibrated RPA parameters were used to estimate the GRP distribution of the study area, and the GRP values were graded into five levels. High and excellent level areas are defined as high recharge areas, which composed 7.92% of the study area. Overall, this study demonstrates that the developed approach can objectively define the RPA parameters and high recharge areas of the Choushui River alluvial fan, and the results should serve as valuable references for the Taiwanese government in their efforts to conserve the groundwater quality and quantity of the study area.Entropy2015-03-24174Article10.3390/e17041558155815801099-43002015-03-24doi: 10.3390/e17041558Jui-Pin TsaiYu-Wen ChenLiang-Cheng ChangYi-Ming KuoYu-Hsuan TuChen-Che Pan<![CDATA[Entropy, Vol. 17, Pages 1549-1557: Thermodynamics in Curved Space-Time and Its Application to Holography]]>
http://www.mdpi.com/1099-4300/17/4/1549
The thermodynamic behaviors of a system living in a curved space-time are different from those of a system in a flat space-time. We have investigated the thermodynamics for a system consisting of relativistic massless bosons. We show that a strongly curved metric will produce a large enhancement of the degrees of freedom in the formulae of energy and entropy of the system, as a comparison to the case in a flat space-time. We are mainly concerned with its implications to holography, including the derivations of holographic entropy and holographic screen.Entropy2015-03-24174Article10.3390/e17041549154915571099-43002015-03-24doi: 10.3390/e17041549Yong XiaoLi-Hua FengLi Guan<![CDATA[Entropy, Vol. 17, Pages 1535-1548: Clustering Heterogeneous Data with k-Means by Mutual Information-Based Unsupervised Feature Transformation]]>
http://www.mdpi.com/1099-4300/17/3/1535
Traditional centroid-based clustering algorithms for heterogeneous data with numerical and non-numerical features result in different levels of inaccurate clustering. This is because the Hamming distance used for dissimilarity measurement of non-numerical values does not provide optimal distances between different values, and problems arise from attempts to combine the Euclidean distance and Hamming distance. In this study, the mutual information (MI)-based unsupervised feature transformation (UFT), which can transform non-numerical features into numerical features without information loss, was utilized with the conventional k-means algorithm for heterogeneous data clustering. For the original non-numerical features, UFT can provide numerical values which preserve the structure of the original non-numerical features and have the property of continuous values at the same time. Experiments and analysis of real-world datasets showed that, the integrated UFT-k-means clustering algorithm outperformed others for heterogeneous data with both numerical and non-numerical features.Entropy2015-03-23173Article10.3390/e17031535153515481099-43002015-03-23doi: 10.3390/e17031535Min WeiTommy ChowRosa Chan<![CDATA[Entropy, Vol. 17, Pages 1508-1534: Space-Time Quantum Imaging]]>
http://www.mdpi.com/1099-4300/17/3/1508
We report on an experimental and theoretical investigation of quantum imaging where the images are stored in both space and time. Ghost images of remote objects are produced with either one or two beams of chaotic laser light generated by a rotating ground glass and two sensors measuring the reference field and bucket field at different space-time points. We further observe that the ghost images translate depending on the time delay between the sensor measurements. The ghost imaging experiments are performed both with and without turbulence. A discussion of the physics of the space-time imaging is presented in terms of quantum nonlocal two-photon analysis to support the experimental results. The theoretical model includes certain phase factors of the rotating ground glass. These experiments demonstrated a means to investigate the time and space aspects of ghost imaging and showed that ghost imaging contains more information per measured photon than was previously recognized where multiple ghost images are stored within the same ghost imaging data sets. This suggests new pathways to explore quantum information stored not only in multi-photon coincidence information but also in time delayed multi-photon interference. The research is applicable to making enhanced space-time quantum images and videos of moving objects where the images are stored in both space and time.Entropy2015-03-23173Article10.3390/e17031508150815341099-43002015-03-23doi: 10.3390/e17031508Ronald MeyersKeith Deacon<![CDATA[Entropy, Vol. 17, Pages 1477-1507: Application of Divergence Entropy to Characterize the Structure of the Hydrophobic Core in DNA Interacting Proteins]]>
http://www.mdpi.com/1099-4300/17/3/1477
The fuzzy oil drop model, a tool which can be used to study the structure of the hydrophobic core in proteins, has been applied in the analysis of proteins belonging to the jumonji group—JARID2, JARID1A, JARID1B and JARID1D—proteins that share the property of being able to interact with DNA. Their ARID and PHD domains, when analyzed in the context of the fuzzy oil drop model, are found to exhibit structural variability regarding the status of their secondary folds, including the β-hairpin which determines their biological function. Additionally, the structure of disordered fragments which are present in jumonji proteins (as confirmed by the DisProt database) is explained on the grounds of the hydrophobic core model, suggesting that such fragments contribute to tertiary structural stabilization. This conclusion is supported by divergence entropy measurements, expressing the degree of ordering in each protein’s hydrophobic core.Entropy2015-03-23173Article10.3390/e17031477147715071099-43002015-03-23doi: 10.3390/e17031477Barbara KalinowskaMateusz BanachLeszek KoniecznyIrena Roterman<![CDATA[Entropy, Vol. 17, Pages 1466-1476: The Solute-Exclusion Zone: A Promising Application for Mirofluidics]]>
http://www.mdpi.com/1099-4300/17/3/1466
While unique phenomena exist at fluid-solid phase intersections, many interfacial phenomena manifest solely on limited scales—i.e., the nm-mm ranges—which stifles their application potential. Here, we constructed microfluidic chips that utilize the unique long-distance interface effects of the Solute-Exclusion Zone (EZ) phenomenon to mix, separate, and guide samples in desired directions within microfluidic channels. On our “EZ Chip”, we utilized the interfacial force generated by EZs to transport specimens across streamlines without the need of an off-chip power source. The advantages of easy-integration, low fabrication cost, and no off-chip energy input make the EZ suitable for independent, portable lab-on-chip system applications.Entropy2015-03-23173Article10.3390/e17031466146614761099-43002015-03-23doi: 10.3390/e17031466Chi-Shuo ChenErik FarrJesse AnayaEric ChenWei-Chun Chin<![CDATA[Entropy, Vol. 17, Pages 1452-1465: Thermodynamic Analysis of a Waste Heat Driven Vuilleumier Cycle Heat Pump]]>
http://www.mdpi.com/1099-4300/17/3/1452
A Vuilleumier (VM) cycle heat pump is a closed gas cycle driven by heat energy. It has the highest performance among all known heat driven technologies. In this paper, two thermodynamic analyses, including energy and exergy analysis, are carried out to evaluate the application of a VM cycle heat pump for waste heat utilization. For a prototype VM cycle heat pump, equations for theoretical and actual cycles are established. Under the given conditions, the exergy efficiency for the theoretical cycle is 0.23 compared to 0.15 for the actual cycle. This is due to losses taking place in the actual cycle. Reheat losses and flow friction losses account for almost 83% of the total losses. Investigation of the effect of heat source temperature, cycle pressure and speed on the exergy efficiency indicate that the low temperature waste heat is a suitable heat source for a VM cycle heat pump. The selected cycle pressure should be higher than 100 MPa, and 200–300 rpm is the optimum speed.Entropy2015-03-20173Article10.3390/e17031452145214651099-43002015-03-20doi: 10.3390/e17031452Yingbai XieXuejie Sun<![CDATA[Entropy, Vol. 17, Pages 1441-1451: Approximated Information Analysis in Bayesian Inference]]>
http://www.mdpi.com/1099-4300/17/3/1441
In models with nuisance parameters, Bayesian procedures based on Markov Chain Monte Carlo (MCMC) methods have been developed to approximate the posterior distribution of the parameter of interest. Because these procedures require burdensome computations related to the use of MCMC, approximation and convergence in these procedures are important issues. In this paper, we explore Gibbs sensitivity by using an alternative to the full conditional distribution of the nuisance parameter. The approximate sensitivity of the posterior distribution of interest is studied in terms of an information measure, including Kullback–Leibler divergence. As an illustration, we then apply these results to simple spatial model settings.Entropy2015-03-20173Article10.3390/e17031441144114511099-43002015-03-20doi: 10.3390/e17031441Jung SeoYongku Kim<![CDATA[Entropy, Vol. 17, Pages 1425-1440: A Comparison of Nonlinear Measures for the Detection of Cardiac Autonomic Neuropathy from Heart Rate Variability]]>
http://www.mdpi.com/1099-4300/17/3/1425
In this work we compare three multiscale measures for their ability to discriminate between participants having cardiac autonomic neuropathy (CAN) and aged controls. CAN is a disease that involves nerve damage leading to an abnormal control of heart rate, so one would expect disease progression to manifest in changes to heart rate variability (HRV). We applied multiscale entropy (MSE), multi fractal detrended fluctuation analysis (MFDFA), and Renyi entropy (RE) to recorded datasets of RR intervals. The latter measure provided the best separation (lowest p-value in Mann–Whitney tests) between classes of participants having CAN, early CAN or no CAN (controls). This comparison suggests the efficacy of RE as a measure for diagnosis of CAN and its progression, when compared to the other multiscale measures.Entropy2015-03-19173Article10.3390/e17031425142514401099-43002015-03-19doi: 10.3390/e17031425David CornforthHerbert JelinekMika Tarvainen<![CDATA[Entropy, Vol. 17, Pages 1411-1424: Entropy Generation Analysis for a CNT Suspension Nanofluid in Plumb Ducts with Peristalsis]]>
http://www.mdpi.com/1099-4300/17/3/1411
The purpose of the current investigation was to discuss the entropy generation analysis for a carbon nanotube (CNT) suspension nanofluid in a plumb duct with peristalsis. The entropy generation number due to heat transfer and fluid friction is formulated. The velocity and temperature distributions across the tube are presented along with pressure attributes. Exact analytical solution for velocity and temperature profile are obtained. It is found that the entropy generation number attains high values in the region close to the walls of the tube, while it attains low values near the center of the tube.Entropy2015-03-19173Article10.3390/e17031411141114241099-43002015-03-19doi: 10.3390/e17031411Noreen Akbar<![CDATA[Entropy, Vol. 17, Pages 1387-1410: Applied Cryptography Using Chaos Function for Fast Digital Logic-Based Systems in Ubiquitous Computing]]>
http://www.mdpi.com/1099-4300/17/3/1387
Recently, chaotic dynamics-based data encryption techniques for wired and wireless networks have become a topic of active research in computer science and network security such as robotic systems, encryption, and communication. The main aim of deploying a chaos-based cryptosystem is to provide encryption with several advantages over traditional encryption algorithms such as high security, speed, and reasonable computational overheads and computational power requirements. These challenges have motivated researchers to explore novel chaos-based data encryption techniques with digital logics dealing with hiding information for fast secure communication networks. This work provides an overview of how traditional data encryption techniques are revised and improved to achieve good performance in a secure communication network environment. A comprehensive survey of existing chaos-based data encryption techniques and their application areas are presented. The comparative tables can be used as a guideline to select an encryption technique suitable for the application at hand. Based on the limitations of the existing techniques, an adaptive chaos based data encryption framework of secure communication for future research is proposedEntropy2015-03-19173Review10.3390/e17031387138714101099-43002015-03-19doi: 10.3390/e17031387Piyush ShuklaAnkur KhareMurtaza RizviShalini StalinSanjay Kumar<![CDATA[Entropy, Vol. 17, Pages 1379-1386: The Optimal Fix-Free Code for Anti-Uniform Sources]]>
http://www.mdpi.com/1099-4300/17/3/1379
An \(n\) symbol source which has a Huffman code with codelength vector \(L_{n}=(1,2,3,\cdots,n-2,n-1,n-1)\) is called an anti-uniform source. In this paper, it is shown that for this class of sources, the optimal fix-free code and symmetric fix-free code is \(C_{n}^{*}=(0,11,101,1001,\cdots,1\overbrace{0\cdots0}^{n-2}1) \)Entropy2015-03-19173Article10.3390/e17031379137913861099-43002015-03-19doi: 10.3390/e17031379Ali ZaghianAdel AghajanT. Gulliver<![CDATA[Entropy, Vol. 17, Pages 1358-1378: Hidden State Conditional Random Field for Abnormal Activity Recognition in Smart Homes]]>
http://www.mdpi.com/1099-4300/17/3/1358
As the number of elderly people has increased worldwide, there has been a surge of research into assistive technologies to provide them with better care by recognizing their normal and abnormal activities. However, existing abnormal activity recognition (AAR) algorithms rarely consider sub-activity relations when recognizing abnormal activities. This paper presents an application of the Hidden State Conditional Random Field (HCRF) method to detect and assess abnormal activities that often occur in elderly persons’ homes. Based on HCRF, this paper designs two AAR algorithms, and validates them by comparing them with a feature vector distance based algorithm in two experiments. The results demonstrate that the proposed algorithms favorably outperform the competitor, especially when abnormal activities have same sensor type and sensor number as normal activities.Entropy2015-03-18173Article10.3390/e17031358135813781099-43002015-03-18doi: 10.3390/e17031358Yu TongRong ChenJian Gao<![CDATA[Entropy, Vol. 17, Pages 1347-1357: Geometric Shrinkage Priors for Kählerian Signal Filters]]>
http://www.mdpi.com/1099-4300/17/3/1347
We construct geometric shrinkage priors for Kählerian signal filters. Based on the characteristics of Kähler manifolds, an efficient and robust algorithm for finding superharmonic priors which outperform the Jeffreys prior is introduced. Several ansätze for the Bayesian predictive priors are also suggested. In particular, the ansätze related to Kähler potential are geometrically intrinsic priors to the information manifold of which the geometry is derived from the potential. The implication of the algorithm to time series models is also provided.Entropy2015-03-17173Article10.3390/e17031347134713571099-43002015-03-17doi: 10.3390/e17031347Jaehyung ChoiAndrew Mullhaupt<![CDATA[Entropy, Vol. 17, Pages 1329-1346: Metriplectic Algebra for Dissipative Fluids in Lagrangian Formulation]]>
http://www.mdpi.com/1099-4300/17/3/1329
The dynamics of dissipative fluids in Eulerian variables may be derived from an algebra of Leibniz brackets of observables, the metriplectic algebra, that extends the Poisson algebra of the frictionless limit of the system via a symmetric semidefinite component, encoding dissipative forces. The metriplectic algebra includes the conserved total Hamiltonian H, generating the non-dissipative part of dynamics, and the entropy S of those microscopic degrees of freedom draining energy irreversibly, which generates dissipation. This S is a Casimir invariant of the Poisson algebra to which the metriplectic algebra reduces in the frictionless limit. The role of S is as paramount as that of H, but this fact may be underestimated in the Eulerian formulation because S is not the only Casimir of the symplectic non-canonical part of the algebra. Instead, when the dynamics of the non-ideal fluid is written through the parcel variables of the Lagrangian formulation, the fact that entropy is symplectically invariant clearly appears to be related to its dependence on the microscopic degrees of freedom of the fluid, that are themselves in involution with the position and momentum of the parcel.Entropy2015-03-16173Article10.3390/e17031329132913461099-43002015-03-16doi: 10.3390/e17031329Massimo Materassi<![CDATA[Entropy, Vol. 17, Pages 1309-1328: A Link between Nano- and Classical Thermodynamics: Dissipation Analysis (The Entropy Generation Approach in Nano-Thermodynamics)]]>
http://www.mdpi.com/1099-4300/17/3/1309
The interest in designing nanosystems is continuously growing. Engineers apply a great number of optimization methods to design macroscopic systems. If these methods could be introduced into the design of small systems, a great improvement in nanotechnologies could be achieved. To do so, however, it is necessary to extend classical thermodynamic analysis to small systems, but irreversibility is also present in small systems, as the Loschmidt paradox highlighted. Here, the use of the recent improvement of the Gouy-Stodola theorem to complex systems (GSGL approach), based on the use of entropy generation, is suggested to obtain the extension of classical thermodynamics to nanothermodynamics. The result is a new approach to nanosystems which avoids the difficulties highlighted in the usual analysis of the small systems, such as the definition of temperature for nanosystems.Entropy2015-03-16173Article10.3390/e17031309130913281099-43002015-03-16doi: 10.3390/e17031309Umberto Lucia<![CDATA[Entropy, Vol. 17, Pages 1278-1308: Ricci Curvature, Isoperimetry and a Non-additive Entropy]]>
http://www.mdpi.com/1099-4300/17/3/1278
Searching for the dynamical foundations of Havrda-Charvát/Daróczy/ Cressie-Read/Tsallis non-additive entropy, we come across a covariant quantity called, alternatively, a generalized Ricci curvature, an N-Ricci curvature or a Bakry-Émery-Ricci curvature in the configuration/phase space of a system. We explore some of the implications of this tensor and its associated curvature and present a connection with the non-additive entropy under investigation. We present an isoperimetric interpretation of the non-extensive parameter and comment on further features of the system that can be probed through this tensor.Entropy2015-03-16173Article10.3390/e17031278127813081099-43002015-03-16doi: 10.3390/e17031278Nikos Kalogeropoulos<![CDATA[Entropy, Vol. 17, Pages 1273-1277: Symmetry, Probabiliy, Entropy: Synopsis of the Lecture at MAXENT 2014]]>
http://www.mdpi.com/1099-4300/17/3/1273
In this discussion, we indicate possibilities for (homological and non-homological) linearization of basic notions of the probability theory and also for replacing the real numbers as values of probabilities by objects of suitable combinatorial categories.Entropy2015-03-13173Meeting Report10.3390/e17031273127312771099-43002015-03-13doi: 10.3390/e17031273Misha Gromov<![CDATA[Entropy, Vol. 17, Pages 1253-1272: Entropic Measures of Complexity of Short-Term Dynamics of Nocturnal Heartbeats in an Aging Population]]>
http://www.mdpi.com/1099-4300/17/3/1253
Two entropy-based approaches are investigated to study patterns describing differences in time intervals between consecutive heartbeats. The first method explores matrices arising from networks of transitions constructed following events represented by a time series. The second method considers distributions of ordinal patterns of length three, whereby patterns with repeated values are counted as different patterns. Both methods provide estimators of dynamical aspects of short-term heartbeat signals obtained from nocturnal Holter electrocardiogram (ECG) recordings of healthy people of different ages and genders. The deceleration capacity, arising from the adjacency matrix of the network, and the entropy rate, resulting from the transition matrix of the network, are also calculated, and both significantly decay with aging. As people age, the permutation entropy grows, due to the increase in patterns with repeated values. All of these estimators describe in a consistent way changes in the beat-to-beat heart period dynamics caused by aging. An overall slowing down of heart period changes is observed, and an increase of permutation entropy results from the progressive increase of patterns with repeated values. This result points to the sympathetic drive becoming dominant in cardiac regulation of nocturnal heart rate with age.Entropy2015-03-13173Article10.3390/e17031253125312721099-43002015-03-13doi: 10.3390/e17031253Danuta MakowiecAgnieszka KaczkowskaDorota WejerMarta Żarczyńska-BuchowieckaZbigniew Struzik<![CDATA[Entropy, Vol. 17, Pages 1236-1252: Analysis of the Magnetocaloric Effect in Heusler Alloys: Study of Ni50CoMn36Sn13 by Calorimetric Techniques]]>
http://www.mdpi.com/1099-4300/17/3/1236
Direct determinations of the isothermal entropy increment, \(-\Delta S_T\), in the Heusler alloy Ni\(_{50}\)CoMn\(_{36}\)Sn\(_{13}\) on demagnetization gave positive values, corresponding to a normal magnetocaloric effect. These values contradict the results derived from heat-capacity measurements and also previous results obtained from magnetization measurements, which indicated an inverse magnetocaloric effect, but showing different values depending on the technique employed. The puzzle is solved, and the apparent incompatibilities are quantitatively explained considering the hysteresis, the width of the martensitic transition and the detailed protocol followed to obtain each datum. The results show that these factors should be analyzed in detail when dealing with Heusler alloys.Entropy2015-03-12173Article10.3390/e17031236123612521099-43002015-03-12doi: 10.3390/e17031236Elias PalaciosJuan BartoloméGaofeng WangRamon BurrielKonstantin SkokovSergey TaskaevVladimir Khovaylo<![CDATA[Entropy, Vol. 17, Pages 1218-1235: Information Hiding Method Using Best DCT and Wavelet Coefficients and ItsWatermark Competition]]>
http://www.mdpi.com/1099-4300/17/3/1218
In recent years, information hiding and its evaluation criteria have been developed by the IHC (Information Hiding and its Criteria) Committee of Japan. This committee was established in 2011 with the aim of establishing standard evaluation criteria for robust watermarks. In this study, we developed an information hiding method that satisfies the IHC evaluation criteria. The proposed method uses the difference of the frequency coefficients derived from a discrete cosine transform or a discrete wavelet transform. The algorithm employs a statistical analysis to find the best positions in the frequency domains for watermark insertion. In particular, we use the BCH (Bose-Chaudhuri-Hocquenghem) (511,31,109) code to error correct the watermark bits and the BCH (63,16,11) code as the sync signal to withstand JPEG (Joint Photographic Experts Group) compression and cropping attacks. Our experimental results showed that there were no errors in 10 HDTV-size areas after the second decompression. It should be noted that after the second compression, the file size should be less than 1 25 of the original size to satisfy the IHC evaluation criteria.Entropy2015-03-12173Article10.3390/e17031218121812351099-43002015-03-12doi: 10.3390/e17031218Hyunho KangKeiichi Iwamura<![CDATA[Entropy, Vol. 17, Pages 1204-1217: Information Geometry on the \(\kappa\)-Thermostatistics]]>
http://www.mdpi.com/1099-4300/17/3/1204
We explore the information geometric structure of the statistical manifold generated by the \(\kappa\)-deformed exponential family. The dually-flat manifold is obtained as a dualistic Hessian structure by introducing suitable generalization of the Fisher metric and affine connections. As a byproduct, we obtain the fluctuation-response relations in the \(\kappa\)-formalism based on the \(\kappa\)-generalized exponential family.Entropy2015-03-12173Article10.3390/e17031204120412171099-43002015-03-12doi: 10.3390/e17031204Tatsuaki WadaAntonio Scarfone<![CDATA[Entropy, Vol. 17, Pages 1197-1203: Generalized Multiscale Entropy Analysis: Application to Quantifying the Complex Volatility of Human Heartbeat Time Series]]>
http://www.mdpi.com/1099-4300/17/3/1197
We introduce a generalization of multiscale entropy (MSE) analysis. The method is termed MSEn, where the subscript denotes the moment used to coarse-grain a time series. MSEμ, described previously, uses the mean value (first moment). Here, we focus on MSEσ2 , which uses the second moment, i.e., the variance. MSEσ2 quantifies the dynamics of the volatility (variance) of a signal over multiple time scales. We use the method to analyze the structure of heartbeat time series. We find that the dynamics of the volatility of heartbeat time series obtained from healthy young subjects is highly complex. Furthermore, we find that the multiscale complexity of the volatility, not only the multiscale complexity of the mean heart rate, degrades with aging and pathology. The “bursty” behavior of the dynamics may be related to intermittency in energy and information flows, as part of multiscale cycles of activation and recovery. Generalized MSE may also be useful in quantifying the dynamical properties of other physiologic and of non-physiologic time series.Entropy2015-03-12173Communication10.3390/e17031197119712031099-43002015-03-12doi: 10.3390/e17031197Madalena CostaAry Goldberger<![CDATA[Entropy, Vol. 17, Pages 1181-1196: Entropy of Quantum Measurement]]>
http://www.mdpi.com/1099-4300/17/3/1181
A notion of entropy of a normal state on a finite von Neumann algebra in Segal’s sense is considered, and its superadditivity is proven together with a necessary and sufficient condition for its additivity. Bounds on the entropy of the state after measurement are obtained, and it is shown that a weakly repeatable measurement gives minimal entropy and that a minimal state entropy measurement satisfying some natural additional conditions is repeatable.Entropy2015-03-12173Article10.3390/e17031181118111961099-43002015-03-12doi: 10.3390/e17031181Hanna Podsȩdkowska<![CDATA[Entropy, Vol. 17, Pages 1165-1180: Distributed Consensus for Metamorphic Systems Using a GossipAlgorithm for CAT(0) Metric Spaces]]>
http://www.mdpi.com/1099-4300/17/3/1165
We present an application of distributed consensus algorithms to metamorphic systems. A metamorphic system is a set of identical units that can self-assemble to form a rigid structure. For instance, one can think of a robotic arm composed of multiple links connected by joints. The system can change its shape in order to adapt to different environments via reconfiguration of its constituting units. We assume in this work that several metamorphic systems form a network: two systems are connected whenever they are able to communicate with each other. The aim of this paper is to propose a distributed algorithm that synchronizes all of the systems in the network. Synchronizing means that all of the systems should end up having the same configuration. This aim is achieved in two steps: (i) we cast the problem as a consensus problem on a metric space; and (ii) we use a recent distributed consensus algorithm that only makes use of metrical notions.Entropy2015-03-12173Article10.3390/e17031165116511801099-43002015-03-12doi: 10.3390/e17031165Anass Bellachehab<![CDATA[Entropy, Vol. 17, Pages 1146-1164: Maximum Relative Entropy Updating and the Value of Learning]]>
http://www.mdpi.com/1099-4300/17/3/1146
We examine the possibility of justifying the principle of maximum relative entropy (MRE) considered as an updating rule by looking at the value of learning theorem established in classical decision theory. This theorem captures an intuitive requirement for learning: learning should lead to new degrees of belief that are expected to be helpful and never harmful in making decisions. We call this requirement the value of learning. We consider the extent to which learning rules by MRE could satisfy this requirement and so could be a rational means for pursuing practical goals. First, by representing MRE updating as a conditioning model, we show that MRE satisfies the value of learning in cases where learning prompts a complete redistribution of one’s degrees of belief over a partition of propositions. Second, we show that the value of learning may not be generally satisfied by MRE updates in cases of updating on a change in one’s conditional degrees of belief. We explain that this is so because, contrary to what the value of learning requires, one’s prior degrees of belief might not be equal to the expectation of one’s posterior degrees of belief. This, in turn, points towards a more general moral: that the justification of MRE updating in terms of the value of learning may be sensitive to the context of a given learning experience. Moreover, this lends support to the idea that MRE is not a universal nor mechanical updating rule, but rather a rule whose application and justification may be context-sensitive.Entropy2015-03-11173Article10.3390/e17031146114611641099-43002015-03-11doi: 10.3390/e17031146Patryk Dziurosz-Serafinowicz<![CDATA[Entropy, Vol. 17, Pages 1135-1145: Comparing Security Notions of Secret Sharing Schemes]]>
http://www.mdpi.com/1099-4300/17/3/1135
Different security notions of secret sharing schemes have been proposed by different information measures. Entropies, such as Shannon entropy and min entropy, are frequently used in the setting security notions for secret sharing schemes. Different to the entropies, Kolmogorov complexity was also defined and used in study the security of individual instances for secret sharing schemes. This paper is concerned with these security notions for secret sharing schemes defined by the variational measures, including Shannon entropy, guessing probability, min entropy and Kolmogorov complexity.Entropy2015-03-10173Article10.3390/e17031135113511451099-43002015-03-10doi: 10.3390/e17031135Songsong DaiDonghui Guo<![CDATA[Entropy, Vol. 17, Pages 1123-1134: Projective Synchronization for a Class of Fractional-Order Chaotic Systems with Fractional-Order in the (1, 2) Interval]]>
http://www.mdpi.com/1099-4300/17/3/1123
In this paper, a projective synchronization approach for a class of fractional-order chaotic systems with fractional-order 1 &lt; q &lt; 2 is demonstrated. The projective synchronization approach is established through precise theorization. To illustrate the effectiveness of the proposed scheme, we discuss two examples: (1) the fractional-order Lorenz chaotic system with fractional-order q = 1.1; (2) the fractional-order modified Chua’s chaotic system with fractional-order q = 1.02. The numerical simulations show the validity and feasibility of the proposed scheme.Entropy2015-03-10173Article10.3390/e17031123112311341099-43002015-03-10doi: 10.3390/e17031123Ping ZhouRongji BaiJiming Zheng<![CDATA[Entropy, Vol. 17, Pages 1103-1122: Weakest-Link Scaling and Extreme Events in Finite-Sized Systems]]>
http://www.mdpi.com/1099-4300/17/3/1103
Weakest-link scaling is used in the reliability analysis of complex systems. It is characterized by the extensivity of the hazard function instead of the entropy. The Weibull distribution is the archetypical example of weakest-link scaling, and it describes variables such as the fracture strength of brittle materials, maximal annual rainfall, wind speed and earthquake return times. We investigate two new distributions that exhibit weakest-link scaling, i.e., a Weibull generalization known as the κ-Weibull and a modified gamma probability function that we propose herein. We show that in contrast with the Weibull and the modified gamma, the hazard function of the κ -Weibull is non-extensive, which is a signature of inter-dependence between the links. We also investigate the impact of heterogeneous links, modeled by means of a stochastic Weibull scale parameter, on the observed probability distribution.Entropy2015-03-09173Article10.3390/e17031103110311221099-43002015-03-09doi: 10.3390/e17031103Dionissios HristopulosManolis PetrakisGiorgio Kaniadakis<![CDATA[Entropy, Vol. 17, Pages 1090-1102: Speed Gradient and MaxEnt Principles for Shannon and Tsallis Entropies]]>
http://www.mdpi.com/1099-4300/17/3/1090
In this paper we consider dynamics of non-stationary processes that follow the MaxEnt principle. We derive a set of equations describing dynamics of a system for Shannon and Tsallis entropies. Systems with discrete probability distribution are considered under mass conservation and energy conservation constraints. The existence and uniqueness of solution are established and asymptotic stability of the equilibrium is proved. Equations are derived based on the speed-gradient principle originated in control theory.Entropy2015-03-06173Article10.3390/e17031090109011021099-43002015-03-06doi: 10.3390/e17031090Alexander FradkovDmitry Shalymov<![CDATA[Entropy, Vol. 17, Pages 1063-1089: Fully Bayesian Experimental Design for Pharmacokinetic Studies]]>
http://www.mdpi.com/1099-4300/17/3/1063
Utility functions in Bayesian experimental design are usually based on the posterior distribution. When the posterior is found by simulation, it must be sampled from for each future dataset drawn from the prior predictive distribution. Many thousands of posterior distributions are often required. A popular technique in the Bayesian experimental design literature, which rapidly obtains samples from the posterior, is importance sampling, using the prior as the importance distribution. However, importance sampling from the prior will tend to break down if there is a reasonable number of experimental observations. In this paper, we explore the use of Laplace approximations in the design setting to overcome this drawback. Furthermore, we consider using the Laplace approximation to form the importance distribution to obtain a more efficient importance distribution than the prior. The methodology is motivated by a pharmacokinetic study, which investigates the effect of extracorporeal membrane oxygenation on the pharmacokinetics of antibiotics in sheep. The design problem is to find 10 near optimal plasma sampling times that produce precise estimates of pharmacokinetic model parameters/measures of interest. We consider several different utility functions of interest in these studies, which involve the posterior distribution of parameter functions.Entropy2015-03-05173Article10.3390/e17031063106310891099-43002015-03-05doi: 10.3390/e17031063Elizabeth RyanChristopher DrovandiAnthony Pettitt<![CDATA[Entropy, Vol. 17, Pages 1054-1062: The Hosoya Entropy of a Graph]]>
http://www.mdpi.com/1099-4300/17/3/1054
This paper demonstrates properties of Hosoya entropy, a quantitative measure of graph complexity based on a decomposition of the vertices linked to partial Hosoya polynomials. Connections between the information content of a graph and Hosoya entropy are established, and the special case of Hosoya entropy of trees is investigated.Entropy2015-03-05173Article10.3390/e17031054105410621099-43002015-03-05doi: 10.3390/e17031054Abbe MowshowitzMatthias Dehmer<![CDATA[Entropy, Vol. 17, Pages 1042-1053: Tone Entropy Analysis of Foetal Heart Rate Variability]]>
http://www.mdpi.com/1099-4300/17/3/1042
Development of the foetal autonomic nervous system can be indirectly understood by looking at the changes in beat to beat variability in foetal heart rates. This study presents Tone-Entropy (T-E) analysis of foetal heart rate variability (HRV) at multiple lags (1–8) to understand the influence of gestational ages (early and late) on the development of the foetal autonomic nervous system (ANS). The analysis was based on foetal electrocardiograms (FECGs) of 46 healthy foetuses of 20–32 weeks (early group) and 22 foetuses of 35–41 weeks (late group). Tone represents sympatho-vagal balance and entropy the total autonomic activities. Results show that tone increases and entropy decreases at all lags for the late foetus group. On the other hand, tone decreases and entropy increases at lags 1–4 in the early foetus group. Increasing tone in late foetuses might represent significant maturation of sympathetic nervous systems because foetuses approaching to delivery period need increased sympathetic activity. T-E could be quantitative clinical index to determine the early foetuses from late ones on the basis of maturation of autonomic nervous system.Entropy2015-03-02173Article10.3390/e17031042104210531099-43002015-03-02doi: 10.3390/e17031042Ahsan KhandokerChandan KarmakarYoshitaka KimuraMiyuki EndoSayaka OshioMarimuthu Palaniswami<![CDATA[Entropy, Vol. 17, Pages 1023-1041: Mining Informative Hydrologic Data by Using Support Vector Machines and Elucidating Mined Data according to Information Entropy]]>
http://www.mdpi.com/1099-4300/17/3/1023
The support vector machine is used as a data mining technique to extract informative hydrologic data on the basis of a strong relationship between error tolerance and the number of support vectors. Hydrologic data of flash flood events in the Lan-Yang River basin in Taiwan were used for the case study. Various percentages (from 50% to 10%) of hydrologic data, including those for flood stage and rainfall data, were mined and used as informative data to characterize a flood hydrograph. Information on these mined hydrologic data sets was quantified using entropy indices, namely marginal entropy, joint entropy, transinformation, and conditional entropy. Analytical results obtained using the entropy indices proved that the mined informative data could be hydrologically interpreted and have a meaningful explanation based on information entropy. Estimates of marginal and joint entropies showed that, in view of flood forecasting, the flood stage was a more informative variable than rainfall. In addition, hydrologic models with variables containing more total information were preferable to variables containing less total information. Analysis results of transinformation explained that approximately 30% of information on the flood stage could be derived from the upstream flood stage and 10% to 20% from the rainfall. Elucidating the mined hydrologic data by applying information theory enabled using the entropy indices to interpret various hydrologic processes.Entropy2015-03-02173Article10.3390/e17031023102310411099-43002015-03-02doi: 10.3390/e17031023Shien-Tsung Chen<![CDATA[Entropy, Vol. 17, Pages 1007-1022: Entropy Measures in the Assessment of Heart Rate Variability in Patients with Cardiodepressive Vasovagal Syncope]]>
http://www.mdpi.com/1099-4300/17/3/1007
Sample entropy (SampEn) was reported to be useful in the assessment of the complexity of heart rate dynamics. Permutation entropy (PermEn) is a new measure based on the concept of order and was previously shown to be accurate for short, non-stationary datasets. The aim of the present study is to assess if SampEn and PermEn obtained from baseline recordings might differentiate patients with various outcomes of the head-up tilt test (HUTT). Time-domain heart rate variability (HRV) indices and several nonlinear parameters were calculated using 500 RR interval-long ECG recordings done before tilting in patients with a history suggesting vasovagal syncope. Groups of patients with so-called cardiodepressive vasovagal syncope (VVS_2) during HUTT and patients who did not faint during the test were compared. Two types of HUT tests were analyzed: with spontaneous (SB) or controlled breathing (CB). In our study, SampEn was higher in VVS_2 patients during SB, and PermEn was higher in VVS_2 patients during CB. Irrespective of the type of breathing during the test, SampEn and PermEn were similar in patients with the same type of reaction during HUTT. The use of several entropy-based parameters seems to be useful in HRV assessment in patients with vasovagal fainting.Entropy2015-03-02173Article10.3390/e17031007100710221099-43002015-03-02doi: 10.3390/e17031007Beata GraffGrzegorz GraffDanuta MakowiecAgnieszka KaczkowskaDorota WejerSzymon BudrejkoDariusz KozłowskiKrzysztof Narkiewicz<![CDATA[Entropy, Vol. 17, Pages 984-1006: Phytotoponyms, Geographical Features and Vegetation Coverage in Western Hubei, China]]>
http://www.mdpi.com/1099-4300/17/3/984
The purpose of this paper is to present and exploit fundamental information, such as semantic meanings and geographical features, of phytotoponyms (a type of toponym that includes plant names) in Western Hubei (China). Long-term vegetation degradation is also estimated. Toponym data for this study were obtained from the place names database of Hubei Province at the Civil Affairs Department of Hubei. In total, 1259 instances of phytotoponyms were recognised; 898 (71.3%) were woody plant toponyms, and 361 (28.7%) were herbaceous plant toponyms. Subsequently, we randomly selected a similar number (1250) of non-phytotoponyms to compare with the phytotoponyms. All toponyms were localised and geo-referenced. The results showed that the most common plant names recognisable in place names are common plants that have a close connection with daily life and positive morals in Chinese culture and literature. The occurrence of plant names can reflect the characteristic plants of a city. The vegetation coverage rate where phytotoponyms are located is higher than that in non-phytotoponym areas. Altitude has a stronger correlation with the number of phytotoponyms than slope and vegetation coverage degree. The identification of long-term vegetation degradation based on phytotoponyms is presented for reference only, and other methods and materials are needed to validate these results.Entropy2015-03-02173Article10.3390/e1703098498410061099-43002015-03-02doi: 10.3390/e17030984Guanghui ShiFu RenQingyun DuNan Gao<![CDATA[Entropy, Vol. 17, Pages 968-983: Do Transitive Preferences Always Result in Indifferent Divisions?]]>
http://www.mdpi.com/1099-4300/17/3/968
The transitivity of preferences is one of the basic assumptions used in the theory of games and decisions. It is often equated with the rationality of choice and is considered useful in building rankings. Intransitive preferences are considered paradoxical and undesirable. This problem is discussed by many social and natural scientists. A simple model of a sequential game in which two players choose one of the two elements in each iteration is discussed in this paper. The players make their decisions in different contexts defined by the rules of the game. It appears that the optimal strategy of one of the players can only be intransitive (the so-called relevant intransitive strategy)! On the other hand, the optimal strategy for the second player can be either transitive or intransitive. A quantum model of the game using pure one-qubit strategies is considered. In this model, an increase in the importance of intransitive strategies is observed: there is a certain course of the game where intransitive strategies are the only optimal strategies for both players. The study of decision-making models using quantum information theory tools may shed some new light on the understanding of mechanisms that drive the formation of types of preferences.Entropy2015-03-02173Article10.3390/e170309689689831099-43002015-03-02doi: 10.3390/e17030968Marcin MakowskiEdward PiotrowskiJan Sładkowski<![CDATA[Entropy, Vol. 17, Pages 950-967: Entropy Rate Maps of Complex Excitable Dynamics in Cardiac Monolayers]]>
http://www.mdpi.com/1099-4300/17/3/950
The characterization of spatiotemporal complexity remains a challenging task. This holds in particular for the analysis of data from fluorescence imaging (optical mapping), which allows for the measurement of membrane potential and intracellular calcium at high spatial and temporal resolutions and, therefore, allows for an investigation of cardiac dynamics. Dominant frequency maps and the analysis of phase singularities are frequently used for this type of excitable media. These methods address some important aspects of cardiac dynamics; however, they only consider very specific properties of excitable media. To extend the scope of the analysis, we present a measure based on entropy rates for determining spatiotemporal complexity patterns of excitable media. Simulated data generated by the Aliev–Panfilov model and the cubic Barkley model are used to validate this method. Then, we apply it to optical mapping data from monolayers of cardiac cells from chicken embryos and compare our findings with dominant frequency maps and the analysis of phase singularities. The studies indicate that entropy rate maps provide additional information about local complexity, the origins of wave breakup and the development of patterns governing unstable wave propagation.Entropy2015-02-26173Article10.3390/e170309509509671099-43002015-02-26doi: 10.3390/e17030950Alexander SchlemmerSebastian BergT. ShajahanStefan LutherUlrich Parlitz<![CDATA[Entropy, Vol. 17, Pages 928-949: Instantaneous 3D EEG Signal Analysis Based on Empirical Mode Decomposition and the Hilbert–Huang Transform Applied to Depth of Anaesthesia]]>
http://www.mdpi.com/1099-4300/17/3/928
Depth of anaesthesia (DoA) is an important measure for assessing the degree to which the central nervous system of a patient is depressed by a general anaesthetic agent, depending on the potency and concentration with which anaesthesia is administered during surgery. We can monitor the DoA by observing the patient’s electroencephalography (EEG) signals during the surgical procedure. Typically high frequency EEG signals indicates the patient is conscious, while low frequency signals mean the patient is in a general anaesthetic state. If the anaesthetist is able to observe the instantaneous frequency changes of the patient’s EEG signals during surgery this can help to better regulate and monitor DoA, reducing surgical and post-operative risks. This paper describes an approach towards the development of a 3D real-time visualization application which can show the instantaneous frequency and instantaneous amplitude of EEG simultaneously by using empirical mode decomposition (EMD) and the Hilbert–Huang transform (HHT). HHT uses the EMD method to decompose a signal into so-called intrinsic mode functions (IMFs). The Hilbert spectral analysis method is then used to obtain instantaneous frequency data. The HHT provides a new method of analyzing non-stationary and nonlinear time series data. We investigate this approach by analyzing EEG data collected from patients undergoing surgical procedures. The results show that the EEG differences between three distinct surgical stages computed by using sample entropy (SampEn) are consistent with the expected differences between these stages based on the bispectral index (BIS), which has been shown to be quantifiable measure of the effect of anaesthetics on the central nervous system. Also, the proposed filtering approach is more effective compared to the standard filtering method in filtering out signal noise resulting in more consistent results than those provided by the BIS. The proposed approach is therefore able to distinguish between key operational stages related to DoA, which is consistent with the clinical observations. SampEn can also be viewed as a useful index for evaluating and monitoring the DoA of a patient when used in combination with this approach.Entropy2015-02-20173Article10.3390/e170309289289491099-43002015-02-20doi: 10.3390/e17030928Mu-Tzu ShihFaiyaz DoctorShou-Zen FanKuo-Kuang JenJiann-Shing Shieh<![CDATA[Entropy, Vol. 17, Pages 914-927: Application of the Permutation Entropy over the Heart Rate Variability for the Improvement of Electrocardiogram-based Sleep Breathing Pause Detection]]>
http://www.mdpi.com/1099-4300/17/3/914
In this paper the permutation entropy (PE) obtained from heart rate variability (HRV) is analyzed in a statistical model. In this model we also integrate other feature extraction techniques, the cepstrum coefficients derived from the same HRV and a set of band powers obtained from the electrocardiogram derived respiratory (EDR) signal. The aim of the model is detecting obstructive sleep apnea (OSA) events. For this purpose, we apply two statistical classification methods: Logistic Regression (LR) and Quadratic Discriminant Analysis (QDA). For testing the models we use seventy ECG recordings from the Physionet database which are divided into equal-size learning and testing sets. Both sets consist of 35 recordings, each containing a single ECG signal. In our experiments we have found that the features extracted from the EDR signal present a sensitivity of 65.6% and specificity of 87.7% (auc = 85) in the LR classifier, and sensitivity of 59.4% and specificity of 90.3% (auc = 83.9) in the QDA classifier. The HRV-based cepstrum coefficients present a sensitivity of 63.8% and specificity of 89.2% (auc = 86) in the LR classifier, and sensitivity of 67.2% and specificity of 86.8% (auc = 86.9) in the QDA. Subsequent tests show that the contribution of the permutation entropy increases the performance of the classifiers, implying that the complexity of RR interval time series play an important role in the breathing pauses detection. Particularly, when all features are jointly used, the quantification task reaches a sensitivity of 71.9% and specificity of 92.1% (auc = 90.3) for LR. Similarly, for QDA the sensitivity is 75.1% and the specificity is 90.5% (auc = 91.7).Entropy2015-02-20173Article10.3390/e170309149149271099-43002015-02-20doi: 10.3390/e17030914Antonio Ravelo-GarcíaJuan Navarro-MesaUbay Casanova-BlancasSofia Martin-GonzalezPedro Quintana-MoralesIván Guerra-MorenoJosé Canino-RodríguezEduardo Hernández-Pérez<![CDATA[Entropy, Vol. 17, Pages 903-913: Thermophysical Characteristics of the Ferrofluid in a Vertical Rectangle]]>
http://www.mdpi.com/1099-4300/17/2/903
The article aimed to analytically investigate the thermophysical behaviors of a ferrofluid in a vertical rectangle with the variation of intensity of the magnetic field, viscosity of the ferrofluid and boundary conditions. The governing equations of the ferrofluid include the continuity, momentum and energy equations for describing the thermal-fluidic behaviors of the ferrofluid and the Maxwell equation and magnetization equation are also added to consider rotating effect of the nano-sized particles. The flow behavior and heat transfer characteristics of the ferrofluid with the intensity of the magnetic field, viscosities of the ferrofluid and boundary conditions were analyzed through isotherms, velocity profiles and both mean and local Nusselt numbers. As a result, the isotherms of the ferrofluid in the vertical rectangle increased with the increase of the magnetic volume fractions and magnetic field intensities. In addition, the mean Nusselt numbers increased with the increase of magnetite volume fractions at all magnetic field intensities because of the combined effects of both heat conduction by magnetite and the magnetic volume force.Entropy2015-02-16172Communication10.3390/e170209039039131099-43002015-02-16doi: 10.3390/e17020903Jae-Hyeong SeoByoung-Hee YouSang-Seuk KwenDong-Yeon LeeMoo-Yeon Lee<![CDATA[Entropy, Vol. 17, Pages 885-902: On Analytical Solutions of the Fractional Differential Equation with Uncertainty: Application to the Basset Problem]]>
http://www.mdpi.com/1099-4300/17/2/885
In this paper, we apply the concept of Caputo’s H-differentiability, constructed based on the generalized Hukuhara difference, to solve the fuzzy fractional differential equation (FFDE) with uncertainty. This is in contrast to conventional solutions that either require a quantity of fractional derivatives of unknown solution at the initial point (Riemann–Liouville) or a solution with increasing length of their support (Hukuhara difference). Then, in order to solve the FFDE analytically, we introduce the fuzzy Laplace transform of the Caputo H-derivative. To the best of our knowledge, there is limited research devoted to the analytical methods to solve the FFDE under the fuzzy Caputo fractional differentiability. An analytical solution is presented to confirm the capability of the proposed method.Entropy2015-02-16172Article10.3390/e170208858859021099-43002015-02-16doi: 10.3390/e17020885Soheil SalahshourAli AhmadianNorazak SenuDumitru BaleanuPraveen Agarwal<![CDATA[Entropy, Vol. 17, Pages 882-884: Entropy Best Paper Award 2015]]>
http://www.mdpi.com/1099-4300/17/2/882
We are pleased to announce the “Entropy Best Paper Award” for 2015. Nominations were selected by the Editor-in-Chief and designated Editorial Board Members from all the papers published in 2011. Reviews and research papers were evaluated separately. We gladly announce that the following three papers have won the Entropy Best Paper Award in 2015:[...]Entropy2015-02-16172Editorial10.3390/e170208828828841099-43002015-02-16doi: 10.3390/e17020882Kevin Knuth<![CDATA[Entropy, Vol. 17, Pages 866-881: Optimal Design of Magnetohydrodynamic Mixed Convection Flow in a Vertical Channel with Slip Boundary Conditions and Thermal Radiation Effects by Using an Entropy Generation Minimization Method]]>
http://www.mdpi.com/1099-4300/17/2/866
Investigation of the effect of thermal radiation on a fully developed magnetohydrodynamic (MHD) convective flow of a Newtonian, incompressible and electrically conducting fluid in a vertical microchannel bounded by two infinite vertical parallel plates with constant temperature walls through a lateral magnetic field of uniform strength is presented. The Rosseland model for the conduction radiation heat transfer in an absorbing medium and two plates with slip-flow and no-slip conditions are assumed. In addition, the induced magnetic field is neglected due to the assumption of a small magnetic Reynolds number. The non-dimensional governing equations are solved numerically using Runge–Kutta–Fehlberg method with a shooting technique. The channel is optimized based on the Second Law of Thermodynamics by changing various parameters such as the thermal radiation parameter, the temperature parameter, Hartmann number, Grashof to Reynolds ratio, velocity slip length, and temperature jump.Entropy2015-02-13172Article10.3390/e170208668668811099-43002015-02-13doi: 10.3390/e17020866Mohamad Abdollahzadeh JamalabadiJae ParkChang Lee<![CDATA[Entropy, Vol. 17, Pages 852-865: Relational Probabilistic Conditionals and Their Instantiations under Maximum Entropy Semantics for First-Order Knowledge Bases]]>
http://www.mdpi.com/1099-4300/17/2/852
For conditional probabilistic knowledge bases with conditionals based on propositional logic, the principle of maximum entropy (ME) is well-established, determining a unique model inductively completing the explicitly given knowledge. On the other hand, there is no general agreement on how to extend the ME principle to relational conditionals containing free variables. In this paper, we focus on two approaches to ME semantics that have been developed for first-order knowledge bases: aggregating semantics and a grounding semantics. Since they use different variants of conditionals, we define the logic PCI, which covers both approaches as special cases and provides a framework where the effects of both approaches can be studied in detail. While the ME models under PCI-grounding and PCI-aggregating semantics are different in general, we point out that parametric uniformity of a knowledge base ensures that both semantics coincide. Using some concrete knowledge bases, we illustrate the differences and common features of both approaches, looking in particular at the ground instances of the given conditionals.Entropy2015-02-13172Article10.3390/e170208528528651099-43002015-02-13doi: 10.3390/e17020852Christoph BeierleMarc FinthammerGabriele Kern-Isberner<![CDATA[Entropy, Vol. 17, Pages 841-851: Probabilistic Three-Party Sharing of Operation on a Remote Qubit]]>
http://www.mdpi.com/1099-4300/17/2/841
A probabilistic tripartite single-qubit operation sharing scheme is put forward by utilizing a two-qubit and a three-qubit non-maximally entangled state as quantum channels. Some specific comparisons between our scheme and another probabilistic scheme are made. It is found that, if the product of the two minimal coefficients characterizing channel entanglements is greater than 3/16, our scheme is more superior than the other one. Nonetheless, the price is that more classical and quantum resources are consumed, and the operation difficulty is rather increased. Moreover, some important features of the scheme, such as its security, probability and sharer symmetry, are revealed through concrete discussions. Additionally, the experimental feasibility of our scheme is analyzed and subsequently confirmed according to the current experimental techniques.Entropy2015-02-12172Article10.3390/e170208418418511099-43002015-02-12doi: 10.3390/e17020841Chuanmei XieYimin LiuHang XingZhanjun Zhang