Entropy doi: 10.3390/e20100791

Authors: Heinz Herwig

In order to teach heat transfer systematically and with a clear physical background, it is recommended that entropy should not be ignored as a fundamental quantity. Heat transfer processes are characterized by introducing the so-called “entropic potential” of the transferred energy, and an assessment number is based on this new quantity.

]]>Entropy doi: 10.3390/e20100790

Authors: Jiří Náprstek Cyril Fischer

In this study, we consider a method for investigating the stochastic response of a nonlinear dynamical system affected by a random seismic process. We present the solution of the probability density of a single/multiple-degree of freedom (SDOF/MDOF) system with several statically stable equilibrium states and with possible jumps of the snap-through type. The system is a Hamiltonian system with weak damping excited by a system of non-stationary Gaussian white noise. The solution based on the Gibbs principle of the maximum entropy of probability could potentially be implemented in various branches of engineering. The search for the extreme of the Gibbs entropy functional is formulated as a constrained optimization problem. The secondary constraints follow from the Fokker–Planck equation (FPE) for the system considered or from the system of ordinary differential equations for the stochastic moments of the response derived from the relevant FPE. In terms of the application type, this strategy is most suitable for SDOF/MDOF systems containing polynomial type nonlinearities. Thus, the solution links up with the customary formulation of the finite elements discretization for strongly nonlinear continuous systems.

]]>Entropy doi: 10.3390/e20100789

Authors: Sylvain Barbay Saliya Coulibaly Marcel G. Clerc

Out-of-equilibrium systems exhibit complex spatiotemporal behaviors when they present a secondary bifurcation to an oscillatory instability. Here, we investigate the complex dynamics shown by a pulsing regime in an extended, one-dimensional semiconductor microcavity laser whose cavity is composed by integrated gain and saturable absorber media. This system is known to give rise experimentally and theoretically to extreme events characterized by rare and high amplitude optical pulses following the onset of spatiotemporal chaos. Based on a theoretical model, we reveal a dynamical behavior characterized by the chaotic alternation of phase and amplitude turbulence. The highest amplitude pulses, i.e., the extreme events, are observed in the phase turbulence zones. This chaotic alternation behavior between different turbulent regimes is at contrast to what is usually observed in a generic amplitude equation model such as the Ginzburg&ndash;Landau model. Hence, these regimes provide some insight into the poorly known properties of the complex spatiotemporal dynamics exhibited by secondary instabilities of an Andronov&ndash;Hopf bifurcation.

]]>Entropy doi: 10.3390/e20100788

Authors: Xiao Zhang Xia Liu Yanyan Yang

The information entropy developed by Shannon is an effective measure of uncertainty in data, and the rough set theory is a useful tool of computer applications to deal with vagueness and uncertainty data circumstances. At present, the information entropy has been extensively applied in the rough set theory, and different information entropy models have also been proposed in rough sets. In this paper, based on the existing feature selection method by using a fuzzy rough set-based information entropy, a corresponding fast algorithm is provided to achieve efficient implementation, in which the fuzzy rough set-based information entropy taking as the evaluation measure for selecting features is computed by an improved mechanism with lower complexity. The essence of the acceleration algorithm is to use iterative reduced instances to compute the lambda-conditional entropy. Numerical experiments are further conducted to show the performance of the proposed fast algorithm, and the results demonstrate that the algorithm acquires the same feature subset to its original counterpart, but with significantly less time.

]]>Entropy doi: 10.3390/e20100787

Authors: Hervé Bergeron Jean-Pierre Gazeau

Any quantization maps linearly function on a phase space to symmetric operators in a Hilbert space. Covariant integral quantization combines operator-valued measure with the symmetry group of the phase space. Covariant means that the quantization map intertwines classical (geometric operation) and quantum (unitary transformations) symmetries. Integral means that we use all resources of integral calculus, in order to implement the method when we apply it to singular functions, or distributions, for which the integral calculus is an essential ingredient. We first review this quantization scheme before revisiting the cases where symmetry covariance is described by the Weyl-Heisenberg group and the affine group respectively, and we emphasize the fundamental role played by Fourier transform in both cases. As an original outcome of our generalisations of the Wigner-Weyl transform, we show that many properties of the Weyl integral quantization, commonly viewed as optimal, are actually shared by a large family of integral quantizations.

]]>Entropy doi: 10.3390/e20100786

Authors: William Cruz-Santos Salvador E. Venegas-Andraca Marco Lanzagorta

In this paper, we propose a methodology to solve the stereo matching problem through quantum annealing optimization. Our proposal takes advantage of the existing Min-Cut/Max-Flow network formulation of computer vision problems. Based on this network formulation, we construct a quadratic pseudo-Boolean function and then optimize it through the use of the D-Wave quantum annealing technology. Experimental validation using two kinds of stereo pair of images, random dot stereograms and gray-scale, shows that our methodology is effective.

]]>Entropy doi: 10.3390/e20100785

Authors: Matteo Bruno Fabio Saracco Tiziano Squartini Marco Dueñas

In this paper, we analyse the bipartite Colombian firms-products network, throughout a period of five years, from 2010 to 2014. Our analysis depicts a strongly modular system, with several groups of firms specializing in the export of specific categories of products. These clusters have been detected by running the bipartite variant of the traditional modularity maximization, revealing a bi-modular structure. Interestingly, this finding is refined by applying a recently proposed algorithm for projecting bipartite networks on the layer of interest and, then, running the Louvain algorithm on the resulting monopartite representations. Important structural differences emerge upon comparing the Colombian firms-products network with the World Trade Web, in particular, the bipartite representation of the latter is not characterized by a similar block-structure, as the modularity maximization fails in revealing (bipartite) nodes clusters. This points out that economic systems behave differently at different scales: while countries tend to diversify their production&mdash;potentially exporting a large number of different products&mdash;firms specialize in exporting (substantially very limited) baskets of basically homogeneous products.

]]>Entropy doi: 10.3390/e20100784

Authors: Peter Harremoës

We study entropy inequalities for variables that are related by functional dependencies. Although the powerset on four variables is the smallest Boolean lattice with non-Shannon inequalities, there exist lattices with many more variables where the Shannon inequalities are sufficient. We search for conditions that exclude the existence of non-Shannon inequalities. The existence of non-Shannon inequalities is related to the question of whether a lattice is isomorphic to a lattice of subgroups of a group. In order to formulate and prove the results, one has to bridge lattice theory, group theory, the theory of functional dependences and the theory of conditional independence. It is demonstrated that the Shannon inequalities are sufficient for planar modular lattices. The proof applies a gluing technique that uses that if the Shannon inequalities are sufficient for the pieces, then they are also sufficient for the whole lattice. It is conjectured that the Shannon inequalities are sufficient if and only if the lattice does not contain a special lattice as a sub-semilattice.

]]>Entropy doi: 10.3390/e20100783

Authors: Vito D. P. Servedio Paolo Buttà Dario Mazzilli Andrea Tacchella Luciano Pietronero

We present a new metric estimating fitness of countries and complexity of products by exploiting a non-linear non-homogeneous map applied to the publicly available information on the goods exported by a country. The non homogeneous terms guarantee both convergence and stability. After a suitable rescaling of the relevant quantities, the non homogeneous terms are eventually set to zero so that this new metric is parameter free. This new map almost reproduces the results of the original homogeneous metrics already defined in literature and allows for an approximate analytic solution in case of actual binarized matrices based on the Revealed Comparative Advantage (RCA) indicator. This solution is connected with a new quantity describing the neighborhood of nodes in bipartite graphs, representing in this work the relations between countries and exported products. Moreover, we define the new indicator of country net-efficiency quantifying how a country efficiently invests in capabilities able to generate innovative complex high quality products. Eventually, we demonstrate analytically the local convergence of the algorithm involved.

]]>Entropy doi: 10.3390/e20100782

Authors: Christos Papadimitriou Georgios Piliouras

In 1950, Nash proposed a natural equilibrium solution concept for games hence called Nash equilibrium, and proved that all finite games have at least one. The proof is through a simple yet ingenious application of Brouwer&rsquo;s (or, in another version Kakutani&rsquo;s) fixed point theorem, the most sophisticated result in his era&rsquo;s topology&mdash;in fact, recent algorithmic work has established that Nash equilibria are computationally equivalent to fixed points. In this paper, we propose a new class of universal non-equilibrium solution concepts arising from an important theorem in the topology of dynamical systems that was unavailable to Nash. This approach starts with both a game and a learning dynamics, defined over mixed strategies. The Nash equilibria are fixpoints of the dynamics, but the system behavior is captured by an object far more general than the Nash equilibrium that is known in dynamical systems theory as chain recurrent set. Informally, once we focus on this solution concept&mdash;this notion of &ldquo;the outcome of the game&rdquo;&mdash;every game behaves like a potential game with the dynamics converging to these states. In other words, unlike Nash equilibria, this solution concept is algorithmic in the sense that it has a constructive proof of existence. We characterize this solution for simple benchmark games under replicator dynamics, arguably the best known evolutionary dynamics in game theory. For (weighted) potential games, the new concept coincides with the fixpoints/equilibria of the dynamics. However, in (variants of) zero-sum games with fully mixed (i.e., interior) Nash equilibria, it covers the whole state space, as the dynamics satisfy specific information theoretic constants of motion. We discuss numerous novel computational, as well as structural, combinatorial questions raised by this chain recurrence conception of games.

]]>Entropy doi: 10.3390/e20100781

Authors: Wojciech Chmiel Joanna Kwiecień

The paper focuses on the opportunity of the application of the quantum-inspired evolutionary algorithm for determining minimal costs of the assignment in the quadratic assignment problem. The idea behind the paper is to present how the algorithm has to be adapted to this problem, including crossover and mutation operators and introducing quantum principles in particular procedures. The results have shown that the performance of our approach in terms of converging to the best solutions is satisfactory. Moreover, we have presented the results of the selected parameters of the approach on the quality of the obtained solutions.

]]>Entropy doi: 10.3390/e20100780

Authors: Mohamed Hatifi Ralph Willox Samuel Colin Thomas Durt

Recently, the properties of bouncing oil droplets, also known as “walkers,” have attracted much attention because they are thought to offer a gateway to a better understanding of quantum behavior. They indeed constitute a macroscopic realization of wave-particle duality, in the sense that their trajectories are guided by a self-generated surrounding wave. The aim of this paper is to try to describe walker phenomenology in terms of de Broglie–Bohm dynamics and of a stochastic version thereof. In particular, we first study how a stochastic modification of the de Broglie pilot-wave theory, à la Nelson, affects the process of relaxation to quantum equilibrium, and we prove an H-theorem for the relaxation to quantum equilibrium under Nelson-type dynamics. We then compare the onset of equilibrium in the stochastic and the de Broglie–Bohm approaches and we propose some simple experiments by which one can test the applicability of our theory to the context of bouncing oil droplets. Finally, we compare our theory to actual observations of walker behavior in a 2D harmonic potential well.

]]>Entropy doi: 10.3390/e20100779

Authors: Renaldas Urniezius Vytautas Galvanauskas Arnas Survyla Rimvydas Simutis Donatas Levisauskas

For historic reasons, industrial knowledge of reproducibility and restrictions imposed by regulations, open-loop feeding control approaches dominate in industrial fed-batch cultivation processes. In this study, a generic gray box biomass modeling procedure uses relative entropy as a key to approach the posterior similarly to how prior distribution approaches the posterior distribution by the multivariate path of Lagrange multipliers, for which a description of a nuisance time is introduced. The ultimate purpose of this study was to develop a numerical semi-global convex optimization procedure that is dedicated to the calculation of feeding rate time profiles during the fed-batch cultivation processes. The proposed numerical semi-global convex optimization of relative entropy is neither restricted to the gray box model nor to the bioengineering application. From the bioengineering application perspective, the proposed bioprocess design technique has benefits for both the regular feed-forward control and the advanced adaptive control systems, in which the model for biomass growth prediction is compulsory. After identification of the gray box model parameters, the options and alternatives in controllable industrial biotechnological processes are described. The main aim of this work is to achieve high reproducibility, controllability, and desired process performance. Glucose concentration measurements, which were used for the development of the model, become unnecessary for the development of the desired microbial cultivation process.

]]>Entropy doi: 10.3390/e20100778

Authors: Yeqiang Bu Shenyou Peng Shiwei Wu Yujie Wei Gang Wang Jiabin Liu Hongtao Wang

The bulk high-entropy alloys (HEAs) exhibit similar deformation behaviours as traditional metals. These bulk behaviours are likely an averaging of the behaviours exhibited at the nanoscale. Herein, in situ atomic-scale observation of deformation behaviours in nanoscaled CoCrCuFeNi face-centred cubic (FCC) HEA was performed. The deformation behaviours of this nanoscaled FCC HEA (i.e., nanodisturbances and phase transformations) were distinct from those of nanoscaled traditional FCC metals and corresponding bulk HEA. First-principles calculations revealed an obvious fluctuation of the stacking fault energy and stability difference at the atomic scale in the HEA. The stability difference was highlighted only in the nanoscaled HEA and induced unconventional deformation behaviours. Our work suggests that the nanoscaled HEA may provide more chances to discover the long-expected essential distinction between the HEAs and traditional metals.

]]>Entropy doi: 10.3390/e20100777

Authors: Matúš Medo Manuel Mariani Linyuan Lü

Real networks typically studied in various research fields—ecology and economic complexity, for example—often exhibit a nested topology, which means that the neighborhoods of high-degree nodes tend to include the neighborhoods of low-degree nodes. Focusing on nested networks, we study the problem of link prediction in complex networks, which aims at identifying likely candidates for missing links. We find that a new method that takes network nestedness into account outperforms well-established link-prediction methods not only when the input networks are sufficiently nested, but also for networks where the nested structure is imperfect. Our study paves the way to search for optimal methods for link prediction in nested networks, which might be beneficial for World Trade and ecological network analysis.

]]>Entropy doi: 10.3390/e20100776

Authors: Angelica Sbardella François Perruchas Lorenzo Napolitano Nicolò Barbieri Davide Consoli

The present study provides an analysis of empirical regularities in the development of green technology. We use patent data to examine inventions that can be traced to the environment-related catalogue (ENV-Tech) covering technologies in environmental management, water-related adaptation and climate change mitigation. Furthermore, we employ the Economic Fitness-Complexity (EFC) approach to assess their development and geographical distribution across countries between 1970 and 2010. This allows us to identify three typologies of countries: leaders, laggards and catch-up. While, as expected, there is a direct relationship between GDP per capita and invention capacity, we also document the remarkable growth of East Asia countries that started from the periphery and rapidly established themselves as key actors. This geographical pattern coincides with higher integration across domains so that, while the relative development of individual areas may have peaked, there is now demand for greater interoperability across green technologies.

]]>Entropy doi: 10.3390/e20100775

Authors: Michiel Straat Fthi Abadi Christina Göpfert Barbara Hammer Michael Biehl

We introduce a modeling framework for the investigation of on-line machine learning processes in non-stationary environments. We exemplify the approach in terms of two specific model situations: In the first, we consider the learning of a classification scheme from clustered data by means of prototype-based Learning Vector Quantization (LVQ). In the second, we study the training of layered neural networks with sigmoidal activations for the purpose of regression. In both cases, the target, i.e., the classification or regression scheme, is considered to change continuously while the system is trained from a stream of labeled data. We extend and apply methods borrowed from statistical physics which have been used frequently for the exact description of training dynamics in stationary environments. Extensions of the approach allow for the computation of typical learning curves in the presence of concept drift in a variety of model situations. First results are presented and discussed for stochastic drift processes in classification and regression problems. They indicate that LVQ is capable of tracking a classification scheme under drift to a non-trivial extent. Furthermore, we show that concept drift can cause the persistence of sub-optimal plateau states in gradient based training of layered neural networks for regression.

]]>Entropy doi: 10.3390/e20100774

Authors: Yimin Yin Xiaojun Duan

In this paper, a rigorous formalism of information transfer within a multi-dimensional deterministic dynamic system is established for both continuous flows and discrete mappings. The underlying mechanism is derived from entropy change and transfer during the evolutions of multiple components. While this work is mainly focused on three-dimensional systems, the analysis of information transfer among state variables can be generalized to high-dimensional systems. Explicit formulas are given and verified in the classical Lorenz and Chua’s systems. The uncertainty of information transfer is quantified for all variables, with which a dynamic sensitivity analysis could be performed statistically as an additional benefit. The generalized formalisms can be applied to study dynamical behaviors as well as asymptotic dynamics of the system. The simulation results can help to reveal some underlying information for understanding the system better, which can be used for prediction and control in many diverse fields.

]]>Entropy doi: 10.3390/e20100773

Authors: Octavio Obregón José Luis López Marco Ortega-Cruz

We explore some important consequences of the quantum ideal Bose gas, the properties of which are described by a non-extensive entropy. We consider in particular two entropies that depend only on the probability. These entropies are defined in the framework of superstatistics, and in this context, such entropies arise when a system is exposed to non-equilibrium conditions, whose general effects can be described by a generalized Boltzmann factor and correspondingly by a generalized probability distribution defining a different statistics. We generalize the usual statistics to their quantum counterparts, and we will focus on the properties of the corresponding generalized quantum ideal Bose gas. The most important consequence of the generalized Bose gas is that the critical temperature predicted for the condensation changes in comparison with the usual quantum Bose gas. Conceptual differences arise when comparing our results with the ones previously reported regarding the q-generalized Bose&ndash;Einstein condensation. As the entropies analyzed here only depend on the probability, our results cannot be adjusted by any parameter. Even though these results are close to those of non-extensive statistical mechanics for q &sim; 1 , they differ and cannot be matched for any q.

]]>Entropy doi: 10.3390/e20100772

Authors: Juan César Flores

In desiccated films, particularly with old paintings, molecular bonds may break to create intricate patterns of macroscopic cracks. The resulting directions of the cracks quantifiably enable an evaluation of the entropy and degree of disorder in the network. Experimental tests on prepared samples and a two-interacting-variables model allow the evolution of entropy to be tracked. Calculations were performed, primarily using data from the painting Girl with a Pearl Earring by Vermeer, revealing that the left side of the girl&rsquo;s face features a crack structure with higher entropy (or less order) than the right side. Other old paintings were considered. The extrapolation of experiments to these old paintings confirms that saturation still is not reached.

]]>Entropy doi: 10.3390/e20100771

Authors: Rohit Saini Nader Karimi Lian Duan Amsini Sadiki Amirfarhang Mehdizadeh

The present study aims to assess the effects of two different underlying RANS models on overall behavior of the IDDES methodology when applied to different flow configurations ranging from fully attached (plane channel flow) to separated flows (periodic hill flow). This includes investigating prediction accuracy of first and second order statistics, response to grid refinement, grey area dynamics and triggering mechanism. Further, several criteria have been investigated to assess reliability and quality of the methodology when operating in scale resolving mode. It turns out that irrespective of the near wall modeling strategy, the IDDES methodology does not satisfy all criteria required to make this methodology reliable when applied to various flow configurations at different Reynolds numbers with different grid resolutions. Further, it is found that using more advanced underlying RANS model to improve prediction accuracy of the near wall dynamics results in extension of the grey area, which may delay the transition to scale resolving mode. This systematic study for attached and separated flows suggests that the shortcomings of IDDES methodology mostly lie in inaccurate prediction of the dynamics inside the grey area and demands further investigation in this direction to make this methodology capable of dealing with different flow situations reliably.

]]>Entropy doi: 10.3390/e20100770

Authors: Todd A. Howe Anthony G. Pollman Anthony J. Gannon

This paper presents the results of an ideal theoretical energy and exergy analysis for a combined, building scale Liquid Air Energy Storage (LAES) and expansion turbine system. This work identifies the upper bounds of energy and exergy efficiency for the combined LAES-expansion system which has not been investigated. The system uses the simple Linde-Hampson and pre-cooled Linde-Hampson cycles for the liquefaction subsystem and direct expansion method, with and without heating above ambient temperature, for the energy production subsystem. In addition, the paper highlights the effectiveness of precooling air for liquefaction and heating air beyond ambient temperature for energy production. Finally, analysis of the system components is presented with an aim toward identifying components that have the greatest impact on energy and exergy efficiencies in an ideal environment. This work highlights the engineering trade-space and serves as a prescription for determining the merit or measures of effectiveness for an engineered LAES system in terms of energy and exergy. The analytical approach presented in this paper may be applied to other LAES configurations in order to identify optimal operating points in terms of energy and exergy efficiencies.

]]>Entropy doi: 10.3390/e20100769

Authors: Donghua Chen Runtong Zhang Xiaomin Zhu

This study aimed to propose a mapping framework with entropy-based metrics for validating the effectiveness of the transition between International Classification of Diseases 10th revision (ICD-10)-coded datasets and a new context of ICD-11. Firstly, we used tabular lists and mapping tables of ICD-11 to establish the framework. Then, we leveraged Shannon entropy to propose validation methods to evaluate information changes during the transition from the perspectives of single-code, single-disease, and multiple-disease datasets. Novel metrics, namely, standardizing rate (SR), uncertainty rate (UR), and information gain (IG), were proposed for the validation. Finally, validation results from an ICD-10-coded dataset with 377,589 records indicated that the proposed metrics reduced the complexity of transition evaluation. The results with the SR in the transition indicated that approximately 60% of the ICD-10 codes in the dataset were unable to map the codes to standard ICD-10 codes released by WHO. The validation results with the UR provided 86.21% of the precise mapping. Validation results of the IG in the dataset, before and after the transition, indicated that approximately 57% of the records tended to increase uncertainty when mapped from ICD-10 to ICD-11. The new features of ICD-11 involved in the transition can promote a reliable and effective mapping between two coding systems.

]]>Entropy doi: 10.3390/e20100768

Authors: Jian-Hong Lin Claudio Tessone Manuel Mariani

Nestedness refers to the structural property of complex networks that the neighborhood of a given node is a subset of the neighborhoods of better-connected nodes. Following the seminal work by Patterson and Atmar (1986), ecologists have been long interested in revealing the configuration of maximal nestedness of spatial and interaction matrices of ecological communities. In ecology, the BINMATNEST genetic algorithm can be considered as the state-of-the-art approach for this task. On the other hand, the fitness-complexity ranking algorithm has been recently introduced in the economic complexity literature with the original goal to rank countries and products in World Trade export networks. Here, by bringing together quantitative methods from ecology and economic complexity, we show that the fitness-complexity algorithm is highly effective in the nestedness maximization task. More specifically, it generates matrices that are more nested than the optimal ones by BINMATNEST for 61.27% of the analyzed mutualistic networks. Our findings on ecological and World Trade data suggest that beyond its applications in economic complexity, the fitness-complexity algorithm has the potential to become a standard tool in nestedness analysis.

]]>Entropy doi: 10.3390/e20100767

Authors: Gabriel Barrios Francisco Peña Francisco Albarrán-Arriagada Patricio Vargas Juan Retamal

We consider a purely mechanical quantum cycle comprised of adiabatic and isoenergetic processes. In the latter, the system interacts with an energy bath keeping constant the expectation value of the Hamiltonian. In this work, we study the performance of the quantum cycle for a system described by the quantum Rabi model for the case of controlling the coupling strength parameter, the resonator frequency, and the two-level system frequency. For the cases of controlling either the coupling strength parameter or the resonator frequency, we find that it is possible to closely approach to maximal unit efficiency when the parameter is sufficiently increased in the first adiabatic stage. In addition, for the first two cases the maximal work extracted is obtained at parameter values corresponding to high efficiency, which constitutes an improvement over current proposals of this cycle.

]]>Entropy doi: 10.3390/e20100766

Authors: Gianni Valerio Vinci Roberto Benzi

In this paper we study the causal relation between country Economic Fitness F c and its Gross Domestic Product per capita ( G D P ). Using the Takens&rsquo; theorem, as first suggested in (Sugihara, G. et al. 2012), we show that there exists a reasonable evidence of causal correlation between G D P and F c for relatively rich countries. This is not the case for relatively poor countries where F c and G D P do not show any significant causal relation. We also present some preliminary results to understand whether G D P or F c are driving factor for economic growth.

]]>Entropy doi: 10.3390/e20100765

Authors: Antonio Peruzzi Fabiana Zollo Walter Quattrociocchi Antonio Scala

The claim of Cambridge Analytica, a political consulting firm, that it was possible to influence voting behavior by using data mined from the social platform Facebook created a sudden fear in its users of being manipulated; consequently, even the market price of the social platform was shocked.We propose a case study analyzing the effect of this data scandal not only on Facebook stock price, but also on the whole stock market. To such a scope, we consider 15-minutes prices and returns of the set of the NASDAQ-100 components before and after the Cambridge Analytica case. We analyze correlations and Mutual Information among components finding that assets become more correlated and their Mutual Information grows higher. We also observe that correlation and Mutual Information are mutually increasing and seem to follow a master curve. Hence, the market appears more fragile after the Cambridge Analytica event. In fact, as it is well-known in finance, an increase in the average value of correlations augments the systemic risk (i.e., all the market can collapse as a whole) and decreases the possibility of allocating a safe investment portfolio.

]]>Entropy doi: 10.3390/e20100764

Authors: John D. McCamley William Denton Andrew Arnold Peter C. Raffalt Jennifer M. Yentes

Sample entropy (SE) has relative consistency using biologically-derived, discrete data &gt;500 data points. For certain populations, collecting this quantity is not feasible and continuous data has been used. The effect of using continuous versus discrete data on SE is unknown, nor are the relative effects of sampling rate and input parameters m (comparison vector length) and r (tolerance). Eleven subjects walked for 10-minutes and continuous joint angles (480 Hz) were calculated for each lower-extremity joint. Data were downsampled (240, 120, 60 Hz) and discrete range-of-motion was calculated. SE was quantified for angles and range-of-motion at all sampling rates and multiple combinations of parameters. A differential relationship between joints was observed between range-of-motion and joint angles. Range-of-motion SE showed no difference; whereas, joint angle SE significantly decreased from ankle to knee to hip. To confirm findings from biological data, continuous signals with manipulations to frequency, amplitude, and both were generated and underwent similar analysis to the biological data. In general, changes to m, r, and sampling rate had a greater effect on continuous compared to discrete data. Discrete data was robust to sampling rate and m. It is recommended that different data types not be compared and discrete data be used for SE.

]]>Entropy doi: 10.3390/e20100763

Authors: Ana Costa Roope Uola Otfried Gühne

The effect of quantum steering describes a possible action at a distance via local measurements. Whereas many attempts on characterizing steerability have been pursued, answering the question as to whether a given state is steerable or not remains a difficult task. Here, we investigate the applicability of a recently proposed method for building steering criteria from generalized entropic uncertainty relations. This method works for any entropy which satisfy the properties of (i) (pseudo-) additivity for independent distributions; (ii) state independent entropic uncertainty relation (EUR); and (iii) joint convexity of a corresponding relative entropy. Our study extends the former analysis to Tsallis and Rényi entropies on bipartite and tripartite systems. As examples, we investigate the steerability of the three-qubit GHZ and W states.

]]>Entropy doi: 10.3390/e20100762

Authors: Joanna Kwiecień

Swarm intelligence draws its inspiration from the collective behaviour of many individual agents interacting with both one another and their environment. This paper presents a possibility to apply a swarm-based algorithm, modelled after the behaviour of individuals operating within a group where individuals move around in the manner intended to avoid mutual collisions, to create the most challenging maze developed on a board with determined dimensions. When solving such a problem, two complexity measures are used. Firstly, the complexity of the path was assumed to be a quality criterion, depending on the number of bends and the length of the path between two set points that was subjected to maximisation. Secondly, we focus on the well-known concept of the maze complexity given as the total complexity of the path and all branches. Owing to the uniqueness of the problem, consisting in the maze modification, a methodology was developed to make it possible for the individuals belonging to their population to make various types of movements, e.g., approach the best individual, within the range of visibility, or relocate randomly. The test results presented here indicate a potential prospect of application of the swarm-based methods to generate more and more challenging two-dimensional mazes.

]]>Entropy doi: 10.3390/e20100761

Authors: Matthieu Cristelli Andrea Tacchella Masud Cader

Does the infrastructure stock catalyze the development of new capabilities and ultimately of new products or vice-versa? Here we want to quantify the interplay between these two dimensions from a temporal dynamics perspective and, namely, to address whether the interaction occurs predominantly in a specific direction. We therefore need to measure the complexity of an economy (i.e., its capability stock) and the infrastructure stock of a country. For the former, we leverage a previously proposed metrics, the Economic Fitness (Tacchella, A.; et al. Sci. Rep. 2012, 2, 723). For the latter, we propose a new purely statistical indicator which is the principal component performed on the 47 infrastructure indicators published by the World Bank. The proposed indicator still belongs to the class of linear combination of relevant indicators but, differently from standard economic indicators of the same type as the Connectivity Index, the HDI, etc, the weights of the linear combination are not subjectively chosen or re-calibrated on a regular basis but they are those which capture the highest fraction of the information encoded in the initial dataset. The two metrics allow the study of the dynamics in the Economic Fitness-Infrastructure plane and reveal the existence of two regimes: one for low Fitness where the infrastructure and the complexity of an economy are unrelated and a second regime where the two dimensions are tightly related. To quantify the interplay of the two dimensions in this latter regime, we assume a parsimonious linear dynamic model and the emerging picture is that: (i) the feedback occurs in both directions; (ii) on the short-term (&lt;3 years) the predominant direction of interaction is from infrastructure to capability stock; (iii) while for longer time scale (&gt;3 years) the interaction is reversed, new capabilities lead to increasing infrastructure stock.

]]>Entropy doi: 10.3390/e20100760

Authors: Johan Anderson Sara Moradi Tariq Rafiq

The numerical solutions to a non-linear Fractional Fokker–Planck (FFP) equation are studied estimating the generalized diffusion coefficients. The aim is to model anomalous diffusion using an FFP description with fractional velocity derivatives and Langevin dynamics where Lévy fluctuations are introduced to model the effect of non-local transport due to fractional diffusion in velocity space. Distribution functions are found using numerical means for varying degrees of fractionality of the stable Lévy distribution as solutions to the FFP equation. The statistical properties of the distribution functions are assessed by a generalized normalized expectation measure and entropy and modified transport coefficient. The transport coefficient significantly increases with decreasing fractality which is corroborated by analysis of experimental data.

]]>Entropy doi: 10.3390/e20100759

Authors: Cheng Ye Richard Wilson Luca Rossi Andrea Torsello Edwin Hancock

The problem of how to represent networks, and from this representation, derive succinct characterizations of network structure and in particular how this structure evolves with time, is of central importance in complex network analysis. This paper tackles the problem by proposing a thermodynamic framework to represent the structure of time-varying complex networks. More importantly, such a framework provides a powerful tool for better understanding the network time evolution. Specifically, the method uses a recently-developed approximation of the network von Neumann entropy and interprets it as the thermodynamic entropy for networks. With an appropriately-defined internal energy in hand, the temperature between networks at consecutive time points can be readily derived, which is computed as the ratio of change of entropy and change in energy. It is critical to emphasize that one of the main advantages of the proposed method is that all these thermodynamic variables can be computed in terms of simple network statistics, such as network size and degree statistics. To demonstrate the usefulness of the thermodynamic framework, the paper uses real-world network data, which are extracted from time-evolving complex systems in the financial and biological domains. The experimental results successfully illustrate that critical events, including abrupt changes and distinct periods in the evolution of complex networks, can be effectively characterized.

]]>Entropy doi: 10.3390/e20100758

Authors: Yair Neuman Navot Israeli Dan Vilenchik Yochai Cohen

To optimize its performance, a competitive team, such as a soccer team, must maintain a delicate balance between organization and disorganization. On the one hand, the team should maintain organized patterns of behavior to maximize the cooperation between its members. On the other hand, the team&rsquo;s behavior should be disordered enough to mislead its opponent and to maintain enough degrees of freedom. In this paper, we have analyzed this dynamic in the context of soccer games and examined whether it is correlated with the team&rsquo;s performance. We measured the organization associated with the behavior of a soccer team through the Tsallis entropy of ball passes between the players. Analyzing data taken from the English Premier League (2015/2016), we show that the team&rsquo;s position at the end of the season is correlated with the team&rsquo;s entropy as measured with a super-additive entropy index. Moreover, the entropy score of a team significantly contributes to the prediction of the team&rsquo;s position at the end of the season beyond the prediction gained by the team&rsquo;s position at the end of the previous season.

]]>Entropy doi: 10.3390/e20100757

Authors: Panayiotis Varotsos Nicholas Sarlis Efthimios Skordas

The observed earthquake scaling laws indicate the existence of phenomena closely associated with the proximity of the system to a critical point. Taking this view that earthquakes are critical phenomena (dynamic phase transitions), here we investigate whether in this case the Lifshitz–Slyozov–Wagner (LSW) theory for phase transitions showing that the characteristic size of the minority phase droplets grows with time as t 1 / 3 is applicable. To achieve this goal, we analyzed the Japanese seismic data in a new time domain termed natural time and find that an LSW behavior is actually obeyed by a precursory change of seismicity and in particular by the fluctuations of the entropy change of seismicity under time reversal before the Tohoku earthquake of magnitude 9.0 that occurred on 11 March 2011 in Japan. Furthermore, the Tsallis entropic index q is found to exhibit a precursory increase.

]]>Entropy doi: 10.3390/e20100756

Authors: Iztok Fajfar Tadej Tuma

The problem of the creation of numerical constants has haunted the Genetic Programming (GP) community for a long time and is still considered one of the principal open research issues. Many problems tackled by GP include finding mathematical formulas, which often contain numerical constants. It is, however, a great challenge for GP to create highly accurate constants as their values are normally continuous, while GP is intrinsically suited for combinatorial optimization. The prevailing attempts to resolve this issue either employ separate real-valued local optimizers or special numeric mutations. While the former yield better accuracy than the latter, they add to implementation complexity and significantly increase computational cost. In this paper, we propose a special numeric crossover operator for use with Robust Gene Expression Programming (RGEP). RGEP is a type of genotype/phenotype evolutionary algorithm closely related to GP, but employing linear chromosomes. Using normalized least squares error as a fitness measure, we show that the proposed operator is significantly better in finding highly accurate solutions than the existing numeric mutation operators on several symbolic regression problems. Another two important advantages of the proposed operator are that it is extremely simple to implement, and it comes at no additional computational cost. The latter is true because the operator is integrated into an existing crossover operator and does not call for an additional cost function evaluation.

]]>Entropy doi: 10.3390/e20100755

Authors: Ryan John Cubero Matteo Marsili Yasser Roudi

In the Minimum Description Length (MDL) principle, learning from the data is equivalent to an optimal coding problem. We show that the codes that achieve optimal compression in MDL are critical in a very precise sense. First, when they are taken as generative models of samples, they generate samples with broad empirical distributions and with a high value of the relevance, defined as the entropy of the empirical frequencies. These results are derived for different statistical models (Dirichlet model, independent and pairwise dependent spin models, and restricted Boltzmann machines). Second, MDL codes sit precisely at a second order phase transition point where the symmetry between the sampled outcomes is spontaneously broken. The order parameter controlling the phase transition is the coding cost of the samples. The phase transition is a manifestation of the optimality of MDL codes, and it arises because codes that achieve a higher compression do not exist. These results suggest a clear interpretation of the widespread occurrence of statistical criticality as a characterization of samples which are maximally informative on the underlying generative process.

]]>Entropy doi: 10.3390/e20100754

Authors: Filippos Vallianatos Georgios Chatzopoulos

Observational indications support the hypothesis that many large earthquakes are preceded by accelerating-decelerating seismic release rates which are described by a power law time to failure relation. In the present work, a unified theoretical framework is discussed based on the ideas of non-extensive statistical physics along with fundamental principles of physics such as the energy conservation in a faulted crustal volume undergoing stress loading. We define a generalized Benioff strain function &Omega; &xi; ( t ) = &sum; i = 1 n ( t ) E i &xi; ( t ) , where Ei is the earthquake energy, 0 &le; &xi; &le; 1 . and a time-to-failure power-law of &Omega; &xi; ( t ) derived for a fault system that obeys a hierarchical distribution law extracted from Tsallis entropy. In the time-to-failure power-law followed by &Omega; &xi; ( t ) the existence of a common exponent m&xi; which is a function of the non-extensive entropic parameter q is demonstrated. An analytic expression that connects m&xi; with the Tsallis entropic parameter q and the b value of Gutenberg&mdash;Richter law is derived. In addition the range of q and b values that could drive the system into an accelerating stage and to failure is discussed, along with precursory variations of m&xi; resulting from the precursory b-value anomaly. Finally our calculations based on Tsallis entropy and the energy conservation give a new view on the empirical laws derived in the literature, the associated average generalized Benioff strain rate during accelerating period with the background rate and connecting model parameters with the expected magnitude of the main shock.

]]>Entropy doi: 10.3390/e20100753

Authors: Kirstin Roster Luciana Harrington Masud Cader

We leverage a new complexity framework called Economic Fitness, which characterizes an economy&rsquo;s level of diversification and its capabilities to produce more complex products. It can be used to predict economic growth and competitiveness. This paper describes an application of Economic Fitness called the Country Opportunity Spotlight (COS) that assesses a country&rsquo;s current level of capabilities and demonstrates which industries have upgrade and diversification potential given those capabilities. It helps unlock the explanatory and predictive power of Economic Fitness for policymakers. COS results serve as a starting point for policymakers to shape and validate priorities, compare countries, asses the capabilities needed in specific industries and begin identifying constraints to growth. We showcase the use of this framework for Mexico and Brazil. These countries provide an interesting case study, as they have similar growth outlooks yet demonstrate different productive capabilities. Examining Mexico and Brazil side by side illustrates the value this analysis can have on deciphering structural change and decision making and at the same time reinforces the need for a nuanced consideration of each country&rsquo;s unique context.

]]>Entropy doi: 10.3390/e20100752

Authors: Francesca Tria Vittorio Loreto Vito D. P. Servedio

Zipf&rsquo;s, Heaps&rsquo; and Taylor&rsquo;s laws are ubiquitous in many different systems where innovation processes are at play. Together, they represent a compelling set of stylized facts regarding the overall statistics, the innovation rate and the scaling of fluctuations for systems as diverse as written texts and cities, ecological systems and stock markets. Many modeling schemes have been proposed in literature to explain those laws, but only recently a modeling framework has been introduced that accounts for the emergence of those laws without deducing the emergence of one of the laws from the others or without ad hoc assumptions. This modeling framework is based on the concept of adjacent possible space and its key feature of being dynamically restructured while its boundaries get explored, i.e., conditional to the occurrence of novel events. Here, we illustrate this approach and show how this simple modeling framework, instantiated through a modified P&oacute;lya&rsquo;s urn model, is able to reproduce Zipf&rsquo;s, Heaps&rsquo; and Taylor&rsquo;s laws within a unique self-consistent scheme. In addition, the same modeling scheme embraces other less common evolutionary laws (Hoppe&rsquo;s model and Dirichlet processes) as particular cases.

]]>Entropy doi: 10.3390/e20100751

Authors: Shuo Shao Tie Liu Chao Tian Cong Shen

The problem of multilevel diversity coding with secure regeneration (MDC-SR) is considered, which includes the problems of multilevel diversity coding with regeneration (MDC-R) and secure regenerating code (SRC) as special cases. Two outer bounds are established, showing that separate coding can achieve the minimum-bandwidth-regeneration (MBR) point of the achievable normalized storage-capacity repair-bandwidth trade-off regions for the general MDC-SR problem. The core of the new converse results is an exchange lemma, which can be established using Han&rsquo;s subset inequality.

]]>Entropy doi: 10.3390/e20100750

Authors: Jerry Gibson

Autoregressive processes play a major role in speech processing (linear prediction), seismic signal processing, biological signal processing, and many other applications. We consider the quantity defined by Shannon in 1948, the entropy rate power, and show that the log ratio of entropy powers equals the difference in the differential entropy of the two processes. Furthermore, we use the log ratio of entropy powers to analyze the change in mutual information as the model order is increased for autoregressive processes. We examine when we can substitute the minimum mean squared prediction error for the entropy power in the log ratio of entropy powers, thus greatly simplifying the calculations to obtain the differential entropy and the change in mutual information and therefore increasing the utility of the approach. Applications to speech processing and coding are given and potential applications to seismic signal processing, EEG classification, and ECG classification are described.

]]>Entropy doi: 10.3390/e20100749

Authors: Saurav Talukdar Shreyas Bhaban James Melbourne Murti Salapaka

This article analyzes the effect of imperfections in physically realizable memory. Motivated by the realization of a bit as a Brownian particle within a double well potential, we investigate the energetics of an erasure protocol under a Gaussian mixture model. We obtain sharp quantitative entropy bounds that not only give rigorous justification for heuristics utilized in prior works, but also provide a guide toward the minimal scale at which an erasure protocol can be performed. We also compare the results obtained with the mean escape times from double wells to ensure reliability of the memory. The article quantifies the effect of overlap of two Gaussians on the the loss of interpretability of the state of a one bit memory, the required heat dissipated in partially successful erasures and reliability of information stored in a memory bit.

]]>Entropy doi: 10.3390/e20100748

Authors: Shazia Saqib Syed Asad Raza Kazmi

Multimedia information requires large repositories of audio-video data. Retrieval and delivery of video content is a very time-consuming process and is a great challenge for researchers. An efficient approach for faster browsing of large video collections and more efficient content indexing and access is video summarization. Compression of data through extraction of keyframes is a solution to these challenges. A keyframe is a representative frame of the salient features of the video. The output frames must represent the original video in temporal order. The proposed research presents a method of keyframe extraction using the mean of consecutive k frames of video data. A sliding window of size k / 2 is employed to select the frame that matches the median entropy value of the sliding window. This is called the Median of Entropy of Mean Frames (MME) method. MME is mean-based keyframes selection using the median of the entropy of the sliding window. The method was tested for more than 500 videos of sign language gestures and showed satisfactory results.

]]>Entropy doi: 10.3390/e20100747

Authors: Bo Wu Yangde Gao Songlin Feng Theerasak Chanwimalueang

To reduce the maintenance cost and safeguard machinery operation, remaining useful life (RUL) prediction is very important for long term health monitoring. In this paper, we introduce a novel hybrid method to deal with the RUL prediction for health management. Firstly, the sparse reconstruction algorithm of the optimized Lasso and the Least Square QR-factorization (Lasso-LSQR) is applied to compressed sensing (CS), which can realize the sparse optimization for long term health monitoring data. After the sparse signal is reconstructed, the minimum entropy de-convolution (MED) is used to identify the fault characteristics and to obtain significant fault information from the machinery operation. Health indicators with Skip-over, sample entropy and approximate entropy are then performed to track the degradation of the machinery process. The performance analysis of the Skip-over is superior to other indicators. Finally, Fractal Autoregressive Integrated Moving Average model (FARIMA) is employed to predict the Skip-over using the R/S method. The analysis results evidence that the novel hybrid method yields a good performance, and such method can achieve highly accurate RUL prediction and safeguard machinery operation for long term monitoring.

]]>Entropy doi: 10.3390/e20100746

Authors: Mirosław Kordos Krystian Łapa

The purpose of instance selection is to reduce the data size while preserving as much useful information stored in the data as possible and detecting and removing the erroneous and redundant information. In this work, we analyze instance selection in regression tasks and apply the NSGA-II multi-objective evolutionary algorithm to direct the search for the optimal subset of the training dataset and the k-NN algorithm for evaluating the solutions during the selection process. A key advantage of the method is obtaining a pool of solutions situated on the Pareto front, where each of them is the best for certain RMSE-compression balance. We discuss different parameters of the process and their influence on the results and put special efforts to reducing the computational complexity of our approach. The experimental evaluation proves that the proposed method achieves good performance in terms of minimization of prediction error and minimization of dataset size.

]]>Entropy doi: 10.3390/e20100745

Authors: Marco Enríquez Francisco Delgado Karol Życzkowski

We study entanglement properties of generic three-qubit pure states. First, we obtain the distributions of both the coefficients and the only phase in the five-term decomposition of Ac&iacute;n et al. for an ensemble of random pure states generated by the Haar measure on U ( 8 ) . Furthermore, we analyze the probability distributions of two sets of polynomial invariants. One of these sets allows us to classify three-qubit pure states into four classes. Entanglement in each class is characterized using the minimal R&eacute;nyi-Ingarden-Urbanik entropy. Besides, the fidelity of a three-qubit random state with the closest state in each entanglement class is investigated. We also present a characterization of these classes in terms of the corresponding entanglement polytope. The entanglement classes related to stochastic local operations and classical communication (SLOCC) are analyzed as well from this geometric perspective. The numerical findings suggest some conjectures relating some of those invariants with entanglement properties to be ground in future analytical work.

]]>Entropy doi: 10.3390/e20100744

Authors: Fabio Anza

The unitary dynamics of isolated quantum systems does not allow a pure state to thermalize. Because of that, if an isolated quantum system equilibrates, it will do so to the predictions of the so-called &ldquo;diagonal ensemble&rdquo; &rho; DE . Building on the intuition provided by Jaynes&rsquo; maximum entropy principle, in this paper we present a novel technique to generate progressively better approximations to &rho; DE . As an example, we write down a hierarchical set of ensembles which can be used to describe the equilibrium physics of small isolated quantum systems, going beyond the &ldquo;thermal ansatz&rdquo; of Gibbs ensembles.

]]>Entropy doi: 10.3390/e20100743

Authors: Ruben Krantz Valerio Gemmetto Diego Garlaschelli

The concepts of economic fitness and complexity, based on iterative and interdependent definitions of the quality of exporting countries and exported products, have led to novel insights into the dynamics of production and trade. A key step in the calculation of these quantities is the preliminary identification of statistically relevant country-product pairs.In this paper, we propose a method that could improve the current practice of filtering based on the revealed comparative advantage, by employing the maximum-entropy principle to construct an unbiased link weight probability distribution that, unlike the traditional thresholding method, allows for the statistical assessment of empirical trade volumes. The result is an adjusted geometric distribution for trade links that refines the revealed comparative advantage approach. This allows us to define the statistical significance of each trade link weight, leading to statistically supported trade link filtering decisions. Using this statistically justified filtering method, we have obtained results that are similar in nature to those that were found without this method, even though there are significant deviations in the details. In addition, the statistical information thus obtained on each trade link allows us to perform a spectral analysis of the export portfolio of individual economies.

]]>Entropy doi: 10.3390/e20100742

Authors: Borja Camino-Pontes Ibai Diez Antonio Jimenez-Marin Javier Rasero Asier Erramuzpe Paolo Bonifazi Sebastiano Stramaglia Stephan Swinnen Jesus M. Cortes

Interaction Information (II) generalizes the univariate Shannon entropy to triplets of variables, allowing the detection of redundant (R) or synergetic (S) interactions in dynamical networks. Here, we calculated II from functional magnetic resonance imaging data and asked whether R or S vary across brain regions and along lifespan. Preserved along lifespan, we found high overlapping between the pattern of high R and the default mode network, whereas high values of S were overlapping with different cognitive domains, such as spatial and temporal memory, emotion processing and motor skills. Moreover, we have found a robust balance between R and S among different age intervals, indicating informational compensatory mechanisms in brain networks.

]]>Entropy doi: 10.3390/e20100741

Authors: Andreas Schlatter

We construct a type of thermal quantum-clocks and show that various interesting relations between energy, entropy and geometry in space&ndash;time directly follow by partially synchronizing them in the sense of making them march in step with photon clocks.

]]>Entropy doi: 10.3390/e20100740

Authors: Wolfgang Muschik

Meixner&rsquo;s historical remark in 1969 &ldquo;... it can be shown that the concept of entropy in the absence of equilibrium is in fact not only questionable but that it cannot even be defined....&rdquo; is investigated from today&rsquo;s insight. Several statements&mdash;such as the three laws of phenomenological thermodynamics, the embedding theorem and the adiabatical uniqueness&mdash;are used to get rid of non-equilibrium entropy as a primitive concept. In this framework, Clausius inequality of open systems can be derived by use of the defining inequalities which establish the non-equilibrium quantities contact temperature and non-equilibrium molar entropy which allow to describe the interaction between the Schottky system and its controlling equilibrium environment.

]]>Entropy doi: 10.3390/e20100739

Authors: Alberto Beretta Claudia Battistin Clélia de Mulatier Iacopo Mastromatteo Matteo Marsili

Models can be simple for different reasons: because they yield a simple and computationally efficient interpretation of a generic dataset (e.g., in terms of pairwise dependencies)&mdash;as in statistical learning&mdash;or because they capture the laws of a specific phenomenon&mdash;as e.g., in physics&mdash;leading to non-trivial falsifiable predictions. In information theory, the simplicity of a model is quantified by the stochastic complexity, which measures the number of bits needed to encode its parameters. In order to understand how simple models look like, we study the stochastic complexity of spin models with interactions of arbitrary order. We show that bijections within the space of possible interactions preserve the stochastic complexity, which allows to partition the space of all models into equivalence classes. We thus found that the simplicity of a model is not determined by the order of the interactions, but rather by their mutual arrangements. Models where statistical dependencies are localized on non-overlapping groups of few variables are simple, affording predictions on independencies that are easy to falsify. On the contrary, fully connected pairwise models, which are often used in statistical learning, appear to be highly complex, because of their extended set of interactions, and they are hard to falsify.

]]>Entropy doi: 10.3390/e20100738

Authors: Xinyu Yang Haijiang He Jun Xu Yikun Wei Hua Zhang

Entropy generation rates in two-dimensional Rayleigh&ndash;Taylor (RT) turbulence mixing are investigated by numerical calculation. We mainly focus on the behavior of thermal entropy generation and viscous entropy generation of global quantities with time evolution in Rayleigh&ndash;Taylor turbulence mixing. Our results mainly indicate that, with time evolution, the intense viscous entropy generation rate s u and the intense thermal entropy generation rate S &theta; occur in the large gradient of velocity and interfaces between hot and cold fluids in the RT mixing process. Furthermore, it is also noted that the mixed changing gradient of two quantities from the center of the region to both sides decrease as time evolves, and that the viscous entropy generation rate &lang; S u &rang; V and thermal entropy generation rate &lang; S &theta; &rang; V constantly increase with time evolution; the thermal entropy generation rate &lang; S &theta; &rang; V with time evolution always dominates in the entropy generation of the RT mixing region. It is further found that a &ldquo;smooth&rdquo; function &lang; S u &rang; V &sim; t 1 / 2 and a linear function &lang; S &theta; &rang; V &sim; t are achieved in the spatial averaging entropy generation of RT mixing process, respectively.

]]>Entropy doi: 10.3390/e20100737

Authors: Fernando C. Pérez-Cárdenas Lorenzo Resca Ian L. Pegg

We show that coarse graining produces significant and predictable effects on the entropy of states of equilibrium when the scale of coarse graining becomes comparable to that of density fluctuations. We demonstrate that a coarse-grained entropy typically evolves toward a state of effective equilibrium with a lower value than that of the state of maximum entropy theoretically possible. The finer the coarse graining, the greater the drop in effective entropy, and the more relevant the fluctuations around that. Fundamental considerations allow us to derive a remarkable power law that relates coarse graining to the effective entropy gap. Another power law is found that precisely relates the noise range of effective entropy fluctuations to coarse graining. We test both power laws with numerical simulations based on a well-studied two-dimensional lattice gas model. As expected, the effects of these power laws diminish as our description approaches a macroscopic level, eventually disappearing in the thermodynamic limit, where the maximum entropy principle is reasserted.

]]>Entropy doi: 10.3390/e20100736

Authors: Alicia Rodriguez-Carrion Carlos Garcia-Rubio Celeste Campo

Correctly estimating the features characterizing human mobility from mobile phone traces is a key factor to improve the performance of mobile networks, as well as for mobility model design and urban planning. Most related works found their conclusions on location data based on the cells where each user sends or receives calls or messages, data known as Call Detail Records (CDRs). In this work, we test if such data sets provide enough detail on users&rsquo; movements so as to accurately estimate some of the most studied mobility features. We perform the analysis using two different data sets, comparing CDRs with respect to an alternative data collection approach. Furthermore, we propose three filtering techniques to reduce the biases detected in the fraction of visits per cell, entropy and entropy rate distributions, and predictability. The analysis highlights the need for contextualizing mobility results with respect to the data used, since the conclusions are biased by the mobile phone traces collection approach.

]]>Entropy doi: 10.3390/e20100735

Authors: Gianluca Teza Michele Caraglio Attilio L. Stella

The dynamics of imports plus exports of 226 product classes by the G7 countries between 1962 and 2000 is described in terms of stochastic differential equations. The model allows interesting comparisons among the different economies related to the compositions of the national baskets. Synthetic solutions can also be used to estimate hidden and unexploited growth potentials. These prerogatives are strictly connected with the fact that a network structure is at the basis of the model. Such a network expresses the mutual influences of different products through resource transfers, and is a key ingredient producing cooperative growth effects which can be quantified and distinguished from those generated by deterministic drifts and representing direct resource inputs. An analysis of this network, which differs substantially from those previously considered within the economic complexity approach, allows to estimate the centrality of different products in each national basket, highlighting the most essential commodities for each economy. Solutions of the model give the possibility of performing counterfactual analyses aimed at estimating how much the growth of each country could have profited from a general strengthening, or weakening, of the links in the same products network.

]]>Entropy doi: 10.3390/e20100734

Authors: Guanrong Chen Marius-F. Danca Xiaosong Yang Genaro J. Martinez Hai Yu

In recent years, as natural and social sciences are rapidly evolving, classical chaos theoryand modern complex networks studies are gradually interacting each other with a great joineddevelopment [...]

]]>Entropy doi: 10.3390/e20100733

Authors: Yikun Wei Zhengdao Wang Yuehong Qian Wenjing Guo

A numerical investigation has been carried out to understand the mechanism of the rotation effect on bifurcation and dual solutions in natural convection within a horizontal annulus. A thermal immersed boundary-lattice Boltzmann method was used to resolve the annular flow domain covered by a Cartesian mesh. The Rayleigh number based on the gap width is fixed at 104. The rotation effect on the natural convection is analyzed by streamlines, isotherms, phase portrait and bifurcation diagram. Our results manifest the existence of three convection patterns in a horizontal annulus with rotating inner cylinder which affect the heat transfer in different ways, and the linear speed ( U i * ) determines the proportion of each convection. Comparison of average Nusselt number versus linear speed for the inner cylinder indicates the existence of the three different mechanisms which drive the convection in a rotation system. The convection pattern caused by rotation reduces the heat transfer efficiency. Our results in phase portraits also reveal the differences among different convection patterns.

]]>Entropy doi: 10.3390/e20100732

Authors: Qiang Han Deren Yang

Under the infrastructure of three gradually deepening layers consisting of System, Service and Software, the information entropy of the Trustworthy Workflow Management System (TWfMS) will evolve from being more precise to more undetermined, due to a series of exception event X occurring on certain components (ExCs), along with the life cycle of TWfMS, experienced in its phased original, as-is, to-be, and agile-consistent stages, and recover, more precisely again, by turning back to the original state from the agile-consistent stage, due to its self-autonomous improvement. With a special emphasis on the system layer, to assure the trustworthiness of WfMS, this paper firstly introduces the preliminary knowledge of the hierarchical information entropy model with correlation theories. After illustrating the fundamental principle, the transformation rule is deduced, step by step, followed by a case study, which is conducive to generating discussions and conclusions in the different research areas of TWfMS. Overall, in this paper, we argue that the trustworthiness maintenance of WfMS could be analyzed and computational, through the viewpoint that all the various states of TWfMS can be considered as the transformation between WfMS and its trustworthiness compensate components, whose information entropy fluctuate repeatedly and comply with the law of the dissipative structure system.

]]>Entropy doi: 10.3390/e20100731

Authors: Leonardo Neves Graciana Puentes

We present a review of photonic implementations of discrete-time quantum walks (DTQW) in the spatial and temporal domains, based on spatial- and time-multiplexing techniques, respectively. Additionally, we propose a detailed novel scheme for photonic DTQW, using transverse spatial modes of single photons and programmable spatial light modulators (SLM) to manipulate them. Unlike all previous mode-multiplexed implementations, this scheme enables simulation of an arbitrary step of the walker, only limited, in principle, by the SLM resolution. We discuss current applications of such photonic DTQW architectures in quantum simulation of topological effects and the use of non-local coin operations based on two-photon hybrid entanglement.

]]>Entropy doi: 10.3390/e20100730

Authors: Li Sun Qinghe Du

With the uninterrupted revolution of communications technologies and the great-leap-forward development of emerging applications, the ubiquitous deployment of Internet of Things (IoT) is imperative to accommodate constantly growing user demands and market scales. Communication security is critically important for the operations of IoT. Among the communication security provisioning techniques, physical layer security (PLS), which can provide unbreakable, provable, and quantifiable secrecy from an information-theoretical point of view, has drawn considerable attention from both the academia and the industries. However, the unique features of IoT, such as low-cost, wide-range coverage, massive connection, and diversified services, impose great challenges for the PLS protocol design in IoT. In this article, we present a comprehensive review of the PLS techniques toward IoT applications. The basic principle of PLS is first briefly introduced, followed by the survey of the existing PLS techniques. Afterwards, the characteristics of IoT are identified, based on which the challenges faced by PLS protocol design are summarized. Then, three newly-proposed PLS solutions are highlighted, which match the features of IoT well and are expected to be applied in the near future. Finally, we conclude the paper and point out some further research directions.

]]>Entropy doi: 10.3390/e20100729

Authors: Amonrat Prasitsupparote Norio Konno Junji Shikata

Many cryptographic systems require random numbers, and the use of weak random numbers leads to insecure systems. In the modern world, there are several techniques for generating random numbers, of which the most fundamental and important methods are deterministic extractors proposed by von Neumann, Elias, and Peres. Elias&rsquo;s extractor achieves the optimal rate (i.e., information-theoretic upper bound) h ( p ) if the block size tends to infinity, where h ( &middot; ) is the binary entropy function and p is the probability that each bit of input sequences occurs. Peres&rsquo;s extractor achieves the optimal rate h ( p ) if the length of the input and the number of iterations tend to infinity. Previous research related to both extractors has made no reference to practical aspects including running time and memory size with finite input sequences. In this paper, based on some heuristics, we derive a lower bound on the maximum redundancy of Peres&rsquo;s extractor, and we show that Elias&rsquo;s extractor is better than Peres&rsquo;s extractor in terms of the maximum redundancy (or the rates) if we do not pay attention to the time complexity or space complexity. In addition, we perform numerical and non-asymptotic analysis of both extractors with a finite input sequence with any biased probability under the same environments. To do so, we implemented both extractors on a general PC and simple environments. Our empirical results show that Peres&rsquo;s extractor is much better than Elias&rsquo;s extractor for given finite input sequences under a very similar running time. As a consequence, Peres&rsquo;s extractor would be more suitable to generate uniformly random sequences in practice in applications such as cryptographic systems.

]]>Entropy doi: 10.3390/e20100728

Authors: Xiangluo Wang Chunlei Yang Guo-Sen Xie Zhonghua Liu

Aiming to implement image segmentation precisely and efficiently, we exploit new ways to encode images and achieve the optimal thresholding on quantum state space. Firstly, the state vector and density matrix are adopted for the representation of pixel intensities and their probability distribution, respectively. Then, the method based on global quantum entropy maximization (GQEM) is proposed, which has an equivalent object function to Otsu&rsquo;s, but gives a more explicit physical interpretation of image thresholding in the language of quantum mechanics. To reduce the time consumption for searching for optimal thresholds, the method of quantum lossy-encoding-based entropy maximization (QLEEM) is presented, in which the eigenvalues of density matrices can give direct clues for thresholding, and then, the process of optimal searching can be avoided. Meanwhile, the QLEEM algorithm achieves two additional effects: (1) the upper bound of the thresholding level can be implicitly determined according to the eigenvalues; and (2) the proposed approaches ensure that the local information in images is retained as much as possible, and simultaneously, the inter-class separability is maximized in the segmented images. Both of them contribute to the structural characteristics of images, which the human visual system is highly adapted to extract. Experimental results show that the proposed methods are able to achieve a competitive quality of thresholding and the fastest computation speed compared with the state-of-the-art methods.

]]>Entropy doi: 10.3390/e20100727

Authors: William Bruce Sherwin

This article discusses how entropy/information methods are well-suited to analyzing and forecasting the four processes of innovation, transmission, movement, and adaptation, which are the common basis to ecology and evolution. Macroecologists study assemblages of differing species, whereas micro-evolutionary biologists study variants of heritable information within species, such as DNA and epigenetic modifications. These two different modes of variation are both driven by the same four basic processes, but approaches to these processes sometimes differ considerably. For example, macroecology often documents patterns without modeling underlying processes, with some notable exceptions. On the other hand, evolutionary biologists have a long history of deriving and testing mathematical genetic forecasts, previously focusing on entropies such as heterozygosity. Macroecology calls this Gini–Simpson, and has borrowed the genetic predictions, but sometimes this measure has shortcomings. Therefore it is important to note that predictive equations have now been derived for molecular diversity based on Shannon entropy and mutual information. As a result, we can now forecast all major types of entropy/information, creating a general predictive approach for the four basic processes in ecology and evolution. Additionally, the use of these methods will allow seamless integration with other studies such as the physical environment, and may even extend to assisting with evolutionary algorithms.

]]>Entropy doi: 10.3390/e20100726

Authors: Rita Iotti Fausto Rossi

Energy dissipation and decoherence in state-of-the-art quantum nanomaterials and related nanodevices are routinely described and simulated via local scattering models, namely relaxation-time and Boltzmann-like schemes. The incorporation of such local scattering approaches within the Wigner-function formalism may lead to anomalous results, such as suppression of intersubband relaxation, incorrect thermalization dynamics, and violation of probability-density positivity. The primary goal of this article is to investigate a recently proposed quantum-mechanical (nonlocal) generalization (Phys. Rev. B 2017, 96, 115420) of semiclassical (local) scattering models, extending such treatment to carrier–carrier interaction, and focusing in particular on the nonlocal character of Pauli-blocking contributions. In order to concretely show the intrinsic limitations of local scattering models, a few simulated experiments of energy dissipation and decoherence in a prototypical quantum-well semiconductor nanostructure are also presented.

]]>Entropy doi: 10.3390/e20100725

Authors: Fernando Hermosillo-Reynoso Deni Torres-Roman Jayro Santiago-Paz Julio Ramirez-Pacheco

Lane detection for traffic surveillance in intelligent transportation systems is a challenge for vision-based systems. In this paper, a novel pixel-entropy based algorithm for the automatic detection of the number of lanes and their centers, as well as the formation of their division lines is proposed. Using as input a video from a static camera, each pixel behavior in the gray color space is modeled by a time series; then, for a time period &tau; , its histogram followed by its entropy are calculated. Three different types of theoretical pixel-entropy behaviors can be distinguished: (1) the pixel-entropy at the lane center shows a high value; (2) the pixel-entropy at the lane division line shows a low value; and (3) a pixel not belonging to the road has an entropy value close to zero. From the road video, several small rectangle areas are captured, each with only a few full rows of pixels. For each pixel of these areas, the entropy is calculated, then for each area or row an entropy curve is produced, which, when smoothed, has as many local maxima as lanes and one more local minima than lane division lines. For the purpose of testing, several real traffic scenarios under different weather conditions with other moving objects were used. However, these background objects, which are out of road, were filtered out. Our algorithm, compared to others based on trajectories of vehicles, shows the following advantages: (1) the lowest computational time for lane detection (only 32 s with a traffic flow of one vehicle/s per-lane); and (2) better results under high traffic flow with congestion and vehicle occlusion. Instead of detecting road markings, it forms lane-dividing lines. Here, the entropies of Shannon and Tsallis were used, but the entropy of Tsallis for a selected q of a finite set achieved the best results.

]]>Entropy doi: 10.3390/e20100724

Authors: Dmytro Velychko Benjamin Knopp Dominik Endres

We describe a sparse, variational posterior approximation to the Coupled Gaussian Process Dynamical Model (CGPDM), which is a latent space coupled dynamical model in discrete time. The purpose of the approximation is threefold: first, to reduce training time of the model; second, to enable modular re-use of learned dynamics; and, third, to store these learned dynamics compactly. Our target applications here are human movement primitive (MP) models, where an MP is a reusable spatiotemporal component, or &ldquo;module&rdquo; of a human full-body movement. Besides re-usability of learned MPs, compactness is crucial, to allow for the storage of a large library of movements. We first derive the variational approximation, illustrate it on toy data, test its predictions against a range of other MP models and finally compare movements produced by the model against human perceptual expectations. We show that the variational CGPDM outperforms several other MP models on movement trajectory prediction. Furthermore, human observers find its movements nearly indistinguishable from replays of natural movement recordings for a very compact parameterization of the approximation.

]]>Entropy doi: 10.3390/e20100723

Authors: Cenker Bicer

The geometric process (GP) is a simple and direct approach to modeling of the successive inter-arrival time data set with a monotonic trend. In addition, it is a quite important alternative to the non-homogeneous Poisson process. In the present paper, the parameter estimation problem for GP is considered, when the distribution of the first occurrence time is Power Lindley with parameters &alpha; and &lambda; . To overcome the parameter estimation problem for GP, the maximum likelihood, modified moments, modified L-moments and modified least-squares estimators are obtained for parameters a, &alpha; and &lambda; . The mean, bias and mean squared error (MSE) values associated with these estimators are evaluated for small, moderate and large sample sizes by using Monte Carlo simulations. Furthermore, two illustrative examples using real data sets are presented in the paper.

]]>Entropy doi: 10.3390/e20100721

Authors: Kalliopi Chochlaki Georgios Michas Filippos Vallianatos

The Yellowstone Park volcanic field is one of the most active volcanic systems in the world, presenting intense seismic activity that is characterized by several earthquake swarms over the last decades. In the present work, we focused on the spatiotemporal properties of the recent earthquake swarms that occurred on December&ndash;January 2008&ndash;2009 and the 2010 Madison Plateau swarm, using the approach of Non Extensive Statistical Physics (NESP). Our approach is based on Tsallis entropy, and is used in order to describe the behavior of complex systems where fracturing and strong correlations exist, such as in tectonic and volcanic environments. This framework is based on the maximization of the non-additive Tsallis entropy Sq, introducing the q-exponential function and the entropic parameter q that expresses the degree of non-extentivity of the system. The estimation of the q-parameters could be used as a correlation degree among the events in the spatiotemporal evolution of seismicity. Using the seismic data provided by University of Utah Seismological Stations (UUSS), we analyzed the inter-event time (T) and distance (r) distribution of successive earthquakes that occurred during the two swarms, fitting the observed data with the q-exponential function, resulting in the estimation of the Tsallis entropic parameters qT, qr for the inter-event time and distance distributions, respectively. Furthermore, we studied the magnitude-frequency distribution of the released earthquake energies E as formulated in the frame of NESP, which results in the estimation of the qE parameter. Our analysis provides the triplet (qE, qT, qr) that describes the magnitude-frequency distribution and the spatiotemporal scaling properties of each of the studied earthquake swarms. In addition, the spatial variability of qE throughout the Yellowstone park volcanic area is presented and correlated with the existence of the regional hydrothermal features.

]]>Entropy doi: 10.3390/e20100722

Authors: Rabha W. Ibrahim Maslina Darus

In this paper, we study Tsallis&rsquo; fractional entropy (TFE) in a complex domain by applying the definition of the complex probability functions. We study the upper and lower bounds of TFE based on some special functions. Moreover, applications in complex neural networks (CNNs) are illustrated to recognize the accuracy of CNNs.

]]>Entropy doi: 10.3390/e20100720

Authors: Adel Ouannas Xiong Wang Amina-Aicha Khennaoui Samir Bendoukha Viet-Thanh Pham Fawaz E. Alsaadi

In this paper, we investigate the dynamics of a fractional order chaotic map corresponding to a recently developed standard map that exhibits a chaotic behavior with no fixed point. This is the first study to explore a fractional chaotic map without a fixed point. In our investigation, we use phase plots and bifurcation diagrams to examine the dynamics of the fractional map and assess the effect of varying the fractional order. We also use the approximate entropy measure to quantify the level of chaos in the fractional map. In addition, we propose a one-dimensional stabilization controller and establish its asymptotic convergence by means of the linearization method.

]]>Entropy doi: 10.3390/e20090719

Authors: Jesús Gutiérrez-Gutiérrez Marta Zárraga-Rodríguez Pedro M. Crespo Xabier Insausti

In this paper, we obtain an integral formula for the rate distortion function (RDF) of any Gaussian asymptotically wide sense stationary (AWSS) vector process. Applying this result, we also obtain an integral formula for the RDF of Gaussian moving average (MA) vector processes and of Gaussian autoregressive MA (ARMA) AWSS vector processes.

]]>Entropy doi: 10.3390/e20090718

Authors: Hao Liao Xiao-Min Huang Alexandre Vidmer Yi-Cheng Zhang Ming-Yang Zhou

The Belt and Road initiative (BRI) was announced in 2013 by the Chinese government. Its goal is to promote the cooperation between European and Asian countries, as well as enhancing the trust between members and unifying the market. Since its creation, more and more developing countries are joining the initiative. Based on the geographical location characteristics of the countries in this initiative, we propose an improvement of a popular recommendation algorithm that includes geographic location information. This recommendation algorithm is able to make suitable recommendations of products for countries in the BRI. Then, Fitness and Complexity metrics are used to evaluate the impact of the recommendation results and measure the country&rsquo;s competitiveness. The aim of this work is to provide countries&rsquo; insights on the ideal development direction. By following the recommendations, the countries can quickly increase their international competitiveness.

]]>Entropy doi: 10.3390/e20090717

Authors: Maël Dugast Guillaume Bouleux Eric Marcon

We proposed in this work the introduction of a new vision of stochastic processes through geometry induced by dilation. The dilation matrices of a given process are obtained by a composition of rotation matrices built in with respect to partial correlation coefficients. Particularly interesting is the fact that the obtention of dilation matrices is regardless of the stationarity of the underlying process. When the process is stationary, only one dilation matrix is obtained and it corresponds therefore to Naimark dilation. When the process is nonstationary, a set of dilation matrices is obtained. They correspond to Kolmogorov decomposition. In this work, the nonstationary class of periodically correlated processes was of interest. The underlying periodicity of correlation coefficients is then transmitted to the set of dilation matrices. Because this set lives on the Lie group of rotation matrices, we can see them as points of a closed curve on the Lie group. Geometrical aspects can then be investigated through the shape of the obtained curves, and to give a complete insight into the space of curves, a metric and the derived geodesic equations are provided. The general results are adapted to the more specific case where the base manifold is the Lie group of rotation matrices, and because the metric in the space of curve naturally extends to the space of shapes; this enables a comparison between curves&rsquo; shapes and allows then the classification of random processes&rsquo; measures.

]]>Entropy doi: 10.3390/e20090716

Authors: Shuqin Zhu Congxu Zhu Wenhong Wang

In order to overcome the difficulty of key management in &ldquo;one time pad&rdquo; encryption schemes and also resist the attack of chosen plaintext, a new image encryption algorithm based on chaos and SHA-256 is proposed in this paper. The architecture of confusion and diffusion is adopted. Firstly, the surrounding of a plaintext image is surrounded by a sequence generated from the SHA-256 hash value of the plaintext to ensure that each encrypted result is different. Secondly, the image is scrambled according to the random sequence obtained by adding the disturbance term associated with the plaintext to the chaotic sequence. Third, the cyphertext (plaintext) feedback mechanism of the dynamic index in the diffusion stage is adopted, that is, the location index of the cyphertext (plaintext) used for feedback is dynamic. The above measures can ensure that the algorithm can resist chosen plaintext attacks and can overcome the difficulty of key management in &ldquo;one time pad&rdquo; encryption scheme. Also, experimental results such as key space analysis, key sensitivity analysis, differential analysis, histograms, information entropy, and correlation coefficients show that the image encryption algorithm is safe and reliable, and has high application potential.

]]>Entropy doi: 10.3390/e20090715

Authors: Ming Zhang Jinghong Zhou Runjuan Zhou

The sustainability of regional water resources has important supporting data needed for establishing policies on the sustainable development of the social economy. The purpose of this paper is to propose an assessment method to accurately reflect the sustainability of regional water resources in various areas. The method is based on the relative entropy of the information entropy theory. The steps are as follows. Firstly, the pretreatment of the evaluation sample data is required, before the relative entropy of each standard evaluation sample and evaluation grade (SEG) is calculated to obtain the entropy weight of each evaluation index. After this, the entropy weighted comprehensive index (WCI) of the standard evaluation grade sample is obtained. The function relation between WCI and SEG can be fitted by the cubic polynomial to construct the evaluation function. Using the above steps, a generalized entropy method (GEM) for the sustainable assessment of regional water resources is established and it is used to evaluate the sustainability of water resources in the Pingba and Huai River areas in China. The results show that the proposed GEM model can accurately reflect the sustainable water resources in the two regions. Compared with the other evaluation models, such as the Shepherd method, Artificial Neural Network and Fuzzy comprehensive evaluation, the GEM model has larger differences in its evaluation results, which are more reasonable. Thus, the proposed GEM model can provide scientific data support for coordinating the relationship between the sustainable development and utilization of regional water resources in order to improve the development of regional population, society and economy.

]]>Entropy doi: 10.3390/e20090714

Authors: Emanuel Guariglia

The aim of this paper is to investigate the generalization of the Sierpinski gasket through the harmonic metric. In particular, this work presents an antenna based on such a generalization. In fact, the harmonic Sierpinski gasket is used as a geometric configuration of small antennas. As with fractal antennas and R&eacute;nyi entropy, their performance is characterized by the associated entropy that is studied and discussed here.

]]>Entropy doi: 10.3390/e20090713

Authors: Dor Cohen Ofer Strichman

We present a new characterization of propositional formulas called entropy, which approximates the freedom we have in assigning the variables. Like several other such measures (e.g., back-door and back-door-key variables), it is computationally expensive to compute. Nevertheless, for small and medium-size satisfiable formulas, it enables us to study the effect of this freedom on the impact of various SAT heuristics, following up on a recent study by C. Oh (Oh, SAT&rsquo;15, LNCS 9340, 307&ndash;323). Oh&rsquo;s findings were that the expected success of various heuristics depends on whether the input formula is satisfiable or not. With entropy, and also with the measure of solution density, we are able to refine these findings for the case of satisfiable formulas. Specifically, we found empirically that satisfiable formulas with small entropy &ldquo;behave&rdquo; similarly to unsatisfiable formulas.

]]>Entropy doi: 10.3390/e20090712

Authors: Edward Bormashenko

The notion of three-phase (line) tension remains one of the most disputable notions in surface science. A very broad range of its values has been reported. Experts even do not agree on the sign of line tension. The polymer-chain-like model of three-phase (triple) line enables rough estimation of entropic input into the value of line tension, estimated as &Gamma; e n &cong; k B T d m &cong; 10 &minus; 11 N , where d m is the diameter of the liquid molecule. The introduction of the polymer-chain-like model of the triple line is justified by the &ldquo;water string&rdquo; model of the liquid state, predicting strong orientation effects for liquid molecules located near hydrophobic moieties. The estimated value of the entropic input into the line tension is close to experimental findings, reported by various groups, and seems to be relevant for the understanding of elastic properties of biological membranes.

]]>Entropy doi: 10.3390/e20090711

Authors: Yuze Su Xiangru Meng Qiaoyan Kang Xiaoyang Han

Network virtualization can offer more flexibility and better manageability for next generation Internet. With the increasing deployments of virtual networks in military and commercial networks, a major challenge is to ensure virtual network survivability against hybrid multiple failures. In this paper, we study the problem of recovering virtual networks affected by hybrid multiple failures in substrate networks and provide an integer linear programming formulation to solve it. We propose a heuristic algorithm to tackle the complexity of the integer linear programming formulation, which includes a faulty virtual network reconfiguration ranking method based on weighted relative entropy, a hybrid multiple failures ranking algorithm, and a virtual node migration method based on weighted relative entropy. In the faulty virtual network reconfiguration ranking method based on weighted relative entropy and virtual node migration method based on weighted relative entropy, multiple ranking indicators are combined in a suitable way based on weighted relative entropy. In the hybrid multiple failures ranking algorithm, the virtual node and its connective virtual links are re-embedded, firstly. Evaluation results show that our heuristic method not only has the best acceptance ratio and normal operation ratio, but also achieves the highest long-term average revenue to cost ratio compared with other virtual network reconfiguration methods.

]]>Entropy doi: 10.3390/e20090710

Authors: Samir Bendoukha Adel Ouannas Xiong Wang Amina-Aicha Khennaoui Viet-Thanh Pham Giuseppe Grassi Van Van Huynh

This paper is concerned with the co-existence of different synchronization types for fractional-order discrete-time chaotic systems with different dimensions. In particular, we show that through appropriate nonlinear control, projective synchronization (PS), full state hybrid projective synchronization (FSHPS), and generalized synchronization (GS) can be achieved simultaneously. A second nonlinear control scheme is developed whereby inverse full state hybrid projective synchronization (IFSHPS) and inverse generalized synchronization (IGS) are shown to co-exist. Numerical examples are presented to confirm the findings.

]]>Entropy doi: 10.3390/e20090709

Authors: Anton M. Unakafov Karsten Keller

This paper is devoted to change-point detection using only the ordinal structure of a time series. A statistic based on the conditional entropy of ordinal patterns characterizing the local up and down in a time series is introduced and investigated. The statistic requires only minimal a priori information on given data and shows good performance in numerical experiments. By the nature of ordinal patterns, the proposed method does not detect pure level changes but changes in the intrinsic pattern structure of a time series and so it could be interesting in combination with other methods.

]]>Entropy doi: 10.3390/e20090708

Authors: Jonatan Zischg Wolfgang Rauch Robert Sitzenfrei

Cities and their infrastructure networks are always in motion and permanently changing in structure and function. This paper presents a methodology for automatically creating future water distribution networks (WDNs) that are stressed step-by-step by disconnection and connection of WDN parts. The associated effects of demand shifting and flow rearrangements are simulated and assessed with hydraulic performances. With the methodology, it is possible to test various planning and adaptation options of the future WDN, where the unknown (future) network is approximated via the co-located and known (future) road network, and hence different topological characteristics (branched vs. strongly looped layout) can be investigated. The reliability of the planning options is evaluated with the flow entropy, a measure based on Shannon&rsquo;s informational entropy. Uncertainties regarding future water consumption and water loss management are included in a scenario analysis. To avoid insufficient water supply to customers during the transition process from an initial to a final WDN state, an adaptation concept is proposed where critical WDN components are replaced over time. Finally, the method is applied to the drastic urban transition of Kiruna, Sweden. Results show that without adaptation measures severe performance drops will occur after the WDN state 2023, mainly caused by the disconnection of WDN parts. However, with low adaptation efforts that consider 2&ndash;3% pipe replacement, sufficient pressure performances are achieved. Furthermore, by using an entropy-cost comparison, the best planning options are determined.

]]>Entropy doi: 10.3390/e20090707

Authors: Matthew E. Quenneville David A. Sivak

A stochastic system under the influence of a stochastic environment is correlated with both present and future states of the environment. Such a system can be seen as implicitly implementing a predictive model of future environmental states. The non-predictive model complexity has been shown to lower-bound the thermodynamic dissipation. Here we explore these statistical and physical quantities at steady state in simple models. We show that under quasi-static driving this model complexity saturates the dissipation. Beyond the quasi-static limit, we demonstrate a lower bound on the ratio of this model complexity to total dissipation, that is realized in the limit of weak driving.

]]>Entropy doi: 10.3390/e20090706

Authors: Khalid Sayood

We examine how information theory has been used to study cognition over the last seven decades. After an initial burst of activity in the 1950s, the backlash that followed stopped most work in this area. The last couple of decades has seen both a revival of interest, and a more firmly grounded, experimentally justified use of information theory. We can view cognition as the process of transforming perceptions into information&mdash;where we use information in the colloquial sense of the word. This last clarification is one of the problems we run into when trying to use information theoretic principles to understand or analyze cognition. Information theory is mathematical, while cognition is a subjective phenomenon. It is relatively easy to discern a subjective connection between cognition and information; it is a different matter altogether to apply the rigor of information theory to the process of cognition. In this paper, we will look at the many ways in which people have tried to alleviate this problem. These approaches range from narrowing the focus to only quantifiable aspects of cognition or borrowing conceptual machinery from information theory to address issues of cognition. We describe applications of information theory across a range of cognition research, from neural coding to cognitive control and predictive coding.

]]>Entropy doi: 10.3390/e20090705

Authors: Juan López-Sauceda Jorge López-Ortega Gerardo Abel Laguna Sánchez Jacobo Sandoval Gutiérrez Ana Paola Rojas Meza José Luis Aragón

A basic pattern in the body plan architecture of many animals, plants and some molecular and cellular systems is five-part units. This pattern has been understood as a result of genetic blueprints in development and as a widely conserved evolutionary character. Despite some efforts, a definitive explanation of the abundance of pentagonal symmetry at so many levels of complexity is still missing. Based on both, a computational platform and a statistical spatial organization argument, we show that five-fold morphology is substantially different from other abundant symmetries like three-fold, four-fold and six-fold symmetries in terms of spatial interacting elements. We develop a measuring system to determine levels of spatial organization in 2D polygons (homogeneous or heterogeneous partition of defined areas) based on principles of regularity in a morphospace. We found that spatial organization of five-fold symmetry is statistically higher than all other symmetries studied here (3 to 10-fold symmetries) in terms of spatial homogeneity. The significance of our findings is based on the statistical constancy of geometrical constraints derived from spatial organization of shapes, beyond the material or complexity level of the many different systems where pentagonal symmetry occurs.

]]>Entropy doi: 10.3390/e20090704

Authors: Jürn Schmelzer Timur Tropin

A response is given to a comment of Zanotto and Mauro on our paper published in Entropy 20, 103 (2018). Our arguments presented in this paper are widely ignored by them, and no new considerations are outlined in the comment, which would require a revision of our conclusions. For this reason, we restrict ourselves here to a brief response, supplementing it by some additional arguments in favor of our point of view not included in our above-cited paper.

]]>Entropy doi: 10.3390/e20090703

Authors: Edgar D. Zanotto John C. Mauro

In a recent article, Schmelzer and Tropin [Entropy 2018, 20, 103] presented a critique of several aspects of modern glass science, including various features of glass transition and relaxation, crystallization, and the definition of glass itself. We argue that these criticisms are at odds with well-accepted knowledge in the field from both theory and experiments. The objective of this short comment is to clarify several of these issues.

]]>Entropy doi: 10.3390/e20090702

Authors: Binghan Liu Zhongguang Fu Pengkai Wang Lu Liu Manda Gao Ji Liu

The energy use analysis of coal-fired power plant units is of significance for energy conservation and consumption reduction. One of the most serious problems attributed to Chinese coal-fired power plants is coal waste. Several units in one plant may experience a practical rated output situation at the same time, which may increase the coal consumption of the power plant. Here, we propose a new hybrid methodology for plant-level load optimization to minimize coal consumption for coal-fired power plants. The proposed methodology includes two parts. One part determines the reference value of the controllable operating parameters of net coal consumption under typical load conditions, based on an improved K-means algorithm and the Hadoop platform. The other part utilizes a support vector machine to determine the sensitivity coefficients of various operating parameters for the net coal consumption under different load conditions. Additionally, the fuzzy rough set attribute reduction method was employed to obtain the minimalist properties reduction method parameters to reduce the complexity of the dataset. This work is based on continuously-measured information system data from a 600 MW coal-fired power plant in China. The results show that the proposed strategy achieves high energy conservation performance. Taking the 600 MW load optimization value as an example, the optimized power supply coal consumption is 307.95 g/(kW&middot;h) compared to the actual operating value of 313.45 g/(kW&middot;h). It is important for coal-fired power plants to reduce their coal consumption.

]]>Entropy doi: 10.3390/e20090701

Authors: Beige Ye Taorong Qiu Xiaoming Bai Ping Liu

In view of the nonlinear characteristics of electroencephalography (EEG) signals collected in the driving fatigue state recognition research and the issue that the recognition accuracy of the driving fatigue state recognition method based on EEG is still unsatisfactory, this paper proposes a driving fatigue recognition method based on sample entropy (SE) and kernel principal component analysis (KPCA), which combines the advantage of the high recognition accuracy of sample entropy and the advantages of KPCA in dimensionality reduction for nonlinear principal components and the strong non-linear processing capability. By using support vector machine (SVM) classifier, the proposed method (called SE_KPCA) is tested on the EEG data, and compared with those based on fuzzy entropy (FE), combination entropy (CE), three kinds of entropies including SE, FE and CE that merged with KPCA. Experiment results show that the method is effective.

]]>Entropy doi: 10.3390/e20090700

Authors: Michail Vlysidis Yiannis N. Kaznessis

The time evolution of stochastic reaction networks can be modeled with the chemical master equation of the probability distribution. Alternatively, the numerical problem can be reformulated in terms of probability moment equations. Herein we present a new alternative method for numerically solving the time evolution of stochastic reaction networks. Based on the assumption that the entropy of the reaction network is maximum, Lagrange multipliers are introduced. The proposed method derives equations that model the time derivatives of these Lagrange multipliers. We present detailed steps to transform moment equations to Lagrange multiplier equations. In order to demonstrate the method, we present examples of non-linear stochastic reaction networks of varying degrees of complexity, including multistable and oscillatory systems. We find that the new approach is as accurate and significantly more efficient than Gillespie&rsquo;s original exact algorithm for systems with small number of interacting species. This work is a step towards solving stochastic reaction networks accurately and efficiently.

]]>Entropy doi: 10.3390/e20090699

Authors: Guolong Chen

The Koch curve exciting coil eddy current sensor is a kind of novel flexible planar eddy current probe. In this study, an intersection angle spectrum entropy index and a radial direction energy spectrum entropy were proposed to evaluate the eddy current distribution. Eddy current distributions induced by one turn of a circular coil and one turn of a second order Koch curve coil feed with different exciting frequency alternative currents and at different lift-off distances, were simulated and the eddy current distributions varying with lift-off distance in different exciting frequencies were compared by the two proposed indices. With the increase of the lift-off distance or the decrease of exciting frequency, the similarity between the shape of the Koch curve and the eddy current distribution became weakened and the degree of the concentration of the eddy current distribution in the specimen under the exciting coil was loosened.

]]>Entropy doi: 10.3390/e20090698

Authors: Alberto Muñoz Nicolás Hernández Javier M. Moguerza Gabriel Martos

The combination of different sources of information is a problem that arises in several situations, for instance, when data are analysed using different similarity measures. Often, each source of information is given as a similarity, distance, or a kernel matrix. In this paper, we propose a new class of methods which consists of producing, for anomaly detection purposes, a single Mercer kernel (that acts as a similarity measure) from a set of local entropy kernels and, at the same time, avoids the task of model selection. This kernel is used to build an embedding of data in a variety that will allow the use of a (modified) one-class Support Vector Machine to detect outliers. We study several information combination schemes and their limiting behaviour when the data sample size increases within an Information Geometry context. In particular, we study the variety of the given positive definite kernel matrices to obtain the desired kernel combination as belonging to that variety. The proposed methodology has been evaluated on several real and artificial problems.

]]>Entropy doi: 10.3390/e20090697

Authors: Jiri Petrzela

This paper brings analysis of the multiple-valued memory system (MVMS) composed by a pair of the resonant tunneling diodes (RTD). Ampere-voltage characteristic (AVC) of both diodes is approximated in operational voltage range as common in practice: by polynomial scalar function. Mathematical model of MVMS represents autonomous deterministic dynamical system with three degrees of freedom and smooth vector field. Based on the very recent results achieved for piecewise-linear MVMS numerical values of the parameters are calculated such that funnel and double spiral chaotic attractor is observed. Existence of such types of strange attractors is proved both numerically by using concept of the largest Lyapunov exponents (LLE) and experimentally by computer-aided simulation of designed lumped circuit using only commercially available active elements.

]]>Entropy doi: 10.3390/e20090696

Authors: Sergio Davis Diego González Gonzalo Gutiérrez

A general framework for inference in dynamical systems is described, based on the language of Bayesian probability theory and making use of the maximum entropy principle. Taking the concept of a path as fundamental, the continuity equation and Cauchy&rsquo;s equation for fluid dynamics arise naturally, while the specific information about the system can be included using the maximum caliber (or maximum path entropy) principle.

]]>Entropy doi: 10.3390/e20090695

Authors: Gamaliel Blé Domingo González

This paper discusses some properties of the topological entropy systems generated by polynomials of degree d in their Hubbard tree. An optimization of Thurston&rsquo;s core entropy algorithm is developed for a family of polynomials of degree d.

]]>Entropy doi: 10.3390/e20090694

Authors: Teresa C. M. Dias Marcio A. Diniz Carlos A. de B. Pereira Adriano Polpo

The 37th edition of MaxEnt was held in Brazil, hosting several distinguished researchers and students. The workshop offered four tutorials, nine invited talks, twenty four oral presentations and twenty seven poster presentations. All submissions received their first choice between oral and poster presentations. The event held a celebration to Julio Stern&rsquo;s 60th anniversary and awarded two prizes to young researchers. As customary, the workshop had one free afternoon, in which participants visited the city&rsquo;s surroundings and experienced Brazilian food and traditions.

]]>Entropy doi: 10.3390/e20090693

Authors: Juan Wang Qun Ding

According to the keyword abstract extraction function in the Natural Language Processing and Information Retrieval Sharing Platform (NLPIR), the design method of a dynamic rounds chaotic block cipher is presented in this paper, which takes into account both the security and efficiency. The cipher combines chaotic theory with the Feistel structure block cipher, and uses the randomness of chaotic sequence and the nonlinearity of chaotic S-box to dynamically generate encrypted rounds, realizing more numbers of dynamic rounds encryption for the important information marked by NLPIR, while less numbers of dynamic rounds encryption for the non-important information that is not marked. Through linear and differential cryptographic analysis, ciphertext information entropy, &ldquo;0&ndash;1&rdquo; balance and National Institute of Science and Technology (NIST) tests and the comparison with other traditional and lightweight block ciphers, the results indicate that the dynamic variety of encrypted rounds can achieve different levels of encryption for different information, which can achieve the purpose of enhancing the anti-attack ability and reducing the number of encrypted rounds. Therefore, the dynamic rounds chaotic block cipher can guarantee the security of information transmission and realize the lightweight of the cryptographic algorithm.

]]>Entropy doi: 10.3390/e20090692

Authors: Margarita A. Man’ko Vladimir I. Man’ko

We study an analog of Bayes&rsquo; formula and the nonnegativity property of mutual information for systems with one random variable. For single-qudit states, we present new entropic inequalities in the form of the subadditivity and condition corresponding to hidden correlations in quantum systems. We present qubit states in the quantum suprematism picture, where these states are identified with three probability distributions, describing the states of three classical coins, and illustrate the states by Triada of Malevich&rsquo;s squares with areas satisfying the quantum constraints. We consider arbitrary quantum states belonging to N-dimensional Hilbert space as ( N 2 &minus; 1 ) fair probability distributions describing the states of ( N 2 &minus; 1 ) classical coins. We illustrate the geometrical properties of the qudit states by a set of Triadas of Malevich&rsquo;s squares. We obtain new entropic inequalities for matrix elements of an arbitrary density N&times;N-matrix of qudit systems using the constructed maps of the density matrix on a set of the probability distributions. In addition, to construct the bijective map of the qudit state onto the set of probabilities describing the positions of classical coins, we show that there exists a bijective map of any quantum observable onto the set of dihotomic classical random variables with statistics determined by the above classical probabilities. Finally, we discuss the physical meaning and possibility to check derived inequalities in the experiments with superconducting circuits based on Josephson junction devices.

]]>