Previous Issue
Volume 21, August

Table of Contents

Entropy, Volume 21, Issue 9 (September 2019)

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
Cover Story (view full-size image) We investigated the quantum adiabatic pumping effect in an interferometer attached to two [...] Read more.
Order results
Result details
Select all
Export citation of selected articles as:
Open AccessArticle
Evidential Decision Tree Based on Belief Entropy
Entropy 2019, 21(9), 897; https://doi.org/10.3390/e21090897 (registering DOI) - 16 Sep 2019
Abstract
Decision Tree is widely applied in many areas, such as classification and recognition. Traditional information entropy and Pearson’s correlation coefficient are often applied as measures of splitting rules to find the best splitting attribute. However, these methods can not handle uncertainty, since the [...] Read more.
Decision Tree is widely applied in many areas, such as classification and recognition. Traditional information entropy and Pearson’s correlation coefficient are often applied as measures of splitting rules to find the best splitting attribute. However, these methods can not handle uncertainty, since the relation between attributes and the degree of disorder of attributes can not be measured by them. Motivated by the idea of Deng Entropy, it can measure the uncertain degree of Basic Belief Assignment (BBA) in terms of uncertain problems. In this paper, Deng entropy is used as a measure of splitting rules to construct an evidential decision tree for fuzzy dataset classification. Compared to traditional combination rules used for combination of BBAs, the evidential decision tree can be applied to classification directly, which efficiently reduces the complexity of the algorithm. In addition, the experiments are conducted on iris dataset to build an evidential decision tree that achieves the goal of more accurate classification. Full article
Show Figures

Figure 1

Open AccessEditorial
Entropy in Dynamic Systems
Entropy 2019, 21(9), 896; https://doi.org/10.3390/e21090896 - 16 Sep 2019
Abstract
In order to measure and quantify the complex behavior of real-world systems, either novel mathematical approaches or modifications of classical ones are required to precisely predict, monitor and control complicated chaotic and stochastic processes [...] Full article
(This article belongs to the Special Issue Entropy in Dynamic Systems)
Open AccessFeature PaperArticle
Universality and Exact Finite-Size Corrections for Spanning Trees on Cobweb and Fan Networks
Entropy 2019, 21(9), 895; https://doi.org/10.3390/e21090895 - 15 Sep 2019
Viewed by 185
Abstract
The concept of universality is a cornerstone of theories of critical phenomena. It is very well understood in most systems, especially in the thermodynamic limit. Finite-size systems present additional challenges. Even in low dimensions, universality of the edge and corner contributions to free [...] Read more.
The concept of universality is a cornerstone of theories of critical phenomena. It is very well understood in most systems, especially in the thermodynamic limit. Finite-size systems present additional challenges. Even in low dimensions, universality of the edge and corner contributions to free energies and response functions is less investigated and less well understood. In particular, the question arises of how universality is maintained in correction-to-scaling in systems of the same universality class but with very different corner geometries. Two-dimensional geometries deliver the simplest such examples that can be constructed with and without corners. To investigate how the presence and absence of corners manifest universality, we analyze the spanning tree generating function on two different finite systems, namely the cobweb and fan networks. The corner free energies of these configurations have stimulated significant interest precisely because of expectations regarding their universal properties and we address how this can be delivered given that the finite-size cobweb has no corners while the fan has four. To answer, we appeal to the Ivashkevich–Izmailian–Hu approach which unifies the generating functions of distinct networks in terms of a single partition function with twisted boundary conditions. This unified approach shows that the contributions to the individual corner free energies of the fan network sum to zero so that it precisely matches that of the web. It therefore also matches conformal theory (in which the central charge is found to be c = 2 ) and finite-size scaling predictions. Correspondence in each case with results established by alternative means for both networks verifies the soundness of the Ivashkevich–Izmailian–Hu algorithm. Its broad range of usefulness is demonstrated by its application to hitherto unsolved problems—namely the exact asymptotic expansions of the logarithms of the generating functions and the conformal partition functions for fan and cobweb geometries. We also investigate strip geometries, again confirming the predictions of conformal field theory. Thus, the resolution of a universality puzzle demonstrates the power of the algorithm and opens up new applications in the future. Full article
Show Figures

Figure 1

Open AccessArticle
Application of a Speedy Modified Entropy Method in Assessing the Complexity of Baroreflex Sensitivity for Age-Controlled Healthy and Diabetic Subjects
Entropy 2019, 21(9), 894; https://doi.org/10.3390/e21090894 - 14 Sep 2019
Viewed by 174
Abstract
The percussion entropy index (PEIorginal) was recently introduced to assess the complexity of baroreflex sensitivity. This study aimed to investigate the ability of a speedy modified PEI (i.e., PEINEW) application to distinguish among age-controlled subjects with or without diabetes. [...] Read more.
The percussion entropy index (PEIorginal) was recently introduced to assess the complexity of baroreflex sensitivity. This study aimed to investigate the ability of a speedy modified PEI (i.e., PEINEW) application to distinguish among age-controlled subjects with or without diabetes. This was carried out using simultaneous photo-plethysmo-graphy (PPG) pulse amplitude series and the R wave-to-R wave interval (RRI) series acquired from healthy subjects (Group 1, number = 42), subjects diagnosed as having diabetes mellitus type 2 with satisfactory blood sugar control (Group 2, number = 38), and type 2 diabetic patients with poor blood sugar control (Group 3, number = 35). Results from PEIorginal and multiscale cross-approximate entropy (MCAE) were also addressed with the same datasets for comparison. The results show that optimal prolongation between the amplitude series and RRI series could be delayed by one to three heartbeat cycles for Group 2, and one to four heartbeat cycles for Group 3 patients. Group 1 subjects only had prolongation for one heartbeat cycle. This study not only demonstrates the sensitivity of PEINEW and PEIorginal in differentiating between Groups 2 and 3 compared with MCAE, highlighting the feasibility of using percussion entropy applications in autonomic nervous function assessments, it also shows that PEINEW can considerably reduce the computational time required for such processes. Full article
Open AccessArticle
Monitoring Autonomic and Central Nervous System Activity by Permutation Entropy during Short Sojourn in Antarctica
Entropy 2019, 21(9), 893; https://doi.org/10.3390/e21090893 - 14 Sep 2019
Viewed by 147
Abstract
The aim of this study was to monitor acute response patterns of autonomic and central nervous system activity during an encounter with Antarctica by synchronously recording heart rate variability (HRV) and electroencephalography (EEG). On three different time-points during the two-week sea journey, the [...] Read more.
The aim of this study was to monitor acute response patterns of autonomic and central nervous system activity during an encounter with Antarctica by synchronously recording heart rate variability (HRV) and electroencephalography (EEG). On three different time-points during the two-week sea journey, the EEG and HRV were recorded from nine male scientists who participated in “The First Turkish Antarctic Research Expedition”. The recordings were performed in a relaxed state with the eyes open, eyes closed, and during a space quantity perception test. For the EEG recordings, the wireless 14 channel EPOC-Emotiv device was used, and for the HRV recordings, a Polar heart rate monitor S810i was used. The HRV data were analyzed by time/frequency domain parameters and ordinal pattern statistics. For the EEG data, spectral band power in the conventional frequency bands, as well as permutation entropy values were calculated. Regarding HRV, neither conventional nor permutation entropy calculations produced significant differences for the different journey time-points, but only permutation entropy was able to differentiate between the testing conditions. During the cognitive test, permutation entropy values increased significantly, whereas the conventional HRV parameters did not show any significant differences. In the EEG analysis, the ordinal pattern statistics revealed significant transitions in the course of the sea voyage as permutation entropy values decreased, whereas spectral band power analysis could not detect any significant difference. Permutation entropy analysis was further able to differentiate between the three testing conditions as well between the brain regions. In the conventional spectral band power analysis, alpha band power could separate the three testing conditions and brain regions, and beta band power could only do so for the brain regions. This superiority of permutation entropy in discerning subtle differences in the autonomic and central nervous system’s responses to an overwhelming subjective experience renders it suitable as an analysis tool for biomonitoring in extreme environments. Full article
Open AccessArticle
Functional Linear and Nonlinear Brain–Heart Interplay during Emotional Video Elicitation: A Maximum Information Coefficient Study
Entropy 2019, 21(9), 892; https://doi.org/10.3390/e21090892 - 14 Sep 2019
Viewed by 134
Abstract
Brain and heart continuously interact through anatomical and biochemical connections. Although several brain regions are known to be involved in the autonomic control, the functional brain–heart interplay (BHI) during emotional processing is not fully characterized yet. To this aim, we investigate BHI during [...] Read more.
Brain and heart continuously interact through anatomical and biochemical connections. Although several brain regions are known to be involved in the autonomic control, the functional brain–heart interplay (BHI) during emotional processing is not fully characterized yet. To this aim, we investigate BHI during emotional elicitation in healthy subjects. The functional linear and nonlinear couplings are quantified using the maximum information coefficient calculated between time-varying electroencephalography (EEG) power spectra within the canonical bands ( δ , θ , α , β and γ ), and time-varying low-frequency and high-frequency powers from heartbeat dynamics. Experimental data were gathered from 30 healthy volunteers whose emotions were elicited through pleasant and unpleasant high-arousing videos. Results demonstrate that functional BHI increases during videos with respect to a resting state through EEG oscillations not including the γ band (>30 Hz). Functional linear coupling seems associated with a high-arousing positive elicitation, with preferred EEG oscillations in the θ band ( [ 4 , 8 ) Hz) especially over the left-temporal and parietal cortices. Differential functional nonlinear coupling between emotional valence seems to mainly occur through EEG oscillations in the δ , θ , α bands and sympathovagal dynamics, as well as through δ , α , β oscillations and parasympathetic activity mainly over the right hemisphere. Functional BHI through δ and α oscillations over the prefrontal region seems primarily nonlinear. This study provides novel insights on synchronous heartbeat and cortical dynamics during emotional video elicitation, also suggesting that a nonlinear analysis is needed to fully characterize functional BHI. Full article
(This article belongs to the Special Issue Information Dynamics in Brain and Physiological Networks)
Open AccessArticle
Service-Oriented Model Encapsulation and Selection Method for Complex System Simulation Based on Cloud Architecture
Entropy 2019, 21(9), 891; https://doi.org/10.3390/e21090891 - 14 Sep 2019
Viewed by 157
Abstract
With the rise in cloud computing architecture, the development of service-oriented simulation models has gradually become a prominent topic in the field of complex system simulation. In order to support the distributed sharing of the simulation models with large computational requirements and to [...] Read more.
With the rise in cloud computing architecture, the development of service-oriented simulation models has gradually become a prominent topic in the field of complex system simulation. In order to support the distributed sharing of the simulation models with large computational requirements and to select the optimal service model to construct complex system simulation applications, this paper proposes a service-oriented model encapsulation and selection method. This method encapsulates models into shared simulation services, supports the distributed scheduling of model services in the network, and designs a semantic search framework which can support users in searching models according to model correlation. An optimization selection algorithm based on quality of service (QoS) is proposed to support users in customizing the weights of QoS indices and obtaining the ordered candidate model set by weighted comparison. The experimental results showed that the parallel operation of service models can effectively improve the execution efficiency of complex system simulation applications, and the performance was increased by 19.76% compared with that of scatter distribution strategy. The QoS weighted model selection method based on semantic search can support the effective search and selection of simulation models in the cloud environment according to the user’s preferences. Full article
(This article belongs to the Special Issue Computation in Complex Networks)
Open AccessArticle
Thermodynamics Beyond Molecules: Statistical Thermodynamics of Probability Distributions
Entropy 2019, 21(9), 890; https://doi.org/10.3390/e21090890 - 13 Sep 2019
Viewed by 149
Abstract
Statistical thermodynamics has a universal appeal that extends beyond molecular systems, and yet, as its tools are being transplanted to fields outside physics, the fundamental question, what is thermodynamics, has remained unanswered. We answer this question here. Generalized statistical thermodynamics is a variational [...] Read more.
Statistical thermodynamics has a universal appeal that extends beyond molecular systems, and yet, as its tools are being transplanted to fields outside physics, the fundamental question, what is thermodynamics, has remained unanswered. We answer this question here. Generalized statistical thermodynamics is a variational calculus of probability distributions. It is independent of physical hypotheses but provides the means to incorporate our knowledge, assumptions and physical models about a stochastic processes that gives rise to the probability in question. We derive the familiar calculus of thermodynamics via a probabilistic argument that makes no reference to physics. At the heart of the theory is a space of distributions and a special functional that assigns probabilities to this space. The maximization of this functional generates the mathematical network of thermodynamic relationship. We obtain statistical mechanics as a special case and make contact with Information Theory and Bayesian inference. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Graphical abstract

Open AccessArticle
Frequency Dependence of the Entanglement Entropy Production in a System of Coupled Driven Nonlinear Oscillators
Entropy 2019, 21(9), 889; https://doi.org/10.3390/e21090889 - 13 Sep 2019
Viewed by 127
Abstract
Driven nonlinear systems have attracted great interest owing to their applications in quantum technologies such as quantum information. In quantum information, entanglement is a vital resource and can be measured by entropy in bipartite systems. In this paper, we carry out an investigation [...] Read more.
Driven nonlinear systems have attracted great interest owing to their applications in quantum technologies such as quantum information. In quantum information, entanglement is a vital resource and can be measured by entropy in bipartite systems. In this paper, we carry out an investigation to study the impact of driving frequency on the entanglement with a bipartite system of two coupled driven nonlinear oscillators. It is numerically found that the time evolution of the entanglement entropy between the subsystems significantly depends on the driving frequency. The dependence curve of the entropy production on the driving frequency exhibits a pronounced peak. This means the entanglement between the subsystems can be greatly increased by tuning the driving frequency. Further analyses show that the enhancement of the entropy production by the driving frequency is closely related to the energy levels involved in the quantum evolution. This is confirmed by the results related to the quantum spectrum and the dispersion of the wave function in the phase space. Our work gives a convenient way to enhance the entanglement in driven nonlinear systems and throws light on the role of driven nonlinear systems in quantum information technologies. Full article
(This article belongs to the Special Issue The Ubiquity of Entropy)
Show Figures

Figure 1

Open AccessArticle
On the Security of a Latin-Bit Cube-Based Image Chaotic Encryption Algorithm
Entropy 2019, 21(9), 888; https://doi.org/10.3390/e21090888 - 12 Sep 2019
Viewed by 119
Abstract
In this paper, the security analysis of an image chaotic encryption algorithm based on Latin cubes and bit cubes is given. The proposed algorithm adopts a first-scrambling-diffusion- second-scrambling three-stage encryption scheme. First, a finite field is constructed using chaotic sequences. Then, the Latin [...] Read more.
In this paper, the security analysis of an image chaotic encryption algorithm based on Latin cubes and bit cubes is given. The proposed algorithm adopts a first-scrambling-diffusion- second-scrambling three-stage encryption scheme. First, a finite field is constructed using chaotic sequences. Then, the Latin cubes are generated from finite field operation and used for image chaotic encryption. In addition, according to the statistical characteristics of the diffusion image in the diffusion stage, the algorithm also uses different Latin cube combinations to scramble the diffusion image for the second time. However, the generation of Latin cubes in this algorithm is independent of plain image, while, in the diffusion stage, when any one bit in the plain image changes, the corresponding number of bits in the cipher image follows the change with obvious regularity. Thus, the equivalent secret keys can be obtained by chosen plaintext attack. Theoretical analysis and experimental results indicate that only a maximum of 2 . 5 × w × h 3 + 6 plain images are needed to crack the cipher image with w × h resolution. The size of equivalent keys deciphered by the method proposed in this paper are much smaller than other general methods of cryptanalysis for similar encryption schemes. Full article
(This article belongs to the Special Issue Entropy in Image Analysis II)
Open AccessArticle
Towards Quantum-Secured Permissioned Blockchain: Signature, Consensus, and Logic
Entropy 2019, 21(9), 887; https://doi.org/10.3390/e21090887 - 12 Sep 2019
Viewed by 204
Abstract
While Blockchain technology is universally considered as a significant technology for the near future, some of its pillars are under a threat of another thriving technology, Quantum Computing. In this paper, we propose important safeguard measures against this threat by developing a framework [...] Read more.
While Blockchain technology is universally considered as a significant technology for the near future, some of its pillars are under a threat of another thriving technology, Quantum Computing. In this paper, we propose important safeguard measures against this threat by developing a framework of a quantum-secured, permissioned blockchain called Logicontract (LC). LC adopts a digital signature scheme based on Quantum Key Distribution (QKD) mechanisms and a vote-based consensus algorithm to achieve consensus on the blockchain. The main contribution of this paper is in the development of: (1) unconditionally secure signature scheme for LC which makes it immune to the attack of quantum computers; (2) scalable consensus protocol used by LC; (3) logic-based scripting language for the creation of smart contracts on LC; (4) quantum-resistant lottery protocol which illustrates the power and usage of LC. Full article
(This article belongs to the Special Issue Blockchain: Security, Challenges, and Opportunities)
Open AccessArticle
Resilience of Urban Technical Networks
Entropy 2019, 21(9), 886; https://doi.org/10.3390/e21090886 - 12 Sep 2019
Viewed by 106
Abstract
The need to overcome the insulated treatment of urban technical infrastructures according to the nature of the transferred flows is argued. The operation of urban technical networks is affected by endogenous and exogenous random events with consequences for users. By identifying these operational [...] Read more.
The need to overcome the insulated treatment of urban technical infrastructures according to the nature of the transferred flows is argued. The operation of urban technical networks is affected by endogenous and exogenous random events with consequences for users. By identifying these operational risks and the difficulties of estimating the impact on the performance of the urban technical networks, the authors chose to study the risk management through a concise expression—in relation to the engineering resilience and its connections with vulnerability. Further, the research is confined to the case of urban traffic networks for which the resilience is expressed by the capabilities of these networks of resistance and risk absorption (both motivated by the redundancy in design and execution). The dynamics of the network, in correlation with the resistance and absorption capacities, is introduced by three states for which the signal graph is built. In a stationary regime, the probability of each state is computed. These probabilities allow the calculation of the entropy of the network, relevant for assessing the preservation of the network functionality. Full article
(This article belongs to the Special Issue Entropy and Scale-Dependence in Urban Modelling)
Show Figures

Figure 1

Open AccessArticle
Channel Capacity of Concurrent Probabilistic Programs
Entropy 2019, 21(9), 885; https://doi.org/10.3390/e21090885 - 12 Sep 2019
Viewed by 161
Abstract
Programs are under continuous attack for disclosing secret information, and defending against these attacks is becoming increasingly vital. An attractive approach for protection is to measure the amount of secret information that might leak to attackers. A fundamental issue in computing information leakage [...] Read more.
Programs are under continuous attack for disclosing secret information, and defending against these attacks is becoming increasingly vital. An attractive approach for protection is to measure the amount of secret information that might leak to attackers. A fundamental issue in computing information leakage is that given a program and attackers with various knowledge of the secret information, what is the maximum amount of leakage of the program? This is called channel capacity. In this paper, two notions of capacity are defined for concurrent probabilistic programs using information theory. These definitions consider intermediate leakage and the scheduler effect. These capacities are computed by a constrained nonlinear optimization problem. Therefore, an evolutionary algorithm is proposed to compute the capacities. Single preference voting and dining cryptographers protocols are analyzed as case studies to show how the proposed approach can automatically compute the capacities. The results demonstrate that there are attackers who can learn the whole secret of both the single preference protocol and dining cryptographers protocol. The proposed evolutionary algorithm is a general approach for computing any type of capacity in any kind of program. Full article
Open AccessTutorial
An Introduction to the Non-Equilibrium Steady States of Maximum Entropy Spike Trains
Entropy 2019, 21(9), 884; https://doi.org/10.3390/e21090884 - 11 Sep 2019
Viewed by 194
Abstract
Although most biological processes are characterized by a strong temporal asymmetry, several popular mathematical models neglect this issue. Maximum entropy methods provide a principled way of addressing time irreversibility, which leverages powerful results and ideas from the literature of non-equilibrium statistical mechanics. This [...] Read more.
Although most biological processes are characterized by a strong temporal asymmetry, several popular mathematical models neglect this issue. Maximum entropy methods provide a principled way of addressing time irreversibility, which leverages powerful results and ideas from the literature of non-equilibrium statistical mechanics. This tutorial provides a comprehensive overview of these issues, with a focus in the case of spike train statistics. We provide a detailed account of the mathematical foundations and work out examples to illustrate the key concepts and results from non-equilibrium statistical mechanics. Full article
(This article belongs to the Special Issue Entropy Production and Its Applications: From Cosmology to Biology)
Open AccessArticle
Pragmatic Hypotheses in the Evolution of Science
Entropy 2019, 21(9), 883; https://doi.org/10.3390/e21090883 - 11 Sep 2019
Viewed by 136
Abstract
This paper introduces pragmatic hypotheses and relates this concept to the spiral of scientific evolution. Previous works determined a characterization of logically consistent statistical hypothesis tests and showed that the modal operators obtained from this test can be represented in the hexagon of [...] Read more.
This paper introduces pragmatic hypotheses and relates this concept to the spiral of scientific evolution. Previous works determined a characterization of logically consistent statistical hypothesis tests and showed that the modal operators obtained from this test can be represented in the hexagon of oppositions. However, despite the importance of precise hypothesis in science, they cannot be accepted by logically consistent tests. Here, we show that this dilemma can be overcome by the use of pragmatic versions of precise hypotheses. These pragmatic versions allow a level of imprecision in the hypothesis that is small relative to other experimental conditions. The introduction of pragmatic hypotheses allows the evolution of scientific theories based on statistical hypothesis testing to be interpreted using the narratological structure of hexagonal spirals, as defined by Pierre Gallais. Full article
Show Figures

Figure 1

Open AccessArticle
Quantifying the Variability in Resting-State Networks
Entropy 2019, 21(9), 882; https://doi.org/10.3390/e21090882 - 11 Sep 2019
Viewed by 198
Abstract
Recent precision functional mapping of individual human brains has shown that individual brain organization is qualitatively different from group average estimates and that individuals exhibit distinct brain network topologies. How this variability affects the connectivity within individual resting-state networks remains an open question. [...] Read more.
Recent precision functional mapping of individual human brains has shown that individual brain organization is qualitatively different from group average estimates and that individuals exhibit distinct brain network topologies. How this variability affects the connectivity within individual resting-state networks remains an open question. This is particularly important since certain resting-state networks such as the default mode network (DMN) and the fronto-parietal network (FPN) play an important role in the early detection of neurophysiological diseases like Alzheimer’s, Parkinson’s, and attention deficit hyperactivity disorder. Using different types of similarity measures including conditional mutual information, we show here that the backbone of the functional connectivity and the direct connectivity within both the DMN and the FPN does not vary significantly between healthy individuals for the AAL brain atlas. Weaker connections do vary however, having a particularly pronounced effect on the cross-connections between DMN and FPN. Our findings suggest that the link topology of single resting-state networks is quite robust if a fixed brain atlas is used and the recordings are sufficiently long—even if the whole brain network topology between different individuals is variable. Full article
(This article belongs to the Special Issue Complex Networks from Information Measures)
Show Figures

Figure 1

Open AccessArticle
The Poincaré-Shannon Machine: Statistical Physics and Machine Learning Aspects of Information Cohomology
Entropy 2019, 21(9), 881; https://doi.org/10.3390/e21090881 - 10 Sep 2019
Viewed by 255
Abstract
Previous works established that entropy is characterized uniquely as the first cohomology class in a topos and described some of its applications to the unsupervised classification of gene expression modules or cell types. These studies raised important questions regarding the statistical meaning of [...] Read more.
Previous works established that entropy is characterized uniquely as the first cohomology class in a topos and described some of its applications to the unsupervised classification of gene expression modules or cell types. These studies raised important questions regarding the statistical meaning of the resulting cohomology of information and its interpretation or consequences with respect to usual data analysis and statistical physics. This paper aims to present the computational methods of information cohomology and to propose its interpretations in terms of statistical physics and machine learning. In order to further underline the cohomological nature of information functions and chain rules, the computation of the cohomology in low degrees is detailed to show more directly that the k multivariate mutual information ( I k ) are ( k - 1 ) -coboundaries. The ( k - 1 ) -cocycles condition corresponds to I k = 0 , which generalizes statistical independence to arbitrary degree k. Hence, the cohomology can be interpreted as quantifying the statistical dependences and the obstruction to factorization. I develop the computationally tractable subcase of simplicial information cohomology represented by entropy H k and information I k landscapes and their respective paths, allowing investigation of Shannon’s information in the multivariate case without the assumptions of independence or of identically distributed variables. I give an interpretation of this cohomology in terms of phase transitions in a model of k-body interactions, holding both for statistical physics without mean field approximations and for data points. The I 1 components define a self-internal energy functional U k and ( - 1 ) k I k , k 2 components define the contribution to a free energy functional G k (the total correlation) of the k-body interactions. A basic mean field model is developed and computed on genetic data reproducing usual free energy landscapes with phase transition, sustaining the analogy of clustering with condensation. The set of information paths in simplicial structures is in bijection with the symmetric group and random processes, providing a trivial topological expression of the second law of thermodynamics. The local minima of free energy, related to conditional information negativity and conditional independence, characterize a minimum free energy complex. This complex formalizes the minimum free-energy principle in topology, provides a definition of a complex system and characterizes a multiplicity of local minima that quantifies the diversity observed in biology. I give an interpretation of this complex in terms of unsupervised deep learning where the neural network architecture is given by the chain complex and conclude by discussing future supervised applications. Full article
Open AccessArticle
Fabrication of Nanocrystalline AlCoCrFeNi High Entropy Alloy through Shock Consolidation and Mechanical Alloying
Entropy 2019, 21(9), 880; https://doi.org/10.3390/e21090880 - 10 Sep 2019
Viewed by 146
Abstract
High entropy alloys (HEAs) are usually fabricated using arc melting which has the disadvantages of diseconomy, and the limitations in the shape and size of final products. However, recently, quite a large amount of research has been carried out to find the fabrication [...] Read more.
High entropy alloys (HEAs) are usually fabricated using arc melting which has the disadvantages of diseconomy, and the limitations in the shape and size of final products. However, recently, quite a large amount of research has been carried out to find the fabrication techniques for HEAs with better properties such as mechanical alloying and rapid solidification. In this paper, an AlCoCrFeNi high entropy alloy was successfully fabricated by the shock consolidation technique. In this method, the starting powders were mixed by mechanical alloying and then the shock wave was imposed to the compacted powders by explosion. High levels of residual stress existed in samples fabricated by the shock consolidation method. Due to this, after fabrication of the sample, heat treatment was used to eliminate the residual stress and improve the mechanical properties. The microstructure of the samples before and after heat treatment were examined by XRD, SEM and electron backscatter diffraction (EBSD). The shock consolidated sample and sample with heat treatment both showed the nano-structure. After heat treatment the hardness of the sample was decreased from 715 HV to the 624 HV, however the failure strength increased, and as expected the ductility of the sample was improved after heat treatment. Full article
Show Figures

Graphical abstract

Open AccessCommunication
An Objective Non-Reference Metric Based on Arimoto Entropy for Assessing the Quality of Fused Images
Entropy 2019, 21(9), 879; https://doi.org/10.3390/e21090879 - 10 Sep 2019
Viewed by 192
Abstract
In the technologies, increasing attention is being paid to image fusion; nevertheless, how to objectively assess the quality of fused images and the performance of different fusion algorithms is of significance. In this paper, we propose a novel objective non-reference measure for evaluating [...] Read more.
In the technologies, increasing attention is being paid to image fusion; nevertheless, how to objectively assess the quality of fused images and the performance of different fusion algorithms is of significance. In this paper, we propose a novel objective non-reference measure for evaluating image fusion. This metric employs the properties of Arimoto entropy, which is a generalization of Shannon entropy, measuring the amount of information that the fusion image contains about two input images. Preliminary experiments on multi-focus images and multi-modal images using the average fusion algorithm, contrast pyramid, principal component analysis, laplacian pyramid, guided filtering and discrete cosine transform have been implemented. In addition, a comparison has been conducted with other relevant quality metrics of image fusion such as mutual information, normalized mutual information, Tsallis divergence and the Petrovic measure. The experimental results illustrate that our presented metric correlates better with the subjective criteria of these fused images. Full article
Show Figures

Figure 1

Open AccessArticle
Two Tests for Dependence (of Unknown Form) between Time Series
Entropy 2019, 21(9), 878; https://doi.org/10.3390/e21090878 - 09 Sep 2019
Viewed by 209
Abstract
This paper proposes two new nonparametric tests for independence between time series. Both tests are based on symbolic analysis, specifically on symbolic correlation integral, in order to be robust to potential unknown nonlinearities. The first test is developed for a scenario in which [...] Read more.
This paper proposes two new nonparametric tests for independence between time series. Both tests are based on symbolic analysis, specifically on symbolic correlation integral, in order to be robust to potential unknown nonlinearities. The first test is developed for a scenario in which each considered time series is independent and therefore the interest is to ascertain if two internally independent time series share a relationship of an unknown form. This is especially relevant as the test is nuisance parameter free, as proved in the paper. The second proposed statistic tests for independence among variables, allowing these time series to exhibit within-dependence. Monte Carlo experiments are conducted to show the empirical properties of the tests. Full article
Open AccessArticle
Decision-Making of Irrigation Scheme for Soybeans in the Huaibei Plain Based on Grey Entropy Weight and Grey Relation–Projection Pursuit
Entropy 2019, 21(9), 877; https://doi.org/10.3390/e21090877 - 09 Sep 2019
Viewed by 170
Abstract
To provide a scientific reference for formulating an effective soybean irrigation schedule in the Huaibei Plain, potted water deficit experiments with nine alternative irrigation schemes during the 2015 and 2016 seasons were conducted. An irrigation scheme decision-making index system was established from the [...] Read more.
To provide a scientific reference for formulating an effective soybean irrigation schedule in the Huaibei Plain, potted water deficit experiments with nine alternative irrigation schemes during the 2015 and 2016 seasons were conducted. An irrigation scheme decision-making index system was established from the aspects of crop water consumption, crop growth process and crop water use efficiency. Moreover, a grey entropy weight method and a grey relation–projection pursuit model were proposed to calculate the weight of each decision-making index. Then, nine alternative schemes were sorted according to the comprehensive grey relation degree of each scheme in the two seasons. The results showed that, when using the entropy weight method or projection pursuit model to determine index weight, it was more direct and effective to obtain the corresponding entropy value or projection eigenvalue according to the sequence of the actual study object. The decision-making results from the perspective of actual soybean growth responses at each stage for various irrigation schemes were mostly consistent in 2015 and 2016. Specifically, for an integrated target of lower water consumption and stable biomass yields, the scheme with moderate-deficit irrigation at the soybean branching stage or seedling stage and adequate irrigation at the flowering-podding and seed filling stages is relatively optimal. Full article
(This article belongs to the Special Issue Entropy Applications in Environmental and Water Engineering II)
Show Figures

Figure 1

Open AccessArticle
Model Order Reduction: A Comparison between Integer and Non-Integer Order Systems Approaches
Entropy 2019, 21(9), 876; https://doi.org/10.3390/e21090876 - 09 Sep 2019
Viewed by 187
Abstract
In this paper, classical and non-integer model order reduction methodologies are compared. Non integer order calculus has been used to generalize many classical control strategies. The property of compressing information in modelling systems, distributed in time and space, and the capability of describing [...] Read more.
In this paper, classical and non-integer model order reduction methodologies are compared. Non integer order calculus has been used to generalize many classical control strategies. The property of compressing information in modelling systems, distributed in time and space, and the capability of describing long-term memory effects in dynamical systems are two features suggesting also the application of fractional calculus in model order reduction. In the paper, an open loop balanced realization is compared with three approaches based on a non-integer representation of the reduced system. Several case studies are considered and compared. The results confirm the capability of fractional order systems to capture and compress the dynamics of high order systems. Full article
(This article belongs to the Special Issue The Fractional View of Complexity)
Show Figures

Figure 1

Open AccessArticle
Differential Effect of the Physical Embodiment on the Prefrontal Cortex Activity as Quantified by Its Entropy
Entropy 2019, 21(9), 875; https://doi.org/10.3390/e21090875 - 08 Sep 2019
Viewed by 248
Abstract
Computer-mediated-communication (CMC) research suggests that unembodied media can surpass in-person communication due to their utility to bypass the nonverbal components of verbal communication such as physical presence and facial expressions. However, recent results on communicative humanoids suggest the importance of the physical embodiment [...] Read more.
Computer-mediated-communication (CMC) research suggests that unembodied media can surpass in-person communication due to their utility to bypass the nonverbal components of verbal communication such as physical presence and facial expressions. However, recent results on communicative humanoids suggest the importance of the physical embodiment of conversational partners. These contradictory findings are strengthened by the fact that almost all of these results are based on the subjective assessments of the behavioural impacts of these systems. To investigate these opposing views of the potential role of the embodiment during communication, we compare the effect of a physically embodied medium that is remotely controlled by a human operator with such unembodied media as telephones and video-chat systems on the frontal brain activity of human subjects, given the pivotal role of this region in social cognition and verbal comprehension. Our results provide evidence that communicating through a physically embodied medium affects the frontal brain activity of humans whose patterns potentially resemble those of in-person communication. These findings argue for the significance of embodiment in naturalistic scenarios of social interaction, such as storytelling and verbal comprehension, and the potential application of brain information as a promising sensory gateway in the characterization of behavioural responses in human-robot interaction. Full article
Open AccessArticle
Thermodynamic Analysis of Transcritical CO2 Ejector Expansion Refrigeration Cycle with Dedicated Mechanical Subcooling
Entropy 2019, 21(9), 874; https://doi.org/10.3390/e21090874 - 08 Sep 2019
Viewed by 240
Abstract
The new configuration of a transcritical CO2 ejector expansion refrigeration cycle combined with a dedicated mechanical subcooling cycle (EMS) is proposed. Three mass ratios of R32/R1234ze(Z) (0.4/0.6, 0.6/0.4, and 0.8/0.2) were selected as the refrigerants of the mechanical subcooling cycle (MS) to [...] Read more.
The new configuration of a transcritical CO2 ejector expansion refrigeration cycle combined with a dedicated mechanical subcooling cycle (EMS) is proposed. Three mass ratios of R32/R1234ze(Z) (0.4/0.6, 0.6/0.4, and 0.8/0.2) were selected as the refrigerants of the mechanical subcooling cycle (MS) to further explore the possibility of improving the EMS cycle’s performance. The thermodynamic performances of the new cycle were evaluated using energetic and exergetic methods and compared with those of the transcritical CO2 ejector expansion cycle integrated with a thermoelectric subcooling system (ETS). The results showed that the proposed cycle presents significant advantages over the ETS cycle in terms of the ejector performance and the system energetic and exergetic performances. Taking the EMS cycle using R32/R1234ze(Z) (0.6/0.4) as the MS refrigerant as an example, the improvements in the coefficient of performance and system exergy efficiency were able to reach up to 10.27% and 15.56%, respectively, at an environmental temperature of 35 °C and evaporation temperature of −5 °C. Additionally, the advantages of the EMS cycle were more pronounced at higher environmental temperatures. Full article
(This article belongs to the Section Thermodynamics)
Show Figures

Figure 1

Open AccessArticle
Thermodynamic Rarity and Recyclability of Raw Materials in the Energy Transition: The Need for an In-Spiral Economy
Entropy 2019, 21(9), 873; https://doi.org/10.3390/e21090873 - 08 Sep 2019
Viewed by 260
Abstract
This paper presents a thermodynamic vision of the depletion of mineral resources. It demonstrates how raw materials can be better assessed using exergy, based on thermodynamic rarity, which considers scarcity in the crust and energy requirements for extracting and refining minerals. An exergy [...] Read more.
This paper presents a thermodynamic vision of the depletion of mineral resources. It demonstrates how raw materials can be better assessed using exergy, based on thermodynamic rarity, which considers scarcity in the crust and energy requirements for extracting and refining minerals. An exergy analysis of the energy transition reveals that, to approach a decarbonized economy by 2050, mineral exergy must be greater than that of fossil fuels, nuclear energy, and even all renewables. This is because clean technologies require huge amounts of many different raw materials. The rapid exhaustion of mines necessitates an increase in recycling and reuse, that is, a “circular economy”. As seen in the automobile industry, society is far removed from closing even the first cycle, and absolute circularity does not exist. The Second Law dictates that, in each cycle, some quantity and quality of materials is unavoidably lost (there are no circles, but spirals). For a rigorous recyclability analysis, we elaborate the exergy indicators to be used in the assessment of the true circularity of recycling processes. We aim to strive toward an advanced economy focused on separating techniques and promoting circularity audits, an economy that inspires new solutions: an in-spiral economy. Full article
(This article belongs to the Special Issue Thermodynamics of Sustainability)
Show Figures

Figure 1

Open AccessArticle
Integrating Classical Preprocessing into an Optical Encryption Scheme
Entropy 2019, 21(9), 872; https://doi.org/10.3390/e21090872 - 07 Sep 2019
Viewed by 261
Abstract
Traditionally, cryptographic protocols rely on mathematical assumptions and results to establish security guarantees. Quantum cryptography has demonstrated how physical properties of a communication channel can be leveraged in the design of cryptographic protocols, too. Our starting point is the AlphaEta protocol, which was [...] Read more.
Traditionally, cryptographic protocols rely on mathematical assumptions and results to establish security guarantees. Quantum cryptography has demonstrated how physical properties of a communication channel can be leveraged in the design of cryptographic protocols, too. Our starting point is the AlphaEta protocol, which was designed to exploit properties of coherent states of light to transmit data securely over an optical channel. AlphaEta aims to draw security from the uncertainty of any measurement of the transmitted coherent states due to intrinsic quantum noise. We present a technique to combine AlphaEta with classical preprocessing, taking into account error-correction for the optical channel. This enables us to establish strong provable security guarantees. In addition, the type of hybrid encryption we suggest, enables trade-offs between invoking a(n inexpensive) classical communication channel and a (more complex to implement) optical channel, without jeopardizing security. Our design can easily incorporate fast state-of-the-art authenticated encryption, but in this case the security analysis requires heuristic reasoning. Full article
Show Figures

Figure 1

Open AccessArticle
New Nonlinear Active Element Dedicated to Modeling Chaotic Dynamics with Complex Polynomial Vector Fields
Entropy 2019, 21(9), 871; https://doi.org/10.3390/e21090871 - 06 Sep 2019
Viewed by 203
Abstract
This paper describes evolution of new active element that is able to significantly simplify the design process of lumped chaotic oscillator, especially if the concept of analog computer or state space description is adopted. The major advantage of the proposed active device lies [...] Read more.
This paper describes evolution of new active element that is able to significantly simplify the design process of lumped chaotic oscillator, especially if the concept of analog computer or state space description is adopted. The major advantage of the proposed active device lies in the incorporation of two fundamental mathematical operations into a single five-port voltage-input current-output element: namely, differentiation and multiplication. The developed active device is verified inside three different synthesis scenarios: circuitry realization of a third-order cyclically symmetrical vector field, hyperchaotic system based on the Lorenz equations and fourth- and fifth-order hyperjerk function. Mentioned cases represent complicated vector fields that cannot be implemented without the necessity of utilizing many active elements. The captured oscilloscope screenshots are compared with numerically integrated trajectories to demonstrate good agreement between theory and measurement. Full article
Open AccessArticle
Malevich’s Suprematist Composition Picture for Spin States
Entropy 2019, 21(9), 870; https://doi.org/10.3390/e21090870 - 06 Sep 2019
Viewed by 205
Abstract
This paper proposes an alternative geometric representation of single qudit states based on probability simplexes to describe the quantum properties of noncomposite systems. In contrast to the known high dimension pictures, we present the planar picture of quantum states, using the elementary geometry. [...] Read more.
This paper proposes an alternative geometric representation of single qudit states based on probability simplexes to describe the quantum properties of noncomposite systems. In contrast to the known high dimension pictures, we present the planar picture of quantum states, using the elementary geometry. The approach is based on, so called, Malevich square representation of the single qubit state. It is shown that the quantum statistics of the single qudit with some spin j and observables are formally equivalent to statistics of the classical system with N 2 - 1 random vector variables and N 2 - 1 classical probability distributions, obeying special constrains, found in this study. We present a universal inequality, that describes the single qudits state quantumness. The inequality provides a possibility to experimentally check up entanglement of the system in terms of the classical probabilities. The simulation study for the single qutrit and ququad systems, using the Metropolis Monte-Carlo method, is obtained. The geometrical representation of the single qudit states, presented in the paper, is useful in providing a visualization of quantum states and illustrating their difference from the classical ones. Full article
(This article belongs to the Section Quantum Information)
Open AccessArticle
Topological Information Data Analysis
Entropy 2019, 21(9), 869; https://doi.org/10.3390/e21090869 - 06 Sep 2019
Viewed by 206
Abstract
This paper presents methods that quantify the structure of statistical interactions within a given data set, and were applied in a previous article. It establishes new results on the k-multivariate mutual-information ( I k ) inspired by the topological formulation of Information [...] Read more.
This paper presents methods that quantify the structure of statistical interactions within a given data set, and were applied in a previous article. It establishes new results on the k-multivariate mutual-information ( I k ) inspired by the topological formulation of Information introduced in a serie of studies. In particular, we show that the vanishing of all I k for 2 k n of n random variables is equivalent to their statistical independence. Pursuing the work of Hu Kuo Ting and Te Sun Han, we show that information functions provide co-ordinates for binary variables, and that they are analytically independent from the probability simplex for any set of finite variables. The maximal positive I k identifies the variables that co-vary the most in the population, whereas the minimal negative I k identifies synergistic clusters and the variables that differentiate–segregate the most in the population. Finite data size effects and estimation biases severely constrain the effective computation of the information topology on data, and we provide simple statistical tests for the undersampling bias and the k-dependences. We give an example of application of these methods to genetic expression and unsupervised cell-type classification. The methods unravel biologically relevant subtypes, with a sample size of 41 genes and with few errors. It establishes generic basic methods to quantify the epigenetic information storage and a unified epigenetic unsupervised learning formalism. We propose that higher-order statistical interactions and non-identically distributed variables are constitutive characteristics of biological systems that should be estimated in order to unravel their significant statistical structure and diversity. The topological information data analysis presented here allows for precisely estimating this higher-order structure characteristic of biological systems. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

Open AccessArticle
Permutation Entropy and Irreversibility in Gait Kinematic Time Series from Patients with Mild Cognitive Decline and Early Alzheimer’s Dementia
Entropy 2019, 21(9), 868; https://doi.org/10.3390/e21090868 - 06 Sep 2019
Viewed by 262
Abstract
Gait is a basic cognitive purposeful action that has been shown to be altered in late stages of neurodegenerative dementias. Nevertheless, alterations are less clear in mild forms of dementia, and the potential use of gait analysis as a biomarker of initial cognitive [...] Read more.
Gait is a basic cognitive purposeful action that has been shown to be altered in late stages of neurodegenerative dementias. Nevertheless, alterations are less clear in mild forms of dementia, and the potential use of gait analysis as a biomarker of initial cognitive decline has hitherto mostly been neglected. Herein, we report the results of a study of gait kinematic time series for two groups of patients (mild cognitive impairment and mild Alzheimer’s disease) and a group of matched control subjects. Two metrics based on permutation patterns are considered, respectively measuring the complexity and irreversibility of the time series. Results indicate that kinematic disorganisation is present in early phases of cognitive impairment; in addition, they depict a rich scenario, in which some joint movements display an increased complexity and irreversibility, while others a marked decrease. Beyond their potential use as biomarkers, complexity and irreversibility metrics can open a new door to the understanding of the role of the nervous system in gait, as well as its adaptation and compensatory mechanisms. Full article
(This article belongs to the Special Issue Permutation Entropy: Theory and Applications)
Show Figures

Figure 1

Previous Issue
Back to TopTop