Next Issue
Volume 22, August
Previous Issue
Volume 22, June

Entropy, Volume 22, Issue 7 (July 2020) – 88 articles

Cover Story (view full-size image): Data-driven methods for the analysis of complex dynamical systems have gained significant attention over the last few years. The goal is to extract global information about the system’s behavior by analyzing time-series data, which can stem from either simulations or measurements. Meanwhile, kernel methods have become a valuable tool in the field of machine learning, as they make it possible to implicitly use large feature spaces, while the resulting numerical problems are easily set up. In this study, we exploit the derivative reproducing property of sufficiently smooth kernel functions to derive kernel approximations to certain differential operators, including the Koopman generator and Schrödinger operator. We illustrate their capabilities by analyzing stochastic differential equations, basic quantum mechanics problems, and examples of manifold learning. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
Order results
Result details
Select all
Export citation of selected articles as:
Article
Multi-Label Feature Selection Based on High-Order Label Correlation Assumption
Entropy 2020, 22(7), 797; https://doi.org/10.3390/e22070797 - 21 Jul 2020
Cited by 1 | Viewed by 872
Abstract
Multi-label data often involve features with high dimensionality and complicated label correlations, resulting in a great challenge for multi-label learning. Feature selection plays an important role in multi-label learning to address multi-label data. Exploring label correlations is crucial for multi-label feature selection. Previous [...] Read more.
Multi-label data often involve features with high dimensionality and complicated label correlations, resulting in a great challenge for multi-label learning. Feature selection plays an important role in multi-label learning to address multi-label data. Exploring label correlations is crucial for multi-label feature selection. Previous information-theoretical-based methods employ the strategy of cumulative summation approximation to evaluate candidate features, which merely considers low-order label correlations. In fact, there exist high-order label correlations in label set, labels naturally cluster into several groups, similar labels intend to cluster into the same group, different labels belong to different groups. However, the strategy of cumulative summation approximation tends to select the features related to the groups containing more labels while ignoring the classification information of groups containing less labels. Therefore, many features related to similar labels are selected, which leads to poor classification performance. To this end, Max-Correlation term considering high-order label correlations is proposed. Additionally, we combine the Max-Correlation term with feature redundancy term to ensure that selected features are relevant to different label groups. Finally, a new method named Multi-label Feature Selection considering Max-Correlation (MCMFS) is proposed. Experimental results demonstrate the classification superiority of MCMFS in comparison to eight state-of-the-art multi-label feature selection methods. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

Article
Evolution Equations for Quantum Semi-Markov Dynamics
Entropy 2020, 22(7), 796; https://doi.org/10.3390/e22070796 - 21 Jul 2020
Cited by 4 | Viewed by 862
Abstract
Using a newly introduced connection between the local and non-local description of open quantum system dynamics, we investigate the relationship between these two characterisations in the case of quantum semi-Markov processes. This class of quantum evolutions, which is a direct generalisation of the [...] Read more.
Using a newly introduced connection between the local and non-local description of open quantum system dynamics, we investigate the relationship between these two characterisations in the case of quantum semi-Markov processes. This class of quantum evolutions, which is a direct generalisation of the corresponding classical concept, guarantees mathematically well-defined master equations, while accounting for a wide range of phenomena, possibly in the non-Markovian regime. In particular, we analyse the emergence of a dephasing term when moving from one type of master equation to the other, by means of several examples. We also investigate the corresponding Redfield-like approximated dynamics, which are obtained after a coarse graining in time. Relying on general properties of the associated classical random process, we conclude that such an approximation always leads to a Markovian evolution for the considered class of dynamics. Full article
(This article belongs to the Special Issue Open Quantum Systems (OQS) for Quantum Technologies)
Show Figures

Figure 1

Article
Bekenstein’s Entropy Bound-Particle Horizon Approach to Avoid the Cosmological Singularity
Entropy 2020, 22(7), 795; https://doi.org/10.3390/e22070795 - 21 Jul 2020
Viewed by 849
Abstract
The cosmological singularity of infinite density, temperature, and spacetime curvature is the classical limit of Friedmann’s general relativity solutions extrapolated to the origin of the standard model of cosmology. Jacob Bekenstein suggests that thermodynamics excludes the possibility of such a singularity in a [...] Read more.
The cosmological singularity of infinite density, temperature, and spacetime curvature is the classical limit of Friedmann’s general relativity solutions extrapolated to the origin of the standard model of cosmology. Jacob Bekenstein suggests that thermodynamics excludes the possibility of such a singularity in a 1989 paper. We propose a re-examination of his particle horizon approach in the early radiation-dominated universe and verify it as a feasible alternative to the classical inevitability of the singularity. We argue that this minimum-radius particle horizon determined from Bekenstein’s entropy bound, necessarily quantum in nature as a quantum particle horizon (QPH), precludes the singularity, just as quantum mechanics provided the solution for singularities in atomic transitions as radius r 0 . An initial radius of zero can never be attained quantum mechanically. This avoids the spacetime singularity, supporting Bekenstein’s assertion that Friedmann models cannot be extrapolated to the very beginning of the universe but only to a boundary that is ‘something like a particle horizon’. The universe may have begun in a bright flash and quantum flux of radiation and particles at a minimum, irreducible quantum particle horizon rather than at the classical mathematical limit and unrealizable state of an infinite singularity. Full article
(This article belongs to the Section Astrophysics, Cosmology, and Black Holes)
Show Figures

Figure 1

Article
Modelling and Recognition of Protein Contact Networks by Multiple Kernel Learning and Dissimilarity Representations
Entropy 2020, 22(7), 794; https://doi.org/10.3390/e22070794 - 21 Jul 2020
Cited by 4 | Viewed by 1171
Abstract
Multiple kernel learning is a paradigm which employs a properly constructed chain of kernel functions able to simultaneously analyse different data or different representations of the same data. In this paper, we propose an hybrid classification system based on a linear combination of [...] Read more.
Multiple kernel learning is a paradigm which employs a properly constructed chain of kernel functions able to simultaneously analyse different data or different representations of the same data. In this paper, we propose an hybrid classification system based on a linear combination of multiple kernels defined over multiple dissimilarity spaces. The core of the training procedure is the joint optimisation of kernel weights and representatives selection in the dissimilarity spaces. This equips the system with a two-fold knowledge discovery phase: by analysing the weights, it is possible to check which representations are more suitable for solving the classification problem, whereas the pivotal patterns selected as representatives can give further insights on the modelled system, possibly with the help of field-experts. The proposed classification system is tested on real proteomic data in order to predict proteins’ functional role starting from their folded structure: specifically, a set of eight representations are drawn from the graph-based protein folded description. The proposed multiple kernel-based system has also been benchmarked against a clustering-based classification system also able to exploit multiple dissimilarities simultaneously. Computational results show remarkable classification capabilities and the knowledge discovery analysis is in line with current biological knowledge, suggesting the reliability of the proposed system. Full article
(This article belongs to the Special Issue Computation in Complex Networks)
Show Figures

Figure 1

Article
Entropy and the Second Law of Thermodynamics—The Nonequilibrium Perspective
Entropy 2020, 22(7), 793; https://doi.org/10.3390/e22070793 - 21 Jul 2020
Viewed by 1281
Abstract
An alternative to the Carnot-Clausius approach for introducing entropy and the second law of thermodynamics is outlined that establishes entropy as a nonequilibrium property from the onset. Five simple observations lead to entropy for nonequilibrium and equilibrium states, and its balance. Thermodynamic temperature [...] Read more.
An alternative to the Carnot-Clausius approach for introducing entropy and the second law of thermodynamics is outlined that establishes entropy as a nonequilibrium property from the onset. Five simple observations lead to entropy for nonequilibrium and equilibrium states, and its balance. Thermodynamic temperature is identified, its positivity follows from the stability of the rest state. It is shown that the equations of engineering thermodynamics are valid for the case of local thermodynamic equilibrium, with inhomogeneous states. The main findings are accompanied by examples and additional discussion to firmly imbed classical and engineering thermodynamics into nonequilibrium thermodynamics. Full article
(This article belongs to the Special Issue The Foundations of Thermodynamics)
Show Figures

Figure 1

Article
MADFU: An Improved Malicious Application Detection Method Based on Features Uncertainty
Entropy 2020, 22(7), 792; https://doi.org/10.3390/e22070792 - 20 Jul 2020
Viewed by 853
Abstract
Millions of Android applications (apps) are widely used today. Meanwhile, the number of malicious apps has increased exponentially. Currently, there are many security detection technologies for Android apps, such as static detection and dynamic detection. However, the uncertainty of the features in detection [...] Read more.
Millions of Android applications (apps) are widely used today. Meanwhile, the number of malicious apps has increased exponentially. Currently, there are many security detection technologies for Android apps, such as static detection and dynamic detection. However, the uncertainty of the features in detection is not considered sufficiently in these technologies. Permissions play an important role in the security detection of Android apps. In this paper, a malicious application detection model based on features uncertainty (MADFU) is proposed. MADFU uses logistic regression function to describe the input (permissions) and output (labels) relationship. Moreover, it uses the Markov chain Monte Carlo (MCMC) algorithm to solve features’ uncertainty. After experimenting with 2037 samples, for malware detection, MADFU achieves an accuracy of up to 95.5%, and the false positive rate (FPR) is 1.2%. MADFU’s Android app detection accuracy is higher than the accuracy of directly using 24 dangerous permission. The results also indicate that the method for an unknown/new sample’s detection accuracy is 92.7%. Compared to other state-of-the-art approaches, the proposed method is more effective and efficient, by detecting malware. Full article
(This article belongs to the Section Signal and Data Analysis)
Show Figures

Figure 1

Article
Information Search and Financial Markets under COVID-19
Entropy 2020, 22(7), 791; https://doi.org/10.3390/e22070791 - 20 Jul 2020
Cited by 4 | Viewed by 1597
Abstract
The discovery and sudden spread of the novel coronavirus (COVID-19) exposed individuals to a great uncertainty about the potential health and economic ramifications of the virus, which triggered a surge in demand for information about COVID-19. To understand financial market implications of individuals’ [...] Read more.
The discovery and sudden spread of the novel coronavirus (COVID-19) exposed individuals to a great uncertainty about the potential health and economic ramifications of the virus, which triggered a surge in demand for information about COVID-19. To understand financial market implications of individuals’ behavior upon such uncertainty, we explore the relationship between Google search queries related to COVID-19—information search that reflects one’s level of concern or risk perception—and the performance of major financial indices. The empirical analysis based on the Bayesian inference of a structural vector autoregressive model shows that one unit increase in the popularity of COVID-19-related global search queries, after controlling for COVID-19 cases, results in 0.038 0.069 % of a cumulative decline in global financial indices after one day and 0.054 0.150 % of a cumulative decline after one week. Full article
Show Figures

Figure 1

Article
A Protocol Design Paradigm for Batched Sparse Codes
Entropy 2020, 22(7), 790; https://doi.org/10.3390/e22070790 - 20 Jul 2020
Cited by 1 | Viewed by 801
Abstract
Internet of Things (IoT) connects billions of everyday objects to the Internet. The mobility of devices can be facilitated by means of employing multiple wireless links. However, packet loss is a common phenomenon in wireless communications, where the traditional forwarding strategy undergoes severe [...] Read more.
Internet of Things (IoT) connects billions of everyday objects to the Internet. The mobility of devices can be facilitated by means of employing multiple wireless links. However, packet loss is a common phenomenon in wireless communications, where the traditional forwarding strategy undergoes severe performance issues in a multi-hop wireless network. One solution is to apply batched sparse (BATS) codes. A fundamental difference from the traditional strategy is that BATS codes require the intermediate network nodes to perform recoding, which generates recoded packets by network coding operations. Literature showed that advanced recoding schemes and burst packet loss can enhance and diminish the performance of BATS codes respectively. However, the existing protocols for BATS codes cannot handle both of them at the same time. In this paper, we propose a paradigm of protocol design for BATS codes. Our design can be applied in different layers of the network stack and it is compatible to the existing network infrastructures. The modular nature of the protocol can support different recoding techniques and different ways to handle burst packet loss. We also give some examples to demonstrate how to use the protocol. Full article
(This article belongs to the Special Issue Information Theory and Network Coding)
Show Figures

Figure 1

Article
Looking at Extremes without Going to Extremes: A New Self-Exciting Probability Model for Extreme Losses in Financial Markets
Entropy 2020, 22(7), 789; https://doi.org/10.3390/e22070789 - 20 Jul 2020
Cited by 1 | Viewed by 1057
Abstract
Forecasting market risk lies at the core of modern empirical finance. We propose a new self-exciting probability peaks-over-threshold (SEP-POT) model for forecasting the extreme loss probability and the value at risk. The model draws from the point-process approach to the POT methodology but [...] Read more.
Forecasting market risk lies at the core of modern empirical finance. We propose a new self-exciting probability peaks-over-threshold (SEP-POT) model for forecasting the extreme loss probability and the value at risk. The model draws from the point-process approach to the POT methodology but is built under a discrete-time framework. Thus, time is treated as an integer value and the days of extreme loss could occur upon a sequence of indivisible time units. The SEP-POT model can capture the self-exciting nature of extreme event arrival, and hence, the strong clustering of large drops in financial prices. The triggering effect of recent events on the probability of extreme losses is specified using a discrete weighting function based on the at-zero-truncated Negative Binomial (NegBin) distribution. The serial correlation in the magnitudes of extreme losses is also taken into consideration using the generalized Pareto distribution enriched with the time-varying scale parameter. In this way, recent events affect the size of extreme losses more than distant events. The accuracy of SEP-POT value at risk (VaR) forecasts is backtested on seven stock indexes and three currency pairs and is compared with existing well-recognized methods. The results remain in favor of our model, showing that it constitutes a real alternative for forecasting extreme quantiles of financial returns. Full article
(This article belongs to the Special Issue Complexity in Economic and Social Systems)
Show Figures

Figure 1

Article
On Gap-Based Lower Bounding Techniques for Best-Arm Identification
Entropy 2020, 22(7), 788; https://doi.org/10.3390/e22070788 - 20 Jul 2020
Viewed by 1012
Abstract
In this paper, we consider techniques for establishing lower bounds on the number of arm pulls for best-arm identification in the multi-armed bandit problem. While a recent divergence-based approach was shown to provide improvements over an older gap-based approach, we show that the [...] Read more.
In this paper, we consider techniques for establishing lower bounds on the number of arm pulls for best-arm identification in the multi-armed bandit problem. While a recent divergence-based approach was shown to provide improvements over an older gap-based approach, we show that the latter can be refined to match the former (up to constant factors) in many cases of interest under Bernoulli rewards, including the case that the rewards are bounded away from zero and one. Together with existing upper bounds, this indicates that the divergence-based and gap-based approaches are both effective for establishing sample complexity lower bounds for best-arm identification. Full article
(This article belongs to the Special Issue Information Theory in Machine Learning and Data Science II)
Article
Decoding Analysis of Alpha Oscillation Networks on Maintaining Driver Alertness
Entropy 2020, 22(7), 787; https://doi.org/10.3390/e22070787 - 18 Jul 2020
Viewed by 961
Abstract
The countermeasure of driver fatigue is valuable for reducing the risk of accidents caused by vigilance failure during prolonged driving. Listening to the radio (RADIO) has been proven to be a relatively effective “in-car” countermeasure. However, the connectivity analysis, which can be used [...] Read more.
The countermeasure of driver fatigue is valuable for reducing the risk of accidents caused by vigilance failure during prolonged driving. Listening to the radio (RADIO) has been proven to be a relatively effective “in-car” countermeasure. However, the connectivity analysis, which can be used to investigate its alerting effect, is subject to the issue of signal mixing. In this study, we propose a novel framework based on clustering and entropy to improve the performance of the connectivity analysis to reveal the effect of RADIO to maintain driver alertness. Regardless of reducing signal mixing, we introduce clustering algorithm to classify the functional connections with their nodes into different categories to mine the effective information of the alerting effect. Differential entropy (DE) is employed to measure the information content in different brain regions after clustering. Compared with the Louvain-based community detection method, the proposed method shows more superior ability to present RADIO effectin confused functional connection matrices. Our experimental results reveal that the active connection clusters distinguished by the proposed method gradually move from frontal region to parieto-occipital regionwith the progress of fatigue, consistent with the alpha energy changes in these two brain areas. The active class of the clusters in parieto-occipital region significantly decreases and the most active clusters remain in the frontal region when RADIO is taken. The estimation results of DE confirm the significant change (p < 0.05) of information content due to the cluster movements. Hence, preventing the movement of the active clusters from frontal region to parieto-occipital region may correlate with maintaining driver alertness. The revelation of alerting effect is helpful for the targeted upgrade of fatigue countermeasures. Full article
(This article belongs to the Special Issue Entropy in Brain Networks)
Show Figures

Figure 1

Article
Microstructure Evolution and Mechanical Properties of FeCoCrNiCuTi0.8 High-Entropy Alloy Prepared by Directional Solidification
Entropy 2020, 22(7), 786; https://doi.org/10.3390/e22070786 - 18 Jul 2020
Cited by 1 | Viewed by 852
Abstract
A CoCrCuFeNiTi0.8 high-entropy alloy was prepared using directional solidification techniques at different withdrawal rates (50 μm/s, 100 μm/s, 500 μm/s). The results showed that the microstructure was dendritic at all withdrawal rates. As the withdrawal rate increased, the dendrite orientation become uniform. [...] Read more.
A CoCrCuFeNiTi0.8 high-entropy alloy was prepared using directional solidification techniques at different withdrawal rates (50 μm/s, 100 μm/s, 500 μm/s). The results showed that the microstructure was dendritic at all withdrawal rates. As the withdrawal rate increased, the dendrite orientation become uniform. Additionally, the accumulation of Cr and Ti elements at the solid/liquid interface caused the formation of dendrites. Through the measurement of the primary dendrite spacing (λ1) and the secondary dendrite spacing (λ2), it was concluded that the dendrite structure was obviously refined with the increase in the withdrawal rate to 500 μm/s. The maximum compressive strength reached 1449.8 MPa, and the maximum hardness was 520 HV. Moreover, the plastic strain of the alloy without directional solidification was 2.11%, while the plastic strain of directional solidification was 12.57% at 500 μm/s. It has been proved that directional solidification technology can effectively improve the mechanical properties of the CoCrCuFeNiTi0.8 high-entropy alloy. Full article
Show Figures

Figure 1

Article
Quantum Correlation Dynamics in Controlled Two-Coupled-Qubit Systems
Entropy 2020, 22(7), 785; https://doi.org/10.3390/e22070785 - 18 Jul 2020
Cited by 2 | Viewed by 1029
Abstract
We study and compare the time evolutions of concurrence and quantum discord in a driven system of two interacting qubits prepared in a generic Werner state. The corresponding quantum dynamics is exactly treated and manifests the appearance and disappearance of entanglement. Our analytical [...] Read more.
We study and compare the time evolutions of concurrence and quantum discord in a driven system of two interacting qubits prepared in a generic Werner state. The corresponding quantum dynamics is exactly treated and manifests the appearance and disappearance of entanglement. Our analytical treatment transparently unveils the physical reasons for the occurrence of such a phenomenon, relating it to the dynamical invariance of the X structure of the initial state. The quantum correlations which asymptotically emerge in the system are investigated in detail in terms of the time evolution of the fidelity of the initial Werner state. Full article
(This article belongs to the Special Issue Quantum Entanglement)
Show Figures

Figure 1

Article
On the Capacity Regions of Degraded Relay Broadcast Channels with and without Feedback
Entropy 2020, 22(7), 784; https://doi.org/10.3390/e22070784 - 17 Jul 2020
Viewed by 738
Abstract
The four-node relay broadcast channel (RBC) is considered, in which a transmitter communicates with two receivers with the assistance of a relay node. We first investigate three types of physically degraded RBCs (PDRBCs) based on different degradation orders among the relay and the [...] Read more.
The four-node relay broadcast channel (RBC) is considered, in which a transmitter communicates with two receivers with the assistance of a relay node. We first investigate three types of physically degraded RBCs (PDRBCs) based on different degradation orders among the relay and the receivers’ observed signals. For the discrete memoryless (DM) case, only the capacity region of the second type of PDRBC is already known, while for the Gaussian case, only the capacity region of the first type of PDRBC is already known. In this paper, we step forward and make the following progress: (1) for the first type of DM-PDRBC, a new outer bound is established, which has the same rate expression as an existing inner bound, with only a slight difference on the input distributions; (2) for the second type of Gaussian PDRBC, the capacity region is established; (3) for the third type of PDRBC, the capacity regions are established both for DM and Gaussian cases. Besides, we also consider the RBC with relay feedback where the relay node can send the feedback signal to the transmitter. A new coding scheme based on a hybrid relay strategy and a layered Marton’s coding is proposed. It is shown that our scheme can strictly enlarge Behboodi and Piantanida’s rate region, which is tight for the second type of DM-PDRBC. Moreover, we show that capacity regions of the second and third types of PDRBCs are exactly the same as that without feedback, which means feedback cannot enlarge capacity regions for these types of RBCs. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

Article
Do Liquidity Proxies Based on Daily Prices and Quotes Really Measure Liquidity?
Entropy 2020, 22(7), 783; https://doi.org/10.3390/e22070783 - 17 Jul 2020
Cited by 2 | Viewed by 1170
Abstract
This paper examines whether liquidity proxies based on different daily prices and quotes approximate latent liquidity. We compare percent-cost daily liquidity proxies with liquidity benchmarks as well as with realized variance estimates. Both benchmarks and volatility measures are obtained from high-frequency data. Our [...] Read more.
This paper examines whether liquidity proxies based on different daily prices and quotes approximate latent liquidity. We compare percent-cost daily liquidity proxies with liquidity benchmarks as well as with realized variance estimates. Both benchmarks and volatility measures are obtained from high-frequency data. Our results show that liquidity proxies based on high-low-open-close prices are more correlated and display higher mutual information with volatility estimates than with liquidity benchmarks. The only percent-cost proxy that indicates higher dependency with liquidity benchmarks than with volatility estimates is the Closing Quoted Spread based on the last bid and ask quotes within a day. We consider different sampling frequencies for calculating realized variance and liquidity benchmarks, and find that our results are robust to it. Full article
(This article belongs to the Special Issue Complexity in Economic and Social Systems)
Show Figures

Figure 1

Article
Discrete Information Dynamics with Confidence via the Computational Mechanics Bootstrap: Confidence Sets and Significance Tests for Information-Dynamic Measures
Entropy 2020, 22(7), 782; https://doi.org/10.3390/e22070782 - 17 Jul 2020
Cited by 1 | Viewed by 904
Abstract
Information dynamics and computational mechanics provide a suite of measures for assessing the information- and computation-theoretic properties of complex systems in the absence of mechanistic models. However, both approaches lack a core set of inferential tools needed to make them more broadly useful [...] Read more.
Information dynamics and computational mechanics provide a suite of measures for assessing the information- and computation-theoretic properties of complex systems in the absence of mechanistic models. However, both approaches lack a core set of inferential tools needed to make them more broadly useful for analyzing real-world systems, namely reliable methods for constructing confidence sets and hypothesis tests for their underlying measures. We develop the computational mechanics bootstrap, a bootstrap method for constructing confidence sets and significance tests for information-dynamic measures via confidence distributions using estimates of ϵ -machines inferred via the Causal State Splitting Reconstruction (CSSR) algorithm. Via Monte Carlo simulation, we compare the inferential properties of the computational mechanics bootstrap to a Markov model bootstrap. The computational mechanics bootstrap is shown to have desirable inferential properties for a collection of model systems and generally outperforms the Markov model bootstrap. Finally, we perform an in silico experiment to assess the computational mechanics bootstrap’s performance on a corpus of ϵ -machines derived from the activity patterns of fifteen-thousand Twitter users. Full article
(This article belongs to the Special Issue Information Theory for Human and Social Processes)
Show Figures

Figure 1

Article
Entropy-Based Solutions for Ecological Inference Problems: A Composite Estimator
Entropy 2020, 22(7), 781; https://doi.org/10.3390/e22070781 - 17 Jul 2020
Viewed by 663
Abstract
Information-based estimation techniques are becoming more popular in the field of Ecological Inference. Within this branch of estimation techniques, two alternative approaches can be pointed out. The first one is the Generalized Maximum Entropy (GME) approach based on a matrix adjustment problem where [...] Read more.
Information-based estimation techniques are becoming more popular in the field of Ecological Inference. Within this branch of estimation techniques, two alternative approaches can be pointed out. The first one is the Generalized Maximum Entropy (GME) approach based on a matrix adjustment problem where the only observable information is given by the margins of the target matrix. An alternative approach is based on a distributionally weighted regression (DWR) equation. These two approaches have been studied so far as completely different streams, even when there are clear connections between them. In this paper we present these connections explicitly. More specifically, we show that under certain conditions the generalized cross-entropy (GCE) solution for a matrix adjustment problem and the GME estimator of a DWR equation differ only in terms of the a priori information considered. Then, we move a step forward and propose a composite estimator that combines the two priors considered in both approaches. Finally, we present a numerical experiment and an empirical application based on Spanish data for the 2010 year. Full article
(This article belongs to the Special Issue The Statistical Foundations of Entropy)
Article
Pseudo-Yang-Lee Edge Singularity Critical Behavior in a Non-Hermitian Ising Model
Entropy 2020, 22(7), 780; https://doi.org/10.3390/e22070780 - 17 Jul 2020
Cited by 1 | Viewed by 842
Abstract
The quantum phase transition of a one-dimensional transverse field Ising model in an imaginary longitudinal field is studied. A new order parameter M is introduced to describe the critical behaviors in the Yang-Lee edge singularity (YLES). The M does not diverge at the [...] Read more.
The quantum phase transition of a one-dimensional transverse field Ising model in an imaginary longitudinal field is studied. A new order parameter M is introduced to describe the critical behaviors in the Yang-Lee edge singularity (YLES). The M does not diverge at the YLES point, a behavior different from other usual parameters. We term this unusual critical behavior around YLES as the pseudo-YLES. To investigate the static and driven dynamics of M, the (1+1) dimensional ferromagnetic-paramagnetic phase transition ((1+1) D FPPT) critical region, (0+1) D YLES critical region and the (1+1) D YLES critical region of the model are selected. Our numerical study shows that the (1+1) D FPPT scaling theory, the (0+1) D YLES scaling theory and (1+1) D YLES scaling theory are applicable to describe the critical behaviors of M, demonstrating that M could be a good indicator to detect the phase transition around YLES. Since M has finite value around YLES, it is expected that M could be quantitatively measured in experiments. Full article
(This article belongs to the Section Statistical Physics)
Show Figures

Figure 1

Article
A Novel Image-Encryption Scheme Based on a Non-Linear Cross-Coupled Hyperchaotic System with the Dynamic Correlation of Plaintext Pixels
Entropy 2020, 22(7), 779; https://doi.org/10.3390/e22070779 - 17 Jul 2020
Viewed by 764
Abstract
Based on a logistic map and Feigenbaum map, we proposed a logistic Feigenbaum non-linear cross-coupled hyperchaotic map (LF-NCHM) model. Experimental verification showed that the system is a hyperchaotic system. Compared with the existing cross-coupled mapping, LF-NCHM demonstrated a wider hyperchaotic range, better ergodicity [...] Read more.
Based on a logistic map and Feigenbaum map, we proposed a logistic Feigenbaum non-linear cross-coupled hyperchaotic map (LF-NCHM) model. Experimental verification showed that the system is a hyperchaotic system. Compared with the existing cross-coupled mapping, LF-NCHM demonstrated a wider hyperchaotic range, better ergodicity and richer dynamic behavior. A hyperchaotic sequence with the same number of image pixels was generated by LF-NCHM, and a novel image-encryption algorithm with permutation that is dynamically related to plaintext pixels was proposed. In the scrambling stage, the position of the first scrambled pixel was related to the sum of the plaintext pixel values, and the positions of the remaining scrambled pixels were related to the pixel values after the previous scrambling. The scrambling operation also had a certain diffusion effect. In the diffusion phase, using the same chaotic sequence as in the scrambling stage increased the usage rate of the hyperchaotic sequence and improved the calculation efficiency of the algorithm. A large number of experimental simulations and cryptanalyses were performed, and the results proved that the algorithm had outstanding security and extremely high encryption efficiency. In addition, LF-NCHM could effectively resist statistical analysis attacks, differential attacks and chosen-plaintext attacks. Full article
(This article belongs to the Special Issue Information Theoretic Security and Privacy of Information Systems)
Show Figures

Figure 1

Article
Lattice–Gas–Automaton Modeling of Income Distribution
Entropy 2020, 22(7), 778; https://doi.org/10.3390/e22070778 - 17 Jul 2020
Cited by 1 | Viewed by 839
Abstract
A simple and effective lattice–gas–automaton (LGA) economic model is proposed for the income distribution. It consists of four stages: random propagation, economic transaction, income tax, and charity. Two types of discrete models are introduced: two-dimensional four-neighbor model (D2N4) and D2N8. For the former, [...] Read more.
A simple and effective lattice–gas–automaton (LGA) economic model is proposed for the income distribution. It consists of four stages: random propagation, economic transaction, income tax, and charity. Two types of discrete models are introduced: two-dimensional four-neighbor model (D2N4) and D2N8. For the former, an agent either remains motionless or travels to one of its four neighboring empty sites randomly. For the latter, the agent may travel to one of its nearest four sites or the four diagonal sites. Afterwards, an economic transaction takes place randomly when two agents are located in the nearest (plus the diagonal) neighboring sites for the D2N4 (D2N8). During the exchange, the Matthew effect could be taken into account in the way that the rich own a higher probability of earning money than the poor. Moreover, two kinds of income tax models are incorporated. One is the detailed taxable income brackets and rates, and the other is a simplified tax model based on a fitting power function. Meanwhile, charity is considered with the assumption that a richer agent donates a part of his income to charity with a certain probability. Finally, the LGA economic model is validated by using two kinds of benchmarks. One is the income distributions of individual agents and two-earner families in a free market. The other is the shares of total income in the USA and UK, respectively. Besides, impacts of the Matthew effect, income tax and charity upon the redistribution of income are investigated. It is confirmed that the model has the potential to offer valuable references for formulating financial laws and regulations. Full article
(This article belongs to the Section Statistical Physics)
Show Figures

Figure 1

Erratum
Erratum: Wu, Z., Zhang, W. Fractional Refined Composite Multiscale Fuzzy Entropy of International Stock Indices. Entropy 2019, 21(9), 914
Entropy 2020, 22(7), 777; https://doi.org/10.3390/e22070777 - 16 Jul 2020
Viewed by 691
Abstract
We have found an error in grant number in Funding published in Entropy [...] Full article
Article
Blind Witnesses Quench Quantum Interference without Transfer of Which-Path Information
Entropy 2020, 22(7), 776; https://doi.org/10.3390/e22070776 - 16 Jul 2020
Viewed by 860
Abstract
Quantum computation is often limited by environmentally-induced decoherence. We examine the loss of coherence for a two-branch quantum interference device in the presence of multiple witnesses, representing an idealized environment. Interference oscillations are visible in the output as the magnetic flux through the [...] Read more.
Quantum computation is often limited by environmentally-induced decoherence. We examine the loss of coherence for a two-branch quantum interference device in the presence of multiple witnesses, representing an idealized environment. Interference oscillations are visible in the output as the magnetic flux through the branches is varied. Quantum double-dot witnesses are field-coupled and symmetrically attached to each branch. The global system—device and witnesses—undergoes unitary time evolution with no increase in entropy. Witness states entangle with the device state, but for these blind witnesses, which-path information is not able to be transferred to the quantum state of witnesses—they cannot “see” or make a record of which branch is traversed. The system which-path information leaves no imprint on the environment. Yet, the presence of a multiplicity of witnesses rapidly quenches quantum interference. Full article
(This article belongs to the Special Issue Physical Information and the Physical Foundations of Computation)
Show Figures

Figure 1

Article
Image-Based Methods to Investigate Synchronization between Time Series Relevant for Plasma Fusion Diagnostics
Entropy 2020, 22(7), 775; https://doi.org/10.3390/e22070775 - 16 Jul 2020
Viewed by 827
Abstract
Advanced time series analysis and causality detection techniques have been successfully applied to the assessment of synchronization experiments in tokamaks, such as Edge Localized Modes (ELMs) and sawtooth pacing. Lag synchronization is a typical strategy for fusion plasma instability control by pace-making techniques. [...] Read more.
Advanced time series analysis and causality detection techniques have been successfully applied to the assessment of synchronization experiments in tokamaks, such as Edge Localized Modes (ELMs) and sawtooth pacing. Lag synchronization is a typical strategy for fusion plasma instability control by pace-making techniques. The major difficulty, in evaluating the efficiency of the pacing methods, is the coexistence of the causal effects with the periodic or quasi-periodic nature of the plasma instabilities. In the present work, a set of methods based on the image representation of time series, are investigated as tools for evaluating the efficiency of the pace-making techniques. The main options rely on the Gramian Angular Field (GAF), the Markov Transition Field (MTF), previously used for time series classification, and the Chaos Game Representation (CGR), employed for the visualization of large collections of long time series. The paper proposes an original variation of the Markov Transition Matrix, defined for a couple of time series. Additionally, a recently proposed method, based on the mapping of time series as cross-visibility networks and their representation as images, is included in this study. The performances of the method are evaluated on synthetic data and applied to JET measurements. Full article
Show Figures

Figure 1

Article
Thermal Resonance and Cell Behavior
Entropy 2020, 22(7), 774; https://doi.org/10.3390/e22070774 - 16 Jul 2020
Cited by 7 | Viewed by 832
Abstract
From a thermodynamic point of view, living cell life is no more than a cyclic process. It starts with the newly separated daughter cells and restarts when the next generations grow as free entities. During this cycle, the cell changes its entropy. In [...] Read more.
From a thermodynamic point of view, living cell life is no more than a cyclic process. It starts with the newly separated daughter cells and restarts when the next generations grow as free entities. During this cycle, the cell changes its entropy. In cancer, the growth control is damaged. In this paper, we analyze the role of the volume–area ratio in the cell in relation to the heat exchange between cell and its environment in order to point out its effect on cancer growth. The result holds to a possible control of the cancer growth based on the heat exchanged by the cancer toward its environment and the membrane potential variation, with the consequence of controlling the ions fluxes and the related biochemical reactions. This second law approach could represent a starting point for a possible future support for the anticancer therapies, in order to improve their effectiveness for the untreatable cancers. Full article
(This article belongs to the Special Issue Thermodynamics of Life: Cells, Organisms and Evolution)
Show Figures

Figure 1

Article
Development of Stock Networks Using Part Mutual Information and Australian Stock Market Data
Entropy 2020, 22(7), 773; https://doi.org/10.3390/e22070773 - 15 Jul 2020
Cited by 3 | Viewed by 1106
Abstract
Complex network is a powerful tool to discover important information from various types of big data. Although substantial studies have been conducted for the development of stock relation networks, correlation coefficient is dominantly used to measure the relationship between stock pairs. Information theory [...] Read more.
Complex network is a powerful tool to discover important information from various types of big data. Although substantial studies have been conducted for the development of stock relation networks, correlation coefficient is dominantly used to measure the relationship between stock pairs. Information theory is much less discussed for this important topic, though mutual information is able to measure nonlinear pairwise relationship. In this work we propose to use part mutual information for developing stock networks. The path-consistency algorithm is used to filter out redundant relationships. Using the Australian stock market data, we develop four stock relation networks using different orders of part mutual information. Compared with the widely used planar maximally filtered graph (PMFG), we can generate networks with cliques of large size. In addition, the large cliques show consistency with the structure of industrial sectors. We also analyze the connectivity and degree distributions of the generated networks. Analysis results suggest that the proposed method is an effective approach to develop stock relation networks using information theory. Full article
(This article belongs to the Special Issue Information Theory and Economic Network)
Show Figures

Figure 1

Article
Secure Image Encryption Algorithm Based on Hyperchaos and Dynamic DNA Coding
Entropy 2020, 22(7), 772; https://doi.org/10.3390/e22070772 - 15 Jul 2020
Cited by 8 | Viewed by 980
Abstract
In this paper, we construct a five dimensional continuous hyperchaotic system and propose an image encryption scheme based on the hyperchaotic system, which adopts DNA dynamic coding mechanism and classical scrambling diffusion encryption structure. In the diffusion stage, two rounds of diffusion are [...] Read more.
In this paper, we construct a five dimensional continuous hyperchaotic system and propose an image encryption scheme based on the hyperchaotic system, which adopts DNA dynamic coding mechanism and classical scrambling diffusion encryption structure. In the diffusion stage, two rounds of diffusion are adopted and the rules of DNA encoding (DNA decoding) are dynamically changed according to the pixel value of the plaintext image, that is, the rules of DNA encoding (DNA decoding) used to encrypt different images are different, which makes the algorithm can resist chosen-plaintext attack. The encryption (decryption) key is only the initial value of the chaotic system, which overcomes the difficulty of key management in the “one time pad” encryption system. The experimental results and security analysis show that the algorithm has some advantages of large key space, no obvious statistical characteristics of ciphertext, sensitivity to plaintext and key and able to resist differential attacks and chosen plaintext attack. It has good application prospects. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

Article
Discrete-Time Fractional, Variable-Order PID Controller for a Plant with Delay
Entropy 2020, 22(7), 771; https://doi.org/10.3390/e22070771 - 14 Jul 2020
Cited by 2 | Viewed by 1020
Abstract
In this paper, we discuss the implementation and tuning algorithms of a variable-, fractional-order Proportional–Integral–Derivative (PID) controller based on Grünwald–Letnikov difference definition. All simulations are executed for the third-order plant with a delay. The results of a unit step response for all described [...] Read more.
In this paper, we discuss the implementation and tuning algorithms of a variable-, fractional-order Proportional–Integral–Derivative (PID) controller based on Grünwald–Letnikov difference definition. All simulations are executed for the third-order plant with a delay. The results of a unit step response for all described implementations are presented in a graphical and tabular form. As the qualitative criteria, we use three different error values, which are the following: a summation of squared error (SSE), a summation of squared time weighted error (SSTE) and a summation of squared time-squared weighted error (SST2E). Besides three types of error values, obtained results are additionally evaluated on the basis of an overshoot and a rise time of the output signals achieved by systems with the designed controllers. Full article
(This article belongs to the Section Signal and Data Analysis)
Show Figures

Figure 1

Article
Intelligent Sea States Identification Based on Maximum Likelihood Evidential Reasoning Rule
Entropy 2020, 22(7), 770; https://doi.org/10.3390/e22070770 - 14 Jul 2020
Viewed by 745
Abstract
It is necessary to switch the control strategies for propulsion system frequently according to the changes of sea states in order to ensure the stability and safety of the navigation. Therefore, identifying the current sea state timely and effectively is of great significance [...] Read more.
It is necessary to switch the control strategies for propulsion system frequently according to the changes of sea states in order to ensure the stability and safety of the navigation. Therefore, identifying the current sea state timely and effectively is of great significance to ensure ship safety. To this end, a reasoning model that is based on maximum likelihood evidential reasoning (MAKER) rule is developed to identify the propeller ventilation type, and the result is used as the basis for the sea states identification. Firstly, a data-driven MAKER model is constructed, which fully considers the interdependence between the input features. Secondly, the genetic algorithm (GA) is used to optimize the parameters of the MAKER model in order to improve the evaluation accuracy. Finally, a simulation is built to obtain experimental data to train the MAKER model, and the validity of the model is verified. The results show that the intelligent sea state identification model that is based on the MAKER rule can identify the propeller ventilation type more accurately, and finally realize intelligent identification of sea states. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

Article
Susceptible-Infected-Susceptible Epidemic Discrete Dynamic System Based on Tsallis Entropy
Entropy 2020, 22(7), 769; https://doi.org/10.3390/e22070769 - 14 Jul 2020
Cited by 1 | Viewed by 884
Abstract
This investigation deals with a discrete dynamic system of susceptible-infected-susceptible epidemic (SISE) using the Tsallis entropy. We investigate the positive and maximal solutions of the system. Stability and equilibrium are studied. Moreover, based on the Tsallis entropy, we shall formulate a new design [...] Read more.
This investigation deals with a discrete dynamic system of susceptible-infected-susceptible epidemic (SISE) using the Tsallis entropy. We investigate the positive and maximal solutions of the system. Stability and equilibrium are studied. Moreover, based on the Tsallis entropy, we shall formulate a new design for the basic reproductive ratio. Finally, we apply the results on live data regarding COVID-19. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

Article
A New Multi-Attribute Emergency Decision-Making Algorithm Based on Intuitionistic Fuzzy Cross-Entropy and Comprehensive Grey Correlation Analysis
Entropy 2020, 22(7), 768; https://doi.org/10.3390/e22070768 - 14 Jul 2020
Cited by 3 | Viewed by 876
Abstract
Intuitionistic fuzzy distance measurement is an effective method to study multi-attribute emergency decision-making (MAEDM) problems. Unfortunately, the traditional intuitionistic fuzzy distance measurement method cannot accurately reflect the difference between membership and non-membership data, where it is easy to cause information confusion. Therefore, from [...] Read more.
Intuitionistic fuzzy distance measurement is an effective method to study multi-attribute emergency decision-making (MAEDM) problems. Unfortunately, the traditional intuitionistic fuzzy distance measurement method cannot accurately reflect the difference between membership and non-membership data, where it is easy to cause information confusion. Therefore, from the intuitionistic fuzzy number (IFN), this paper constructs a decision-making model based on intuitionistic fuzzy cross-entropy and a comprehensive grey correlation analysis algorithm. For the MAEDM problems of completely unknown and partially known attribute weights, this method establishes a grey correlation analysis algorithm based on the objective evaluation value and subjective preference value of decision makers (DMs), which makes up for the shortcomings of traditional model information loss and greatly improves the accuracy of MAEDM. Finally, taking the Wenchuan Earthquake on May 12th 2008 as a case study, this paper constructs and solves the ranking problem of shelters. Through the sensitivity comparison analysis, when the grey resolution coefficient increases from 0.4 to 1.0, the ranking result of building shelters remains stable. Compared to the traditional intuitionistic fuzzy distance, this method is shown to be more reliable. Full article
(This article belongs to the Special Issue Data Science: Measuring Uncertainties)
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop