Previous Issue
Volume 23, April

Entropy, Volume 23, Issue 5 (May 2021) – 106 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
Order results
Result details
Select all
Export citation of selected articles as:
Open AccessArticle
A Hybridization of Dragonfly Algorithm Optimization and Angle Modulation Mechanism for 0-1 Knapsack Problems
by , and
Entropy 2021, 23(5), 598; https://doi.org/10.3390/e23050598 - 12 May 2021
Viewed by 124
Abstract
The dragonfly algorithm (DA) is a new intelligent algorithm based on the theory of dragonfly foraging and evading predators. DA exhibits excellent performance in solving multimodal continuous functions and engineering problems. To make this algorithm work in the binary space, this paper introduces [...] Read more.
The dragonfly algorithm (DA) is a new intelligent algorithm based on the theory of dragonfly foraging and evading predators. DA exhibits excellent performance in solving multimodal continuous functions and engineering problems. To make this algorithm work in the binary space, this paper introduces an angle modulation mechanism on DA (called AMDA) to generate bit strings, that is, to give alternative solutions to binary problems, and uses DA to optimize the coefficients of the trigonometric function. Further, to improve the algorithm stability and convergence speed, an improved AMDA, called IAMDA, is proposed by adding one more coefficient to adjust the vertical displacement of the cosine part of the original generating function. To test the performance of IAMDA and AMDA, 12 zero-one knapsack problems are considered along with 13 classic benchmark functions. Experimental results prove that IAMDA has a superior convergence speed and solution quality as compared to other algorithms. Full article
(This article belongs to the Special Issue Swarm Models: From Biological and Social to Artificial Systems)
Open AccessArticle
Light-Front Field Theory on Current Quantum Computers
by , , , , and
Entropy 2021, 23(5), 597; https://doi.org/10.3390/e23050597 - 12 May 2021
Viewed by 134
Abstract
We present a quantum algorithm for simulation of the quantum field theory in light-front formulation and demonstrate how existing quantum devices can be used to study the structure of bound states in relativistic nuclear physics. Specifically, we apply the Variational Quantum Eigensolver algorithm [...] Read more.
We present a quantum algorithm for simulation of the quantum field theory in light-front formulation and demonstrate how existing quantum devices can be used to study the structure of bound states in relativistic nuclear physics. Specifically, we apply the Variational Quantum Eigensolver algorithm to find the ground state of the light-front Hamiltonian obtained within the Basis Light-Front Quantization (BLFQ) framework. The BLFQ formulation of the quantum field theory allows one to readily import techniques developed for digital quantum simulation of quantum chemistry. This provides a method that can be scaled up to the simulation of full, relativistic quantum field theories in the quantum advantage regime. As an illustration, we calculate the mass, mass radius, decay constant, electromagnetic form factor, and charge radius of the pion on the IBM Vigo chip. This is the first time that the light-front approach to the quantum field theory has been used to enable simulation of a real physical system on a quantum computer. Full article
(This article belongs to the Special Issue Noisy Intermediate-Scale Quantum Technologies (NISQ))
Open AccessArticle
Sentiment Analysis of Persian Movie Reviews Using Deep Learning
by , , , and
Entropy 2021, 23(5), 596; https://doi.org/10.3390/e23050596 - 12 May 2021
Viewed by 260
Abstract
Sentiment analysis aims to automatically classify the subject’s sentiment (e.g., positive, negative, or neutral) towards a particular aspect such as a topic, product, movie, news, etc. Deep learning has recently emerged as a powerful machine learning technique to tackle the growing demand for [...] Read more.
Sentiment analysis aims to automatically classify the subject’s sentiment (e.g., positive, negative, or neutral) towards a particular aspect such as a topic, product, movie, news, etc. Deep learning has recently emerged as a powerful machine learning technique to tackle the growing demand for accurate sentiment analysis. However, the majority of research efforts are devoted to English-language only, while information of great importance is also available in other languages. This paper presents a novel, context-aware, deep-learning-driven, Persian sentiment analysis approach. Specifically, the proposed deep-learning-driven automated feature-engineering approach classifies Persian movie reviews as having positive or negative sentiments. Two deep learning algorithms, convolutional neural networks (CNN) and long-short-term memory (LSTM), are applied and compared with our previously proposed manual-feature-engineering-driven, SVM-based approach. Simulation results demonstrate that LSTM obtained a better performance as compared to multilayer perceptron (MLP), autoencoder, support vector machine (SVM), logistic regression and CNN algorithms. Full article
Open AccessArticle
Uncertainty Relation between Detection Probability and Energy Fluctuations
Entropy 2021, 23(5), 595; https://doi.org/10.3390/e23050595 (registering DOI) - 11 May 2021
Viewed by 162
Abstract
A classical random walker starting on a node of a finite graph will always reach any other node since the search is ergodic, namely it fully explores space, hence the arrival probability is unity. For quantum walks, destructive interference may induce effectively non-ergodic [...] Read more.
A classical random walker starting on a node of a finite graph will always reach any other node since the search is ergodic, namely it fully explores space, hence the arrival probability is unity. For quantum walks, destructive interference may induce effectively non-ergodic features in such search processes. Under repeated projective local measurements, made on a target state, the final detection of the system is not guaranteed since the Hilbert space is split into a bright subspace and an orthogonal dark one. Using this we find an uncertainty relation for the deviations of the detection probability from its classical counterpart, in terms of the energy fluctuations. Full article
(This article belongs to the Special Issue Axiomatic Approaches to Quantum Mechanics)
Show Figures

Figure 1

Open AccessArticle
Mimicking Complexity of Structured Data Matrix’s Information Content: Categorical Exploratory Data Analysis
Entropy 2021, 23(5), 594; https://doi.org/10.3390/e23050594 - 11 May 2021
Viewed by 156
Abstract
We develop Categorical Exploratory Data Analysis (CEDA) with mimicking to explore and exhibit the complexity of information content that is contained within any data matrix: categorical, discrete, or continuous. Such complexity is shown through visible and explainable serial multiscale structural dependency with heterogeneity. [...] Read more.
We develop Categorical Exploratory Data Analysis (CEDA) with mimicking to explore and exhibit the complexity of information content that is contained within any data matrix: categorical, discrete, or continuous. Such complexity is shown through visible and explainable serial multiscale structural dependency with heterogeneity. CEDA is developed upon all features’ categorical nature via histogram and it is guided by all features’ associative patterns (order-2 dependence) in a mutual conditional entropy matrix. Higher-order structural dependency of k(3) features is exhibited through block patterns within heatmaps that are constructed by permuting contingency-kD-lattices of counts. By growing k, the resultant heatmap series contains global and large scales of structural dependency that constitute the data matrix’s information content. When involving continuous features, the principal component analysis (PCA) extracts fine-scale information content from each block in the final heatmap. Our mimicking protocol coherently simulates this heatmap series by preserving global-to-fine scales structural dependency. Upon every step of mimicking process, each accepted simulated heatmap is subject to constraints with respect to all of the reliable observed categorical patterns. For reliability and robustness in sciences, CEDA with mimicking enhances data visualization by revealing deterministic and stochastic structures within each scale-specific structural dependency. For inferences in Machine Learning (ML) and Statistics, it clarifies, upon which scales, which covariate feature-groups have major-vs.-minor predictive powers on response features. For the social justice of Artificial Intelligence (AI) products, it checks whether a data matrix incompletely prescribes the targeted system. Full article
(This article belongs to the Special Issue Information Complexity in Structured Data)
Show Figures

Figure 1

Open AccessArticle
Spectral and Informational Analysis of Temperature and Chemical Composition of Solfatara Fumaroles (Campi Flegrei, Italy)
Entropy 2021, 23(5), 593; https://doi.org/10.3390/e23050593 - 11 May 2021
Viewed by 141
Abstract
Temperature and composition at fumaroles are controlled by several volcanic and exogenous processes that operate on various time-space scales. Here, we analyze fluctuations of temperature and chemical composition recorded at fumarolic vents in Solfatara (Campi Flegrei caldera, Italy) from December 1997 to December [...] Read more.
Temperature and composition at fumaroles are controlled by several volcanic and exogenous processes that operate on various time-space scales. Here, we analyze fluctuations of temperature and chemical composition recorded at fumarolic vents in Solfatara (Campi Flegrei caldera, Italy) from December 1997 to December 2015, in order to better understand source(s) and driving processes. Applying the singular spectral analysis, we found that the trends explain the great part of the variance of the geochemical series but not of the temperature series. On the other hand, a common source, also shared by other geo-indicators (ground deformation, seismicity, hydrogeological and meteorological data), seems to be linked with the oscillatory structure of the investigated signals. The informational characteristics of temperature and geochemical compositions, analyzed by using the Fisher–Shannon method, appear to be a sort of fingerprint of the different periodic structure. In fact, the oscillatory components were characterized by a wide range of significant periodicities nearly equally powerful that show a higher degree of entropy, indicating that changes are influenced by overlapped processes occurring at different scales with a rather similar intensity. The present study represents an advancement in the understanding of the dominant driving mechanisms of volcanic signals at fumaroles that might be also valid for other volcanic areas. Full article
(This article belongs to the Section Signal and Data Analysis)
Show Figures

Figure 1

Open AccessArticle
EEG Fractal Analysis Reflects Brain Impairment after Stroke
Entropy 2021, 23(5), 592; https://doi.org/10.3390/e23050592 - 11 May 2021
Viewed by 139
Abstract
Stroke is the commonest cause of disability. Novel treatments require an improved understanding of the underlying mechanisms of recovery. Fractal approaches have demonstrated that a single metric can describe the complexity of seemingly random fluctuations of physiological signals. We hypothesize that fractal algorithms [...] Read more.
Stroke is the commonest cause of disability. Novel treatments require an improved understanding of the underlying mechanisms of recovery. Fractal approaches have demonstrated that a single metric can describe the complexity of seemingly random fluctuations of physiological signals. We hypothesize that fractal algorithms applied to electroencephalographic (EEG) signals may track brain impairment after stroke. Sixteen stroke survivors were studied in the hyperacute (<48 h) and in the acute phase (∼1 week after stroke), and 35 stroke survivors during the early subacute phase (from 8 days to 32 days and after ∼2 months after stroke): We compared resting-state EEG fractal changes using fractal measures (i.e., Higuchi Index, Tortuosity) with 11 healthy controls. Both Higuchi index and Tortuosity values were significantly lower after a stroke throughout the acute and early subacute stage compared to healthy subjects, reflecting a brain activity which is significantly less complex. These indices may be promising metrics to track behavioral changes in the very early stage after stroke. Our findings might contribute to the neurorehabilitation quest in identifying reliable biomarkers for a better tailoring of rehabilitation pathways. Full article
(This article belongs to the Special Issue Fractal and Multifractal Analysis of Complex Networks)
Show Figures

Figure 1

Open AccessArticle
Pulse Coupled Neural Network-Based Multimodal Medical Image Fusion via Guided Filtering and WSEML in NSCT Domain
Entropy 2021, 23(5), 591; https://doi.org/10.3390/e23050591 - 11 May 2021
Viewed by 182
Abstract
Multimodal medical image fusion aims to fuse images with complementary multisource information. In this paper, we propose a novel multimodal medical image fusion method using pulse coupled neural network (PCNN) and a weighted sum of eight-neighborhood-based modified Laplacian (WSEML) integrating guided image filtering [...] Read more.
Multimodal medical image fusion aims to fuse images with complementary multisource information. In this paper, we propose a novel multimodal medical image fusion method using pulse coupled neural network (PCNN) and a weighted sum of eight-neighborhood-based modified Laplacian (WSEML) integrating guided image filtering (GIF) in non-subsampled contourlet transform (NSCT) domain. Firstly, the source images are decomposed by NSCT, several low- and high-frequency sub-bands are generated. Secondly, the PCNN-based fusion rule is used to process the low-frequency components, and the GIF-WSEML fusion model is used to process the high-frequency components. Finally, the fused image is obtained by integrating the fused low- and high-frequency sub-bands. The experimental results demonstrate that the proposed method can achieve better performance in terms of multimodal medical image fusion. The proposed algorithm also has obvious advantages in objective evaluation indexes VIFF, QW, API, SD, EN and time consumption. Full article
(This article belongs to the Special Issue Advances in Image Fusion)
Show Figures

Figure 1

Open AccessArticle
Multiple Observations for Secret-Key Binding with SRAM PUFs
Entropy 2021, 23(5), 590; https://doi.org/10.3390/e23050590 - 11 May 2021
Viewed by 217
Abstract
We present a new Multiple-Observations (MO) helper data scheme for secret-key binding to an SRAM-PUF. This MO scheme binds a single key to multiple enrollment observations of the SRAM-PUF. Performance is improved in comparison to classic schemes which generate helper data based on [...] Read more.
We present a new Multiple-Observations (MO) helper data scheme for secret-key binding to an SRAM-PUF. This MO scheme binds a single key to multiple enrollment observations of the SRAM-PUF. Performance is improved in comparison to classic schemes which generate helper data based on a single enrollment observation. The performance increase can be explained by the fact that the reliabilities of the different SRAM cells are modeled (implicitly) in the helper data. We prove that the scheme achieves secret-key capacity for any number of enrollment observations, and, therefore, it is optimal. We evaluate performance of the scheme using Monte Carlo simulations, where an off-the-shelf LDPC code is used to implement the linear error-correcting code. Another scheme that models the reliabilities of the SRAM cells is the so-called Soft-Decision (SD) helper data scheme. The SD scheme considers the one-probabilities of the SRAM cells as an input, which in practice are not observable. We present a new strategy for the SD scheme that considers the binary SRAM-PUF observations as an input instead and show that the new strategy is optimal and achieves the same reconstruction performance as the MO scheme. Finally, we present a variation on the MO helper data scheme that updates the helper data sequentially after each successful reconstruction of the key. As a result, the error-correcting performance of the scheme is improved over time. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

Open AccessArticle
On Explaining Quantum Correlations: Causal vs. Non-Causal
Entropy 2021, 23(5), 589; https://doi.org/10.3390/e23050589 - 10 May 2021
Viewed by 167
Abstract
At the basis of the problem of explaining non-local quantum correlations lies the tension between two factors: on the one hand, the natural interpretation of correlations as the manifestation of a causal relation; on the other, the resistance on the part of the [...] Read more.
At the basis of the problem of explaining non-local quantum correlations lies the tension between two factors: on the one hand, the natural interpretation of correlations as the manifestation of a causal relation; on the other, the resistance on the part of the physics underlying said correlations to adjust to the most essential features of a pre-theoretic notion of causation. In this paper, I argue for the rejection of the first horn of the dilemma, i.e., the assumption that quantum correlations call for a causal explanation. The paper is divided into two parts. The first, destructive, part provides a critical overview of the enterprise of causally interpreting non-local quantum correlations, with the aim of warning against the temptation of an account of causation claiming to cover such correlations ‘for free’. The second, constructive, part introduces the so-called structural explanation (a variety of non-causal explanation that shows how the explanandum is the manifestation of a fundamental structure of the world) and argues that quantum correlations might be explained structurally in the context of an information-theoretic approach to QT. Full article
(This article belongs to the Special Issue Time, Causality, and Entropy)
Open AccessArticle
Training Data Selection and Optimal Sensor Placement for Deep-Learning-Based Sparse Inertial Sensor Human Posture Reconstruction
Entropy 2021, 23(5), 588; https://doi.org/10.3390/e23050588 - 10 May 2021
Viewed by 174
Abstract
Although commercial motion-capture systems have been widely used in various applications, the complex setup limits their application scenarios for ordinary consumers. To overcome the drawbacks of wearability, human posture reconstruction based on a few wearable sensors have been actively studied in recent years. [...] Read more.
Although commercial motion-capture systems have been widely used in various applications, the complex setup limits their application scenarios for ordinary consumers. To overcome the drawbacks of wearability, human posture reconstruction based on a few wearable sensors have been actively studied in recent years. In this paper, we propose a deep-learning-based sparse inertial sensor human posture reconstruction method. This method uses bidirectional recurrent neural network (Bi-RNN) to build an a priori model from a large motion dataset to build human motion, thereby the low-dimensional motion measurements are mapped to whole-body posture. To improve the motion reconstruction performance for specific application scenarios, two fundamental problems in the model construction are investigated: training data selection and sparse sensor placement. The problem of deep-learning training data selection is to select independent and identically distributed (IID) data for a certain scenario from the accumulated imbalanced motion dataset with sufficient information. We formulate the data selection into an optimization problem to obtain continuous and IID data segments, which comply with a small reference dataset collected from the target scenario. A two-step heuristic algorithm is proposed to solve the data selection problem. On the other hand, the optimal sensor placement problem is studied to exploit most information from partial observation of human movement. A method for evaluating the motion information amount of any group of wearable inertial sensors based on mutual information is proposed, and a greedy searching method is adopted to obtain the approximate optimal sensor placement of a given sensor number, so that the maximum motion information and minimum redundancy is achieved. Finally, the human posture reconstruction performance is evaluated with different training data and sensor placement selection methods, and experimental results show that the proposed method takes advantages in both posture reconstruction accuracy and model training time. In the 6 sensors configuration, the posture reconstruction errors of our model for walking, running, and playing basketball are 7.25, 8.84, and 14.13, respectively. Full article
Open AccessArticle
Unveiling Operator Growth Using Spin Correlation Functions
Entropy 2021, 23(5), 587; https://doi.org/10.3390/e23050587 - 10 May 2021
Viewed by 184
Abstract
In this paper, we study non-equilibrium dynamics induced by a sudden quench of strongly correlated Hamiltonians with all-to-all interactions. By relying on a Sachdev-Ye-Kitaev (SYK)-based quench protocol, we show that the time evolution of simple spin-spin correlation functions is highly sensitive to the [...] Read more.
In this paper, we study non-equilibrium dynamics induced by a sudden quench of strongly correlated Hamiltonians with all-to-all interactions. By relying on a Sachdev-Ye-Kitaev (SYK)-based quench protocol, we show that the time evolution of simple spin-spin correlation functions is highly sensitive to the degree of k-locality of the corresponding operators, once an appropriate set of fundamental fields is identified. By tracking the time-evolution of specific spin-spin correlation functions and their decay, we argue that it is possible to distinguish between operator-hopping and operator growth dynamics; the latter being a hallmark of quantum chaos in many-body quantum systems. Such an observation, in turn, could constitute a promising tool to probe the emergence of chaotic behavior, rather accessible in state-of-the-art quench setups. Full article
(This article belongs to the Special Issue Entropy and Complexity in Quantum Dynamics)
Show Figures

Figure 1

Open AccessEditorial
Artificial Intelligence and Computational Methods in the Modeling of Complex Systems
Entropy 2021, 23(5), 586; https://doi.org/10.3390/e23050586 - 10 May 2021
Viewed by 176
Abstract
Based on the increased attention, the Special Issue aims to investigate the modeling of complex systems using artificial intelligence and computational methods [...] Full article
Open AccessArticle
Self-Assembled Structures of Colloidal Dimers and Disks on a Spherical Surface
Entropy 2021, 23(5), 585; https://doi.org/10.3390/e23050585 - 09 May 2021
Viewed by 253
Abstract
We study self-assembly on a spherical surface of a model for a binary mixture of amphiphilic dimers in the presence of guest particles via Monte Carlo (MC) computer simulation. All particles had a hard core, but one monomer of the dimer also interacted [...] Read more.
We study self-assembly on a spherical surface of a model for a binary mixture of amphiphilic dimers in the presence of guest particles via Monte Carlo (MC) computer simulation. All particles had a hard core, but one monomer of the dimer also interacted with the guest particle by means of a short-range attractive potential. We observed the formation of aggregates of various shapes as a function of the composition of the mixture and of the size of guest particles. Our MC simulations are a further step towards a microscopic understanding of experiments on colloidal aggregation over curved surfaces, such as oil droplets. Full article
(This article belongs to the Special Issue Statistical Mechanics and Thermodynamics of Liquids and Crystals)
Show Figures

Figure 1

Open AccessArticle
Representation of the Universe as a Dendrogramic Hologram Endowed with Relational Interpretation
Entropy 2021, 23(5), 584; https://doi.org/10.3390/e23050584 - 08 May 2021
Viewed by 251
Abstract
A proposal for a fundamental theory is described in which classical and quantum physics as a representation of the universe as a gigantic dendrogram are unified. The latter is the explicate order structure corresponding to the purely number-theoretical implicate order structure given by [...] Read more.
A proposal for a fundamental theory is described in which classical and quantum physics as a representation of the universe as a gigantic dendrogram are unified. The latter is the explicate order structure corresponding to the purely number-theoretical implicate order structure given by p-adic numbers. This number field was zero-dimensional, totally disconnected, and disordered. Physical systems (such as electrons, photons) are sub-dendrograms of the universal dendrogram. Measurement process is described as interactions among dendrograms; in particular, quantum measurement problems can be resolved using this process. The theory is realistic, but realism is expressed via the the Leibniz principle of the Identity of Indiscernibles. The classical-quantum interplay is based on the degree of indistinguishability between dendrograms (in which the ergodicity assumption is removed). Depending on this degree, some physical quantities behave more or less in a quantum manner (versus classic manner). Conceptually, our theory is very close to Smolin’s dynamics of difference and Rovelli’s relational quantum mechanics. The presence of classical behavior in nature implies a finiteness of the Universe-dendrogram. (Infinite Universe is considered to be purely quantum.) Reconstruction of events in a four-dimensional space type is based on the holographic principle. Our model reproduces Bell-type correlations in the dendrogramic framework. By adjusting dendrogram complexity, violation of the Bell inequality can be made larger or smaller. Full article
(This article belongs to the Special Issue The Philosophy of Quantum Physics)
Show Figures

Figure 1

Open AccessArticle
Implications of Noise on Neural Correlates of Consciousness: A Computational Analysis of Stochastic Systems of Mutually Connected Processes
Entropy 2021, 23(5), 583; https://doi.org/10.3390/e23050583 - 08 May 2021
Viewed by 201
Abstract
Random fluctuations in neuronal processes may contribute to variability in perception and increase the information capacity of neuronal networks. Various sources of random processes have been characterized in the nervous system on different levels. However, in the context of neural correlates of consciousness, [...] Read more.
Random fluctuations in neuronal processes may contribute to variability in perception and increase the information capacity of neuronal networks. Various sources of random processes have been characterized in the nervous system on different levels. However, in the context of neural correlates of consciousness, the robustness of mechanisms of conscious perception against inherent noise in neural dynamical systems is poorly understood. In this paper, a stochastic model is developed to study the implications of noise on dynamical systems that mimic neural correlates of consciousness. We computed power spectral densities and spectral entropy values for dynamical systems that contain a number of mutually connected processes. Interestingly, we found that spectral entropy decreases linearly as the number of processes within the system doubles. Further, power spectral density frequencies shift to higher values as system size increases, revealing an increasing impact of negative feedback loops and regulations on the dynamics of larger systems. Overall, our stochastic modeling and analysis results reveal that large dynamical systems of mutually connected and negatively regulated processes are more robust against inherent noise than small systems. Full article
(This article belongs to the Special Issue Integrated Information Theory and Consciousness)
Show Figures

Figure 1

Open AccessArticle
A Two-Stage Hybrid Default Discriminant Model Based on Deep Forest
Entropy 2021, 23(5), 582; https://doi.org/10.3390/e23050582 - 08 May 2021
Viewed by 192
Abstract
Background: the credit scoring model is an effective tool for banks and other financial institutions to distinguish potential default borrowers. The credit scoring model represented by machine learning methods such as deep learning performs well in terms of the accuracy of default discrimination, [...] Read more.
Background: the credit scoring model is an effective tool for banks and other financial institutions to distinguish potential default borrowers. The credit scoring model represented by machine learning methods such as deep learning performs well in terms of the accuracy of default discrimination, but the model itself also has many shortcomings such as many hyperparameters and large dependence on big data. There is still a lot of room to improve its interpretability and robustness. Methods: the deep forest or multi-Grained Cascade Forest (gcForest) is a decision tree depth model based on the random forest algorithm. Using multidimensional scanning and cascading processing, gcForest can effectively identify and process high-dimensional feature information. At the same time, gcForest has fewer hyperparameters and has strong robustness. So, this paper constructs a two-stage hybrid default discrimination model based on multiple feature selection methods and gcForest algorithm, and at the same time, it optimizes the parameters for the lowest type II error as the first principle, and the highest AUC and accuracy as the second and third principles. GcForest can not only reflect the advantages of traditional statistical models in terms of interpretability and robustness but also take into account the advantages of deep learning models in terms of accuracy. Results: the validity of the hybrid default discrimination model is verified by three real open credit data sets of Australian, Japanese, and German in the UCI database. Conclusions: the performance of the gcForest is better than the current popular single classifiers such as ANN, and the common ensemble classifiers such as LightGBM, and CNNs in type II error, AUC, and accuracy. Besides, in comparison with other similar research results, the robustness and effectiveness of this model are further verified. Full article
Show Figures

Figure 1

Open AccessReview
The Phase Space Model of Nonrelativistic Quantum Mechanics
Entropy 2021, 23(5), 581; https://doi.org/10.3390/e23050581 - 08 May 2021
Viewed by 236
Abstract
We focus on several questions arising during the modelling of quantum systems on a phase space. First, we discuss the choice of phase space and its structure. We include an interesting case of discrete phase space. Then, we introduce the respective algebras of [...] Read more.
We focus on several questions arising during the modelling of quantum systems on a phase space. First, we discuss the choice of phase space and its structure. We include an interesting case of discrete phase space. Then, we introduce the respective algebras of functions containing quantum observables. We also consider the possibility of performing strict calculations and indicate cases where only formal considerations can be performed. We analyse alternative realisations of strict and formal calculi, which are determined by different kernels. Finally, two classes of Wigner functions as representations of states are investigated. Full article
(This article belongs to the Special Issue Quantum Mechanics and Its Foundations)
Open AccessArticle
Word and Face Recognition Processing Based on Response Times and Ex-Gaussian Components
Entropy 2021, 23(5), 580; https://doi.org/10.3390/e23050580 - 08 May 2021
Viewed by 181
Abstract
The face is a fundamental feature of our identity. In humans, the existence of specialized processing modules for faces is now widely accepted. However, identifying the processes involved for proper names is more problematic. The aim of the present study is to examine [...] Read more.
The face is a fundamental feature of our identity. In humans, the existence of specialized processing modules for faces is now widely accepted. However, identifying the processes involved for proper names is more problematic. The aim of the present study is to examine which of the two treatments is produced earlier and whether the social abilities are influent. We selected 100 university students divided into two groups: Spanish and USA students. They had to recognize famous faces or names by using a masked priming task. An analysis of variance about the reaction times (RT) was used to determine whether significant differences could be observed in word or face recognition and between the Spanish or USA group. Additionally, and to examine the role of outliers, the Gaussian distribution has been modified exponentially. Famous faces were recognized faster than names, and differences were observed between Spanish and North American participants, but not for unknown distracting faces. The current results suggest that response times to face processing might be faster than name recognition, which supports the idea of differences in processing nature. Full article
(This article belongs to the Special Issue Information Propagation in Psychological Networks)
Show Figures

Figure 1

Open AccessArticle
A Theoretical Perspective of the Photochemical Potential in the Spectral Performance of Photovoltaic Cells
Entropy 2021, 23(5), 579; https://doi.org/10.3390/e23050579 - 08 May 2021
Viewed by 159
Abstract
We present a novel theoretical approach to the problem of light energy conversion in thermostated semiconductor junctions. Using the classical model of a two-level atom, we deduced formulas for the spectral response and the quantum efficiency in terms of the input photons’ non-zero [...] Read more.
We present a novel theoretical approach to the problem of light energy conversion in thermostated semiconductor junctions. Using the classical model of a two-level atom, we deduced formulas for the spectral response and the quantum efficiency in terms of the input photons’ non-zero chemical potential. We also calculated the spectral entropy production and the global efficiency parameter in the thermodynamic limit. The heat transferred to the thermostat results in a dissipative loss that appreciably controls the spectral quantities’ behavior and, therefore, the cell’s performance. The application of the obtained formulas to data extracted from photovoltaic cells enabled us to accurately interpolate experimental data for the spectral response and the quantum efficiency of cells based on Si-, GaAs, and CdTe, among others. Full article
(This article belongs to the Special Issue Entropy-based Methods in In and Out of Equilibrium Systems)
Show Figures

Figure 1

Open AccessArticle
Hybrid Control of Digital Baker Map with Application to Pseudo-Random Number Generator
Entropy 2021, 23(5), 578; https://doi.org/10.3390/e23050578 - 08 May 2021
Viewed by 198
Abstract
Dynamical degradation occurs when chaotic systems are implemented on digital devices, which seriously threatens the security of chaos-based cryptosystems. The existing solutions mainly focus on the compensation of dynamical properties rather than on the elimination of the inherent biases of chaotic systems. In [...] Read more.
Dynamical degradation occurs when chaotic systems are implemented on digital devices, which seriously threatens the security of chaos-based cryptosystems. The existing solutions mainly focus on the compensation of dynamical properties rather than on the elimination of the inherent biases of chaotic systems. In this paper, a unidirectional hybrid control method is proposed to improve the dynamical properties and to eliminate the biases of digital chaotic maps. A continuous chaotic system is introduced to provide external feedback control of the given digital chaotic map. Three different control modes are investigated, and the influence of control parameter on the properties of the controlled system is discussed. The experimental results show that the proposed method can not only improve the dynamical degradation of the digital chaotic map but also make the controlled digital system produce outputs with desirable performances. Finally, a pseudorandom number generator (PRNG) is proposed. Statistical analysis shows that the PRNG has good randomness and almost ideal entropy values. Full article
(This article belongs to the Special Issue Nonlinear Dynamics and Analysis)
Show Figures

Figure 1

Open AccessArticle
Improving the Reversible LSB Matching Scheme Based on the Likelihood Re-encoding Strategy
Entropy 2021, 23(5), 577; https://doi.org/10.3390/e23050577 - 08 May 2021
Viewed by 236
Abstract
In 2018, Tseng et al. proposed a dual-image reversible embedding method based on the modified Least Significant Bit matching (LSB matching) method. This method improved on the dual-image LSB matching method proposed by Lu et al. In Lu et al.’s scheme, there are [...] Read more.
In 2018, Tseng et al. proposed a dual-image reversible embedding method based on the modified Least Significant Bit matching (LSB matching) method. This method improved on the dual-image LSB matching method proposed by Lu et al. In Lu et al.’s scheme, there are seven situations that cannot be restored and need to be modified. Furthermore, the scheme uses two pixels to conceal four secret bits. The maximum modification of each pixel, in Lu et al.’s scheme, is two. To decrease the modification, Tseng et al. use one pixel to embed two secret bits and allow the maximum modification to decrease from two to one such that the image quality can be improved. This study enhances Tseng et al.’s method by re-encoding the modified rule table based on the probability of each hiding combination. The scheme analyzes the frequency occurrence of each combination and sets the lowest modified codes to the highest frequency case to significantly reduce the amount of modification. Experimental results show that better image quality is obtained using our method under the same amount of hiding payload. Full article
(This article belongs to the Special Issue Entropy Based Data Hiding and Its Applications)
Open AccessArticle
Generalized Structure Functions and Multifractal Detrended Fluctuation Analysis Applied to Vegetation Index Time Series: An Arid Rangeland Study
Entropy 2021, 23(5), 576; https://doi.org/10.3390/e23050576 - 07 May 2021
Viewed by 287
Abstract
Estimates suggest that more than 70% of the world’s rangelands are degraded. The Normalized Difference Vegetation Index (NDVI) is commonly used by ecologists and agriculturalists to monitor vegetation and contribute to more sustainable rangeland management. This paper aims to explore the scaling character [...] Read more.
Estimates suggest that more than 70% of the world’s rangelands are degraded. The Normalized Difference Vegetation Index (NDVI) is commonly used by ecologists and agriculturalists to monitor vegetation and contribute to more sustainable rangeland management. This paper aims to explore the scaling character of NDVI and NDVI anomaly (NDVIa) time series by applying three fractal analyses: generalized structure function (GSF), multifractal detrended fluctuation analysis (MF-DFA), and Hurst index (HI). The study was conducted in four study areas in Southeastern Spain. Results suggest a multifractal character influenced by different land uses and spatial diversity. MF-DFA indicated an antipersistent character in study areas, while GSF and HI results indicated a persistent character. Different behaviors of generalized Hurst and scaling exponents were found between herbaceous and tree dominated areas. MF-DFA and surrogate and shuffle series allow us to study multifractal sources, reflecting the importance of long-range correlations in these areas. Two types of long-range correlation appear to be in place due to short-term memory reflecting seasonality and longer-term memory based on a time scale of a year or longer. The comparison of these series also provides us with a differentiating profile to distinguish among our four study areas that can improve land use and risk management in arid rangelands. Full article
Show Figures

Figure 1

Open AccessArticle
The Stochastic Nature of Functional Responses
Entropy 2021, 23(5), 575; https://doi.org/10.3390/e23050575 - 07 May 2021
Viewed by 239
Abstract
Functional responses are non-linear functions commonly used to describe the variation in the rate of consumption of resources by a consumer. They have been widely used in both theoretical and empirical studies, but a comprehensive understanding of their parameters at different levels of [...] Read more.
Functional responses are non-linear functions commonly used to describe the variation in the rate of consumption of resources by a consumer. They have been widely used in both theoretical and empirical studies, but a comprehensive understanding of their parameters at different levels of description remains elusive. Here, by depicting consumers and resources as stochastic systems of interacting particles, we present a minimal set of reactions for consumer resource dynamics. We rigorously derived the corresponding system of ODEs, from which we obtained via asymptotic expansions classical 2D consumer-resource dynamics, characterized by different functional responses. We also derived functional responses by focusing on the subset of reactions describing only the feeding process. This involves fixing the total number of consumers and resources, which we call chemostatic conditions. By comparing these two ways of deriving functional responses, we showed that classical functional response parameters in effective 2D consumer-resource dynamics differ from the same parameters obtained by measuring (or deriving) functional responses for typical feeding experiments under chemostatic conditions, which points to potential errors in interpreting empirical data. We finally discuss possible generalizations of our models to systems with multiple consumers and more complex population structures, including spatial dynamics. Our stochastic approach builds on fundamental ecological processes and has natural connections to basic ecological theory. Full article
(This article belongs to the Special Issue Statistical Physics of Living Systems)
Show Figures

Figure 1

Open AccessArticle
An Indoor Localization System Using Residual Learning with Channel State Information
Entropy 2021, 23(5), 574; https://doi.org/10.3390/e23050574 - 07 May 2021
Viewed by 178
Abstract
With the increasing demand of location-based services, neural network (NN)-based intelligent indoor localization has attracted great interest due to its high localization accuracy. However, deep NNs are usually affected by degradation and gradient vanishing. To fill this gap, we propose a novel indoor [...] Read more.
With the increasing demand of location-based services, neural network (NN)-based intelligent indoor localization has attracted great interest due to its high localization accuracy. However, deep NNs are usually affected by degradation and gradient vanishing. To fill this gap, we propose a novel indoor localization system, including denoising NN and residual network (ResNet), to predict the location of moving object by the channel state information (CSI). In the ResNet, to prevent overfitting, we replace all the residual blocks by the stochastic residual blocks. Specially, we explore the long-range stochastic shortcut connection (LRSSC) to solve the degradation problem and gradient vanishing. To obtain a large receptive field without losing information, we leverage the dilated convolution at the rear of the ResNet. Experimental results are presented to confirm that our system outperforms state-of-the-art methods in a representative indoor environment. Full article
(This article belongs to the Section Signal and Data Analysis)
Show Figures

Figure 1

Open AccessCommunication
Can Quantum Correlations Lead to Violation of the Second Law of Thermodynamics?
Entropy 2021, 23(5), 573; https://doi.org/10.3390/e23050573 - 07 May 2021
Viewed by 176
Abstract
Quantum entanglement can cause the efficiency of a heat engine to be greater than the efficiency of the Carnot cycle. However, this does not mean a violation of the second law of thermodynamics, since there is no local equilibrium for pure quantum states, [...] Read more.
Quantum entanglement can cause the efficiency of a heat engine to be greater than the efficiency of the Carnot cycle. However, this does not mean a violation of the second law of thermodynamics, since there is no local equilibrium for pure quantum states, and, in the absence of local equilibrium, thermodynamics cannot be formulated correctly. Von Neumann entropy is not a thermodynamic quantity, although it can characterize the ordering of a system. In the case of the entanglement of the particles of the system with the environment, the concept of an isolated system should be refined. In any case, quantum correlations cannot lead to a violation of the second law of thermodynamics in any of its formulations. This article is devoted to a technical discussion of the expected results on the role of quantum entanglement in thermodynamics. Full article
Open AccessArticle
Dynamics Analysis of a Wireless Rechargeable Sensor Network for Virus Mutation Spreading
Entropy 2021, 23(5), 572; https://doi.org/10.3390/e23050572 - 06 May 2021
Viewed by 211
Abstract
Virus spreading problems in wireless rechargeable sensor networks (WSNs) are becoming a hot topic, and the problem has been studied and discussed in recent years. Many epidemic spreading models have been introduced for revealing how a virus spreads and how a virus is [...] Read more.
Virus spreading problems in wireless rechargeable sensor networks (WSNs) are becoming a hot topic, and the problem has been studied and discussed in recent years. Many epidemic spreading models have been introduced for revealing how a virus spreads and how a virus is suppressed. However, most of them assumed the sensors are not rechargeable sensors. In addition, most of existing works do not consider virus mutation problems. This paper proposes a novel epidemic model, including susceptible, infected, variant, low-energy and dead states, which considers the rechargeable sensors and the virus mutation factor. The stability of the proposed model is first analyzed by adopting the characteristic equation and constructing Lyapunov functions methods. Then, an optimal control problem is formulated to control the virus spread and decrease the cost of the networks by applying Pontryagin’s maximum principle. Finally, all of the theoretical results are confirmed by numerical simulation. Full article
Show Figures

Figure 1

Open AccessArticle
Electrophysiological Properties from Computations at a Single Voltage: Testing Theory with Stochastic Simulations
Entropy 2021, 23(5), 571; https://doi.org/10.3390/e23050571 - 06 May 2021
Viewed by 195
Abstract
We use stochastic simulations to investigate the performance of two recently developed methods for calculating the free energy profiles of ion channels and their electrophysiological properties, such as current–voltage dependence and reversal potential, from molecular dynamics simulations at a single applied voltage. These [...] Read more.
We use stochastic simulations to investigate the performance of two recently developed methods for calculating the free energy profiles of ion channels and their electrophysiological properties, such as current–voltage dependence and reversal potential, from molecular dynamics simulations at a single applied voltage. These methods require neither knowledge of the diffusivity nor simulations at multiple voltages, which greatly reduces the computational effort required to probe the electrophysiological properties of ion channels. They can be used to determine the free energy profiles from either forward or backward one-sided properties of ions in the channel, such as ion fluxes, density profiles, committor probabilities, or from their two-sided combination. By generating large sets of stochastic trajectories, which are individually designed to mimic the molecular dynamics crossing statistics of models of channels of trichotoxin, p7 from hepatitis C and a bacterial homolog of the pentameric ligand-gated ion channel, GLIC, we find that the free energy profiles obtained from stochastic simulations corresponding to molecular dynamics simulations of even a modest length are burdened with statistical errors of only 0.3 kcal/mol. Even with many crossing events, applying two-sided formulas substantially reduces statistical errors compared to one-sided formulas. With a properly chosen reference voltage, the current–voltage curves can be reproduced with good accuracy from simulations at a single voltage in a range extending for over 200 mV. If possible, the reference voltages should be chosen not simply to drive a large current in one direction, but to observe crossing events in both directions. Full article
Show Figures

Figure 1

Open AccessArticle
Visual Secure Image Encryption Scheme Based on Compressed Sensing and Regional Energy
Entropy 2021, 23(5), 570; https://doi.org/10.3390/e23050570 - 06 May 2021
Viewed by 185
Abstract
The network security transmission of digital images needs to solve the dual security problems of content and appearance. In this paper, a visually secure image compression and encryption scheme is proposed by combining compressed sensing (CS) and regional energy. The plain image is [...] Read more.
The network security transmission of digital images needs to solve the dual security problems of content and appearance. In this paper, a visually secure image compression and encryption scheme is proposed by combining compressed sensing (CS) and regional energy. The plain image is compressed and encrypted into a secret image by CS and zigzag confusion. Then, according to the regional energy, the secret image is embedded into a carrier image to obtain the final visual secure cipher image. A method of hour hand printing (HHP) scrambling is proposed to increase the pixel irrelevance. Regional energy embedding reduce the damage to the visual quality of carrier image, and the different embedding positions between images greatly enhances the security of the encryption algorithm. Furthermore, the hyperchaotic multi-character system (MCS) is utilized to construct measurement matrix and control pixels. Simulation results and security analyses demonstrate the effectiveness, security and robustness of the propose algorithm. Full article
Show Figures

Figure 1

Open AccessArticle
Multi-Class Cost-Constrained Random Coding for Correlated Sources over the Multiple-Access Channel
Entropy 2021, 23(5), 569; https://doi.org/10.3390/e23050569 - 03 May 2021
Viewed by 326
Abstract
This paper studies a generalized version of multi-class cost-constrained random-coding ensemble with multiple auxiliary costs for the transmission of N correlated sources over an N-user multiple-access channel. For each user, the set of messages is partitioned into classes and codebooks are generated [...] Read more.
This paper studies a generalized version of multi-class cost-constrained random-coding ensemble with multiple auxiliary costs for the transmission of N correlated sources over an N-user multiple-access channel. For each user, the set of messages is partitioned into classes and codebooks are generated according to a distribution depending on the class index of the source message and under the constraint that the codewords satisfy a set of cost functions. Proper choices of the cost functions recover different coding schemes including message-dependent and message-independent versions of independent and identically distributed, independent conditionally distributed, constant-composition and conditional constant composition ensembles. The transmissibility region of the scheme is related to the Cover-El Gamal-Salehi region. A related family of correlated-source Gallager source exponent functions is also studied. The achievable exponents are compared for correlated and independent sources, both numerically and analytically. Full article
(This article belongs to the Special Issue Finite-Length Information Theory)
Show Figures

Figure 1

Previous Issue
Back to TopTop