Previous Issue
Volume 22, January

Table of Contents

Entropy, Volume 22, Issue 2 (February 2020) – 114 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
Cover Story (view full-size image) In recent years, analysis of the organization and performance of football teams has undergone a [...] Read more.
Order results
Result details
Select all
Export citation of selected articles as:
Open AccessArticle
Entropy-Based Measures of Hypnopompic Heart Rate Variability Contribute to the Automatic Prediction of Cardiovascular Events
Entropy 2020, 22(2), 241; https://doi.org/10.3390/e22020241 (registering DOI) - 20 Feb 2020
Abstract
Surges in sympathetic activity should be a major contributor to the frequent occurrence of cardiovascular events towards the end of nocturnal sleep. We aimed to investigate whether the analysis of hypnopompic heart rate variability (HRV) could assist in the prediction of cardiovascular disease [...] Read more.
Surges in sympathetic activity should be a major contributor to the frequent occurrence of cardiovascular events towards the end of nocturnal sleep. We aimed to investigate whether the analysis of hypnopompic heart rate variability (HRV) could assist in the prediction of cardiovascular disease (CVD). 2217 baseline CVD-free subjects were identified and divided into CVD group and non-CVD group, according to the presence of CVD during a follow-up visit. HRV measures derived from time domain analysis, frequency domain analysis and nonlinear analysis were employed to characterize cardiac functioning. Machine learning models for both long-term and short-term CVD prediction were then constructed, based on hypnopompic HRV metrics and other typical CVD risk factors. CVD was associated with significant alterations in hypnopompic HRV. An accuracy of 81.4% was achieved in short-term prediction of CVD, demonstrating a 10.7% increase compared with long-term prediction. There was a decline of more than 6% in the predictive performance of short-term CVD outcomes without HRV metrics. The complexity of hypnopompic HRV, measured by entropy-based indices, contributed considerably to the prediction and achieved greater importance in the proposed models than conventional HRV measures. Our findings suggest that Hypnopompic HRV assists the prediction of CVD outcomes, especially the occurrence of CVD event within two years. Full article
(This article belongs to the Special Issue Application of Information Theory and Entropy in Cardiology)
Open AccessArticle
Global Geometry of Bayesian Statistics
Entropy 2020, 22(2), 240; https://doi.org/10.3390/e22020240 (registering DOI) - 20 Feb 2020
Abstract
In the previous work of the author, a non-trivial symmetry of the relative entropy in the information geometry of normal distributions was discovered. The same symmetry also appears in the symplectic/contact geometry of Hilbert modular cusps. Further, it was observed that a contact [...] Read more.
In the previous work of the author, a non-trivial symmetry of the relative entropy in the information geometry of normal distributions was discovered. The same symmetry also appears in the symplectic/contact geometry of Hilbert modular cusps. Further, it was observed that a contact Hamiltonian flow presents a certain Bayesian inference on normal distributions. In this paper, we describe Bayesian statistics and the information geometry in the language of current geometry in order to spread our interest in statistics through general geometers and topologists. Then, we foliate the space of multivariate normal distributions by symplectic leaves to generalize the above result of the author. This foliation arises from the Cholesky decomposition of the covariance matrices. Full article
(This article belongs to the Special Issue Information Geometry III)
Open AccessReview
Complexity Analysis of EEG, MEG, and fMRI in Mild Cognitive Impairment and Alzheimer’s Disease: A Review
Entropy 2020, 22(2), 239; https://doi.org/10.3390/e22020239 (registering DOI) - 20 Feb 2020
Abstract
Alzheimer’s disease (AD) is a degenerative brain disease with a high and irreversible
incidence. In recent years, because brain signals have complex nonlinear dynamics, there has been
growing interest in studying complex changes in the time series of brain signals in patients with
[...] Read more.
Alzheimer’s disease (AD) is a degenerative brain disease with a high and irreversible
incidence. In recent years, because brain signals have complex nonlinear dynamics, there has been
growing interest in studying complex changes in the time series of brain signals in patients with
AD. We reviewed studies of complexity analyses of single‐channel time series from
electroencephalogram (EEG), magnetoencephalogram (MEG), and functional magnetic resonance
imaging (fMRI) in AD and determined future research directions. A systematic literature search for
2000–2019 was performed in the Web of Science and PubMed databases, resulting in 126 identified
studies. Compared to healthy individuals, the signals from AD patients have less complexity and
more predictable oscillations, which are found mainly in the left parietal, occipital, right frontal, and
temporal regions. This complexity is considered a potential biomarker for accurately responding to
the functional lesion in AD. The current review helps to reveal the patterns of dysfunction in the
brains of patients with AD and to investigate whether signal complexity can be used as a biomarker
to accurately respond to the functional lesion in AD. We proposed further studies in the signal
complexities of AD patients, including investigating the reliability of complexity algorithms and the
spatial patterns of signal complexity. In conclusion, the current review helps to better understand
the complexity of abnormalities in the AD brain and provide useful information for AD diagnosis. Full article
Open AccessArticle
Biometric Identification Method for Heart Sound Based on Multimodal Multiscale Dispersion Entropy
Entropy 2020, 22(2), 238; https://doi.org/10.3390/e22020238 (registering DOI) - 20 Feb 2020
Abstract
In this paper, a new method of biometric characterization of heart sounds based on multimodal multiscale dispersion entropy is proposed. Firstly, the heart sound is periodically segmented, and then each single-cycle heart sound is decomposed into a group of intrinsic mode functions (IMFs) [...] Read more.
In this paper, a new method of biometric characterization of heart sounds based on multimodal multiscale dispersion entropy is proposed. Firstly, the heart sound is periodically segmented, and then each single-cycle heart sound is decomposed into a group of intrinsic mode functions (IMFs) by improved complete ensemble empirical mode decomposition with adaptive noise (ICEEMDAN). These IMFs are then segmented to a series of frames, which is used to calculate the refine composite multiscale dispersion entropy (RCMDE) as the characteristic representation of heart sound. In the simulation experiments I, carried out on the open heart sounds database Michigan, Washington and Littman, the feature representation method was combined with the heart sound segmentation method based on logistic regression (LR) and hidden semi-Markov models (HSMM), and feature selection was performed through the Fisher ratio (FR). Finally, the Euclidean distance (ED) and the close principle are used for matching and identification, and the recognition accuracy rate was 96.08%. To improve the practical application value of this method, the proposed method was applied to 80 heart sounds database constructed by 40 volunteer heart sounds to discuss the effect of single-cycle heart sounds with different starting positions on performance in experiment II. The experimental results show that the single-cycle heart sound with the starting position of the start of the first heart sound (S1) has the highest recognition rate of 97.5%. In summary, the proposed method is effective for heart sound biometric recognition. Full article
(This article belongs to the Special Issue Multiscale Entropy Approaches and Their Applications)
Open AccessArticle
Thermodynamic and Transport Properties of Equilibrium Debye Plasmas
Entropy 2020, 22(2), 237; https://doi.org/10.3390/e22020237 (registering DOI) - 20 Feb 2020
Abstract
The thermodynamic and transport properties of weakly non-ideal, high-density partially ionized hydrogen plasma are investigated, accounting for quantum effects due to the change in the energy spectrum of atomic hydrogen when the electron–proton interaction is considered embedded in the surrounding particles. The complexity [...] Read more.
The thermodynamic and transport properties of weakly non-ideal, high-density partially ionized hydrogen plasma are investigated, accounting for quantum effects due to the change in the energy spectrum of atomic hydrogen when the electron–proton interaction is considered embedded in the surrounding particles. The complexity of the rigorous approach led to the development of simplified models, able to include the neighbor-effects on the isolated system while remaining consistent with the traditional thermodynamic approach. High-density conditions have been simulated assuming particle interactions described by a screened Coulomb potential. Full article
(This article belongs to the Special Issue Simulation with Entropy Thermodynamics)
Show Figures

Figure 1

Open AccessArticle
Estimating Differential Entropy using Recursive Copula Splitting
Entropy 2020, 22(2), 236; https://doi.org/10.3390/e22020236 (registering DOI) - 19 Feb 2020
Viewed by 148
Abstract
A method for estimating the Shannon differential entropy of multidimensional random variables using independent samples is described. The method is based on decomposing the distribution into a product of marginal distributions and joint dependency, also known as the copula. The entropy of marginals [...] Read more.
A method for estimating the Shannon differential entropy of multidimensional random variables using independent samples is described. The method is based on decomposing the distribution into a product of marginal distributions and joint dependency, also known as the copula. The entropy of marginals is estimated using one-dimensional methods. The entropy of the copula, which always has a compact support, is estimated recursively by splitting the data along statistically dependent dimensions. The method can be applied both for distributions with compact and non-compact supports, which is imperative when the support is not known or of a mixed type (in different dimensions). At high dimensions (larger than 20), numerical examples demonstrate that our method is not only more accurate, but also significantly more efficient than existing approaches. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Open AccessArticle
Entropy, Information, and Symmetry; Ordered is Symmetrical, II: System of Spins in the Magnetic Field
Entropy 2020, 22(2), 235; https://doi.org/10.3390/e22020235 (registering DOI) - 19 Feb 2020
Viewed by 134
Abstract
The second part of this paper develops an approach suggested in Entropy 2020, 22(1), 11; which relates ordering in physical systems to symmetrizing. Entropy is frequently interpreted as a quantitative measure of “chaos” or “disorder”. However, the notions of “chaos” and [...] Read more.
The second part of this paper develops an approach suggested in Entropy 2020, 22(1), 11; which relates ordering in physical systems to symmetrizing. Entropy is frequently interpreted as a quantitative measure of “chaos” or “disorder”. However, the notions of “chaos” and “disorder” are vague and subjective, to a great extent. This leads to numerous misinterpretations of entropy. We propose that the disorder is viewed as an absence of symmetry and identify “ordering” with symmetrizing of a physical system; in other words, introducing the elements of symmetry into an initially disordered physical system. We explore the initially disordered system of elementary magnets exerted to the external magnetic field H. Imposing symmetry restrictions diminishes the entropy of the system and decreases its temperature. The general case of the system of elementary magnets demonstrating j-fold symmetry is studied. The Tj=Tj interrelation takes place, where T and Tj are the temperatures of non-symmetrized and j-fold-symmetrized systems of the magnets, correspondingly. Full article
(This article belongs to the Section Statistical Physics)
Open AccessArticle
Diffusion Barrier Performance of AlCrTaTiZr/AlCrTaTiZr-N High-Entropy Alloy Films for Cu/Si Connect System
Entropy 2020, 22(2), 234; https://doi.org/10.3390/e22020234 - 19 Feb 2020
Viewed by 114
Abstract
In this study, high-entropy alloy films, namely, AlCrTaTiZr/AlCrTaTiZr-N, were deposited on the n-type (100) silicon substrate. Then, a copper film was deposited on the high-entropy alloy films. The diffusion barrier performance of AlCrTaTiZr/AlCrTaTiZr-N for Cu/Si connect system was investigated after thermal annealing for [...] Read more.
In this study, high-entropy alloy films, namely, AlCrTaTiZr/AlCrTaTiZr-N, were deposited on the n-type (100) silicon substrate. Then, a copper film was deposited on the high-entropy alloy films. The diffusion barrier performance of AlCrTaTiZr/AlCrTaTiZr-N for Cu/Si connect system was investigated after thermal annealing for an hour at 600 °C, 700 °C, 800 °C, and 900 °C. There were no Cu-Si intermetallic compounds generated in the Cu/AlCrTaTiZr/AlCrTaTiZr-N/Si film stacks after annealing even at 900 °C through transmission electron microscopy (TEM) and atomic probe tomography (APT) analysis. The results indicated that AlCrTaTiZr/AlCrTaTiZr-N alloy films can prevent copper diffusion at 900 °C. The reason was investigated in this work. The amorphous structure of the AlCrTaTiZr layer has lower driving force to form intermetallic compounds; the lattice mismatch between the AlCrTaTiZr and AlCrTaTiZ-rN layers increased the diffusion distance of the Cu atoms and the difficulty of the Cu atom diffusion to the Si substrate. Full article
Open AccessFeature PaperArticle
Musical Collaboration in Rhythmic Improvisation
Entropy 2020, 22(2), 233; https://doi.org/10.3390/e22020233 - 19 Feb 2020
Viewed by 123
Abstract
Despite our intimate relationship with music in every-day life, we know little about how people create music. A particularly elusive area of study entails the spontaneous collaborative musical creation in the absence of rehearsals or scripts. Toward this aim, we designed an experiment [...] Read more.
Despite our intimate relationship with music in every-day life, we know little about how people create music. A particularly elusive area of study entails the spontaneous collaborative musical creation in the absence of rehearsals or scripts. Toward this aim, we designed an experiment in which pairs of players collaboratively created music in rhythmic improvisation. Rhythmic patterns and collaborative processes were investigated through symbolic-recurrence quantification and information theory, applied to the time series of the sound created by the players. Working with real data on collaborative rhythmic improvisation, we identified features of improvised music and elucidated underlying processes of collaboration. Players preferred certain patterns over others, and their musical experience drove musical collaboration when rhythmic improvisation started. These results unfold prevailing rhythmic features in collaborative music creation while informing the complex dynamics of the underlying processes. Full article
(This article belongs to the Special Issue Information theory and Symbolic Analysis: Theory and Applications)
Open AccessArticle
Short-Time Estimation of Fractionation in Atrial Fibrillation with Coarse-Grained Correlation Dimension for Mapping the Atrial Substrate
Entropy 2020, 22(2), 232; https://doi.org/10.3390/e22020232 - 19 Feb 2020
Viewed by 132
Abstract
Atrial fibrillation (AF) is currently the most common cardiac arrhythmia, with catheter ablation (CA) of the pulmonary veins (PV) being its first line therapy. Ablation of complex fractionated atrial electrograms (CFAEs) outside the PVs has demonstrated improved long-term results, but their identification requires [...] Read more.
Atrial fibrillation (AF) is currently the most common cardiac arrhythmia, with catheter ablation (CA) of the pulmonary veins (PV) being its first line therapy. Ablation of complex fractionated atrial electrograms (CFAEs) outside the PVs has demonstrated improved long-term results, but their identification requires a reliable electrogram (EGM) fractionation estimator. This study proposes a technique aimed to assist CA procedures under real-time settings. The method has been tested on three groups of recordings: Group 1 consisted of 24 highly representative EGMs, eight of each belonging to a different AF Type. Group 2 contained the entire dataset of 119 EGMs, whereas Group 3 contained 20 pseudo-real EGMs of the special Type IV AF. Coarse-grained correlation dimension (CGCD) was computed at epochs of 1 s duration, obtaining a classification accuracy of 100% in Group 1 and 84.0–85.7% in Group 2, using 10-fold cross-validation. The receiver operating characteristics (ROC) analysis for highly fractionated EGMs, showed 100% specificity and sensitivity in Group 1 and 87.5% specificity and 93.6% sensitivity in Group 2. In addition, 100% of the pseudo-real EGMs were correctly identified as Type IV AF. This method can consistently express the fractionation level of AF EGMs and provides better performance than previous works. Its ability to compute fractionation in short-time can agilely detect sudden changes of AF Types and could be used for mapping the atrial substrate, thus assisting CA procedures under real-time settings for atrial substrate modification. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

Open AccessArticle
A Multiple-Input Multiple-Output Reservoir Computing System Subject to Optoelectronic Feedbacks and Mutual Coupling
Entropy 2020, 22(2), 231; https://doi.org/10.3390/e22020231 - 18 Feb 2020
Viewed by 166
Abstract
In this paper, a multiple-input multiple-output reservoir computing (RC) system is proposed, which is composed of multiple nonlinear nodes (Mach–Zehnder modulators) and multiple mutual-coupling loops of optoelectronic delay lines. Each input signal is added into every mutual-coupling loop to implement the simultaneous recognition [...] Read more.
In this paper, a multiple-input multiple-output reservoir computing (RC) system is proposed, which is composed of multiple nonlinear nodes (Mach–Zehnder modulators) and multiple mutual-coupling loops of optoelectronic delay lines. Each input signal is added into every mutual-coupling loop to implement the simultaneous recognition of multiple route signals, which results in the signal processing speed improving and the number of routes increasing. As an example, the four-route input and four-route output RC is simultaneously realized by numerical simulations. The results show that this type of RC system can successfully recognize the four-route optical packet headers with 3-bit, 8-bit, 16-bit, and 32-bit, and four-route independent digital speeches. When the white noise is added to the signals such that the signal-to-noise ratio (SNR) of the optical packet headers and the digital speeches are 35 dB and 20 dB respectively, the normalized root mean square errors (NRMSEs) of the signal recognition are all close to 0.1. The word error rates (WERs) of the optical packet header recognition are 0%. The WER of the digital speech recognition is 1.6%. The eight-route input and eight-route output RC is also numerically simulated. The recognition of the eight-route 3-bit optical packet headers is implemented. The parallel processing of multiple-route signals and the high recognition accuracy are implemented by this proposed system. Full article
(This article belongs to the Special Issue Entropy-Based Algorithms for Signal Processing)
Open AccessArticle
High-Temperature Nano-Indentation Creep of Reduced Activity High Entropy Alloys Based on 4-5-6 Elemental Palette
Entropy 2020, 22(2), 230; https://doi.org/10.3390/e22020230 - 18 Feb 2020
Viewed by 172
Abstract
There is a strong demand for materials with inherently high creep resistance in the harsh environment of next-generation nuclear reactors. High entropy alloys have drawn intense attention in this regard due to their excellent elevated temperature properties and irradiation resistance. Here, the time-dependent [...] Read more.
There is a strong demand for materials with inherently high creep resistance in the harsh environment of next-generation nuclear reactors. High entropy alloys have drawn intense attention in this regard due to their excellent elevated temperature properties and irradiation resistance. Here, the time-dependent plastic deformation behavior of two refractory high entropy alloys was investigated, namely HfTaTiVZr and TaTiVWZr. These alloys are based on reduced activity metals from the 4-5-6 elemental palette that would allow easy post-service recycling after use in nuclear reactors. The creep behavior was investigated using nano-indentation over the temperature range of 298 K to 573 K under static and dynamic loads up to 5 N. Creep stress exponent for HfTaTiVZr and TaTiVWZr was found to be in the range of 20–140 and the activation volume was ~16–20b3, indicating dislocation dominated mechanism. The stress exponent increased with increasing indentation depth due to a higher density of dislocations and their entanglement at larger depth and the exponent decreased with increasing temperature due to thermally activated dislocations. Smaller creep displacement and higher activation energy for the two high entropy alloys indicate superior creep resistance compared to refractory pure metals like tungsten. Full article
(This article belongs to the Special Issue Future Directions of High Entropy Alloys)
Open AccessArticle
Numerical Simulation on Convection and Thermal Radiation of Casson Fluid in an Enclosure with Entropy Generation
Entropy 2020, 22(2), 229; https://doi.org/10.3390/e22020229 - 18 Feb 2020
Viewed by 136
Abstract
The goal of the current numerical simulation is to explore the impact of aspect ratio, thermal radiation, and entropy generation on buoyant induced convection in a rectangular box filled with Casson fluid. The vertical boundaries of the box are maintained with different constant [...] Read more.
The goal of the current numerical simulation is to explore the impact of aspect ratio, thermal radiation, and entropy generation on buoyant induced convection in a rectangular box filled with Casson fluid. The vertical boundaries of the box are maintained with different constant thermal distribution. Thermal insulation is executed on horizontal boundaries. The solution is obtained by a finite volume-based iterative method. The results are explored over a range of radiation parameter, Casson fluid parameter, aspect ratio, and Grashof number. The impact of entropy generation is also examined in detail. Thermal stratification occurs for greater values of Casson liquid parameters in the presence of radiation. The kinetic energy grows on rising the values of Casson liquid and radiation parameters. The thermal energy transport declines on growing the values of radiation parameter and it enhances on rising the Casson fluid parameter. Full article
(This article belongs to the Special Issue Thermal Radiation and Entropy Analysis)
Show Figures

Figure 1

Open AccessArticle
Non-Monogamy of Spatio-Temporal Correlations and the Black Hole Information Loss Paradox
Entropy 2020, 22(2), 228; https://doi.org/10.3390/e22020228 (registering DOI) - 18 Feb 2020
Viewed by 170
Abstract
Pseudo-density matrices are a generalisation of quantum states and do not obey monogamy of quantum correlations. Could this be the solution to the paradox of information loss during the evaporation of a black hole? In this paper we discuss this possibility, providing a [...] Read more.
Pseudo-density matrices are a generalisation of quantum states and do not obey monogamy of quantum correlations. Could this be the solution to the paradox of information loss during the evaporation of a black hole? In this paper we discuss this possibility, providing a theoretical proposal to extend quantum theory with these pseudo-states to describe the statistics arising in black-hole evaporation. We also provide an experimental demonstration of this theoretical proposal, using a simulation in optical regime, that tomographically reproduces the correlations of the pseudo-density matrix describing this physical phenomenon. Full article
(This article belongs to the Special Issue Quantum Entanglement)
Open AccessArticle
Thermomass Theory in the Framework of GENERIC
Entropy 2020, 22(2), 227; https://doi.org/10.3390/e22020227 - 18 Feb 2020
Viewed by 123
Abstract
Thermomass theory was developed to deal with the non-Fourier heat conduction phenomena involving the influence of heat inertia. However, its structure, derived from an analogy to fluid mechanics, requires further mathematical verification. In this paper, the GENERIC framework, which is a geometrical and [...] Read more.
Thermomass theory was developed to deal with the non-Fourier heat conduction phenomena involving the influence of heat inertia. However, its structure, derived from an analogy to fluid mechanics, requires further mathematical verification. In this paper, the GENERIC framework, which is a geometrical and mathematical structure in nonequilibrium thermodynamics, was employed to verify the thermomass theory. At first, the thermomass theory was introduced briefly; then, the GENERIC framework was applied in the thermomass gas system with state variables, thermomass gas density ρh and thermomass momentum mh, and the time evolution equations obtained from GENERIC framework were compared with those in thermomass theory. It was demonstrated that the equations generated by GENERIC theory were the same as the continuity and momentum equations in thermomass theory with proper potentials and eta-function. Thermomass theory gives a physical interpretation to the GENERIC theory in non-Fourier heat conduction phenomena. By combining these two theories, it was found that the Hamiltonian energy in reversible process and the dissipation potential in irreversible process could be unified into one formulation, i.e., the thermomass energy. Furthermore, via the framework of GENERIC, thermomass theory could be extended to involve more state variables, such as internal source term and distortion matrix term. Numerical simulations investigated the influences of the convective term and distortion matrix term in the equations. It was found that the convective term changed the shape of thermal energy distribution and enhanced the spreading behaviors of thermal energy. The distortion matrix implies the elasticity and viscosity of the thermomass gas. Full article
(This article belongs to the Special Issue Entropies: Between Information Geometry and Kinetics)
Show Figures

Graphical abstract

Open AccessArticle
Leveraging Blockchain Technology for Secure Energy Trading and Least-Cost Evaluation of Decentralized Contributions to Electrification in Sub-Saharan Africa
Entropy 2020, 22(2), 226; https://doi.org/10.3390/e22020226 - 17 Feb 2020
Viewed by 210
Abstract
The International Energy Agency has projected that the total energy demand for electricity in sub-Saharan Africa (SSA) is expected to rise by an average of 4% per year up to 2040. It implies that ~620 million people are living without electricity in SSA. [...] Read more.
The International Energy Agency has projected that the total energy demand for electricity in sub-Saharan Africa (SSA) is expected to rise by an average of 4% per year up to 2040. It implies that ~620 million people are living without electricity in SSA. Going with the 2030 vision of the United Nations that electricity should be accessible to all, it is important that new technology and methods are provided. In comparison to other nations worldwide, smart grid (SG) is an emerging technology in SSA. SG is an information technology-enhanced power grid, which provides a two-way communication network between energy producers and customers. Also, it includes renewable energy, smart meters, and smart devices that help to manage energy demands and reduce energy generation costs. However, SG is facing inherent difficulties, such as energy theft, lack of trust, security, and privacy issues. Therefore, this paper proposes a blockchain-based decentralized energy system (BDES) to accelerate rural and urban electrification by improving service delivery while minimizing the cost of generation and addressing historical antipathy and cybersecurity risk within SSA. Additionally, energy insufficiency and fixed pricing schemes may raise concerns in SG, such as the imbalance of order. The paper also introduces a blockchain-based energy trading system, which includes price negotiation and incentive mechanisms to address the imbalance of order. Moreover, existing models for energy planning do not consider the effect of fill rate (FR) and service level (SL). A blockchain levelized cost of energy (BLCOE) is proposed as the least-cost solution that measures the impact of energy reliability on generation cost using FR and SL. Simulation results are presented to show the performance of the proposed model and the least-cost option varies with relative energy generation cost of centralized, decentralized and BDES infrastructure. Case studies of Burkina Faso, Cote d’Ivoire, Gambia, Liberia, Mali, and Senegal illustrate situations that are more suitable for BDES. For other SSA countries, BDES can cost-effectively service a large population and regions. Additionally, BLCOE reduces energy costs by approximately 95% for battery and 75% for the solar modules. The future BLCOE varies across SSA on an average of about 0.049 $/kWh as compared to 0.15 $/kWh of an existing system in the literature. Full article
(This article belongs to the Special Issue Blockchain: Security, Challenges, and Opportunities)
Open AccessArticle
The Ohm Law as an Alternative for the Entropy Origin Nonlinearities in Conductivity of Dilute Colloidal Polyelectrolytes
Entropy 2020, 22(2), 225; https://doi.org/10.3390/e22020225 - 17 Feb 2020
Viewed by 150
Abstract
We discuss the peculiarities of the Ohm law in dilute polyelectrolytes containing a relatively low concentration n of multiply charged colloidal particles. It is demonstrated that in these conditions, the effective conductivity of polyelectrolyte is the linear function of n . [...] Read more.
We discuss the peculiarities of the Ohm law in dilute polyelectrolytes containing a relatively low concentration n of multiply charged colloidal particles. It is demonstrated that in these conditions, the effective conductivity of polyelectrolyte is the linear function of n . This happens due to the change of the electric field in the polyelectrolyte under the effect of colloidal particle polarization. The proposed theory explains the recent experimental findings and presents the alternative to mean spherical approximation which predicts the nonlinear I–V characteristics of dilute colloidal polyelectrolytes due to entropy changes. Full article
(This article belongs to the Special Issue Simulation with Entropy Thermodynamics)
Show Figures

Figure 1

Open AccessCommunication
The Brevity Law as a Scaling Law, and a Possible Origin of Zipf’s Law for Word Frequencies
Entropy 2020, 22(2), 224; https://doi.org/10.3390/e22020224 - 17 Feb 2020
Viewed by 186
Abstract
An important body of quantitative linguistics is constituted by a series of statistical laws about language usage. Despite the importance of these linguistic laws, some of them are poorly formulated, and, more importantly, there is no unified framework that encompasses all them. This [...] Read more.
An important body of quantitative linguistics is constituted by a series of statistical laws about language usage. Despite the importance of these linguistic laws, some of them are poorly formulated, and, more importantly, there is no unified framework that encompasses all them. This paper presents a new perspective to establish a connection between different statistical linguistic laws. Characterizing each word type by two random variables—length (in number of characters) and absolute frequency—we show that the corresponding bivariate joint probability distribution shows a rich and precise phenomenology, with the type-length and the type-frequency distributions as its two marginals, and the conditional distribution of frequency at fixed length providing a clear formulation for the brevity-frequency phenomenon. The type-length distribution turns out to be well fitted by a gamma distribution (much better than with the previously proposed lognormal), and the conditional frequency distributions at fixed length display power-law-decay behavior with a fixed exponent α 1.4 and a characteristic-frequency crossover that scales as an inverse power δ 2.8 of length, which implies the fulfillment of a scaling law analogous to those found in the thermodynamics of critical phenomena. As a by-product, we find a possible model-free explanation for the origin of Zipf’s law, which should arise as a mixture of conditional frequency distributions governed by the crossover length-dependent frequency. Full article
(This article belongs to the Special Issue Information Theory and Language)
Show Figures

Figure 1

Open AccessArticle
Robust Power Optimization for Downlink Cloud Radio Access Networks with Physical Layer Security
Entropy 2020, 22(2), 223; https://doi.org/10.3390/e22020223 - 17 Feb 2020
Viewed by 129
Abstract
Since the cloud radio access network (C-RAN) transmits information signals by jointly transmission, the multiple copies of information signals might be eavesdropped on. Therefore, this paper studies the resource allocation algorithm for secure energy optimization in a downlink C-RAN, via jointly designing base [...] Read more.
Since the cloud radio access network (C-RAN) transmits information signals by jointly transmission, the multiple copies of information signals might be eavesdropped on. Therefore, this paper studies the resource allocation algorithm for secure energy optimization in a downlink C-RAN, via jointly designing base station (BS) mode, beamforming and artificial noise (AN) given imperfect channel state information (CSI) of information receivers (IRs) and eavesdrop receivers (ERs). The considered resource allocation design problem is formulated as a nonlinear programming problem of power minimization under the quality of service (QoS) for each IR, the power constraint for each BS, and the physical layer security (PLS) constraints for each ER. To solve this non-trivial problem, we first adopt smooth 0 -norm approximation and propose a general iterative difference of convex (IDC) algorithm with provable convergence for a difference of convex programming problem. Then, a three-stage algorithm is proposed to solve the original problem, which firstly apply the iterative difference of convex programming with semi-definite relaxation (SDR) technique to provide a roughly (approximately) sparse solution, and then improve the sparsity of the solutions using a deflation based post processing method. The effectiveness of the proposed algorithm is validated with extensive simulations for power minimization in secure downlink C-RANs. Full article
Show Figures

Figure 1

Open AccessArticle
Computing Classical-Quantum Channel Capacity Using Blahut–Arimoto Type Algorithm: A Theoretical and Numerical Analysis
Entropy 2020, 22(2), 222; https://doi.org/10.3390/e22020222 - 16 Feb 2020
Viewed by 218
Abstract
Based on Arimoto’s work in 1972, we propose an iterative algorithm for computing the capacity of a discrete memoryless classical-quantum channel with a finite input alphabet and a finite dimensional output, which we call the Blahut–Arimoto algorithm for classical-quantum channel, and an input [...] Read more.
Based on Arimoto’s work in 1972, we propose an iterative algorithm for computing the capacity of a discrete memoryless classical-quantum channel with a finite input alphabet and a finite dimensional output, which we call the Blahut–Arimoto algorithm for classical-quantum channel, and an input cost constraint is considered. We show that, to reach ε accuracy, the iteration complexity of the algorithm is upper bounded by log n log ε ε where n is the size of the input alphabet. In particular, when the output state { ρ x } x X is linearly independent in complex matrix space, the algorithm has a geometric convergence. We also show that the algorithm reaches an ε accurate solution with a complexity of O ( m 3 log n log ε ε ) , and O ( m 3 log ε log ( 1 δ ) ε D ( p * | | p N 0 ) ) in the special case, where m is the output dimension, D ( p * | | p N 0 ) is the relative entropy of two distributions, and δ is a positive number. Numerical experiments were performed and an approximate solution for the binary two-dimensional case was analysed. Full article
(This article belongs to the collection Quantum Information)
Show Figures

Figure 1

Open AccessArticle
On a Generalization of the Jensen–Shannon Divergence and the Jensen–Shannon Centroid
Entropy 2020, 22(2), 221; https://doi.org/10.3390/e22020221 - 16 Feb 2020
Viewed by 308
Abstract
The Jensen–Shannon divergence is a renown bounded symmetrization of the Kullback–Leibler divergence which does not require probability densities to have matching supports. In this paper, we introduce a vector-skew generalization of the scalar α -Jensen–Bregman divergences and derive thereof the vector-skew α -Jensen–Shannon [...] Read more.
The Jensen–Shannon divergence is a renown bounded symmetrization of the Kullback–Leibler divergence which does not require probability densities to have matching supports. In this paper, we introduce a vector-skew generalization of the scalar α -Jensen–Bregman divergences and derive thereof the vector-skew α -Jensen–Shannon divergences. We prove that the vector-skew α -Jensen–Shannon divergences are f-divergences and study the properties of these novel divergences. Finally, we report an iterative algorithm to numerically compute the Jensen–Shannon-type centroids for a set of probability densities belonging to a mixture family: This includes the case of the Jensen–Shannon centroid of a set of categorical distributions or normalized histograms. Full article
Open AccessArticle
Magnetic Resonance Image Quality Assessment by Using Non-Maximum Suppression and Entropy Analysis
Entropy 2020, 22(2), 220; https://doi.org/10.3390/e22020220 - 16 Feb 2020
Viewed by 244
Abstract
An investigation of diseases using magnetic resonance (MR) imaging requires automatic image quality assessment methods able to exclude low-quality scans. Such methods can be also employed for an optimization of parameters of imaging systems or evaluation of image processing algorithms. Therefore, in this [...] Read more.
An investigation of diseases using magnetic resonance (MR) imaging requires automatic image quality assessment methods able to exclude low-quality scans. Such methods can be also employed for an optimization of parameters of imaging systems or evaluation of image processing algorithms. Therefore, in this paper, a novel blind image quality assessment (BIQA) method for the evaluation of MR images is introduced. It is observed that the result of filtering using non-maximum suppression (NMS) strongly depends on the perceptual quality of an input image. Hence, in the method, the image is first processed by the NMS with various levels of acceptable local intensity difference. Then, the quality is efficiently expressed by the entropy of a sequence of extrema numbers obtained with the thresholded NMS. The proposed BIQA approach is compared with ten state-of-the-art techniques on a dataset containing MR images and subjective scores provided by 31 experienced radiologists. The Pearson, Spearman, Kendall correlation coefficients and root mean square error for the method assessing images in the dataset were 0.6741, 0.3540, 0.2428, and 0.5375, respectively. The extensive experimental evaluation of the BIQA methods reveals that the introduced measure outperforms related techniques by a large margin as it correlates better with human scores. Full article
(This article belongs to the Special Issue Entropy in Image Analysis II)
Show Figures

Figure 1

Open AccessArticle
Experimentally Demonstrate the Spin-1 Information Entropic Inequality Based on Simulated Photonic Qutrit States
Entropy 2020, 22(2), 219; https://doi.org/10.3390/e22020219 - 15 Feb 2020
Viewed by 199
Abstract
Quantum correlations of higher-dimensional systems are an important content of quantum information theory and quantum information application. The quantification of quantum correlation of high-dimensional quantum systems is crucial, but difficult. In this paper, using the second-order nonlinear optical effect and multiphoton interference enhancement [...] Read more.
Quantum correlations of higher-dimensional systems are an important content of quantum information theory and quantum information application. The quantification of quantum correlation of high-dimensional quantum systems is crucial, but difficult. In this paper, using the second-order nonlinear optical effect and multiphoton interference enhancement effect, we experimentally implement the photonic qutrit states and demonstrate the spin-1 information entropic inequality for the first time to quantitative quantum correlation. Our work shows that information entropy is an important way to quantify quantum correlation and quantum information processing. Full article
(This article belongs to the Special Issue Quantum Entropies and Complexity)
Show Figures

Figure 1

Open AccessArticle
An Information-Theoretic Measure for Balance Assessment in Comparative Clinical Studies
Entropy 2020, 22(2), 218; https://doi.org/10.3390/e22020218 - 15 Feb 2020
Viewed by 198
Abstract
Limitations of statistics currently used to assess balance in observation samples include their insensitivity to shape discrepancies and their dependence upon sample size. The Jensen–Shannon divergence (JSD) is an alternative approach to quantifying the lack of balance among treatment groups that does not [...] Read more.
Limitations of statistics currently used to assess balance in observation samples include their insensitivity to shape discrepancies and their dependence upon sample size. The Jensen–Shannon divergence (JSD) is an alternative approach to quantifying the lack of balance among treatment groups that does not have these limitations. The JSD is an information-theoretic statistic derived from relative entropy, with three specific advantages relative to using standardized difference scores. First, it is applicable to cases in which the covariate is categorical or continuous. Second, it generalizes to studies in which there are more than two exposure or treatment groups. Third, it is decomposable, allowing for the identification of specific covariate values, treatment groups or combinations thereof that are responsible for any observed imbalance. Full article
(This article belongs to the Special Issue Applications of Information Theory to Epidemiology)
Show Figures

Figure 1

Open AccessArticle
Spherically Restricted Random Hyperbolic Diffusion
Entropy 2020, 22(2), 217; https://doi.org/10.3390/e22020217 - 14 Feb 2020
Viewed by 201
Abstract
This paper investigates solutions of hyperbolic diffusion equations in R 3 with random initial conditions. The solutions are given as spatial-temporal random fields. Their restrictions to the unit sphere S 2 are studied. All assumptions are formulated in terms of the angular power [...] Read more.
This paper investigates solutions of hyperbolic diffusion equations in R 3 with random initial conditions. The solutions are given as spatial-temporal random fields. Their restrictions to the unit sphere S 2 are studied. All assumptions are formulated in terms of the angular power spectrum or the spectral measure of the random initial conditions. Approximations to the exact solutions are given. Upper bounds for the mean-square convergence rates of the approximation fields are obtained. The smoothness properties of the exact solution and its approximation are also investigated. It is demonstrated that the Hölder-type continuity of the solution depends on the decay of the angular power spectrum. Conditions on the spectral measure of initial conditions that guarantee short- or long-range dependence of the solutions are given. Numerical studies are presented to verify the theoretical findings. Full article
(This article belongs to the Special Issue Applications of Nonlinear Diffusion Equations)
Show Figures

Figure 1

Open AccessArticle
Generalised Measures of Multivariate Information Content
Entropy 2020, 22(2), 216; https://doi.org/10.3390/e22020216 - 14 Feb 2020
Viewed by 281
Abstract
The entropy of a pair of random variables is commonly depicted using a Venn diagram. This representation is potentially misleading, however, since the multivariate mutual information can be negative. This paper presents new measures of multivariate information content that can be accurately depicted [...] Read more.
The entropy of a pair of random variables is commonly depicted using a Venn diagram. This representation is potentially misleading, however, since the multivariate mutual information can be negative. This paper presents new measures of multivariate information content that can be accurately depicted using Venn diagrams for any number of random variables. These measures complement the existing measures of multivariate mutual information and are constructed by considering the algebraic structure of information sharing. It is shown that the distinct ways in which a set of marginal observers can share their information with a non-observing third party corresponds to the elements of a free distributive lattice. The redundancy lattice from partial information decomposition is then subsequently and independently derived by combining the algebraic structures of joint and shared information content. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Open AccessArticle
Second Law Analysis for the Experimental Performances of a Cold Heat Exchanger of a Stirling Refrigeration Machine
Entropy 2020, 22(2), 215; https://doi.org/10.3390/e22020215 - 14 Feb 2020
Viewed by 179
Abstract
The second law of thermodynamics is applied to evaluate the influence of entropy generation on the performances of a cold heat exchanger of an experimental Stirling refrigeration machine by means of three factors: the entropy generation rate N S , the irreversibility distribution [...] Read more.
The second law of thermodynamics is applied to evaluate the influence of entropy generation on the performances of a cold heat exchanger of an experimental Stirling refrigeration machine by means of three factors: the entropy generation rate N S , the irreversibility distribution ratio ϕ and the Bejan number B e | N S based on a dimensionless entropy ratio that we introduced. These factors are investigated as functions of characteristic dimensions of the heat exchanger (hydraulic diameter and length), coolant mass flow and cold gas temperature. We have demonstrated the role of these factors on the thermal and fluid friction irreversibilities. The conclusions are derived from the behavior of the entropy generation factors concerning the heat transfer and fluid friction characteristics of a double-pipe type heat exchanger crossed by a coolant liquid (55/45 by mass ethylene glycol/water mixture) in the temperature range 240 K < TC < 300 K. The mathematical model of entropy generation includes experimental measurements of pressures, temperatures and coolant mass flow, and the characteristic dimensions of the heat exchanger. A large characteristic length and small hydraulic diameter generate large entropy production, especially at a low mean temperature, because the high value of the coolant liquid viscosity increases the fluid frictions. The model and experiments showed the dominance of heat transfer over viscous friction in the cold heat exchanger and B e | N S 1 and ϕ → 0 for mass flow rates m ˙ 0.1 kg.s−1. Full article
(This article belongs to the Special Issue Carnot Cycle and Heat Engine Fundamentals and Applications)
Show Figures

Figure 1

Open AccessArticle
Sociophysics Analysis of Multi-Group Conflicts
Entropy 2020, 22(2), 214; https://doi.org/10.3390/e22020214 - 14 Feb 2020
Viewed by 243
Abstract
We present our research on the application of statistical physics techniques to multi-group social conflicts. We identify real conflict situations of which the characteristics correspond to the model. We offer realistic assumptions about conflict behaviors that get factored into model-generated scenarios. The scenarios [...] Read more.
We present our research on the application of statistical physics techniques to multi-group social conflicts. We identify real conflict situations of which the characteristics correspond to the model. We offer realistic assumptions about conflict behaviors that get factored into model-generated scenarios. The scenarios can inform conflict research and strategies for conflict management. We discuss model applications to two- and three-group conflicts. We identify chaotic time evolution of mean attitudes and the occurrence of strange attractors. We examine the role that the range of interactions plays with respect to the occurrence of chaotic behavior. Full article
Show Figures

Figure 1

Open AccessArticle
Variational Information Bottleneck for Unsupervised Clustering: Deep Gaussian Mixture Embedding
Entropy 2020, 22(2), 213; https://doi.org/10.3390/e22020213 - 13 Feb 2020
Viewed by 204
Abstract
In this paper, we develop an unsupervised generative clustering framework that combines the variational information bottleneck and the Gaussian mixture model. Specifically, in our approach, we use the variational information bottleneck method and model the latent space as a mixture of Gaussians. We [...] Read more.
In this paper, we develop an unsupervised generative clustering framework that combines the variational information bottleneck and the Gaussian mixture model. Specifically, in our approach, we use the variational information bottleneck method and model the latent space as a mixture of Gaussians. We derive a bound on the cost function of our model that generalizes the Evidence Lower Bound (ELBO) and provide a variational inference type algorithm that allows computing it. In the algorithm, the coders’ mappings are parametrized using neural networks, and the bound is approximated by Markov sampling and optimized with stochastic gradient descent. Numerical results on real datasets are provided to support the efficiency of our method. Full article
(This article belongs to the Special Issue Information Theory for Data Communications and Processing)
Open AccessArticle
On the Determination of Kappa Distribution Functions from Space Plasma Observations
Entropy 2020, 22(2), 212; https://doi.org/10.3390/e22020212 - 13 Feb 2020
Viewed by 226
Abstract
The velocities of space plasma particles, often follow kappa distribution functions. The kappa index, which labels and governs these distributions, is an important parameter in understanding the plasma dynamics. Space science missions often carry plasma instruments on board which observe the plasma particles [...] Read more.
The velocities of space plasma particles, often follow kappa distribution functions. The kappa index, which labels and governs these distributions, is an important parameter in understanding the plasma dynamics. Space science missions often carry plasma instruments on board which observe the plasma particles and construct their velocity distribution functions. A proper analysis of the velocity distribution functions derives the plasma bulk parameters, such as the plasma density, speed, temperature, and kappa index. Commonly, the plasma bulk density, velocity, and temperature are determined from the velocity moments of the observed distribution function. Interestingly, recent studies demonstrated the calculation of the kappa index from the speed (kinetic energy) moments of the distribution function. Such a novel calculation could be very useful in future analyses and applications. This study examines the accuracy of the specific method using synthetic plasma proton observations by a typical electrostatic analyzer. We analyze the modeled observations in order to derive the plasma bulk parameters, which we compare with the parameters we used to model the observations in the first place. Through this comparison, we quantify the systematic and statistical errors in the derived moments, and we discuss their possible sources. Full article
(This article belongs to the Special Issue Theoretical Aspects of Kappa Distributions)
Previous Issue
Back to TopTop