Next Issue
Volume 21, October
Previous Issue
Volume 21, August

Table of Contents

Entropy, Volume 21, Issue 9 (September 2019) – 100 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
Cover Story (view full-size image) We investigated the quantum adiabatic pumping effect in an interferometer attached to two [...] Read more.
Order results
Result details
Select all
Export citation of selected articles as:
Open AccessArticle
Survey Assessment for Decision Support Using Self-Organizing Maps Profile Characterization with an Odds and Cluster Heat Map: Application to Children’s Perception of Urban School Environments
Entropy 2019, 21(9), 916; https://doi.org/10.3390/e21090916 - 19 Sep 2019
Cited by 2 | Viewed by 1182
Abstract
The interpretation of opinion and satisfaction surveys based exclusively on statistical analysis often faces difficulties due to the nature of the information and the requirements of the available statistical methods. These difficulties include the concurrence of categorical information with answers based on Likert [...] Read more.
The interpretation of opinion and satisfaction surveys based exclusively on statistical analysis often faces difficulties due to the nature of the information and the requirements of the available statistical methods. These difficulties include the concurrence of categorical information with answers based on Likert scales with only a few levels, or the distancing of the necessary heuristic approach of the decision support system (DSS). The artificial neural network used for data analysis, called Kohonen or self-organizing maps (SOM), although rarely used for survey analysis, has been applied in many fields, facilitating the graphical representation and the simple interpretation of high-dimensionality data. This clustering method, based on unsupervised learning, also allows obtaining profiles of respondents without the need to provide additional information for the creation of these clusters. In this work, we propose the identification of profiles using SOM for evaluating opinion surveys. Subsequently, non-parametric chi-square tests were first conducted to contrast whether answer was independent of each profile found, and in the case of statistical significance (p ≤ 0.05), the odds ratio was evaluated as an indicator of the effect size of such dependence. Finally, all results were displayed in an odds and cluster heat map so that they could be easily interpreted and used to make decisions regarding the survey results. The methodology was applied to the analysis of a survey based on forms administered to children (N = 459) about their perception of the urban environment close to their school, obtaining relevant results, facilitating results interpretation, and providing support to the decision-process. Full article
(This article belongs to the Special Issue Intelligent Tools and Applications in Engineering and Mathematics)
Show Figures

Graphical abstract

Open AccessArticle
Analytic Expressions for Radar Sea Clutter WSSUS Scattering Functions
Entropy 2019, 21(9), 915; https://doi.org/10.3390/e21090915 - 19 Sep 2019
Viewed by 664
Abstract
Bello’s stochastic linear time-varying system theory has been widely used in the wireless communications literature to characterize multipath fading channel statistics. In the context of radar backscatter, this formulation allows for statistical characterization of distributed radar targets in range and Doppler using wide-sense [...] Read more.
Bello’s stochastic linear time-varying system theory has been widely used in the wireless communications literature to characterize multipath fading channel statistics. In the context of radar backscatter, this formulation allows for statistical characterization of distributed radar targets in range and Doppler using wide-sense stationary uncorrelated scattering (WSSUS) models. WSSUS models separate the channel from the effect of the waveform and receive filter, making it an ideal formulation for waveform design problems. Of particular interest in the radar waveform design community is the ability to suppress unwanted backscatter from the earth’s surface, known as clutter. Various methods for estimating WSSUS system functions have been studied in the literature, but to date no analytic expressions for radar surface clutter range-Doppler scattering functions exist. In this work we derive a frequency-selective generalization of the Jakes Doppler spectrum model, which is widely used in the wireless communications literature, adapt it for use in radar problems, and show how the maximum entropy method can be used to extend this model to account for internal clutter motion. Validation of the spectral and stationarity properties of the proposed model against a subset of the Australian Ingara sea clutter database is performed, and good agreement is shown. Full article
(This article belongs to the Special Issue Maximum Entropy and Its Application III)
Show Figures

Figure 1

Open AccessArticle
Fractional Refined Composite Multiscale Fuzzy Entropy of International Stock Indices
Entropy 2019, 21(9), 914; https://doi.org/10.3390/e21090914 - 19 Sep 2019
Cited by 4 | Viewed by 631
Abstract
Fractional refined composite multiscale fuzzy entropy (FRCMFE), which aims to relieve the large fluctuation of fuzzy entropy (FuzzyEn) measure and significantly discriminate different short-term financial time series with noise, is proposed to quantify the complexity dynamics of the international stock indices in the [...] Read more.
Fractional refined composite multiscale fuzzy entropy (FRCMFE), which aims to relieve the large fluctuation of fuzzy entropy (FuzzyEn) measure and significantly discriminate different short-term financial time series with noise, is proposed to quantify the complexity dynamics of the international stock indices in the paper. To comprehend the FRCMFE, the complexity analyses of Gaussian white noise with different signal lengths, the random logarithmic returns and volatility series of the international stock indices are comparatively performed with multiscale fuzzy entropy (MFE), composite multiscale fuzzy entropy (CMFE) and refined composite multiscale fuzzy entropy (RCMFE). The empirical results show that the FRCMFE measure outperforms the traditional methods to some extent. Full article
(This article belongs to the Special Issue The Fractional View of Complexity)
Show Figures

Figure 1

Open AccessFeature PaperArticle
Multivariate Multiscale Dispersion Entropy of Biomedical Times Series
Entropy 2019, 21(9), 913; https://doi.org/10.3390/e21090913 - 19 Sep 2019
Cited by 3 | Viewed by 886
Abstract
Due to the non-linearity of numerous physiological recordings, non-linear analysis of multi-channel signals has been extensively used in biomedical engineering and neuroscience. Multivariate multiscale sample entropy (MSE–mvMSE) is a popular non-linear metric to quantify the irregularity of multi-channel time series. However, mvMSE has [...] Read more.
Due to the non-linearity of numerous physiological recordings, non-linear analysis of multi-channel signals has been extensively used in biomedical engineering and neuroscience. Multivariate multiscale sample entropy (MSE–mvMSE) is a popular non-linear metric to quantify the irregularity of multi-channel time series. However, mvMSE has two main drawbacks: (1) the entropy values obtained by the original algorithm of mvMSE are either undefined or unreliable for short signals (300 sample points); and (2) the computation of mvMSE for signals with a large number of channels requires the storage of a huge number of elements. To deal with these problems and improve the stability of mvMSE, we introduce multivariate multiscale dispersion entropy (MDE–mvMDE), as an extension of our recently developed MDE, to quantify the complexity of multivariate time series. We assess mvMDE, in comparison with the state-of-the-art and most widespread multivariate approaches, namely, mvMSE and multivariate multiscale fuzzy entropy (mvMFE), on multi-channel noise signals, bivariate autoregressive processes, and three biomedical datasets. The results show that mvMDE takes into account dependencies in patterns across both the time and spatial domains. The mvMDE, mvMSE, and mvMFE methods are consistent in that they lead to similar conclusions about the underlying physiological conditions. However, the proposed mvMDE discriminates various physiological states of the biomedical recordings better than mvMSE and mvMFE. In addition, for both the short and long time series, the mvMDE-based results are noticeably more stable than the mvMSE- and mvMFE-based ones. For short multivariate time series, mvMDE, unlike mvMSE, does not result in undefined values. Furthermore, mvMDE is faster than mvMFE and mvMSE and also needs to store a considerably smaller number of elements. Due to its ability to detect different kinds of dynamics of multivariate signals, mvMDE has great potential to analyse various signals. Full article
(This article belongs to the Special Issue Multiscale Entropy Approaches and Their Applications)
Show Figures

Figure 1

Open AccessArticle
Evolved-Cooperative Correntropy-Based Extreme Learning Machine for Robust Prediction
Entropy 2019, 21(9), 912; https://doi.org/10.3390/e21090912 - 19 Sep 2019
Cited by 1 | Viewed by 702
Abstract
In recent years, the correntropy instead of the mean squared error has been widely taken as a powerful tool for enhancing the robustness against noise and outliers by forming the local similarity measurements. However, most correntropy-based models either have too simple descriptions of [...] Read more.
In recent years, the correntropy instead of the mean squared error has been widely taken as a powerful tool for enhancing the robustness against noise and outliers by forming the local similarity measurements. However, most correntropy-based models either have too simple descriptions of the correntropy or require too many parameters to adjust in advance, which is likely to cause poor performance since the correntropy fails to reflect the probability distributions of the signals. Therefore, in this paper, a novel correntropy-based extreme learning machine (ELM) called ECC-ELM has been proposed to provide a more robust training strategy based on the newly developed multi-kernel correntropy with the parameters that are generated using cooperative evolution. To achieve an accurate description of the correntropy, the method adopts a cooperative evolution which optimizes the bandwidths by switching delayed particle swarm optimization (SDPSO) and generates the corresponding influence coefficients that minimizes the minimum integrated error (MIE) to adaptively provide the best solution. The simulated experiments and real-world applications show that cooperative evolution can achieve the optimal solution which provides an accurate description on the probability distribution of the current error in the model. Therefore, the multi-kernel correntropy that is built with the optimal solution results in more robustness against the noise and outliers when training the model, which increases the accuracy of the predictions compared with other methods. Full article
(This article belongs to the Special Issue Entropy Application for Forecasting) Printed Edition available
Show Figures

Figure 1

Open AccessArticle
Design and Analysis of the Domestic Micro-Cogeneration Potential for an ORC System Adapted to a Solar Domestic Hot Water System
Entropy 2019, 21(9), 911; https://doi.org/10.3390/e21090911 - 19 Sep 2019
Cited by 2 | Viewed by 721
Abstract
This paper proposes the configuration of an Organic Rankine Cycle (ORC) coupled to a solar domestic hot water system (SDHWS) with the purpose of analyzing the cogeneration capacity of the system. A simulation of the SDHWS was conducted at different temperatures, observing its [...] Read more.
This paper proposes the configuration of an Organic Rankine Cycle (ORC) coupled to a solar domestic hot water system (SDHWS) with the purpose of analyzing the cogeneration capacity of the system. A simulation of the SDHWS was conducted at different temperatures, observing its performance to determine the amounts of useable heat generated by the solar collector; thus, from an energy balance point of view, the amount of heat that may be used by the ORC could be determined. The working fluid that would be suitable for the temperatures and pressures in the system was selected. The best fluid for the given conditions of superheated vapor at 120 °C and 604 kPa and a condensation temperature of 60 °C and 115 kPa was acetone. The main parameters for the expander thermodynamic design that may be used by the ORC were obtained, with the possibility of generating 443 kWh of annual electric energy with 6.65% global efficiency of solar to electric power, or an overall efficiency of the cogeneration system of 56.35% with a solar collector of 2.84 m2. Full article
(This article belongs to the Special Issue Thermodynamic Approaches in Modern Engineering Systems)
Show Figures

Graphical abstract

Open AccessArticle
Lossless Contrast Enhancement of Color Images with Reversible Data Hiding
Entropy 2019, 21(9), 910; https://doi.org/10.3390/e21090910 - 18 Sep 2019
Cited by 1 | Viewed by 768
Abstract
Recently, lossless contrast enhancement (CE) has been proposed so that a contrast-changed image can be converted to its original version by maintaining information entropy in it. As most of the lossless CE methods are proposed for grayscale images, artifacts are probably introduced after [...] Read more.
Recently, lossless contrast enhancement (CE) has been proposed so that a contrast-changed image can be converted to its original version by maintaining information entropy in it. As most of the lossless CE methods are proposed for grayscale images, artifacts are probably introduced after directly applying them to color images. For instance, color distortions may be caused after CE is separately conducted in each channel of the RGB (red, green, and blue) model. To cope with this issue, a new scheme is proposed based on the HSV (hue, saturation, and value) color model. Specifically, both hue and saturation components are kept unchanged while only the value component is modified. More precisely, the ratios between the RGB components are maintained while a reversible data hiding method is applied to the value component to achieve CE effects. The experimental results clearly show CE effects obtained with the proposed scheme, while the original color images can be perfectly recovered. Several metrics including image entropy were adopted to measure the changes made in CE procedure, while the performances were compared with those of one existing scheme. The evaluation results demonstrate that better image quality and increased information entropy can be simultaneously achieved with our proposed scheme. Full article
(This article belongs to the Special Issue Entropy Based Data Hiding)
Show Figures

Figure 1

Open AccessArticle
Applying the Bayesian Stackelberg Active Deception Game for Securing Infrastructure Networks
Entropy 2019, 21(9), 909; https://doi.org/10.3390/e21090909 - 18 Sep 2019
Cited by 1 | Viewed by 650
Abstract
With new security threats cropping up every day, finding a real-time and smart protection strategy for critical infrastructure has become a big challenge. Game theory is suitable for solving this problem, for it provides a theoretical framework for analyzing the intelligent decisions from [...] Read more.
With new security threats cropping up every day, finding a real-time and smart protection strategy for critical infrastructure has become a big challenge. Game theory is suitable for solving this problem, for it provides a theoretical framework for analyzing the intelligent decisions from both attackers and defenders. However, existing methods are only based on complete information and only consider a single type of attacker, which is not always available in realistic situations. Furthermore, although infrastructure interconnection has been greatly improved, there is a lack of methods considering network characteristics. To overcome these limitations, we focus on the problem of infrastructure network protection under asymmetry information. We present a novel method to measure the performance of infrastructure from the network perspective. Moreover, we propose a false network construction method to simulate how the defender applies asymmetric information to defend against the attacker actively. Meanwhile, we consider multiple types of attackers and introduce the Bayesian Stackelberg game to build the model. Experiments in real infrastructure networks reveal that our approach can improve infrastructure protection performance. Our method gives a brand new way to approach the problem of infrastructure security defense. Full article
(This article belongs to the Section Complexity)
Show Figures

Figure 1

Open AccessArticle
Parameter Optimization Based BPNN of Atmosphere Continuous-Variable Quantum Key Distribution
Entropy 2019, 21(9), 908; https://doi.org/10.3390/e21090908 - 18 Sep 2019
Viewed by 584
Abstract
The goal of continuous variable quantum key distribution (CVQKD) is to be diffusely used and adopted in diverse scenarios, so the adhibition of atmospheric channel will play a crucial part in constituting global secure quantum communications. Atmospheric channel transmittance is affected by many [...] Read more.
The goal of continuous variable quantum key distribution (CVQKD) is to be diffusely used and adopted in diverse scenarios, so the adhibition of atmospheric channel will play a crucial part in constituting global secure quantum communications. Atmospheric channel transmittance is affected by many factors and does not vary linearly, leading to great changes in signal-to-noise ratio. It is crucial to choose the appropriate modulation variance under different turbulence intensities to acquire the optimal secret key rate. In this paper, the four-state protocol, back-propagation neural network (BPNN) algorithm was discussed in the proposed scheme. We employ BPNN to CVQKD, which could adjust the modulation variance to an optimum value for ensuring the system security and making the system performance optimal. The numerical results show that the proposed scheme is equipped to improve the secret key rate efficiently. Full article
(This article belongs to the Special Issue Quantum Information Processing)
Show Figures

Figure 1

Open AccessFeature PaperArticle
Lifts of Symmetric Tensors: Fluids, Plasma, and Grad Hierarchy
Entropy 2019, 21(9), 907; https://doi.org/10.3390/e21090907 - 18 Sep 2019
Cited by 2 | Viewed by 584
Abstract
Geometrical and algebraic aspects of the Hamiltonian realizations of the Euler’s fluid and the Vlasov’s plasma are investigated. A purely geometric pathway (involving complete lifts and vertical representatives) is proposed, which establishes a link from particle motion to evolution of the field variables. [...] Read more.
Geometrical and algebraic aspects of the Hamiltonian realizations of the Euler’s fluid and the Vlasov’s plasma are investigated. A purely geometric pathway (involving complete lifts and vertical representatives) is proposed, which establishes a link from particle motion to evolution of the field variables. This pathway is free from Poisson brackets and Hamiltonian functionals. Momentum realizations (sections on T * T * Q ) of (both compressible and incompressible) Euler’s fluid and Vlasov’s plasma are derived. Poisson mappings relating the momentum realizations with the usual field equations are constructed as duals of injective Lie algebra homomorphisms. The geometric pathway is then used to construct the evolution equations for 10-moments kinetic theory. This way the entire Grad hierarchy (including entropic fields) can be constructed in a purely geometric way. This geometric way is an alternative to the usual Hamiltonian approach to mechanics based on Poisson brackets. Full article
(This article belongs to the Special Issue Entropies: Between Information Geometry and Kinetics)
Open AccessArticle
A Hierarchical Gamma Mixture Model-Based Method for Classification of High-Dimensional Data
Entropy 2019, 21(9), 906; https://doi.org/10.3390/e21090906 - 18 Sep 2019
Viewed by 766
Abstract
Data classification is an important research topic in the field of data mining. With the rapid development in social media sites and IoT devices, data have grown tremendously in volume and complexity, which has resulted in a lot of large and complex high-dimensional [...] Read more.
Data classification is an important research topic in the field of data mining. With the rapid development in social media sites and IoT devices, data have grown tremendously in volume and complexity, which has resulted in a lot of large and complex high-dimensional data. Classifying such high-dimensional complex data with a large number of classes has been a great challenge for current state-of-the-art methods. This paper presents a novel, hierarchical, gamma mixture model-based unsupervised method for classifying high-dimensional data with a large number of classes. In this method, we first partition the features of the dataset into feature strata by using k-means. Then, a set of subspace data sets is generated from the feature strata by using the stratified subspace sampling method. After that, the GMM Tree algorithm is used to identify the number of clusters and initial clusters in each subspace dataset and passing these initial cluster centers to k-means to generate base subspace clustering results. Then, the subspace clustering result is integrated into an object cluster association (OCA) matrix by using the link-based method. The ensemble clustering result is generated from the OCA matrix by the k-means algorithm with the number of clusters identified by the GMM Tree algorithm. After producing the ensemble clustering result, the dominant class label is assigned to each cluster after computing the purity. A classification is made on the object by computing the distance between the new object and the center of each cluster in the classifier, and the class label of the cluster is assigned to the new object which has the shortest distance. A series of experiments were conducted on twelve synthetic and eight real-world data sets, with different numbers of classes, features, and objects. The experimental results have shown that the new method outperforms other state-of-the-art techniques to classify data in most of the data sets. Full article
(This article belongs to the Special Issue Information-Theoretical Methods in Data Mining)
Show Figures

Figure 1

Open AccessArticle
On NACK-Based rDWS Algorithm for Network Coded Broadcast
Entropy 2019, 21(9), 905; https://doi.org/10.3390/e21090905 - 17 Sep 2019
Cited by 1 | Viewed by 938
Abstract
The Drop when seen (DWS) technique, an online network coding strategy is capable of making a broadcast transmission over erasure channels more robust. This throughput optimal strategy reduces the expected sender queue length. One major issue with the DWS technique is the high [...] Read more.
The Drop when seen (DWS) technique, an online network coding strategy is capable of making a broadcast transmission over erasure channels more robust. This throughput optimal strategy reduces the expected sender queue length. One major issue with the DWS technique is the high computational complexity. In this paper, we present a randomized version of the DWS technique (rDWS), where the unique strength of the DWS, which is the sender’s ability to drop a packet even before its decoding at receivers, is not compromised. Computational complexity of the algorithms is reduced with rDWS, but the encoding is not throughput optimal here. So, we perform a throughput efficiency analysis of it. Exact probabilistic analysis of innovativeness of a coefficient is found to be difficult. Hence, we carry out two individual analyses, maximum entropy analysis, average understanding analysis, and obtain a lower bound on the innovativeness probability of a coefficient. Based on these findings, innovativeness probability of a coded combination is analyzed. We evaluate the performance of our proposed scheme in terms of dropping and decoding statistics through simulation. Our analysis, supported by plots, reveals some interesting facts about innovativeness and shows that rDWS technique achieves near-optimal performance for a finite field of sufficient size. Full article
(This article belongs to the Special Issue Information Theory and Network Coding)
Show Figures

Figure 1

Open AccessArticle
A Generic Model for Quantum Measurements
Entropy 2019, 21(9), 904; https://doi.org/10.3390/e21090904 - 17 Sep 2019
Cited by 3 | Viewed by 635
Abstract
In previous articles, we presented a derivation of Born’s rule and unitary transforms in Quantum Mechanics (QM), from a simple set of axioms built upon a physical phenomenology of quantization—physically, the structure of QM results of an interplay between the quantized number of [...] Read more.
In previous articles, we presented a derivation of Born’s rule and unitary transforms in Quantum Mechanics (QM), from a simple set of axioms built upon a physical phenomenology of quantization—physically, the structure of QM results of an interplay between the quantized number of “modalities” accessible to a quantum system, and the continuum of “contexts” required to define these modalities. In the present article, we provide a unified picture of quantum measurements within our approach, and justify further the role of the system–context dichotomy, and of quantum interferences. We also discuss links with stochastic quantum thermodynamics, and with algebraic quantum theory. Full article
(This article belongs to the Special Issue Quantum Information Revolution: Impact to Foundations)
Show Figures

Figure 1

Open AccessArticle
A Formal Model for Semantic Computing Based on Generalized Probabilistic Automata
Entropy 2019, 21(9), 903; https://doi.org/10.3390/e21090903 - 17 Sep 2019
Viewed by 510
Abstract
In most previous research, “semantic computing” refers to computational implementations of semantic reasoning. It lacks support from the formal theory of computation. To provide solid foundations for semantic computing, researchers propose a different understanding of semantic computing based on finite automata. This approach [...] Read more.
In most previous research, “semantic computing” refers to computational implementations of semantic reasoning. It lacks support from the formal theory of computation. To provide solid foundations for semantic computing, researchers propose a different understanding of semantic computing based on finite automata. This approach provides a computer theoretical approach to semantic computing. But finite automata are not capable enough to deal with imprecise knowledge. Therefore, in this paper, we provide foundations for semantic computing based on probabilistic automata. Even though traditional probabilistic automata can handle imprecise knowledge, their limitation resides in their being defined on a fixed finite input alphabet. This deeply restricts the abilities of automata. In this paper, we rebuild traditional probabilistic automata for semantic computing. Furthermore, our new probabilistic automata are robust enough to handle any alphabet as input. They have better performances in many applications. We provide an application for weather forecasting, a domain for which traditional probabilistic automata are not effective due to their finite input alphabet. Our new probabilistic automata can overcome these limitations. Full article
Open AccessArticle
Optimization of Big Data Scheduling in Social Networks
Entropy 2019, 21(9), 902; https://doi.org/10.3390/e21090902 - 17 Sep 2019
Cited by 2 | Viewed by 616
Abstract
In social network big data scheduling, it is easy for target data to conflict in the same data node. Of the different kinds of entropy measures, this paper focuses on the optimization of target entropy. Therefore, this paper presents an optimized method for [...] Read more.
In social network big data scheduling, it is easy for target data to conflict in the same data node. Of the different kinds of entropy measures, this paper focuses on the optimization of target entropy. Therefore, this paper presents an optimized method for the scheduling of big data in social networks and also takes into account each task’s amount of data communication during target data transmission to construct a big data scheduling model. Firstly, the task scheduling model is constructed to solve the problem of conflicting target data in the same data node. Next, the necessary conditions for the scheduling of tasks are analyzed. Then, the a periodic task distribution function is calculated. Finally, tasks are scheduled based on the minimum product of the corresponding resource level and the minimum execution time of each task is calculated. Experimental results show that our optimized scheduling model quickly optimizes the scheduling of social network data and solves the problem of strong data collision. Full article
Show Figures

Figure 1

Open AccessArticle
What Is the Entropy of a Social Organization?
Entropy 2019, 21(9), 901; https://doi.org/10.3390/e21090901 - 17 Sep 2019
Cited by 2 | Viewed by 1081
Abstract
We quantify a social organization’s potentiality, that is, its ability to attain different configurations. The organization is represented as a network in which nodes correspond to individuals and (multi-)edges to their multiple interactions. Attainable configurations are treated as realizations from a network ensemble. [...] Read more.
We quantify a social organization’s potentiality, that is, its ability to attain different configurations. The organization is represented as a network in which nodes correspond to individuals and (multi-)edges to their multiple interactions. Attainable configurations are treated as realizations from a network ensemble. To have the ability to encode interaction preferences, we choose the generalized hypergeometric ensemble of random graphs, which is described by a closed-form probability distribution. From this distribution we calculate Shannon entropy as a measure of potentiality. This allows us to compare different organizations as well as different stages in the development of a given organization. The feasibility of the approach is demonstrated using data from three empirical and two synthetic systems. Full article
Show Figures

Figure 1

Open AccessArticle
An Entropy-Based Algorithm with Nonlocal Residual Learning for Image Compressive Sensing Recovery
Entropy 2019, 21(9), 900; https://doi.org/10.3390/e21090900 - 17 Sep 2019
Cited by 2 | Viewed by 623
Abstract
Image recovery from compressive sensing (CS) measurement data, especially noisy data has always been challenging due to its implicit ill-posed nature, thus, to seek a domain where a signal can exhibit a high degree of sparsity and to design an effective algorithm have [...] Read more.
Image recovery from compressive sensing (CS) measurement data, especially noisy data has always been challenging due to its implicit ill-posed nature, thus, to seek a domain where a signal can exhibit a high degree of sparsity and to design an effective algorithm have drawn increasingly more attention. Among various sparsity-based models, structured or group sparsity often leads to more powerful signal reconstruction techniques. In this paper, we propose a novel entropy-based algorithm for CS recovery to enhance image sparsity through learning the group sparsity of residual. To reduce the residual of similar packed patches, the group sparsity of residual is described by a Laplacian scale mixture (LSM) model, therefore, each singular value of the residual of similar packed patches is modeled as a Laplacian distribution with a variable scale parameter, to exploit the benefits of high-order dependency among sparse coefficients. Due to the latent variables, the maximum a posteriori (MAP) estimation of the sparse coefficients cannot be obtained, thus, we design a loss function for expectation–maximization (EM) method based on relative entropy. In the frame of EM iteration, the sparse coefficients can be estimated with the denoising-based approximate message passing (D-AMP) algorithm. Experimental results have shown that the proposed algorithm can significantly outperform existing CS techniques for image recovery. Full article
(This article belongs to the Special Issue Entropy-Based Algorithms for Signal Processing)
Show Figures

Figure 1

Open AccessArticle
Improved Adaptive Successive Cancellation List Decoding of Polar Codes
Entropy 2019, 21(9), 899; https://doi.org/10.3390/e21090899 - 17 Sep 2019
Viewed by 745
Abstract
Although the adaptive successive cancellation list (AD-SCL) algorithm and the segmented-CRC adaptive successive cancellation list (SCAD-SCL) algorithm based on the cyclic redundancy check (CRC) can greatly reduce the computational complexity of the successive cancellation list (SCL) algorithm, these two algorithms discard the previous [...] Read more.
Although the adaptive successive cancellation list (AD-SCL) algorithm and the segmented-CRC adaptive successive cancellation list (SCAD-SCL) algorithm based on the cyclic redundancy check (CRC) can greatly reduce the computational complexity of the successive cancellation list (SCL) algorithm, these two algorithms discard the previous decoding result and re-decode by increasing L, where L is the size of list. When CRC fails, these two algorithms waste useful information from the previous decoding. In this paper, a simplified adaptive successive cancellation list (SAD-SCL) is proposed. Before the re-decoding of updating value L each time, SAD-SCL uses the existing log likelihood ratio (LLR) information to locate the range of burst error bits, and then re-decoding starts at the incorrect bit with the smallest index in this range. Moreover, when the segmented information sequence cannot get the correct result of decoding, the SAD-SCL algorithm uses SC decoding to complete the decoding of the subsequent segmentation information sequence. Furthermore, its decoding performance is almost the same as that of the subsequent segmentation information sequence using the AD-SCL algorithm. The simulation results show that the SAD-SCL algorithm has lower computational complexity than AD-SCL and SCAD-SCL with negligible loss of performance. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

Open AccessArticle
Bayesian-Maximum-Entropy Reweighting of IDP Ensembles Based on NMR Chemical Shifts
Entropy 2019, 21(9), 898; https://doi.org/10.3390/e21090898 - 17 Sep 2019
Cited by 4 | Viewed by 1138
Abstract
Bayesian and Maximum Entropy approaches allow for a statistically sound and systematic fitting of experimental and computational data. Unfortunately, assessing the relative confidence in these two types of data remains difficult as several steps add unknown error. Here we propose the use of [...] Read more.
Bayesian and Maximum Entropy approaches allow for a statistically sound and systematic fitting of experimental and computational data. Unfortunately, assessing the relative confidence in these two types of data remains difficult as several steps add unknown error. Here we propose the use of a validation-set method to determine the balance, and thus the amount of fitting. We apply the method to synthetic NMR chemical shift data of an intrinsically disordered protein. We show that the method gives consistent results even when other methods to assess the amount of fitting cannot be applied. Finally, we also describe how the errors in the chemical shift predictor can lead to an incorrect fitting and how using secondary chemical shifts could alleviate this problem. Full article
Show Figures

Figure 1

Open AccessArticle
Evidential Decision Tree Based on Belief Entropy
Entropy 2019, 21(9), 897; https://doi.org/10.3390/e21090897 - 16 Sep 2019
Cited by 31 | Viewed by 1070
Abstract
Decision Tree is widely applied in many areas, such as classification and recognition. Traditional information entropy and Pearson’s correlation coefficient are often applied as measures of splitting rules to find the best splitting attribute. However, these methods can not handle uncertainty, since the [...] Read more.
Decision Tree is widely applied in many areas, such as classification and recognition. Traditional information entropy and Pearson’s correlation coefficient are often applied as measures of splitting rules to find the best splitting attribute. However, these methods can not handle uncertainty, since the relation between attributes and the degree of disorder of attributes can not be measured by them. Motivated by the idea of Deng Entropy, it can measure the uncertain degree of Basic Belief Assignment (BBA) in terms of uncertain problems. In this paper, Deng entropy is used as a measure of splitting rules to construct an evidential decision tree for fuzzy dataset classification. Compared to traditional combination rules used for combination of BBAs, the evidential decision tree can be applied to classification directly, which efficiently reduces the complexity of the algorithm. In addition, the experiments are conducted on iris dataset to build an evidential decision tree that achieves the goal of more accurate classification. Full article
Show Figures

Figure 1

Open AccessEditorial
Entropy in Dynamic Systems
Entropy 2019, 21(9), 896; https://doi.org/10.3390/e21090896 - 16 Sep 2019
Viewed by 564
Abstract
In order to measure and quantify the complex behavior of real-world systems, either novel mathematical approaches or modifications of classical ones are required to precisely predict, monitor and control complicated chaotic and stochastic processes [...] Full article
(This article belongs to the Special Issue Entropy in Dynamic Systems) Printed Edition available
Open AccessFeature PaperArticle
Universality and Exact Finite-Size Corrections for Spanning Trees on Cobweb and Fan Networks
Entropy 2019, 21(9), 895; https://doi.org/10.3390/e21090895 - 15 Sep 2019
Viewed by 652
Abstract
The concept of universality is a cornerstone of theories of critical phenomena. It is very well understood in most systems, especially in the thermodynamic limit. Finite-size systems present additional challenges. Even in low dimensions, universality of the edge and corner contributions to free [...] Read more.
The concept of universality is a cornerstone of theories of critical phenomena. It is very well understood in most systems, especially in the thermodynamic limit. Finite-size systems present additional challenges. Even in low dimensions, universality of the edge and corner contributions to free energies and response functions is less investigated and less well understood. In particular, the question arises of how universality is maintained in correction-to-scaling in systems of the same universality class but with very different corner geometries. Two-dimensional geometries deliver the simplest such examples that can be constructed with and without corners. To investigate how the presence and absence of corners manifest universality, we analyze the spanning tree generating function on two different finite systems, namely the cobweb and fan networks. The corner free energies of these configurations have stimulated significant interest precisely because of expectations regarding their universal properties and we address how this can be delivered given that the finite-size cobweb has no corners while the fan has four. To answer, we appeal to the Ivashkevich–Izmailian–Hu approach which unifies the generating functions of distinct networks in terms of a single partition function with twisted boundary conditions. This unified approach shows that the contributions to the individual corner free energies of the fan network sum to zero so that it precisely matches that of the web. It therefore also matches conformal theory (in which the central charge is found to be c = 2 ) and finite-size scaling predictions. Correspondence in each case with results established by alternative means for both networks verifies the soundness of the Ivashkevich–Izmailian–Hu algorithm. Its broad range of usefulness is demonstrated by its application to hitherto unsolved problems—namely the exact asymptotic expansions of the logarithms of the generating functions and the conformal partition functions for fan and cobweb geometries. We also investigate strip geometries, again confirming the predictions of conformal field theory. Thus, the resolution of a universality puzzle demonstrates the power of the algorithm and opens up new applications in the future. Full article
Show Figures

Figure 1

Open AccessArticle
Application of a Speedy Modified Entropy Method in Assessing the Complexity of Baroreflex Sensitivity for Age-Controlled Healthy and Diabetic Subjects
Entropy 2019, 21(9), 894; https://doi.org/10.3390/e21090894 - 14 Sep 2019
Cited by 5 | Viewed by 739
Abstract
The percussion entropy index (PEIorginal) was recently introduced to assess the complexity of baroreflex sensitivity. This study aimed to investigate the ability of a speedy modified PEI (i.e., PEINEW) application to distinguish among age-controlled subjects with or without diabetes. [...] Read more.
The percussion entropy index (PEIorginal) was recently introduced to assess the complexity of baroreflex sensitivity. This study aimed to investigate the ability of a speedy modified PEI (i.e., PEINEW) application to distinguish among age-controlled subjects with or without diabetes. This was carried out using simultaneous photo-plethysmo-graphy (PPG) pulse amplitude series and the R wave-to-R wave interval (RRI) series acquired from healthy subjects (Group 1, number = 42), subjects diagnosed as having diabetes mellitus type 2 with satisfactory blood sugar control (Group 2, number = 38), and type 2 diabetic patients with poor blood sugar control (Group 3, number = 35). Results from PEIorginal and multiscale cross-approximate entropy (MCAE) were also addressed with the same datasets for comparison. The results show that optimal prolongation between the amplitude series and RRI series could be delayed by one to three heartbeat cycles for Group 2, and one to four heartbeat cycles for Group 3 patients. Group 1 subjects only had prolongation for one heartbeat cycle. This study not only demonstrates the sensitivity of PEINEW and PEIorginal in differentiating between Groups 2 and 3 compared with MCAE, highlighting the feasibility of using percussion entropy applications in autonomic nervous function assessments, it also shows that PEINEW can considerably reduce the computational time required for such processes. Full article
Show Figures

Figure 1

Open AccessArticle
Monitoring Autonomic and Central Nervous System Activity by Permutation Entropy during Short Sojourn in Antarctica
Entropy 2019, 21(9), 893; https://doi.org/10.3390/e21090893 - 14 Sep 2019
Cited by 1 | Viewed by 719
Abstract
The aim of this study was to monitor acute response patterns of autonomic and central nervous system activity during an encounter with Antarctica by synchronously recording heart rate variability (HRV) and electroencephalography (EEG). On three different time-points during the two-week sea journey, the [...] Read more.
The aim of this study was to monitor acute response patterns of autonomic and central nervous system activity during an encounter with Antarctica by synchronously recording heart rate variability (HRV) and electroencephalography (EEG). On three different time-points during the two-week sea journey, the EEG and HRV were recorded from nine male scientists who participated in “The First Turkish Antarctic Research Expedition”. The recordings were performed in a relaxed state with the eyes open, eyes closed, and during a space quantity perception test. For the EEG recordings, the wireless 14 channel EPOC-Emotiv device was used, and for the HRV recordings, a Polar heart rate monitor S810i was used. The HRV data were analyzed by time/frequency domain parameters and ordinal pattern statistics. For the EEG data, spectral band power in the conventional frequency bands, as well as permutation entropy values were calculated. Regarding HRV, neither conventional nor permutation entropy calculations produced significant differences for the different journey time-points, but only permutation entropy was able to differentiate between the testing conditions. During the cognitive test, permutation entropy values increased significantly, whereas the conventional HRV parameters did not show any significant differences. In the EEG analysis, the ordinal pattern statistics revealed significant transitions in the course of the sea voyage as permutation entropy values decreased, whereas spectral band power analysis could not detect any significant difference. Permutation entropy analysis was further able to differentiate between the three testing conditions as well between the brain regions. In the conventional spectral band power analysis, alpha band power could separate the three testing conditions and brain regions, and beta band power could only do so for the brain regions. This superiority of permutation entropy in discerning subtle differences in the autonomic and central nervous system’s responses to an overwhelming subjective experience renders it suitable as an analysis tool for biomonitoring in extreme environments. Full article
Show Figures

Figure 1

Open AccessArticle
Functional Linear and Nonlinear Brain–Heart Interplay during Emotional Video Elicitation: A Maximum Information Coefficient Study
Entropy 2019, 21(9), 892; https://doi.org/10.3390/e21090892 - 14 Sep 2019
Cited by 3 | Viewed by 721
Abstract
Brain and heart continuously interact through anatomical and biochemical connections. Although several brain regions are known to be involved in the autonomic control, the functional brain–heart interplay (BHI) during emotional processing is not fully characterized yet. To this aim, we investigate BHI during [...] Read more.
Brain and heart continuously interact through anatomical and biochemical connections. Although several brain regions are known to be involved in the autonomic control, the functional brain–heart interplay (BHI) during emotional processing is not fully characterized yet. To this aim, we investigate BHI during emotional elicitation in healthy subjects. The functional linear and nonlinear couplings are quantified using the maximum information coefficient calculated between time-varying electroencephalography (EEG) power spectra within the canonical bands ( δ , θ , α , β and γ ), and time-varying low-frequency and high-frequency powers from heartbeat dynamics. Experimental data were gathered from 30 healthy volunteers whose emotions were elicited through pleasant and unpleasant high-arousing videos. Results demonstrate that functional BHI increases during videos with respect to a resting state through EEG oscillations not including the γ band (>30 Hz). Functional linear coupling seems associated with a high-arousing positive elicitation, with preferred EEG oscillations in the θ band ( [ 4 , 8 ) Hz) especially over the left-temporal and parietal cortices. Differential functional nonlinear coupling between emotional valence seems to mainly occur through EEG oscillations in the δ , θ , α bands and sympathovagal dynamics, as well as through δ , α , β oscillations and parasympathetic activity mainly over the right hemisphere. Functional BHI through δ and α oscillations over the prefrontal region seems primarily nonlinear. This study provides novel insights on synchronous heartbeat and cortical dynamics during emotional video elicitation, also suggesting that a nonlinear analysis is needed to fully characterize functional BHI. Full article
(This article belongs to the Special Issue Information Dynamics in Brain and Physiological Networks)
Show Figures

Figure 1

Open AccessArticle
Service-Oriented Model Encapsulation and Selection Method for Complex System Simulation Based on Cloud Architecture
Entropy 2019, 21(9), 891; https://doi.org/10.3390/e21090891 - 14 Sep 2019
Viewed by 759
Abstract
With the rise in cloud computing architecture, the development of service-oriented simulation models has gradually become a prominent topic in the field of complex system simulation. In order to support the distributed sharing of the simulation models with large computational requirements and to [...] Read more.
With the rise in cloud computing architecture, the development of service-oriented simulation models has gradually become a prominent topic in the field of complex system simulation. In order to support the distributed sharing of the simulation models with large computational requirements and to select the optimal service model to construct complex system simulation applications, this paper proposes a service-oriented model encapsulation and selection method. This method encapsulates models into shared simulation services, supports the distributed scheduling of model services in the network, and designs a semantic search framework which can support users in searching models according to model correlation. An optimization selection algorithm based on quality of service (QoS) is proposed to support users in customizing the weights of QoS indices and obtaining the ordered candidate model set by weighted comparison. The experimental results showed that the parallel operation of service models can effectively improve the execution efficiency of complex system simulation applications, and the performance was increased by 19.76% compared with that of scatter distribution strategy. The QoS weighted model selection method based on semantic search can support the effective search and selection of simulation models in the cloud environment according to the user’s preferences. Full article
(This article belongs to the Special Issue Computation in Complex Networks)
Show Figures

Figure 1

Open AccessArticle
Thermodynamics Beyond Molecules: Statistical Thermodynamics of Probability Distributions
Entropy 2019, 21(9), 890; https://doi.org/10.3390/e21090890 - 13 Sep 2019
Viewed by 977
Abstract
Statistical thermodynamics has a universal appeal that extends beyond molecular systems, and yet, as its tools are being transplanted to fields outside physics, the fundamental question, what is thermodynamics, has remained unanswered. We answer this question here. Generalized statistical thermodynamics is a variational [...] Read more.
Statistical thermodynamics has a universal appeal that extends beyond molecular systems, and yet, as its tools are being transplanted to fields outside physics, the fundamental question, what is thermodynamics, has remained unanswered. We answer this question here. Generalized statistical thermodynamics is a variational calculus of probability distributions. It is independent of physical hypotheses but provides the means to incorporate our knowledge, assumptions and physical models about a stochastic processes that gives rise to the probability in question. We derive the familiar calculus of thermodynamics via a probabilistic argument that makes no reference to physics. At the heart of the theory is a space of distributions and a special functional that assigns probabilities to this space. The maximization of this functional generates the mathematical network of thermodynamic relationship. We obtain statistical mechanics as a special case and make contact with Information Theory and Bayesian inference. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Graphical abstract

Open AccessArticle
Frequency Dependence of the Entanglement Entropy Production in a System of Coupled Driven Nonlinear Oscillators
Entropy 2019, 21(9), 889; https://doi.org/10.3390/e21090889 - 13 Sep 2019
Viewed by 509
Abstract
Driven nonlinear systems have attracted great interest owing to their applications in quantum technologies such as quantum information. In quantum information, entanglement is a vital resource and can be measured by entropy in bipartite systems. In this paper, we carry out an investigation [...] Read more.
Driven nonlinear systems have attracted great interest owing to their applications in quantum technologies such as quantum information. In quantum information, entanglement is a vital resource and can be measured by entropy in bipartite systems. In this paper, we carry out an investigation to study the impact of driving frequency on the entanglement with a bipartite system of two coupled driven nonlinear oscillators. It is numerically found that the time evolution of the entanglement entropy between the subsystems significantly depends on the driving frequency. The dependence curve of the entropy production on the driving frequency exhibits a pronounced peak. This means the entanglement between the subsystems can be greatly increased by tuning the driving frequency. Further analyses show that the enhancement of the entropy production by the driving frequency is closely related to the energy levels involved in the quantum evolution. This is confirmed by the results related to the quantum spectrum and the dispersion of the wave function in the phase space. Our work gives a convenient way to enhance the entanglement in driven nonlinear systems and throws light on the role of driven nonlinear systems in quantum information technologies. Full article
(This article belongs to the Special Issue The Ubiquity of Entropy)
Show Figures

Figure 1

Open AccessArticle
On the Security of a Latin-Bit Cube-Based Image Chaotic Encryption Algorithm
Entropy 2019, 21(9), 888; https://doi.org/10.3390/e21090888 - 12 Sep 2019
Cited by 2 | Viewed by 727
Abstract
In this paper, the security analysis of an image chaotic encryption algorithm based on Latin cubes and bit cubes is given. The proposed algorithm adopts a first-scrambling-diffusion- second-scrambling three-stage encryption scheme. First, a finite field is constructed using chaotic sequences. Then, the Latin [...] Read more.
In this paper, the security analysis of an image chaotic encryption algorithm based on Latin cubes and bit cubes is given. The proposed algorithm adopts a first-scrambling-diffusion- second-scrambling three-stage encryption scheme. First, a finite field is constructed using chaotic sequences. Then, the Latin cubes are generated from finite field operation and used for image chaotic encryption. In addition, according to the statistical characteristics of the diffusion image in the diffusion stage, the algorithm also uses different Latin cube combinations to scramble the diffusion image for the second time. However, the generation of Latin cubes in this algorithm is independent of plain image, while, in the diffusion stage, when any one bit in the plain image changes, the corresponding number of bits in the cipher image follows the change with obvious regularity. Thus, the equivalent secret keys can be obtained by chosen plaintext attack. Theoretical analysis and experimental results indicate that only a maximum of 2.5 × w × h 3 + 6 plain images are needed to crack the cipher image with w × h resolution. The size of equivalent keys deciphered by the method proposed in this paper are much smaller than other general methods of cryptanalysis for similar encryption schemes. Full article
(This article belongs to the Special Issue Entropy in Image Analysis II) Printed Edition available
Show Figures

Figure 1

Open AccessArticle
Towards Quantum-Secured Permissioned Blockchain: Signature, Consensus, and Logic
Entropy 2019, 21(9), 887; https://doi.org/10.3390/e21090887 - 12 Sep 2019
Cited by 4 | Viewed by 1256
Abstract
While Blockchain technology is universally considered as a significant technology for the near future, some of its pillars are under a threat of another thriving technology, Quantum Computing. In this paper, we propose important safeguard measures against this threat by developing a framework [...] Read more.
While Blockchain technology is universally considered as a significant technology for the near future, some of its pillars are under a threat of another thriving technology, Quantum Computing. In this paper, we propose important safeguard measures against this threat by developing a framework of a quantum-secured, permissioned blockchain called Logicontract (LC). LC adopts a digital signature scheme based on Quantum Key Distribution (QKD) mechanisms and a vote-based consensus algorithm to achieve consensus on the blockchain. The main contribution of this paper is in the development of: (1) unconditionally secure signature scheme for LC which makes it immune to the attack of quantum computers; (2) scalable consensus protocol used by LC; (3) logic-based scripting language for the creation of smart contracts on LC; (4) quantum-resistant lottery protocol which illustrates the power and usage of LC. Full article
(This article belongs to the Special Issue Blockchain: Security, Challenges, and Opportunities)
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop