Special Issue "The 20th Anniversary of Entropy - Recent Advances in Entropy and Information-Theoretic Concepts and Their Applications"

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (15 December 2018).

Special Issue Editor

Dr. Raúl Alcaraz
E-Mail Website
Guest Editor
Research Group in Electronic, Biomedical and Telecommunication Engineering, Universidad de Castilla-La Mancha, Campus Universitario s/n, 16071, Cuenca, Spain
Interests: entropy; complexity; information theory; information geometry; nonlinear dynamics; computational mathematics and statistics in medicine; biomedical time series analysis; cardiac signal processing
Special Issues and Collections in MDPI journals

Special Issue Information

Dear Colleagues,

Entropy marks its 20th anniversary in 2018. Our journal has a long history of publishing reflexive and enlightening works on the development of entropy and information-theoretic concepts as well as their application in a broad variety of areas, including physics, engineering, economics, chemistry and biology, among others. In fact, 3018 papers have been published in these two decades, among them, 444 articles have been cited 10 times or more. Moreover, in the last few years, the journal website can attract more than 160,000 views per month. Without the help of our readers, authors, anonymous peer reviewers and editors, we would  never have achieved this milestone. Thus, thank you very much!

To celebrate this anniversary, we are launching this Special Issue, entitled “Recent Advances in Entropy and Information-Theoretic Concepts and Their Applications”. The idea is to collect a set of high-quality articles that highlight the most recent advances in the field of entropy and information theory, present novel and provocative applications and address the most relevant challenges for the next years.

We look forward to providing many more years of high-quality science publication and serve our readership with intellectually-stimulating science.

Prof. Dr. Raúl Alcaraz Martínez
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (26 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

Open AccessArticle
Entropy of Simulated Liquids Using Multiscale Cell Correlation
Entropy 2019, 21(8), 750; https://doi.org/10.3390/e21080750 - 31 Jul 2019
Abstract
Accurately calculating the entropy of liquids is an important goal, given that many processes take place in the liquid phase. Of almost equal importance is understanding the values obtained. However, there are few methods that can calculate the entropy of such systems, and [...] Read more.
Accurately calculating the entropy of liquids is an important goal, given that many processes take place in the liquid phase. Of almost equal importance is understanding the values obtained. However, there are few methods that can calculate the entropy of such systems, and fewer still to make sense of the values obtained. We present our multiscale cell correlation (MCC) method to calculate the entropy of liquids from molecular dynamics simulations. The method uses forces and torques at the molecule and united-atom levels and probability distributions of molecular coordinations and conformations. The main differences with previous work are the consistent treatment of the mean-field cell approximation to the approriate degrees of freedom, the separation of the force and torque covariance matrices, and the inclusion of conformation correlation for molecules with multiple dihedrals. MCC is applied to a broader set of 56 important industrial liquids modeled using the Generalized AMBER Force Field (GAFF) and Optimized Potentials for Liquid Simulations (OPLS) force fields with 1.14*CM1A charges. Unsigned errors versus experimental entropies are 8.7 J K 1 mol 1 for GAFF and 9.8 J K 1 mol 1 for OPLS. This is significantly better than the 2-Phase Thermodynamics method for the subset of molecules in common, which is the only other method that has been applied to such systems. MCC makes clear why the entropy has the value it does by providing a decomposition in terms of translational and rotational vibrational entropy and topographical entropy at the molecular and united-atom levels. Full article
Show Figures

Graphical abstract

Open AccessArticle
Information Theory and an Entropic Approach to an Analysis of Fiscal Inequality
Entropy 2019, 21(7), 643; https://doi.org/10.3390/e21070643 - 28 Jun 2019
Cited by 1
Abstract
In his influential study, Theil (1967) developed the notion of entropy on the basis of information theory. He then advocated the use of entropy-based measure for the analysis of income inequality. In this paper, the first of its kind, we apply Theil’s notion [...] Read more.
In his influential study, Theil (1967) developed the notion of entropy on the basis of information theory. He then advocated the use of entropy-based measure for the analysis of income inequality. In this paper, the first of its kind, we apply Theil’s notion of entropy to public finances in multi-tiered governments, in particular for a measurement of fiscal decentralisation, which is currently very crude in terms of the ratio between local government revenue and total revenue. It is the claim of this paper that such an approach of measuring fiscal decentralisation completely ignores important distributional aspects of fiscal arrangements. Findings from this paper indicate that studies on measuring various aspects of fiscal activities—such as fiscal decentralisation—should carefully take into account the dispersion of revenue (and expenditure) across regions. On that basis, the entropic approach developed in this paper is able to accommodate these dispersions across subnational governments. As an illustration for the case of Vietnam, the true degree of fiscal decentralization has effectively been decreased in comparison with estimates from other simple measurements due to the presence of substantial dispersions of revenue and expenditure from the subnational governments across 63 provinces in Vietnam. Full article
Show Figures

Figure 1

Open AccessArticle
The Arbitrarily Varying Relay Channel
Entropy 2019, 21(5), 516; https://doi.org/10.3390/e21050516 - 22 May 2019
Cited by 1
Abstract
We study the arbitrarily varying relay channel, which models communication with relaying in the presence of an active adversary. We establish the cutset bound and partial decode-forward bound on the random code capacity. We further determine the random code capacity for special cases. [...] Read more.
We study the arbitrarily varying relay channel, which models communication with relaying in the presence of an active adversary. We establish the cutset bound and partial decode-forward bound on the random code capacity. We further determine the random code capacity for special cases. Then, we consider conditions under which the deterministic code capacity is determined as well. In addition, we consider the arbitrarily varying Gaussian relay channel with sender frequency division under input and state constraints. We determine the random code capacity, and establish lower and upper bounds on the deterministic code capacity. Furthermore, we show that as opposed to previous relay models, the primitive relay channel has a different behavior compared to the non-primitive relay channel in the arbitrarily varying scenario. Full article
Show Figures

Figure 1

Open AccessArticle
A Novel Uncertainty Management Approach for Air Combat Situation Assessment Based on Improved Belief Entropy
Entropy 2019, 21(5), 495; https://doi.org/10.3390/e21050495 - 14 May 2019
Abstract
Uncertain information exists in each procedure of an air combat situation assessment. To address this issue, this paper proposes an improved method to address the uncertain information fusion of air combat situation assessment in the Dempster–Shafer evidence theory (DST) framework. A better fusion [...] Read more.
Uncertain information exists in each procedure of an air combat situation assessment. To address this issue, this paper proposes an improved method to address the uncertain information fusion of air combat situation assessment in the Dempster–Shafer evidence theory (DST) framework. A better fusion result regarding the prediction of military intention can be helpful for decision-making in an air combat situation. To obtain a more accurate fusion result of situation assessment, an improved belief entropy (IBE) is applied to preprocess the uncertainty of situation assessment information. Data fusion of assessment information after preprocessing will be based on the classical Dempster’s rule of combination. The illustrative example result validates the rationality and the effectiveness of the proposed method. Full article
Show Figures

Figure 1

Open AccessArticle
Bounded Rational Decision-Making from Elementary Computations That Reduce Uncertainty
Entropy 2019, 21(4), 375; https://doi.org/10.3390/e21040375 - 06 Apr 2019
Abstract
In its most basic form, decision-making can be viewed as a computational process that progressively eliminates alternatives, thereby reducing uncertainty. Such processes are generally costly, meaning that the amount of uncertainty that can be reduced is limited by the amount of available computational [...] Read more.
In its most basic form, decision-making can be viewed as a computational process that progressively eliminates alternatives, thereby reducing uncertainty. Such processes are generally costly, meaning that the amount of uncertainty that can be reduced is limited by the amount of available computational resources. Here, we introduce the notion of elementary computation based on a fundamental principle for probability transfers that reduce uncertainty. Elementary computations can be considered as the inverse of Pigou–Dalton transfers applied to probability distributions, closely related to the concepts of majorization, T-transforms, and generalized entropies that induce a preorder on the space of probability distributions. Consequently, we can define resource cost functions that are order-preserving and therefore monotonic with respect to the uncertainty reduction. This leads to a comprehensive notion of decision-making processes with limited resources. Along the way, we prove several new results on majorization theory, as well as on entropy and divergence measures. Full article
Show Figures

Graphical abstract

Open AccessArticle
Using the Data-Compression Method for Studying Hunting Behavior in Small Mammals
Entropy 2019, 21(4), 368; https://doi.org/10.3390/e21040368 - 04 Apr 2019
Abstract
Using the data-compression method we revealed a similarity between hunting behaviors of the common shrew, which is insectivorous, and several rodent species with different types of diet. Seven rodent species studied displayed succinct, highly predictable hunting stereotypes, in which it was easy for [...] Read more.
Using the data-compression method we revealed a similarity between hunting behaviors of the common shrew, which is insectivorous, and several rodent species with different types of diet. Seven rodent species studied displayed succinct, highly predictable hunting stereotypes, in which it was easy for the data compressor to find regularities. The generalist Norway rat, with its changeable manipulation of prey and less predictable transitions between stereotype elements, significantly differs from other species. The levels of complexities of hunting stereotypes in young and adult rats are similar, and both groups had no prior experience with the prey, so one can assume that it is not learning, but rather the specificity of the organization of the stereotype that is responsible for the nature of the hunting behavior in rats. We speculate that rodents possess different types of hunting behaviors, one of which is based on a succinct insectivorous standard, and another type, perhaps characteristic of generalists, which is less ordered and is characterized by poorly predictable transitions between elements. We suggest that the data-compression method may well be more broadly applicable to behavioral analysis. Full article
Show Figures

Figure 1

Open AccessArticle
CDCS: Cluster-Based Distributed Compressed Sensing to Facilitate QoS Routing in Cognitive Video Sensor Networks
Entropy 2019, 21(4), 345; https://doi.org/10.3390/e21040345 - 28 Mar 2019
Abstract
Compressed sensing based in-network compression methods which minimize data redundancy are critical to cognitive video sensor networks. However, most existing methods require a large number of sensors for each measurement, resulting in significant performance degradation in energy efficiency and quality-of-service satisfaction. In this [...] Read more.
Compressed sensing based in-network compression methods which minimize data redundancy are critical to cognitive video sensor networks. However, most existing methods require a large number of sensors for each measurement, resulting in significant performance degradation in energy efficiency and quality-of-service satisfaction. In this paper, a cluster-based distributed compressed sensing scheme working together with a quality-of-service aware routing framework is proposed to deliver visual information in cognitive video sensor networks efficiently. First, the correlation among adjacent video sensors determines the member nodes that participate in a cluster. On this basis, a sequential compressed sensing approach is applied to determine whether enough measurements are obtained to limit the reconstruction error between decoded signals and original signals under a specified reconstruction threshold. The goal is to maximize the removal of unnecessary traffic without sacrificing video quality. Lastly, the compressed data is transmitted via a distributed spectrum-aware quality-of-service routing scheme, with an objective of minimizing energy consumption subject to delay and reliability constraints. Simulation results demonstrate that the proposed approach can achieve energy-efficient data delivery and reconstruction accuracy of visual information compared with existing quality-of-service routing schemes. Full article
Show Figures

Figure 1

Open AccessArticle
Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding
Entropy 2019, 21(3), 243; https://doi.org/10.3390/e21030243 - 04 Mar 2019
Abstract
Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted [...] Read more.
Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted to continuous variables because the calculation of Fisher information requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations of the mutual information based on Kullback-Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all work for discrete variables, one of them has consistent performance and high accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating the mutual information between the stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems. Full article
Show Figures

Figure 1

Open AccessArticle
Identification of Hypsarrhythmia in Children with Microcephaly Infected by Zika Virus
Entropy 2019, 21(3), 232; https://doi.org/10.3390/e21030232 - 28 Feb 2019
Abstract
Hypsarrhythmia is an electroencephalographic pattern specific to some epileptic syndromes that affect children under one year of age. The identification of this pattern, in some cases, causes disagreements between experts, which is worrisome since an inaccurate diagnosis can bring complications to the infant. [...] Read more.
Hypsarrhythmia is an electroencephalographic pattern specific to some epileptic syndromes that affect children under one year of age. The identification of this pattern, in some cases, causes disagreements between experts, which is worrisome since an inaccurate diagnosis can bring complications to the infant. Despite the difficulties in visually identifying hypsarrhythmia, options of computerized assistance are scarce. Aiming to collaborate with the recognition of this electropathological pattern, we propose in this paper a mathematical index that can help electroencephalography experts to identify hypsarrhythmia. We performed hypothesis tests that indicated significant differences in the groups under analysis, where the p-values were found to be extremely small. Full article
Show Figures

Figure 1

Open AccessArticle
Unification of Epistemic and Ontic Concepts of Information, Probability, and Entropy, Using Cognizers-System Model
Entropy 2019, 21(2), 216; https://doi.org/10.3390/e21020216 - 24 Feb 2019
Abstract
Information and probability are common words used in scientific investigations. However, information and probability both involve epistemic (subjective) and ontic (objective) interpretations under the same terms, which causes controversy within the concept of entropy in physics and biology. There is another issue regarding [...] Read more.
Information and probability are common words used in scientific investigations. However, information and probability both involve epistemic (subjective) and ontic (objective) interpretations under the same terms, which causes controversy within the concept of entropy in physics and biology. There is another issue regarding the circularity between information (or data) and reality: The observation of reality produces phenomena (or events), whereas the reality is confirmed (or constituted) by phenomena. The ordinary concept of information presupposes reality as a source of information, whereas another type of information (known as it-from-bit) constitutes the reality from data (bits). In this paper, a monistic model, called the cognizers-system model (CS model), is employed to resolve these issues. In the CS model, observations (epistemic) and physical changes (ontic) are both unified as “cognition”, meaning a related state change. Information and probability, epistemic and ontic, are formalized and analyzed systematically using a common theoretical framework of the CS model or a related model. Based on the results, a perspective for resolving controversial issues of entropy originating from information and probability is presented. Full article
Show Figures

Figure 1

Open AccessArticle
An Information Theory-Based Approach to Assessing Spatial Patterns in Complex Systems
Entropy 2019, 21(2), 182; https://doi.org/10.3390/e21020182 - 15 Feb 2019
Abstract
Given the intensity and frequency of environmental change, the linked and cross-scale nature of social-ecological systems, and the proliferation of big data, methods that can help synthesize complex system behavior over a geographical area are of great value. Fisher information evaluates order in [...] Read more.
Given the intensity and frequency of environmental change, the linked and cross-scale nature of social-ecological systems, and the proliferation of big data, methods that can help synthesize complex system behavior over a geographical area are of great value. Fisher information evaluates order in data and has been established as a robust and effective tool for capturing changes in system dynamics, including the detection of regimes and regime shifts. The methods developed to compute Fisher information can accommodate multivariate data of various types and requires no a priori decisions about system drivers, making it a unique and powerful tool. However, the approach has primarily been used to evaluate temporal patterns. In its sole application to spatial data, Fisher information successfully detected regimes in terrestrial and aquatic systems over transects. Although the selection of adjacently positioned sampling stations provided a natural means of ordering the data, such an approach limits the types of questions that can be answered in a spatial context. Here, we expand the approach to develop a method for more fully capturing spatial dynamics. The results reflect changes in the index that correspond with geographical patterns and demonstrate the utility of the method in uncovering hidden spatial trends in complex systems. Full article
Show Figures

Figure 1

Open AccessArticle
Incentive Contract Design for the Water-Rail-Road Intermodal Transportation with Travel Time Uncertainty: A Stackelberg Game Approach
Entropy 2019, 21(2), 161; https://doi.org/10.3390/e21020161 - 09 Feb 2019
Cited by 1
Abstract
In the management of intermodal transportation, incentive contract design problem has significant impacts on the benefit of a multimodal transport operator (MTO). In this paper, we analyze a typical water-rail-road (WRR) intermodal transportation that is composed of three serial transportation stages: water, rail [...] Read more.
In the management of intermodal transportation, incentive contract design problem has significant impacts on the benefit of a multimodal transport operator (MTO). In this paper, we analyze a typical water-rail-road (WRR) intermodal transportation that is composed of three serial transportation stages: water, rail and road. In particular, the entire transportation process is planned, organized, and funded by an MTO that outsources the transportation task at each stage to independent carriers (subcontracts). Due to the variability of transportation conditions, the travel time of each transportation stage depending on the respective carrier’s effort level is unknown (asymmetric information) and characterized as an uncertain variable via the experts’ estimations. Considering the decentralized decision-making process, we interpret the incentive contract design problem for the WRR intermodal transportation as a Stackelberg game in which the risk-neutral MTO serves as the leader and the risk-averse carriers serve as the followers. Within the framework of uncertainty theory, we formulate an uncertain bi-level programming model for the incentive contract design problem under expectation and entropy decision criteria. Subsequently, we provide the analytical results of the proposed model and analyze the optimal time-based incentive contracts by developing a hybrid solution method which combines a decomposition approach and an iterative algorithm. Finally, we give a simulation example to investigate the impact of asymmetric information on the optimal time-based incentive contracts and to identify the value of information for WRR intermodal transportation. Full article
Show Figures

Figure 1

Open AccessArticle
Sensorless Pose Determination Using Randomized Action Sequences
Entropy 2019, 21(2), 154; https://doi.org/10.3390/e21020154 - 06 Feb 2019
Abstract
This paper is a study of 2D manipulation without sensing and planning, by exploring the effects of unplanned randomized action sequences on 2D object pose uncertainty. Our approach follows the work of Erdmann and Mason’s sensorless reorienting of an object into a completely [...] Read more.
This paper is a study of 2D manipulation without sensing and planning, by exploring the effects of unplanned randomized action sequences on 2D object pose uncertainty. Our approach follows the work of Erdmann and Mason’s sensorless reorienting of an object into a completely determined pose, regardless of its initial pose. While Erdmann and Mason proposed a method using Newtonian mechanics, this paper shows that under some circumstances, a long enough sequence of random actions will also converge toward a determined final pose of the object. This is verified through several simulation and real robot experiments where randomized action sequences are shown to reduce entropy of the object pose distribution. The effects of varying object shapes, action sequences, and surface friction are also explored. Full article
Show Figures

Figure 1

Open AccessArticle
A Neighborhood Rough Sets-Based Attribute Reduction Method Using Lebesgue and Entropy Measures
Entropy 2019, 21(2), 138; https://doi.org/10.3390/e21020138 - 01 Feb 2019
Cited by 4
Abstract
For continuous numerical data sets, neighborhood rough sets-based attribute reduction is an important step for improving classification performance. However, most of the traditional reduction algorithms can only handle finite sets, and yield low accuracy and high cardinality. In this paper, a novel attribute [...] Read more.
For continuous numerical data sets, neighborhood rough sets-based attribute reduction is an important step for improving classification performance. However, most of the traditional reduction algorithms can only handle finite sets, and yield low accuracy and high cardinality. In this paper, a novel attribute reduction method using Lebesgue and entropy measures in neighborhood rough sets is proposed, which has the ability of dealing with continuous numerical data whilst maintaining the original classification information. First, Fisher score method is employed to eliminate irrelevant attributes to significantly reduce computation complexity for high-dimensional data sets. Then, Lebesgue measure is introduced into neighborhood rough sets to investigate uncertainty measure. In order to analyze the uncertainty and noisy of neighborhood decision systems well, based on Lebesgue and entropy measures, some neighborhood entropy-based uncertainty measures are presented, and by combining algebra view with information view in neighborhood rough sets, a neighborhood roughness joint entropy is developed in neighborhood decision systems. Moreover, some of their properties are derived and the relationships are established, which help to understand the essence of knowledge and the uncertainty of neighborhood decision systems. Finally, a heuristic attribute reduction algorithm is designed to improve the classification performance of large-scale complex data. The experimental results under an instance and several public data sets show that the proposed method is very effective for selecting the most relevant attributes with high classification accuracy. Full article
Show Figures

Figure 1

Open AccessArticle
K-th Nearest Neighbor (KNN) Entropy Estimates of Complexity and Integration from Ongoing and Stimulus-Evoked Electroencephalographic (EEG) Recordings of the Human Brain
Entropy 2019, 21(1), 61; https://doi.org/10.3390/e21010061 - 13 Jan 2019
Abstract
Information-theoretic measures for quantifying multivariate statistical dependence have proven useful for the study of the unity and diversity of the human brain. Two such measures–integration, I(X), and interaction complexity, CI(X)–have been previously applied to electroencephalographic (EEG) signals recorded during [...] Read more.
Information-theoretic measures for quantifying multivariate statistical dependence have proven useful for the study of the unity and diversity of the human brain. Two such measures–integration, I(X), and interaction complexity, CI(X)–have been previously applied to electroencephalographic (EEG) signals recorded during ongoing wakeful brain states. Here, I(X) and CI(X) were computed for empirical and simulated visually-elicited alpha-range (8–13 Hz) EEG signals. Integration and complexity of evoked (stimulus-locked) and induced (non-stimulus-locked) EEG responses were assessed using nonparametric k-th nearest neighbor (KNN) entropy estimation, which is robust to the nonstationarity of stimulus-elicited EEG signals. KNN-based I(X) and CI(X) were also computed for the alpha-range EEG of ongoing wakeful brain states. I(X) and CI(X) patterns differentiated between induced and evoked EEG signals and replicated previous wakeful EEG findings obtained using Gaussian-based entropy estimators. Absolute levels of I(X) and CI(X) were related to absolute levels of alpha-range EEG power and phase synchronization, but stimulus-related changes in the information-theoretic and other EEG properties were independent. These findings support the hypothesis that visual perception and ongoing wakeful mental states emerge from complex, dynamical interaction among segregated and integrated brain networks operating near an optimal balance between order and disorder. Full article
Show Figures

Figure 1

Open AccessArticle
AD or Non-AD: A Deep Learning Approach to Detect Advertisements from Magazines
Entropy 2018, 20(12), 982; https://doi.org/10.3390/e20120982 - 17 Dec 2018
Cited by 2
Abstract
The processing and analyzing of multimedia data has become a popular research topic due to the evolution of deep learning. Deep learning has played an important role in addressing many challenging problems, such as computer vision, image recognition, and image detection, which can [...] Read more.
The processing and analyzing of multimedia data has become a popular research topic due to the evolution of deep learning. Deep learning has played an important role in addressing many challenging problems, such as computer vision, image recognition, and image detection, which can be useful in many real-world applications. In this study, we analyzed visual features of images to detect advertising images from scanned images of various magazines. The aim is to identify key features of advertising images and to apply them to real-world application. The proposed work will eventually help improve marketing strategies, which requires the classification of advertising images from magazines. We employed convolutional neural networks to classify scanned images as either advertisements or non-advertisements (i.e., articles). The results show that the proposed approach outperforms other classifiers and the related work in terms of accuracy. Full article
Show Figures

Figure 1

Open AccessArticle
An Informational Test for Random Finite Strings
Entropy 2018, 20(12), 934; https://doi.org/10.3390/e20120934 - 06 Dec 2018
Cited by 1
Abstract
In this paper, by extending some results of informational genomics, we present a new randomness test based on the empirical entropy of strings and some properties of the repeatability and unrepeatability of substrings of certain lengths. We give the theoretical motivations of our [...] Read more.
In this paper, by extending some results of informational genomics, we present a new randomness test based on the empirical entropy of strings and some properties of the repeatability and unrepeatability of substrings of certain lengths. We give the theoretical motivations of our method and some experimental results of its application to a wide class of strings: decimal representations of real numbers, roulette outcomes, logistic maps, linear congruential generators, quantum measurements, natural language texts, and genomes. It will be evident that the evaluation of randomness resulting from our tests does not distinguish among the different sources of randomness (natural, or pseudo-casual). Full article
Show Figures

Figure 1

Open AccessArticle
Entropy and Mutability for the q-State Clock Model in Small Systems
Entropy 2018, 20(12), 933; https://doi.org/10.3390/e20120933 - 06 Dec 2018
Abstract
In this paper, we revisit the q-state clock model for small systems. We present results for the thermodynamics of the q-state clock model for values from q = 2 to q = 20 for small square lattices of L × L [...] Read more.
In this paper, we revisit the q-state clock model for small systems. We present results for the thermodynamics of the q-state clock model for values from q = 2 to q = 20 for small square lattices of L × L , with L ranging from L = 3 to L = 64 with free-boundary conditions. Energy, specific heat, entropy, and magnetization were measured. We found that the Berezinskii–Kosterlitz–Thouless (BKT)-like transition appears for q > 5, regardless of lattice size, while this transition at q = 5 is lost for L < 10; for q 4, the BKT transition is never present. We present the phase diagram in terms of q that shows the transition from the ferromagnetic (FM) to the paramagnetic (PM) phases at the critical temperature T 1 for small systems, and the transition changes such that it is from the FM to the BKT phase for larger systems, while a second phase transition between the BKT and the PM phases occurs at T 2. We also show that the magnetic phases are well characterized by the two-dimensional (2D) distribution of the magnetization values. We made use of this opportunity to carry out an information theory analysis of the time series obtained from Monte Carlo simulations. In particular, we calculated the phenomenological mutability and diversity functions. Diversity characterizes the phase transitions, but the phases are less detectable as q increases. Free boundary conditions were used to better mimic the reality of small systems (far from any thermodynamic limit). The role of size is discussed. Full article
Show Figures

Figure 1

Open AccessArticle
A New Entropy-Based Atrial Fibrillation Detection Method for Scanning Wearable ECG Recordings
Entropy 2018, 20(12), 904; https://doi.org/10.3390/e20120904 - 26 Nov 2018
Cited by 3
Abstract
Entropy-based atrial fibrillation (AF) detectors have been applied for short-term electrocardiogram (ECG) analysis. However, existing methods suffer from several limitations. To enhance the performance of entropy-based AF detectors, we have developed a new entropy measure, named EntropyAF, which includes the following [...] Read more.
Entropy-based atrial fibrillation (AF) detectors have been applied for short-term electrocardiogram (ECG) analysis. However, existing methods suffer from several limitations. To enhance the performance of entropy-based AF detectors, we have developed a new entropy measure, named EntropyAF, which includes the following improvements: (1) use of a ranged function rather than the Chebyshev function to define vector distance, (2) use of a fuzzy function to determine vector similarity, (3) replacement of the probability estimation with density estimation for entropy calculation, (4) use of a flexible distance threshold parameter, and (5) use of adjusted entropy results for the heart rate effect. EntropyAF was trained using the MIT-BIH Atrial Fibrillation (AF) database, and tested on the clinical wearable long-term AF recordings. Three previous entropy-based AF detectors were used for comparison: sample entropy (SampEn), fuzzy measure entropy (FuzzyMEn) and coefficient of sample entropy (COSEn). For classifying AF and non-AF rhythms in the MIT-BIH AF database, EntropyAF achieved the highest area under receiver operating characteristic curve (AUC) values of 98.15% when using a 30-beat time window, which was higher than COSEn with AUC of 91.86%. SampEn and FuzzyMEn resulted in much lower AUCs of 74.68% and 79.24% respectively. For classifying AF and non-AF rhythms in the clinical wearable AF database, EntropyAF also generated the largest values of Youden index (77.94%), sensitivity (92.77%), specificity (85.17%), accuracy (87.10%), positive predictivity (68.09%) and negative predictivity (97.18%). COSEn had the second-best accuracy of 78.63%, followed by an accuracy of 65.08% in FuzzyMEn and an accuracy of 59.91% in SampEn. The new proposed EntropyAF also generated highest classification accuracy when using a 12-beat time window. In addition, the results from time cost analysis verified the efficiency of the new EntropyAF. This study showed the better discrimination ability for identifying AF when using EntropyAF method, indicating that it would be useful for the practical clinical wearable AF scanning. Full article
Show Figures

Figure 1

Open AccessArticle
The Reconciliation of Multiple Conflicting Estimates: Entropy-Based and Axiomatic Approaches
Entropy 2018, 20(11), 815; https://doi.org/10.3390/e20110815 - 23 Oct 2018
Abstract
When working with economic accounts it may occur that multiple estimates of a single datum exist, with different degrees of uncertainty or data quality. This paper addresses the problem of defining a method that can reconcile conflicting estimates, given best guess and uncertainty [...] Read more.
When working with economic accounts it may occur that multiple estimates of a single datum exist, with different degrees of uncertainty or data quality. This paper addresses the problem of defining a method that can reconcile conflicting estimates, given best guess and uncertainty values. We proceeded from first principles, using two different routes. First, under an entropy-based approach, the data reconciliation problem is addressed as a particular case of a wider data balancing problem, and an alternative setting is found in which the multiple estimates are replaced by a single one. Afterwards, under an axiomatic approach, a set of properties is defined, which characterizes the ideal data reconciliation method. Under both approaches, the conclusion is that the formula for the reconciliation of best guesses is a weighted arithmetic average, with the inverse of uncertainties as weights, and that the formula for the reconciliation of uncertainties is a harmonic average. Full article
Show Figures

Figure 1

Open AccessArticle
Information Geometric Approach on Most Informative Boolean Function Conjecture
Entropy 2018, 20(9), 688; https://doi.org/10.3390/e20090688 - 10 Sep 2018
Cited by 1
Abstract
Let X n be a memoryless uniform Bernoulli source and Y n be the output of it through a binary symmetric channel. Courtade and Kumar conjectured that the Boolean function f : { 0 , 1 } n { 0 , 1 [...] Read more.
Let X n be a memoryless uniform Bernoulli source and Y n be the output of it through a binary symmetric channel. Courtade and Kumar conjectured that the Boolean function f : { 0 , 1 } n { 0 , 1 } that maximizes the mutual information I ( f ( X n ) ; Y n ) is a dictator function, i.e., f ( x n ) = x i for some i. We propose a clustering problem, which is equivalent to the above problem where we emphasize an information geometry aspect of the equivalent problem. Moreover, we define a normalized geometric mean of measures and interesting properties of it. We also show that the conjecture is true when the arithmetic and geometric mean coincide in a specific set of measures. Full article
Open AccessArticle
Fuzzy and Sample Entropies as Predictors of Patient Survival Using Short Ventricular Fibrillation Recordings during out of Hospital Cardiac Arrest
Entropy 2018, 20(8), 591; https://doi.org/10.3390/e20080591 - 09 Aug 2018
Cited by 6
Abstract
Optimal defibrillation timing guided by ventricular fibrillation (VF) waveform analysis would contribute to improved survival of out-of-hospital cardiac arrest (OHCA) patients by minimizing myocardial damage caused by futile defibrillation shocks and minimizing interruptions to cardiopulmonary resuscitation. Recently, fuzzy entropy (FuzzyEn) tailored to jointly [...] Read more.
Optimal defibrillation timing guided by ventricular fibrillation (VF) waveform analysis would contribute to improved survival of out-of-hospital cardiac arrest (OHCA) patients by minimizing myocardial damage caused by futile defibrillation shocks and minimizing interruptions to cardiopulmonary resuscitation. Recently, fuzzy entropy (FuzzyEn) tailored to jointly measure VF amplitude and regularity has been shown to be an efficient defibrillation success predictor. In this study, 734 shocks from 296 OHCA patients (50 survivors) were analyzed, and the embedding dimension (m) and matching tolerance (r) for FuzzyEn and sample entropy (SampEn) were adjusted to predict defibrillation success and patient survival. Entropies were significantly larger in successful shocks and in survivors, and when compared to the available methods, FuzzyEn presented the best prediction results, marginally outperforming SampEn. The sensitivity and specificity of FuzzyEn were 83.3% and 76.7% when predicting defibrillation success, and 83.7% and 73.5% for patient survival. Sensitivities and specificities were two points above those of the best available methods, and the prediction accuracy was kept even for VF intervals as short as 2s. These results suggest that FuzzyEn and SampEn may be promising tools for optimizing the defibrillation time and predicting patient survival in OHCA patients presenting VF. Full article
Show Figures

Figure 1

Open AccessArticle
Approach to Evaluating Accounting Informatization Based on Entropy in Intuitionistic Fuzzy Environment
Entropy 2018, 20(6), 476; https://doi.org/10.3390/e20060476 - 20 Jun 2018
Cited by 4
Abstract
Accounting informatization is an important part of enterprise informatization. It affects the accounting and finance operational efficiency. With the comprehensive evaluation of accounting informatization from multiple aspects, we can find the strengths and weaknesses of corporate accounting informatization, which then can be improved [...] Read more.
Accounting informatization is an important part of enterprise informatization. It affects the accounting and finance operational efficiency. With the comprehensive evaluation of accounting informatization from multiple aspects, we can find the strengths and weaknesses of corporate accounting informatization, which then can be improved precisely. In this paper, an evaluation approach of accounting informatization is proposed. Firstly, the evaluation index system is constructed from the aspects of strategic position, infrastructure construction, implementation of accounting informatization, informatization guarantee, and application efficiency. Considering the complexity and ambiguity of the index, experts are required to give linguistic ratings which are then converted into intuitionistic fuzzy number. Then, an entropy and cross entropy method based on intuitionistic fuzzy sets is proposed to derive the weights of experts so as to reduce the error caused by personal bias. By combining the weights of the index and weighted ratings, the evaluation results are obtained. Finally, a case of accounting information evaluation is given to illustrate the feasibility of the proposed approach. Full article
Show Figures

Figure 1

Open AccessArticle
Numerical Analysis of Consensus Measures within Groups
Entropy 2018, 20(6), 408; https://doi.org/10.3390/e20060408 - 25 May 2018
Abstract
Measuring the consensus for a group of ordinal-type responses is of practical importance in decision making. Many consensus measures appear in the literature, but they sometimes provide inconsistent results. Therefore, it is crucial to compare these consensus measures, and analyze their relationships. In [...] Read more.
Measuring the consensus for a group of ordinal-type responses is of practical importance in decision making. Many consensus measures appear in the literature, but they sometimes provide inconsistent results. Therefore, it is crucial to compare these consensus measures, and analyze their relationships. In this study, we targeted five consensus measures: Φ e (from entropy), Φ 1 (from absolute deviation), Φ 2 (from variance), Φ 3 (from skewness), and Φ m v (from conditional probability). We generated 316,251 probability distributions, and analyzed the relationships among their consensus values. Our results showed that Φ 1 ,   Φ e ,   Φ 2 , and Φ 3 tended to provide consistent results, and the ordering Φ 1 Φ e Φ 2 Φ 3 held at a high probability. Although Φ m v had a positive correlation with Φ 1 ,   Φ e ,   Φ 2 , and Φ 3 , it had a much lower tolerance for even a small proportion of extreme opposite opinions than Φ 1 ,   Φ e ,   Φ 2 , and Φ 3 did. Full article
Show Figures

Figure 1

Review

Jump to: Research

Open AccessReview
Applications of Information Theory in Solar and Space Physics
Entropy 2019, 21(2), 140; https://doi.org/10.3390/e21020140 - 01 Feb 2019
Cited by 1
Abstract
Characterizing and modeling processes at the sun and space plasma in our solar system are difficult because the underlying physics is often complex, nonlinear, and not well understood. The drivers of a system are often nonlinearly correlated with one another, which makes it [...] Read more.
Characterizing and modeling processes at the sun and space plasma in our solar system are difficult because the underlying physics is often complex, nonlinear, and not well understood. The drivers of a system are often nonlinearly correlated with one another, which makes it a challenge to understand the relative effects caused by each driver. However, entropy-based information theory can be a valuable tool that can be used to determine the information flow among various parameters, causalities, untangle the drivers, and provide observational constraints that can help guide the development of the theories and physics-based models. We review two examples of the applications of the information theoretic tools at the Sun and near-Earth space environment. In the first example, the solar wind drivers of radiation belt electrons are investigated using mutual information (MI), conditional mutual information (CMI), and transfer entropy (TE). As previously reported, radiation belt electron flux (Je) is anticorrelated with solar wind density (nsw) with a lag of 1 day. However, this lag time and anticorrelation can be attributed mainly to the Je(t + 2 days) correlation with solar wind velocity (Vsw)(t) and nsw(t + 1 day) anticorrelation with Vsw(t). Analyses of solar wind driving of the magnetosphere need to consider the large lag times, up to 3 days, in the (Vsw, nsw) anticorrelation. Using CMI to remove the effects of Vsw, the response of Je to nsw is 30% smaller and has a lag time <24 h, suggesting that the loss mechanism due to nsw or solar wind dynamic pressure has to start operating in <24 h. Nonstationarity in the system dynamics is investigated using windowed TE. The triangle distribution in Je(t + 2 days) vs. Vsw(t) can be better understood with TE. In the second example, the previously identified causal parameters of the solar cycle in the Babcock–Leighton type model such as the solar polar field, meridional flow, polar faculae (proxy for polar field), and flux emergence are investigated using TE. The transfer of information from the polar field to the sunspot number (SSN) peaks at lag times of 3–4 years. Both the flux emergence and the meridional flow contribute to the polar field, but at different time scales. The polar fields from at least the last 3 cycles contain information about SSN. Full article
Show Figures

Figure 1

Open AccessReview
A Review of Physical Layer Security Techniques for Internet of Things: Challenges and Solutions
Entropy 2018, 20(10), 730; https://doi.org/10.3390/e20100730 - 23 Sep 2018
Cited by 6
Abstract
With the uninterrupted revolution of communications technologies and the great-leap-forward development of emerging applications, the ubiquitous deployment of Internet of Things (IoT) is imperative to accommodate constantly growing user demands and market scales. Communication security is critically important for the operations of IoT. [...] Read more.
With the uninterrupted revolution of communications technologies and the great-leap-forward development of emerging applications, the ubiquitous deployment of Internet of Things (IoT) is imperative to accommodate constantly growing user demands and market scales. Communication security is critically important for the operations of IoT. Among the communication security provisioning techniques, physical layer security (PLS), which can provide unbreakable, provable, and quantifiable secrecy from an information-theoretical point of view, has drawn considerable attention from both the academia and the industries. However, the unique features of IoT, such as low-cost, wide-range coverage, massive connection, and diversified services, impose great challenges for the PLS protocol design in IoT. In this article, we present a comprehensive review of the PLS techniques toward IoT applications. The basic principle of PLS is first briefly introduced, followed by the survey of the existing PLS techniques. Afterwards, the characteristics of IoT are identified, based on which the challenges faced by PLS protocol design are summarized. Then, three newly-proposed PLS solutions are highlighted, which match the features of IoT well and are expected to be applied in the near future. Finally, we conclude the paper and point out some further research directions. Full article
Show Figures

Figure 1

Back to TopTop