Next Issue
Volume 12, February
Previous Issue
Volume 12, December

Table of Contents

Entropy, Volume 12, Issue 1 (January 2010) – 11 articles , Pages 1-160

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
Order results
Result details
Select all
Export citation of selected articles as:
Open AccessArticle
The Quantum-Classical Transition as an Information Flow
Entropy 2010, 12(1), 148-160; https://doi.org/10.3390/e12010148 - 26 Jan 2010
Cited by 5 | Viewed by 6879
Abstract
We investigate the classical limit of the semiclassical evolution with reference to a well-known model that represents the interaction between matter and a given field. This is done by recourse to a special statistical quantifier called the “symbolic transfer entropy”. We encounter that [...] Read more.
We investigate the classical limit of the semiclassical evolution with reference to a well-known model that represents the interaction between matter and a given field. This is done by recourse to a special statistical quantifier called the “symbolic transfer entropy”. We encounter that the quantum-classical transition gets thereby described as the sign-reversal of the dominating direction of the information flow between classical and quantal variables. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

Open AccessArticle
On the Entropy Based Associative Memory Model with Higher-Order Correlations
Entropy 2010, 12(1), 136-147; https://doi.org/10.3390/e12010136 - 22 Jan 2010
Viewed by 5198
Abstract
In this paper, an entropy based associative memory model will be proposed and applied to memory retrievals with an orthogonal learning model so as to compare with the conventional model based on the quadratic Lyapunov functional to be minimized during the retrieval process. [...] Read more.
In this paper, an entropy based associative memory model will be proposed and applied to memory retrievals with an orthogonal learning model so as to compare with the conventional model based on the quadratic Lyapunov functional to be minimized during the retrieval process. In the present approach, the updating dynamics will be constructed on the basis of the entropy minimization strategy which may be reduced asymptotically to the above-mentioned conventional dynamics as a special case ignoring the higher-order correlations. According to the introduction of the entropy functional, one may involve higer-order correlation effects between neurons in a self-contained manner without any heuristic coupling coefficients as in the conventional manner. In fact we shall show such higher order coupling tensors are to be uniquely determined in the framework of the entropy based approach. From numerical results, it will be found that the presently proposed novel approach realizes much larger memory capacity than that of the quadratic Lyapunov functional approach, e.g., associatron. Full article
(This article belongs to the Special Issue Entropy in Model Reduction)
Show Figures

Figure 1

Open AccessArticle
Arguments for the Integration of the Non-Zero-Sum Logic of Complex Animal Communication with Information Theory
Entropy 2010, 12(1), 127-135; https://doi.org/10.3390/e12010127 - 21 Jan 2010
Cited by 2 | Viewed by 5696
Abstract
The outstanding levels of knowledge attained today in the research on animal communication, and the new available technologies to study visual, vocal and chemical signalling, allow an ever increasing use of information theory as a sophisticated tool to improve our knowledge of the [...] Read more.
The outstanding levels of knowledge attained today in the research on animal communication, and the new available technologies to study visual, vocal and chemical signalling, allow an ever increasing use of information theory as a sophisticated tool to improve our knowledge of the complexity of animal communication. Some considerations on the way information theory and intraspecific communication can be linked are presented here. Specifically, information theory may help us to explore interindividual variations in different environmental constraints and social scenarios, as well as the communicative features of social vs. solitary species. Full article
(This article belongs to the Special Issue Information Theory Applied to Animal Communication)
Open AccessArticle
From Maximum Entropy to Maximum Entropy Production: A New Approach
Entropy 2010, 12(1), 107-126; https://doi.org/10.3390/e12010107 - 18 Jan 2010
Cited by 12 | Viewed by 6729
Abstract
Evidence from climate science suggests that a principle of maximum thermodynamic entropy production can be used to make predictions about some physical systems. I discuss the general form of this principle and an inherent problem with it, currently unsolved by theoretical approaches: how [...] Read more.
Evidence from climate science suggests that a principle of maximum thermodynamic entropy production can be used to make predictions about some physical systems. I discuss the general form of this principle and an inherent problem with it, currently unsolved by theoretical approaches: how to determine which system it should be applied to. I suggest a new way to derive the principle from statistical mechanics, and present a tentative solution to the system boundary problem. I discuss the need for experimental validation of the principle, and its impact on the way we see the relationship between thermodynamics and kinetics. Full article
(This article belongs to the Special Issue What Is Maximum Entropy Production and How Should We Apply It?)
Show Figures

Graphical abstract

Open AccessReview
Maximum Entropy Approaches to Living Neural Networks
Entropy 2010, 12(1), 89-106; https://doi.org/10.3390/e12010089 - 13 Jan 2010
Cited by 21 | Viewed by 7862
Abstract
Understanding how ensembles of neurons collectively interact will be a key step in developing a mechanistic theory of cognitive processes. Recent progress in multineuron recording and analysis techniques has generated tremendous excitement over the physiology of living neural networks. One of the key [...] Read more.
Understanding how ensembles of neurons collectively interact will be a key step in developing a mechanistic theory of cognitive processes. Recent progress in multineuron recording and analysis techniques has generated tremendous excitement over the physiology of living neural networks. One of the key developments driving this interest is a new class of models based on the principle of maximum entropy. Maximum entropy models have been reported to account for spatial correlation structure in ensembles of neurons recorded from several different types of data. Importantly, these models require only information about the firing rates of individual neurons and their pairwise correlations. If this approach is generally applicable, it would drastically simplify the problem of understanding how neural networks behave. Given the interest in this method, several groups now have worked to extend maximum entropy models to account for temporal correlations. Here, we review how maximum entropy models have been applied to neuronal ensemble data to account for spatial and temporal correlations. We also discuss criticisms of the maximum entropy approach that argue that it is not generally applicable to larger ensembles of neurons. We conclude that future maximum entropy models will need to address three issues: temporal correlations, higher-order correlations, and larger ensemble sizes. Finally, we provide a brief list of topics for future research. Full article
Show Figures

Figure 1

Open AccessArticle
A Dynamic Model of Information and Entropy
Entropy 2010, 12(1), 80-88; https://doi.org/10.3390/e12010080 - 07 Jan 2010
Cited by 2 | Viewed by 5047
Abstract
We discuss the possibility of a relativistic relationship between information and entropy, closely analogous to the classical Maxwell electro-magnetic wave equations. Inherent to the analysis is the description of information as residing in points of non-analyticity; yet ultimately also exhibiting a distributed characteristic: [...] Read more.
We discuss the possibility of a relativistic relationship between information and entropy, closely analogous to the classical Maxwell electro-magnetic wave equations. Inherent to the analysis is the description of information as residing in points of non-analyticity; yet ultimately also exhibiting a distributed characteristic: additionally analogous, therefore, to the wave-particle duality of light. At cosmological scales our vector differential equations predict conservation of information in black holes, whereas regular- and Z-DNA molecules correspond to helical solutions at microscopic levels. We further propose that regular- and Z-DNA are equivalent to the alternative words chosen from an alphabet to maintain the equilibrium of an information transmission system. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

Open AccessArticle
Redundancy in Systems Which Entertain a Model of Themselves: Interaction Information and the Self-Organization of Anticipation
Entropy 2010, 12(1), 63-79; https://doi.org/10.3390/e12010063 - 06 Jan 2010
Cited by 23 | Viewed by 8527
Abstract
Mutual information among three or more dimensions (μ* = –Q) has been considered as interaction information. However, Krippendorff [1,2] has shown that this measure cannot be interpreted as a unique property of the interactions and has proposed an alternative measure of interaction information [...] Read more.
Mutual information among three or more dimensions (μ* = –Q) has been considered as interaction information. However, Krippendorff [1,2] has shown that this measure cannot be interpreted as a unique property of the interactions and has proposed an alternative measure of interaction information based on iterative approximation of maximum entropies. Q can then be considered as a measure of the difference between interaction information and redundancy generated in a model entertained by an observer. I argue that this provides us with a measure of the imprint of a second-order observing system—a model entertained by the system itself—on the underlying information processing. The second-order system communicates meaning hyper-incursively; an observation instantiates this meaning-processing within the information processing. The net results may add to or reduce the prevailing uncertainty. The model is tested empirically for the case where textual organization can be expected to contain intellectual organization in terms of distributions of title words, author names, and cited references. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

Open AccessArticle
Imprecise Shannon’s Entropy and Multi Attribute Decision Making
Entropy 2010, 12(1), 53-62; https://doi.org/10.3390/e12010053 - 05 Jan 2010
Cited by 127 | Viewed by 8312
Abstract
Finding the appropriate weight for each criterion is one of the main points in Multi Attribute Decision Making (MADM) problems. Shannon’s entropy method is one of the various methods for finding weights discussed in the literature. However, in many real life problems, the [...] Read more.
Finding the appropriate weight for each criterion is one of the main points in Multi Attribute Decision Making (MADM) problems. Shannon’s entropy method is one of the various methods for finding weights discussed in the literature. However, in many real life problems, the data for the decision making processes cannot be measured precisely and there may be some other types of data, for instance, interval data and fuzzy data. The goal of this paper is the extension of the Shannon entropy method for the imprecise data, especially interval and fuzzy data cases. Full article
Open AccessReview
Data Compression Concepts and Algorithms and Their Applications to Bioinformatics
Entropy 2010, 12(1), 34-52; https://doi.org/10.3390/e12010034 - 29 Dec 2009
Cited by 30 | Viewed by 6593
Abstract
Data compression at its base is concerned with how information is organized in data. Understanding this organization can lead to efficient ways of representing the information and hence data compression. In this paper we review the ways in which ideas and approaches fundamental [...] Read more.
Data compression at its base is concerned with how information is organized in data. Understanding this organization can lead to efficient ways of representing the information and hence data compression. In this paper we review the ways in which ideas and approaches fundamental to the theory and practice of data compression have been used in the area of bioinformatics. We look at how basic theoretical ideas from data compression, such as the notions of entropy, mutual information, and complexity have been used for analyzing biological sequences in order to discover hidden patterns, infer phylogenetic relationships between organisms and study viral populations. Finally, we look at how inferred grammars for biological sequences have been used to uncover structure in biological sequences. Full article
Show Figures

Figure 1

Open AccessArticle
Estimation of Seismic Wavelets Based on the Multivariate Scale Mixture of Gaussians Model
Entropy 2010, 12(1), 14-33; https://doi.org/10.3390/e12010014 - 28 Dec 2009
Cited by 11 | Viewed by 5615
Abstract
This paper proposes a new method for estimating seismic wavelets. Suppose a seismic wavelet can be modeled by a formula with three free parameters (scale, frequency and phase). We can transform the estimation of the wavelet into determining these three parameters. The phase [...] Read more.
This paper proposes a new method for estimating seismic wavelets. Suppose a seismic wavelet can be modeled by a formula with three free parameters (scale, frequency and phase). We can transform the estimation of the wavelet into determining these three parameters. The phase of the wavelet is estimated by constant-phase rotation to the seismic signal, while the other two parameters are obtained by the Higher-order Statistics (HOS) (fourth-order cumulant) matching method. In order to derive the estimator of the Higher-order Statistics (HOS), the multivariate scale mixture of Gaussians (MSMG) model is applied to formulating the multivariate joint probability density function (PDF) of the seismic signal. By this way, we can represent HOS as a polynomial function of second-order statistics to improve the anti-noise performance and accuracy. In addition, the proposed method can work well for short time series. Full article
(This article belongs to the Special Issue Maximum Entropy)
Show Figures

Figure 1

Open AccessArticle
Lorenz Curves, Size Classification, and Dimensions of Bubble Size Distributions
Entropy 2010, 12(1), 1-13; https://doi.org/10.3390/e12010001 - 25 Dec 2009
Cited by 3 | Viewed by 5628
Abstract
Lorenz curves of bubble size distributions and their Gini coefficients characterize demixing processes. Through a systematic size classification, bubble size histograms are generated and investigated concerning their statistical entropy. It turns out that the temporal development of the entropy is preserved although characteristics [...] Read more.
Lorenz curves of bubble size distributions and their Gini coefficients characterize demixing processes. Through a systematic size classification, bubble size histograms are generated and investigated concerning their statistical entropy. It turns out that the temporal development of the entropy is preserved although characteristics of the histograms like number of size classes and modality are remarkably reduced. Examinations by Rényi dimensions show that the bubble size distributions are multifractal and provide information about the underlying structures like self-similarity. Full article
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop