Next Issue
Previous Issue

E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Table of Contents

Entropy, Volume 11, Issue 4 (December 2009), Pages 529-1147

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
View options order results:
result details:
Displaying articles 1-38
Export citation of selected articles as:
Open AccessArticle Entropy-Based Wavelet De-noising Method for Time Series Analysis
Entropy 2009, 11(4), 1123-1147; https://doi.org/10.3390/e11041123
Received: 9 October 2009 / Accepted: 11 December 2009 / Published: 22 December 2009
Cited by 32 | PDF Full-text (657 KB) | HTML Full-text | XML Full-text
Abstract
The existence of noise has great influence on the real features of observed time series, thus noise reduction in time series data is a necessary and significant task in many practical applications. When using traditional de-noising methods, the results often cannot meet the
[...] Read more.
The existence of noise has great influence on the real features of observed time series, thus noise reduction in time series data is a necessary and significant task in many practical applications. When using traditional de-noising methods, the results often cannot meet the practical needs due to their inherent shortcomings. In the present paper, first a set of key but difficult wavelet de-noising problems are discussed, and then by applying information entropy theories to the wavelet de-noising process, i.e., using the principle of maximum entropy (POME) to describe the random character of the noise and using wavelet energy entropy to describe the degrees of complexity of the main series in original series data, a new entropy-based wavelet de-noising method is proposed. Analysis results of both several different synthetic series and typical observed time series data have verified the performance of the new method. A comprehensive discussion of the results indicates that compared with traditional wavelet de-noising methods, the new proposed method is more effective and universal. Furthermore, because it uses information entropy theories to describe the obviously different characteristics of noises and the main series in the series data is observed first and then de-noised, the analysis process has a more reliable physical basis, and the results of the new proposed method are more reasonable and are the global optimum. Besides, the analysis process of the new proposed method is simple and is easy to implement, so it would be more applicable and useful in applied sciences and practical engineering works. Full article
(This article belongs to the Special Issue Maximum Entropy)
Figures

Figure 1

Open AccessComment Comment on “Equiprobability, Entropy, Gamma Distributions and Other Geometrical Questions in Multi-Agent Systems”, Entropy 2009, 11, 959-971
Entropy 2009, 11(4), 1121-1122; https://doi.org/10.3390/e11041121
Received: 11 December 2009 / Accepted: 18 December 2009 / Published: 22 December 2009
PDF Full-text (100 KB) | HTML Full-text | XML Full-text
Abstract
The volume of the body enclosed by the n-dimensional Lamé curve defined by Ʃni=1 xbi = E is computed. Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessReview Processing Information in Quantum Decision Theory
Entropy 2009, 11(4), 1073-1120; https://doi.org/10.3390/e11041073
Received: 28 October 2009 / Accepted: 10 December 2009 / Published: 14 December 2009
Cited by 35 | PDF Full-text (328 KB) | HTML Full-text | XML Full-text
Abstract
A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures
[...] Read more.
A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures the effect of superposition of composite prospects, including many incorporated intended actions. The theory characterizes entangled decision making, non-commutativity of subsequent decisions, and intention interference. The self-consistent procedure of decision making, in the frame of the quantum decision theory, takes into account both the available objective information as well as subjective contextual effects. This quantum approach avoids any paradox typical of classical decision theory. Conditional maximization of entropy, equivalent to the minimization of an information functional, makes it possible to connect the quantum and classical decision theories, showing that the latter is the limit of the former under vanishing interference terms. Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessArticle Explaining Change in Language: A Cybersemiotic Perspective
Entropy 2009, 11(4), 1055-1072; https://doi.org/10.3390/e11041055
Received: 14 October 2009 / Accepted: 2 December 2009 / Published: 11 December 2009
Cited by 3 | PDF Full-text (484 KB) | HTML Full-text | XML Full-text
Abstract
One of the greatest conundrums in semiotics and linguistics is explaining why change occurs in communication systems. The descriptive apparatus of how change occurs has been developed in great detail since at least the nineteenth century, but a viable explanatory framework of why
[...] Read more.
One of the greatest conundrums in semiotics and linguistics is explaining why change occurs in communication systems. The descriptive apparatus of how change occurs has been developed in great detail since at least the nineteenth century, but a viable explanatory framework of why it occurs in the first place still seems to be clouded in vagueness. So far, only the so-called Principle of Least Effort has come forward to provide a suggestive psychobiological framework for understanding change in communication codes such as language. Extensive work in using this model has shown many fascinating things about language structure and how it evolves. However, the many findings need an integrative framework for shedding light on any generalities implicit in them. This paper argues that a new approach to the study of codes, called cybersemiotics, can be used to great advantage for assessing theoretical frameworks and notions such as the Principle of Least Effort. Amalgamating cybernetic and biosemiotic notions, this new science provides analysts with valuable insights on the raison d’être of phenomena such as linguistic change. Full article
Open AccessArticle Modeling Electric Discharges with Entropy Production Rate Principles
Entropy 2009, 11(4), 1042-1054; https://doi.org/10.3390/e11041042
Received: 29 October 2009 / Accepted: 1 December 2009 / Published: 8 December 2009
Cited by 14 | PDF Full-text (210 KB) | HTML Full-text | XML Full-text
Abstract
Under which circumstances are variational principles based on entropy production rate useful tools for modeling steady states of electric (gas) discharge systems far from equilibrium? It is first shown how various different approaches, as Steenbeck’s minimum voltage and Prigogine’s minimum entropy production rate
[...] Read more.
Under which circumstances are variational principles based on entropy production rate useful tools for modeling steady states of electric (gas) discharge systems far from equilibrium? It is first shown how various different approaches, as Steenbeck’s minimum voltage and Prigogine’s minimum entropy production rate principles are related to the maximum entropy production rate principle (MEPP). Secondly, three typical examples are discussed, which provide a certain insight in the structure of the models that are candidates for MEPP application. It is then thirdly argued that MEPP, although not being an exact physical law, may provide reasonable model parameter estimates, provided the constraints contain the relevant (nonlinear) physical effects and the parameters to be determined are related to disregarded weak constraints that affect mainly global entropy production. Finally, it is additionally conjectured that a further reason for the success of MEPP in certain far from equilibrium systems might be based on a hidden linearity of the underlying kinetic equation(s). Full article
(This article belongs to the Special Issue What Is Maximum Entropy Production and How Should We Apply It?)
Figures

Figure 1

Open AccessArticle On the Spectral Entropy of Thermodynamic Paths for Elementary Systems
Entropy 2009, 11(4), 1025-1041; https://doi.org/10.3390/e11041025
Received: 13 October 2009 / Accepted: 27 November 2009 / Published: 7 December 2009
PDF Full-text (425 KB) | HTML Full-text | XML Full-text
Abstract
Systems do not elect thermodynamic pathways on their own. They operate in tandem with their surroundings. Pathway selection and traversal require coordinated work and heat exchanges along with parallel tuning of the system variables. Previous research by the author (Reference [1]) focused on
[...] Read more.
Systems do not elect thermodynamic pathways on their own. They operate in tandem with their surroundings. Pathway selection and traversal require coordinated work and heat exchanges along with parallel tuning of the system variables. Previous research by the author (Reference [1]) focused on the information expressed in thermodynamic pathways. Examined here is how spectral entropy is a by-product of information that depends intricately on the pathway structure. The spectral entropy has proven to be a valuable tool in diverse fields. This paper illustrates the contact between spectral entropy and the properties which distinguish ideal from non-ideal gases. The role of spectral entropy in the first and second laws of thermodynamics and heat → work conversions is also discussed. Full article
(This article belongs to the Special Issue Information and Entropy)
Figures

Figure 1

Open AccessArticle Best Probability Density Function for Random Sampled Data
Entropy 2009, 11(4), 1001-1024; https://doi.org/10.3390/e11041001
Received: 9 October 2009 / Accepted: 2 December 2009 / Published: 4 December 2009
Cited by 3 | PDF Full-text (1356 KB) | HTML Full-text | XML Full-text
Abstract
The maximum entropy method is a theoretically sound approach to construct an analytical form for the probability density function (pdf) given a sample of random events. In practice, numerical methods employed to determine the appropriate Lagrange multipliers associated with a set of moments
[...] Read more.
The maximum entropy method is a theoretically sound approach to construct an analytical form for the probability density function (pdf) given a sample of random events. In practice, numerical methods employed to determine the appropriate Lagrange multipliers associated with a set of moments are generally unstable in the presence of noise due to limited sampling. A robust method is presented that always returns the best pdf, where tradeoff in smoothing a highly varying function due to noise can be controlled. An unconventional adaptive simulated annealing technique, called funnel diffusion, determines expansion coefficients for Chebyshev polynomials in the exponential function. Full article
(This article belongs to the Special Issue Maximum Entropy)
Figures

Figure 1

Open AccessCommunication Dispersal (Entropy) and Recognition (Information) as Foundations of Emergence and Dissolvence
Entropy 2009, 11(4), 993-1000; https://doi.org/10.3390/e11040993
Received: 15 October 2009 / Accepted: 27 November 2009 / Published: 3 December 2009
Cited by 7 | PDF Full-text (178 KB) | HTML Full-text | XML Full-text
Abstract
The objective of this essay is to reflect on a possible relation between entropy and emergence. A qualitative, relational approach is followed. We begin by highlighting that entropy includes the concept of dispersal, relevant to our enquiry. Emergence in complex systems arises from
[...] Read more.
The objective of this essay is to reflect on a possible relation between entropy and emergence. A qualitative, relational approach is followed. We begin by highlighting that entropy includes the concept of dispersal, relevant to our enquiry. Emergence in complex systems arises from the coordinated behavior of their parts. Coordination in turn necessitates recognition between parts, i.e., information exchange. What will be argued here is that the scope of recognition processes between parts is increased when preceded by their dispersal, which multiplies the number of encounters and creates a richer potential for recognition. A process intrinsic to emergence is dissolvence (aka submergence or top-down constraints), which participates in the information-entropy interplay underlying the creation, evolution and breakdown of higher-level entities. Full article
(This article belongs to the Special Issue Information and Entropy)
Figures

Figure 1

Open AccessReview Fisher Information and Semiclassical Treatments
Entropy 2009, 11(4), 972-992; https://doi.org/10.3390/e11040972
Received: 30 October 2009 / Accepted: 27 November 2009 / Published: 3 December 2009
Cited by 12 | PDF Full-text (266 KB) | HTML Full-text | XML Full-text
Abstract
We review here the difference between quantum statistical treatments and semiclassical ones, using as the main concomitant tool a semiclassical, shift-invariant Fisher information measure built up with Husimi distributions. Its semiclassical character notwithstanding, this measure also contains abundant information of a purely quantal
[...] Read more.
We review here the difference between quantum statistical treatments and semiclassical ones, using as the main concomitant tool a semiclassical, shift-invariant Fisher information measure built up with Husimi distributions. Its semiclassical character notwithstanding, this measure also contains abundant information of a purely quantal nature. Such a tool allows us to refine the celebrated Lieb bound for Wehrl entropies and to discover thermodynamic-like relations that involve the degree of delocalization. Fisher-related thermal uncertainty relations are developed and the degree of purity of canonical distributions, regarded as mixed states, is connected to this Fisher measure as well. Full article
(This article belongs to the Special Issue Maximum Entropy)
Figures

Figure 1

Open AccessArticle Equiprobability, Entropy, Gamma Distributions and Other Geometrical Questions in Multi-Agent Systems
Entropy 2009, 11(4), 959-971; https://doi.org/10.3390/e11040959
Received: 3 November 2009 / Accepted: 30 November 2009 / Published: 2 December 2009
Cited by 5 | PDF Full-text (187 KB) | HTML Full-text | XML Full-text
Abstract
A set of many identical interacting agents obeying a global additive constraint is considered. Under the hypothesis of equiprobability in the high-dimensional volume delimited in phase space by the constraint, the statistical behavior of a generic agent over the ensemble is worked out.
[...] Read more.
A set of many identical interacting agents obeying a global additive constraint is considered. Under the hypothesis of equiprobability in the high-dimensional volume delimited in phase space by the constraint, the statistical behavior of a generic agent over the ensemble is worked out. The asymptotic distribution of that statistical behavior is derived from geometrical arguments. This distribution is related with the Gamma distributions found in several multi-agent economy models. The parallelism with all these systems is established. Also, as a collateral result, a formula for the volume of high-dimensional symmetrical bodies is proposed. Full article
(This article belongs to the Special Issue Information and Entropy)
Figures

Figure 1

Open AccessArticle Calculation of Entropy for a Sinusoid with Beta-Distributed Phase
Entropy 2009, 11(4), 949-958; https://doi.org/10.3390/e11040949
Received: 17 September 2009 / Accepted: 13 November 2009 / Published: 2 December 2009
PDF Full-text (256 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, an analytical expression is developed for the differential entropy of a sinusoid with a Beta-distributed phase angle. This particular signal model is prevalent in optical communications, however an expression for the associated differential entropy does not currently exist. The expression
[...] Read more.
In this paper, an analytical expression is developed for the differential entropy of a sinusoid with a Beta-distributed phase angle. This particular signal model is prevalent in optical communications, however an expression for the associated differential entropy does not currently exist. The expression we derive is approximate as it relies on a series expansion for one of the key terms needed in the derivation. However, we are able to show that the approximation is accurate (error ≤ 5%) for a wide variety of Beta parameter choices. Full article
Figures

Figure 1

Open AccessArticle A Story and a Recommendation about the Principle of Maximum Entropy Production
Entropy 2009, 11(4), 945-948; https://doi.org/10.3390/e11040945
Received: 23 October 2009 / Accepted: 26 November 2009 / Published: 30 November 2009
Cited by 9 | PDF Full-text (70 KB) | HTML Full-text | XML Full-text
Abstract
The principle of maximum entropy production (MEP) is the subject of considerable academic study, but has yet to become remarkable for its practical applications. A tale is told of an instance in which a spin-off from consideration of an MEP-constrained climate model at
[...] Read more.
The principle of maximum entropy production (MEP) is the subject of considerable academic study, but has yet to become remarkable for its practical applications. A tale is told of an instance in which a spin-off from consideration of an MEP-constrained climate model at least led to re-consideration of the very practical issue of water-vapour feedback in climate change. Further, and on a more-or-less unrelated matter, a recommendation is made for further research on whether there might exist a general "rule" whereby, for certain classes of complex non-linear systems, a state of maximum entropy production is equivalent to a state of minimum entropy. Full article
(This article belongs to the Special Issue What Is Maximum Entropy Production and How Should We Apply It?)
Open AccessArticle Maximum Entropy Production as an Inference Algorithm that Translates Physical Assumptions into Macroscopic Predictions: Don’t Shoot the Messenger
Entropy 2009, 11(4), 931-944; https://doi.org/10.3390/e11040931
Received: 23 October 2009 / Accepted: 23 November 2009 / Published: 27 November 2009
Cited by 49 | PDF Full-text (268 KB) | HTML Full-text | XML Full-text
Abstract
Is Maximum Entropy Production (MEP) a physical principle? In this paper I tentatively suggest it is not, on the basis that MEP is equivalent to Jaynes’ Maximum Entropy (MaxEnt) inference algorithm that passively translates physical assumptions into macroscopic predictions, as applied to non-equilibrium
[...] Read more.
Is Maximum Entropy Production (MEP) a physical principle? In this paper I tentatively suggest it is not, on the basis that MEP is equivalent to Jaynes’ Maximum Entropy (MaxEnt) inference algorithm that passively translates physical assumptions into macroscopic predictions, as applied to non-equilibrium systems. MaxEnt itself has no physical content; disagreement between MaxEnt predictions and experiment falsifies the physical assumptions, not MaxEnt. While it remains to be shown rigorously that MEP is indeed equivalent to MaxEnt for systems arbitrarily far from equilibrium, work in progress tentatively supports this conclusion. In terms of its role within non-equilibrium statistical mechanics, MEP might then be better understood as Messenger of Essential Physics. Full article
(This article belongs to the Special Issue What Is Maximum Entropy Production and How Should We Apply It?)
Figures

Figure 1

Open AccessArticle A Weighted Generalized Maximum Entropy Estimator with a Data-driven Weight
Entropy 2009, 11(4), 917-930; https://doi.org/10.3390/e11040917
Received: 24 September 2009 / Accepted: 16 November 2009 / Published: 26 November 2009
Cited by 8 | PDF Full-text (189 KB) | HTML Full-text | XML Full-text
Abstract
The method of Generalized Maximum Entropy (GME), proposed in Golan, Judge and Miller (1996), is an information-theoretic approach that is robust to multicolinearity problem. It uses an objective function that is the sum of the entropies for coefficient distributions and disturbance distributions. This
[...] Read more.
The method of Generalized Maximum Entropy (GME), proposed in Golan, Judge and Miller (1996), is an information-theoretic approach that is robust to multicolinearity problem. It uses an objective function that is the sum of the entropies for coefficient distributions and disturbance distributions. This method can be generalized to the weighted GME (W-GME), where different weights are assigned to the two entropies in the objective function. We propose a data-driven method to select the weights in the entropy objective function. We use the least squares cross validation to derive the optimal weights. MonteCarlo simulations demonstrate that the proposedW-GME estimator is comparable to and often outperforms the conventional GME estimator, which places equal weights on the entropies of coefficient and disturbance distributions. Full article
(This article belongs to the Special Issue Maximum Entropy)
Open AccessArticle On the Structural Non-identifiability of Flexible Branched Polymers
Entropy 2009, 11(4), 907-916; https://doi.org/10.3390/e11040907
Received: 21 October 2009 / Accepted: 18 November 2009 / Published: 20 November 2009
Cited by 7 | PDF Full-text (235 KB) | HTML Full-text | XML Full-text
Abstract
The dynamics and statics of flexible polymer chains are based on their conformational entropy, resulting in the properties of isolated polymer chains with any branching potentially being characterized by Gaussian chain models. According to the graph-theoretical approach, the dynamics and statics of Gaussian
[...] Read more.
The dynamics and statics of flexible polymer chains are based on their conformational entropy, resulting in the properties of isolated polymer chains with any branching potentially being characterized by Gaussian chain models. According to the graph-theoretical approach, the dynamics and statics of Gaussian chains can be expressed as a set of eigenvalues of their Laplacian matrix. As such, the existence of Laplacian cospectral trees allows the structural nonidentifiability of any branched flexible polymer. Full article
(This article belongs to the Special Issue Entropies of Polymers)
Figures

Figure 1

Back to Top