Next Issue
Volume 12, January
Previous Issue
Volume 11, September
 
 
entropy-logo

Journal Browser

Journal Browser

Entropy, Volume 11, Issue 4 (December 2009) – 38 articles , Pages 529-1147

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:

Research

Jump to: Review, Other

219 KiB  
Article
The Quantum Noise of Ferromagnetic π-Bloch Domain Walls
by Peter R. Crompton
Entropy 2009, 11(4), 548-559; https://doi.org/10.3390/e11040548 - 28 Sep 2009
Cited by 1 | Viewed by 8322
Abstract
We quantify the probability per unit Euclidean-time of reversing the magnetization of a π-Bloch vector, which describes the Ferromagnetic Domain Walls of a Ferromagnetic Nanowire at finite-temperatures. Our approach, based on Langer’s Theory, treats the double sine-Gordon model that defines the π-Bloch vectors [...] Read more.
We quantify the probability per unit Euclidean-time of reversing the magnetization of a π-Bloch vector, which describes the Ferromagnetic Domain Walls of a Ferromagnetic Nanowire at finite-temperatures. Our approach, based on Langer’s Theory, treats the double sine-Gordon model that defines the π-Bloch vectors via a procedure of nonperturbative renormalization, and uses importance sampling methods to minimise the free energy of the system and identify the saddlepoint solution corresponding to the reversal probability. We identify that whilst the general solution for the free energy minima cannot be expressed in closed form, we can obtain a closed expression for the saddlepoint by maximizing the entanglement entropy of the system as a polynomial ring. We use this approach to quantify the geometric and non-geometric contributions to the entanglement entropy of the Ferromagnetic Nanowire, defined between entangled Ferromagnetic Domain Walls, and evaluate the Euclidean-time dependence of the domain wall width and angular momentum transfer at the domain walls, which has been recently proposed as a mechanism for Quantum Memory Storage. Full article
Show Figures

Figure 1

1393 KiB  
Article
An Entropy-Like Estimator for Robust Parameter Identification
by Giovanni Indiveri
Entropy 2009, 11(4), 560-585; https://doi.org/10.3390/e11040560 - 12 Oct 2009
Cited by 19 | Viewed by 7705
Abstract
This paper describes the basic ideas behind a novel prediction error parameter identification algorithm exhibiting high robustness with respect to outlying data. Given the low sensitivity to outliers, these can be more easily identified by analysing the residuals of the fit. The devised [...] Read more.
This paper describes the basic ideas behind a novel prediction error parameter identification algorithm exhibiting high robustness with respect to outlying data. Given the low sensitivity to outliers, these can be more easily identified by analysing the residuals of the fit. The devised cost function is inspired by the definition of entropy, although the method in itself does not exploit the stochastic meaning of entropy in its usual sense. After describing the most common alternative approaches for robust identification, the novel method is presented together with numerical examples for validation. Full article
Show Figures

Figure 1

175 KiB  
Article
Landauer’s Principle and Divergenceless Dynamical Systems
by Claudia Zander, Angel Ricardo Plastino, Angelo Plastino, Montserrat Casas and Sergio Curilef
Entropy 2009, 11(4), 586-597; https://doi.org/10.3390/e11040586 - 13 Oct 2009
Cited by 5 | Viewed by 10265
Abstract
Landauer’s principle is one of the pillars of the physics of information. It constitutes one of the foundations behind the idea that “information is physical”. Landauer’s principle establishes the smallest amount of energy that has to be dissipated when one bit of information [...] Read more.
Landauer’s principle is one of the pillars of the physics of information. It constitutes one of the foundations behind the idea that “information is physical”. Landauer’s principle establishes the smallest amount of energy that has to be dissipated when one bit of information is erased from a computing device. Here we explore an extended Landauerlike principle valid for general dynamical systems (not necessarily Hamiltonian) governed by divergenceless phase space flows. Full article
(This article belongs to the Special Issue Information and Entropy)
165 KiB  
Article
An Application of Entropy in Survey Scale
by Özlem Ege Oruç, Emel Kuruoğlu and Özgül Vupa
Entropy 2009, 11(4), 598-605; https://doi.org/10.3390/e11040598 - 14 Oct 2009
Cited by 2 | Viewed by 8304
Abstract
This study demonstrates an application of entropy for information theory in the field of survey scale. Based on computer anxiety scale we obtain that the desired information may be achieved with fewer questions. In particular, one question is insufficient and two questions are [...] Read more.
This study demonstrates an application of entropy for information theory in the field of survey scale. Based on computer anxiety scale we obtain that the desired information may be achieved with fewer questions. In particular, one question is insufficient and two questions are necessary for a survey subscale. Full article
Show Figures

Figure 1

626 KiB  
Article
Economies Evolve by Energy Dispersal
by Arto Annila and Stanley Salthe
Entropy 2009, 11(4), 606-633; https://doi.org/10.3390/e11040606 - 21 Oct 2009
Cited by 57 | Viewed by 21091 | Correction
Abstract
Economic activity can be regarded as an evolutionary process governed by the 2nd law of thermodynamics. The universal law, when formulated locally as an equation of motion, reveals that a growing economy develops functional machinery and organizes hierarchically in such a way [...] Read more.
Economic activity can be regarded as an evolutionary process governed by the 2nd law of thermodynamics. The universal law, when formulated locally as an equation of motion, reveals that a growing economy develops functional machinery and organizes hierarchically in such a way as to tend to equalize energy density differences within the economy and in respect to the surroundings it is open to. Diverse economic activities result in flows of energy that will preferentially channel along the most steeply descending paths, leveling a non-Euclidean free energy landscape. This principle of 'maximal energy dispersal‘, equivalent to the maximal rate of entropy production, gives rise to economic laws and regularities. The law of diminishing returns follows from the diminishing free energy while the relation between supply and demand displays a quest for a balance among interdependent energy densities. Economic evolution is dissipative motion where the driving forces and energy flows are inseparable from each other. When there are multiple degrees of freedom, economic growth and decline are inherently impossible to forecast in detail. Namely, trajectories of an evolving economy are non-integrable, i.e. unpredictable in detail because a decision by a player will affect also future decisions of other players. We propose that decision making is ultimately about choosing from various actions those that would reduce most effectively subjectively perceived energy gradients. Full article
Show Figures

Figure 1

143 KiB  
Article
A Lower-Bound for the Maximin Redundancy in Pattern Coding
by Aurélien Garivier
Entropy 2009, 11(4), 634-642; https://doi.org/10.3390/e11040634 - 22 Oct 2009
Cited by 7 | Viewed by 8043
Abstract
We show that the maximin average redundancy in pattern coding is eventually larger than 1.84 (n/log n)1/3 for messages of length n. This improves recent results on pattern redundancy, although it does not fill the gap between known [...] Read more.
We show that the maximin average redundancy in pattern coding is eventually larger than 1.84 (n/log n)1/3 for messages of length n. This improves recent results on pattern redundancy, although it does not fill the gap between known lower- and upper-bounds. The pattern of a string is obtained by replacing each symbol by the index of its first occurrence. The problem of pattern coding is of interest because strongly universal codes have been proved to exist for patterns while universal message coding is impossible for memoryless sources on an infinite alphabet. The proof uses fine combinatorial results on partitions with small summands. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

2634 KiB  
Article
The Influence of Shape on Parallel Self-Assembly
by Shuhei Miyashita, Zoltán Nagy, Bradley J. Nelson and Rolf Pfeifer
Entropy 2009, 11(4), 643-666; https://doi.org/10.3390/e11040643 - 23 Oct 2009
Cited by 19 | Viewed by 9231
Abstract
Self-assembly is a key phenomenon whereby vast numbers of individual components passively interact and form organized structures, as can be seen, for example, in the morphogenesis of a virus. Generally speaking, the process can be viewed as a spatial placement of attractive and [...] Read more.
Self-assembly is a key phenomenon whereby vast numbers of individual components passively interact and form organized structures, as can be seen, for example, in the morphogenesis of a virus. Generally speaking, the process can be viewed as a spatial placement of attractive and repulsive components. In this paper, we report on an investigation of how morphology, i.e., the shape of components, affects a self-assembly process. The experiments were conducted with 3 differently shaped floating tiles equipped with magnets in an agitated water tank. We propose a novel measure involving clustering coefficients, which qualifies the degree of parallelism of the assembly process. The results showed that the assembly processes were affected by the aggregation sequence in their early stages, where shape induces different behaviors and thus results in variations in aggregation speeds. Full article
Show Figures

Graphical abstract

423 KiB  
Article
Configurational Entropy in Chiral Solutions—Negative Entropy of Solvent Envelopes
by Meir Shinitzky
Entropy 2009, 11(4), 667-674; https://doi.org/10.3390/e11040667 - 29 Oct 2009
Cited by 4 | Viewed by 9207
Abstract
A homogeneous solution of a chiral substance is acquired with an overall asymmetry which is expressed by a specific rotation of a linearly polarized light. Such a solution, despite being at a complete equilibrium, stores configurational entropy in a form of negative entropy [...] Read more.
A homogeneous solution of a chiral substance is acquired with an overall asymmetry which is expressed by a specific rotation of a linearly polarized light. Such a solution, despite being at a complete equilibrium, stores configurational entropy in a form of negative entropy which can be nullified by mixing with a solution of the opposite enantiomer. This abundant, yet quite a specific case of inherent negative entropy, resides predominantly in the chiral configuration of the solvent envelopes surrounding the chiral centers. Heat release, amounting to several cal/mol, associated with the annulment of negative entropy in aqueous solutions of D- and L-amino acids, was recently documented by Shinitzky et al. [1]. This heat corresponds almost exclusively to TΔS stored in the solvent envelope upon adoption of a chiral configuration. Simple fundamental expressions which combine configurational entropy and information capacity in chiral solutions have been developed and were found to comply well with the observed heat release upon intermolecular racemization. Full article
(This article belongs to the Special Issue Configurational Entropy)
Show Figures

Figure 1

164 KiB  
Article
The Maximum Entropy Rate Description of a Thermodynamic System in a Stationary Non-Equilibrium State
by Marco Favretti
Entropy 2009, 11(4), 675-687; https://doi.org/10.3390/e11040675 - 29 Oct 2009
Cited by 6 | Viewed by 7571
Abstract
In this paper we present a simple model to describe a rather general system in a stationary non-equilibrium state, which is an open system traversed by a stationary flux. The probabilistic description is provided by a non-homogeneous Markov chain, which is not assumed [...] Read more.
In this paper we present a simple model to describe a rather general system in a stationary non-equilibrium state, which is an open system traversed by a stationary flux. The probabilistic description is provided by a non-homogeneous Markov chain, which is not assumed on the basis of a model of the microscopic interactions but rather derived from the knowledge of the macroscopic fluxes traversing the system through a maximum entropy rate principle. Full article
(This article belongs to the Special Issue Maximum Entropy)
233 KiB  
Article
A Law of Word Meaning in Dolphin Whistle Types
by Ramon Ferrer-i-Cancho and Brenda McCowan
Entropy 2009, 11(4), 688-701; https://doi.org/10.3390/e11040688 - 30 Oct 2009
Cited by 27 | Viewed by 10758
Abstract
We show that dolphin whistle types tend to be used in specific behavioral contexts, which is consistent with the hypothesis that dolphin whistle have some sort of “meaning”. Besides, in some cases, it can be shown that the behavioral context in which a [...] Read more.
We show that dolphin whistle types tend to be used in specific behavioral contexts, which is consistent with the hypothesis that dolphin whistle have some sort of “meaning”. Besides, in some cases, it can be shown that the behavioral context in which a whistle tends to occur or not occur is shared by different individuals, which is consistent with the hypothesis that dolphins are communicating through whistles. Furthermore, we show that the number of behavioral contexts significantly associated with a certain whistle type tends to grow with the frequency of the whistle type, a pattern that is reminiscent of a law of word meanings stating, as a tendency, that the higher the frequency of a word, the higher its number of meanings. Our findings indicate that the presence of Zipf's law in dolphin whistle types cannot be explained with enough detail by a simplistic die rolling experiment. Full article
(This article belongs to the Special Issue Information Theory Applied to Animal Communication)
216 KiB  
Article
Determination of the Real Loss of Power for a Condensing and a Backpressure Turbine by Means of Second Law Analysis
by Henrik Holmberg, Pekka Ruohonen and Pekka Ahtila
Entropy 2009, 11(4), 702-712; https://doi.org/10.3390/e11040702 - 30 Oct 2009
Cited by 29 | Viewed by 12810
Abstract
All real processes generate entropy and the power/exergy loss is usually determined by means of the Gouy-Stodola law. If the system only exchanges heat at the environmental temperature, the Gouy-Stodola law gives the correct loss of power. However, most industrial processes exchange heat [...] Read more.
All real processes generate entropy and the power/exergy loss is usually determined by means of the Gouy-Stodola law. If the system only exchanges heat at the environmental temperature, the Gouy-Stodola law gives the correct loss of power. However, most industrial processes exchange heat at higher or lower temperatures than the actual environmental temperature. When calculating the real loss of power in these cases, the Gouy-Stodola law does not give the correct loss if the actual environmental temperature is used. The first aim of this paper is to show through simple steam turbine examples that the previous statement is true. The second aim of the paper is to define the effective temperature to calculate the real power loss of the system with the Gouy-Stodola law, and to apply it to turbine examples. Example calculations also show that the correct power loss can be defined if the effective temperature is used instead of the real environmental temperature. Full article
(This article belongs to the Special Issue Exergy: Analysis and Applications)
Show Figures

Figure 1

194 KiB  
Article
Transport of Heat and Charge in Electromagnetic Metrology Based on Nonequilibrium Statistical Mechanics
by James Baker-Jarvis and Jack Surek
Entropy 2009, 11(4), 748-765; https://doi.org/10.3390/e11040748 - 03 Nov 2009
Cited by 1 | Viewed by 8795
Abstract
Current research is probing transport on ever smaller scales. Modeling of the electromagnetic interaction with nanoparticles or small collections of dipoles and its associated energy transport and nonequilibrium characteristics requires a detailed understanding of transport properties. The goal of this paper is to [...] Read more.
Current research is probing transport on ever smaller scales. Modeling of the electromagnetic interaction with nanoparticles or small collections of dipoles and its associated energy transport and nonequilibrium characteristics requires a detailed understanding of transport properties. The goal of this paper is to use a nonequilibrium statistical-mechanical method to obtain exact time-correlation functions, fluctuation-dissipation theorems (FD), heat and charge transport, and associated transport expressions under electromagnetic driving. We extend the time-symmetric Robertson statistical-mechanical theory to study the exact time evolution of relevant variables and entropy rate in the electromagnetic interaction with materials. In this exact statistical-mechanical theory, a generalized canonical density is used to define an entropy in terms of a set of relevant variables and associated Lagrange multipliers. Then the entropy production rate are defined through the relevant variables. The influence of the nonrelevant variables enter the equations through the projection-like operator and thereby influences the entropy. We present applications to the response functions for the electrical and thermal conductivity, specific heat, generalized temperature, Boltzmann’s constant, and noise. The analysis can be performed either classically or quantum-mechanically, and there are only a few modifications in transferring between the approaches. As an application we study the energy, generalized temperature, and charge transport equations that are valid in nonequilibrium and relate it to heat flow and temperature relations in equilibrium states. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
222 KiB  
Article
What is Fair Pay for Executives? An Information Theoretic Analysis of Wage Distributions
by Venkat Venkatasubramanian
Entropy 2009, 11(4), 766-781; https://doi.org/10.3390/e11040766 - 03 Nov 2009
Cited by 11 | Viewed by 20618
Abstract
The high pay packages of U.S. CEOs have raised serious concerns about what would constitute a fair pay. Since the present economic models do not adequately address this fundamental question, we propose a new theory based on statistical mechanics and information theory. We [...] Read more.
The high pay packages of U.S. CEOs have raised serious concerns about what would constitute a fair pay. Since the present economic models do not adequately address this fundamental question, we propose a new theory based on statistical mechanics and information theory. We use the principle of maximum entropy to show that the maximally fair pay distribution is lognormal under ideal conditions. This prediction is in agreement with observed data for the bottom 90%–95% of the working population. The theory estimates that the top 35 U.S. CEOs were overpaid by about 129 times their ideal salaries in 2008. We also provide an insight of entropy as a measure of fairness, which is maximized at equilibrium, in an economic system. Full article
(This article belongs to the Special Issue Maximum Entropy)
241 KiB  
Article
The Languages of Neurons: An Analysis of Coding Mechanisms by Which Neurons Communicate, Learn and Store Information
by Morris H. Baslow
Entropy 2009, 11(4), 782-797; https://doi.org/10.3390/e11040782 - 04 Nov 2009
Cited by 14 | Viewed by 11983
Abstract
In this paper evidence is provided that individual neurons possess language, and that the basic unit for communication consists of two neurons and their entire field of interacting dendritic and synaptic connections. While information processing in the brain is highly complex, each neuron [...] Read more.
In this paper evidence is provided that individual neurons possess language, and that the basic unit for communication consists of two neurons and their entire field of interacting dendritic and synaptic connections. While information processing in the brain is highly complex, each neuron uses a simple mechanism for transmitting information. This is in the form of temporal electrophysiological action potentials or spikes (S) operating on a millisecond timescale that, along with pauses (P) between spikes constitute a two letter “alphabet” that generates meaningful frequency-encoded signals or neuronal S/P “words” in a primary language. However, when a word from an afferent neuron enters the dendritic-synaptic-dendritic field between two neurons, it is translated into a new frequency-encoded word with the same meaning, but in a different spike-pause language, that is delivered to and understood by the efferent neuron. It is suggested that this unidirectional inter-neuronal language-based word translation step is of utmost importance to brain function in that it allows for variations in meaning to occur. Thus, structural or biochemical changes in dendrites or synapses can produce novel words in the second language that have changed meanings, allowing for a specific signaling experience, either external or internal, to modify the meaning of an original word (learning), and store the learned information of that experience (memory) in the form of an altered dendritic-synaptic-dendritic field. Full article
Show Figures

Graphical abstract

219 KiB  
Article
Exergy as a Useful Variable for Quickly Assessing the Theoretical Maximum Power of Salinity Gradient Energy Systems
by Raynald Labrecque
Entropy 2009, 11(4), 798-806; https://doi.org/10.3390/e11040798 - 05 Nov 2009
Cited by 8 | Viewed by 9151
Abstract
It is known that mechanical work, and in turn electricity, can be produced from a difference in the chemical potential that may result from a salinity gradient. Such a gradient may be found, for instance, in an estuary where a stream of soft [...] Read more.
It is known that mechanical work, and in turn electricity, can be produced from a difference in the chemical potential that may result from a salinity gradient. Such a gradient may be found, for instance, in an estuary where a stream of soft water is flooding into a sink of salty water which we may find in an ocean, gulf or salt lake. Various technological approaches are proposed for the production of energy from a salinity gradient between a stream of soft water and a source of salty water. Before considering the implementation of a typical technology, it is of utmost importance to be able to compare various technological approaches, on the same basis, using the appropriate variables and mathematical formulations. In this context, exergy balance can become a very useful tool for an easy and quick evaluation of the maximum thermodynamic work that can be produced from energy systems. In this short paper, we briefly introduce the use of exergy for enabling us to easily and quickly assess the theoretical maximum power or ideal reversible work we may expect from typical salinity gradient energy systems. Full article
(This article belongs to the Special Issue Exergy: Analysis and Applications)
Show Figures

Figure 1

1282 KiB  
Article
Statistical Ensemble Theory of Gompertz Growth Model
by Takuya Yamano
Entropy 2009, 11(4), 807-819; https://doi.org/10.3390/e11040807 - 05 Nov 2009
Cited by 8 | Viewed by 8583
Abstract
An ensemble formulation for the Gompertz growth function within the framework of statistical mechanics is presented, where the two growth parameters are assumed to be statistically distributed. The growth can be viewed as a self-referential process, which enables us to use the Bose-Einstein [...] Read more.
An ensemble formulation for the Gompertz growth function within the framework of statistical mechanics is presented, where the two growth parameters are assumed to be statistically distributed. The growth can be viewed as a self-referential process, which enables us to use the Bose-Einstein statistics picture. The analytical entropy expression pertain to the law can be obtained in terms of the growth velocity distribution as well as the Gompertz function itself for the whole process. Full article
Show Figures

Figure 1

294 KiB  
Article
Using Exergy to Understand and Improve the Efficiency of Electrical Power Technologies
by Marc A. Rosen and Cornelia Aida Bulucea
Entropy 2009, 11(4), 820-835; https://doi.org/10.3390/e11040820 - 06 Nov 2009
Cited by 90 | Viewed by 12548
Abstract
The benefits are demonstrated of using exergy to understand the efficiencies of electrical power technologies and to assist improvements. Although exergy applications in power systems and electrical technology are uncommon, exergy nevertheless identifies clearly potential reductions in thermodynamic losses and efficiency improvements. Various [...] Read more.
The benefits are demonstrated of using exergy to understand the efficiencies of electrical power technologies and to assist improvements. Although exergy applications in power systems and electrical technology are uncommon, exergy nevertheless identifies clearly potential reductions in thermodynamic losses and efficiency improvements. Various devices are considered, ranging from simple electrical devices to generation systems for electrical power and for multiple products including electricity, and on to electrically driven. The insights provided by exergy are shown to be more useful than those provided by energy, which are sometimes misleading. Exergy is concluded to have a significant role in assessing and improving the efficiencies of electrical power technologies and systems, and provides a useful tool for engineers and scientists as well as decision and policy makers. Full article
(This article belongs to the Special Issue Exergy: Analysis and Applications)
Show Figures

Figure 1

210 KiB  
Article
Maximum Entropy Estimation of Transition Probabilities of Reversible Markov Chains
by Erik Van der Straeten
Entropy 2009, 11(4), 867-887; https://doi.org/10.3390/e11040867 - 17 Nov 2009
Cited by 9 | Viewed by 11052
Abstract
In this paper, we develop a general theory for the estimation of the transition probabilities of reversible Markov chains using the maximum entropy principle. A broad range of physical models can be studied within this approach. We use one-dimensional classical spin systems to [...] Read more.
In this paper, we develop a general theory for the estimation of the transition probabilities of reversible Markov chains using the maximum entropy principle. A broad range of physical models can be studied within this approach. We use one-dimensional classical spin systems to illustrate the theoretical ideas. The examples studied in this paper are: the Ising model, the Potts model and the Blume-Emery-Griffiths model. Full article
(This article belongs to the Special Issue Maximum Entropy)
Show Figures

Figure 1

235 KiB  
Article
On the Structural Non-identifiability of Flexible Branched Polymers
by Koh-hei Nitta
Entropy 2009, 11(4), 907-916; https://doi.org/10.3390/e11040907 - 20 Nov 2009
Cited by 9 | Viewed by 10733
Abstract
The dynamics and statics of flexible polymer chains are based on their conformational entropy, resulting in the properties of isolated polymer chains with any branching potentially being characterized by Gaussian chain models. According to the graph-theoretical approach, the dynamics and statics of Gaussian [...] Read more.
The dynamics and statics of flexible polymer chains are based on their conformational entropy, resulting in the properties of isolated polymer chains with any branching potentially being characterized by Gaussian chain models. According to the graph-theoretical approach, the dynamics and statics of Gaussian chains can be expressed as a set of eigenvalues of their Laplacian matrix. As such, the existence of Laplacian cospectral trees allows the structural nonidentifiability of any branched flexible polymer. Full article
(This article belongs to the Special Issue Entropies of Polymers)
Show Figures

Figure 1

189 KiB  
Article
A Weighted Generalized Maximum Entropy Estimator with a Data-driven Weight
by Ximing Wu
Entropy 2009, 11(4), 917-930; https://doi.org/10.3390/e11040917 - 26 Nov 2009
Cited by 26 | Viewed by 8173
Abstract
The method of Generalized Maximum Entropy (GME), proposed in Golan, Judge and Miller (1996), is an information-theoretic approach that is robust to multicolinearity problem. It uses an objective function that is the sum of the entropies for coefficient distributions and disturbance distributions. This [...] Read more.
The method of Generalized Maximum Entropy (GME), proposed in Golan, Judge and Miller (1996), is an information-theoretic approach that is robust to multicolinearity problem. It uses an objective function that is the sum of the entropies for coefficient distributions and disturbance distributions. This method can be generalized to the weighted GME (W-GME), where different weights are assigned to the two entropies in the objective function. We propose a data-driven method to select the weights in the entropy objective function. We use the least squares cross validation to derive the optimal weights. MonteCarlo simulations demonstrate that the proposedW-GME estimator is comparable to and often outperforms the conventional GME estimator, which places equal weights on the entropies of coefficient and disturbance distributions. Full article
(This article belongs to the Special Issue Maximum Entropy)
268 KiB  
Article
Maximum Entropy Production as an Inference Algorithm that Translates Physical Assumptions into Macroscopic Predictions: Don’t Shoot the Messenger
by Roderick C. Dewar
Entropy 2009, 11(4), 931-944; https://doi.org/10.3390/e11040931 - 27 Nov 2009
Cited by 83 | Viewed by 13481
Abstract
Is Maximum Entropy Production (MEP) a physical principle? In this paper I tentatively suggest it is not, on the basis that MEP is equivalent to Jaynes’ Maximum Entropy (MaxEnt) inference algorithm that passively translates physical assumptions into macroscopic predictions, as applied to non-equilibrium [...] Read more.
Is Maximum Entropy Production (MEP) a physical principle? In this paper I tentatively suggest it is not, on the basis that MEP is equivalent to Jaynes’ Maximum Entropy (MaxEnt) inference algorithm that passively translates physical assumptions into macroscopic predictions, as applied to non-equilibrium systems. MaxEnt itself has no physical content; disagreement between MaxEnt predictions and experiment falsifies the physical assumptions, not MaxEnt. While it remains to be shown rigorously that MEP is indeed equivalent to MaxEnt for systems arbitrarily far from equilibrium, work in progress tentatively supports this conclusion. In terms of its role within non-equilibrium statistical mechanics, MEP might then be better understood as Messenger of Essential Physics. Full article
(This article belongs to the Special Issue What Is Maximum Entropy Production and How Should We Apply It?)
Show Figures

Figure 1

70 KiB  
Article
A Story and a Recommendation about the Principle of Maximum Entropy Production
by Garth W. Paltridge
Entropy 2009, 11(4), 945-948; https://doi.org/10.3390/e11040945 - 30 Nov 2009
Cited by 15 | Viewed by 7954
Abstract
The principle of maximum entropy production (MEP) is the subject of considerable academic study, but has yet to become remarkable for its practical applications. A tale is told of an instance in which a spin-off from consideration of an MEP-constrained climate model at [...] Read more.
The principle of maximum entropy production (MEP) is the subject of considerable academic study, but has yet to become remarkable for its practical applications. A tale is told of an instance in which a spin-off from consideration of an MEP-constrained climate model at least led to re-consideration of the very practical issue of water-vapour feedback in climate change. Further, and on a more-or-less unrelated matter, a recommendation is made for further research on whether there might exist a general "rule" whereby, for certain classes of complex non-linear systems, a state of maximum entropy production is equivalent to a state of minimum entropy. Full article
(This article belongs to the Special Issue What Is Maximum Entropy Production and How Should We Apply It?)
256 KiB  
Article
Calculation of Entropy for a Sinusoid with Beta-Distributed Phase
by Joseph V. Michalowicz, Colin C. Olson, Frank Bucholtz and Jonathan M. Nichols
Entropy 2009, 11(4), 949-958; https://doi.org/10.3390/e11040949 - 02 Dec 2009
Viewed by 7331
Abstract
In this paper, an analytical expression is developed for the differential entropy of a sinusoid with a Beta-distributed phase angle. This particular signal model is prevalent in optical communications, however an expression for the associated differential entropy does not currently exist. The expression [...] Read more.
In this paper, an analytical expression is developed for the differential entropy of a sinusoid with a Beta-distributed phase angle. This particular signal model is prevalent in optical communications, however an expression for the associated differential entropy does not currently exist. The expression we derive is approximate as it relies on a series expansion for one of the key terms needed in the derivation. However, we are able to show that the approximation is accurate (error ≤ 5%) for a wide variety of Beta parameter choices. Full article
Show Figures

Figure 1

187 KiB  
Article
Equiprobability, Entropy, Gamma Distributions and Other Geometrical Questions in Multi-Agent Systems
by Ricardo López-Ruiz, Jaime Sañudo and Xavier Calbet
Entropy 2009, 11(4), 959-971; https://doi.org/10.3390/e11040959 - 02 Dec 2009
Cited by 6 | Viewed by 9217
Abstract
A set of many identical interacting agents obeying a global additive constraint is considered. Under the hypothesis of equiprobability in the high-dimensional volume delimited in phase space by the constraint, the statistical behavior of a generic agent over the ensemble is worked out. [...] Read more.
A set of many identical interacting agents obeying a global additive constraint is considered. Under the hypothesis of equiprobability in the high-dimensional volume delimited in phase space by the constraint, the statistical behavior of a generic agent over the ensemble is worked out. The asymptotic distribution of that statistical behavior is derived from geometrical arguments. This distribution is related with the Gamma distributions found in several multi-agent economy models. The parallelism with all these systems is established. Also, as a collateral result, a formula for the volume of high-dimensional symmetrical bodies is proposed. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

178 KiB  
Communication
Dispersal (Entropy) and Recognition (Information) as Foundations of Emergence and Dissolvence
by Bernard Testa
Entropy 2009, 11(4), 993-1000; https://doi.org/10.3390/e11040993 - 03 Dec 2009
Cited by 9 | Viewed by 6362
Abstract
The objective of this essay is to reflect on a possible relation between entropy and emergence. A qualitative, relational approach is followed. We begin by highlighting that entropy includes the concept of dispersal, relevant to our enquiry. Emergence in complex systems arises from [...] Read more.
The objective of this essay is to reflect on a possible relation between entropy and emergence. A qualitative, relational approach is followed. We begin by highlighting that entropy includes the concept of dispersal, relevant to our enquiry. Emergence in complex systems arises from the coordinated behavior of their parts. Coordination in turn necessitates recognition between parts, i.e., information exchange. What will be argued here is that the scope of recognition processes between parts is increased when preceded by their dispersal, which multiplies the number of encounters and creates a richer potential for recognition. A process intrinsic to emergence is dissolvence (aka submergence or top-down constraints), which participates in the information-entropy interplay underlying the creation, evolution and breakdown of higher-level entities. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

1356 KiB  
Article
Best Probability Density Function for Random Sampled Data
by Donald J. Jacobs
Entropy 2009, 11(4), 1001-1024; https://doi.org/10.3390/e11041001 - 04 Dec 2009
Cited by 7 | Viewed by 10777
Abstract
The maximum entropy method is a theoretically sound approach to construct an analytical form for the probability density function (pdf) given a sample of random events. In practice, numerical methods employed to determine the appropriate Lagrange multipliers associated with a set of moments [...] Read more.
The maximum entropy method is a theoretically sound approach to construct an analytical form for the probability density function (pdf) given a sample of random events. In practice, numerical methods employed to determine the appropriate Lagrange multipliers associated with a set of moments are generally unstable in the presence of noise due to limited sampling. A robust method is presented that always returns the best pdf, where tradeoff in smoothing a highly varying function due to noise can be controlled. An unconventional adaptive simulated annealing technique, called funnel diffusion, determines expansion coefficients for Chebyshev polynomials in the exponential function. Full article
(This article belongs to the Special Issue Maximum Entropy)
Show Figures

Figure 1

425 KiB  
Article
On the Spectral Entropy of Thermodynamic Paths for Elementary Systems
by Daniel J. Graham
Entropy 2009, 11(4), 1025-1041; https://doi.org/10.3390/e11041025 - 07 Dec 2009
Cited by 1 | Viewed by 9496
Abstract
Systems do not elect thermodynamic pathways on their own. They operate in tandem with their surroundings. Pathway selection and traversal require coordinated work and heat exchanges along with parallel tuning of the system variables. Previous research by the author (Reference [1]) focused on [...] Read more.
Systems do not elect thermodynamic pathways on their own. They operate in tandem with their surroundings. Pathway selection and traversal require coordinated work and heat exchanges along with parallel tuning of the system variables. Previous research by the author (Reference [1]) focused on the information expressed in thermodynamic pathways. Examined here is how spectral entropy is a by-product of information that depends intricately on the pathway structure. The spectral entropy has proven to be a valuable tool in diverse fields. This paper illustrates the contact between spectral entropy and the properties which distinguish ideal from non-ideal gases. The role of spectral entropy in the first and second laws of thermodynamics and heat → work conversions is also discussed. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

210 KiB  
Article
Modeling Electric Discharges with Entropy Production Rate Principles
by Thomas Christen
Entropy 2009, 11(4), 1042-1054; https://doi.org/10.3390/e11041042 - 08 Dec 2009
Cited by 17 | Viewed by 11448
Abstract
Under which circumstances are variational principles based on entropy production rate useful tools for modeling steady states of electric (gas) discharge systems far from equilibrium? It is first shown how various different approaches, as Steenbeck’s minimum voltage and Prigogine’s minimum entropy production rate [...] Read more.
Under which circumstances are variational principles based on entropy production rate useful tools for modeling steady states of electric (gas) discharge systems far from equilibrium? It is first shown how various different approaches, as Steenbeck’s minimum voltage and Prigogine’s minimum entropy production rate principles are related to the maximum entropy production rate principle (MEPP). Secondly, three typical examples are discussed, which provide a certain insight in the structure of the models that are candidates for MEPP application. It is then thirdly argued that MEPP, although not being an exact physical law, may provide reasonable model parameter estimates, provided the constraints contain the relevant (nonlinear) physical effects and the parameters to be determined are related to disregarded weak constraints that affect mainly global entropy production. Finally, it is additionally conjectured that a further reason for the success of MEPP in certain far from equilibrium systems might be based on a hidden linearity of the underlying kinetic equation(s). Full article
(This article belongs to the Special Issue What Is Maximum Entropy Production and How Should We Apply It?)
Show Figures

Figure 1

484 KiB  
Article
Explaining Change in Language: A Cybersemiotic Perspective
by Marcel Danesi
Entropy 2009, 11(4), 1055-1072; https://doi.org/10.3390/e11041055 - 11 Dec 2009
Cited by 7 | Viewed by 8698
Abstract
One of the greatest conundrums in semiotics and linguistics is explaining why change occurs in communication systems. The descriptive apparatus of how change occurs has been developed in great detail since at least the nineteenth century, but a viable explanatory framework of why [...] Read more.
One of the greatest conundrums in semiotics and linguistics is explaining why change occurs in communication systems. The descriptive apparatus of how change occurs has been developed in great detail since at least the nineteenth century, but a viable explanatory framework of why it occurs in the first place still seems to be clouded in vagueness. So far, only the so-called Principle of Least Effort has come forward to provide a suggestive psychobiological framework for understanding change in communication codes such as language. Extensive work in using this model has shown many fascinating things about language structure and how it evolves. However, the many findings need an integrative framework for shedding light on any generalities implicit in them. This paper argues that a new approach to the study of codes, called cybersemiotics, can be used to great advantage for assessing theoretical frameworks and notions such as the Principle of Least Effort. Amalgamating cybernetic and biosemiotic notions, this new science provides analysts with valuable insights on the raison d’être of phenomena such as linguistic change. Full article
657 KiB  
Article
Entropy-Based Wavelet De-noising Method for Time Series Analysis
by Yan-Fang Sang, Dong Wang, Ji-Chun Wu, Qing-Ping Zhu and Ling Wang
Entropy 2009, 11(4), 1123-1147; https://doi.org/10.3390/e11041123 - 22 Dec 2009
Cited by 59 | Viewed by 12273
Abstract
The existence of noise has great influence on the real features of observed time series, thus noise reduction in time series data is a necessary and significant task in many practical applications. When using traditional de-noising methods, the results often cannot meet the [...] Read more.
The existence of noise has great influence on the real features of observed time series, thus noise reduction in time series data is a necessary and significant task in many practical applications. When using traditional de-noising methods, the results often cannot meet the practical needs due to their inherent shortcomings. In the present paper, first a set of key but difficult wavelet de-noising problems are discussed, and then by applying information entropy theories to the wavelet de-noising process, i.e., using the principle of maximum entropy (POME) to describe the random character of the noise and using wavelet energy entropy to describe the degrees of complexity of the main series in original series data, a new entropy-based wavelet de-noising method is proposed. Analysis results of both several different synthetic series and typical observed time series data have verified the performance of the new method. A comprehensive discussion of the results indicates that compared with traditional wavelet de-noising methods, the new proposed method is more effective and universal. Furthermore, because it uses information entropy theories to describe the obviously different characteristics of noises and the main series in the series data is observed first and then de-noised, the analysis process has a more reliable physical basis, and the results of the new proposed method are more reasonable and are the global optimum. Besides, the analysis process of the new proposed method is simple and is easy to implement, so it would be more applicable and useful in applied sciences and practical engineering works. Full article
(This article belongs to the Special Issue Maximum Entropy)
Show Figures

Figure 1

Review

Jump to: Research, Other

332 KiB  
Review
Optimal Thermodynamics—New Upperbounds
by Michel Feidt
Entropy 2009, 11(4), 529-547; https://doi.org/10.3390/e11040529 - 28 Sep 2009
Cited by 73 | Viewed by 11460
Abstract
This paper reviews how ideas have evolved in this field from the pioneering work of CARNOT right up to the present. The coupling of thermostatics with thermokinetics (heat and mass transfers) and entropy or exergy analysis is illustrated through study of thermomechanical engines [...] Read more.
This paper reviews how ideas have evolved in this field from the pioneering work of CARNOT right up to the present. The coupling of thermostatics with thermokinetics (heat and mass transfers) and entropy or exergy analysis is illustrated through study of thermomechanical engines such as the Carnot heat engine, and internal combustion engines. The benefits and importance of stagnation temperature and irreversibility parameters are underlined. The main situations of constrained (or unconstrained) optimization are defined, discussed and illustrated. The result of this study is a new branch of thermodynamics: Finite Dimensions Optimal Thermodynamics (FDOT). Full article
(This article belongs to the Special Issue Exergy: Analysis and Applications)
Show Figures

Figure 1

367 KiB  
Review
The Maximum Entropy Formalism and the Prediction of Liquid Spray Drop-Size Distribution
by Christophe Dumouchel
Entropy 2009, 11(4), 713-747; https://doi.org/10.3390/e11040713 - 02 Nov 2009
Cited by 34 | Viewed by 9305
Abstract
The efficiency of any application involving a liquid spray is known to be highly dependent on the spray characteristics, and mainly, on the drop-diameter distribution. There is therefore a crucial need of models allowing the prediction of this distribution. However, atomization processes are [...] Read more.
The efficiency of any application involving a liquid spray is known to be highly dependent on the spray characteristics, and mainly, on the drop-diameter distribution. There is therefore a crucial need of models allowing the prediction of this distribution. However, atomization processes are partially known and so far a universal model is not available. For almost thirty years, models based on the Maximum Entropy Formalism have been proposed to fulfill this task. This paper presents a review of these models emphasizing their similarities and differences, and discusses expectations of the use of this formalism to model spray drop-size distribution Full article
(This article belongs to the Special Issue Maximum Entropy)
1264 KiB  
Review
The Use of Ideas of Information Theory for Studying “Language” and Intelligence in Ants
by Boris Ryabko and Zhanna Reznikova
Entropy 2009, 11(4), 836-853; https://doi.org/10.3390/e11040836 - 10 Nov 2009
Cited by 24 | Viewed by 12257
Abstract
In this review we integrate results of long term experimental study on ant “language” and intelligence which were fully based on fundamental ideas of Information Theory, such as the Shannon entropy, the Kolmogorov complexity, and the Shannon’s equation connecting the length of a [...] Read more.
In this review we integrate results of long term experimental study on ant “language” and intelligence which were fully based on fundamental ideas of Information Theory, such as the Shannon entropy, the Kolmogorov complexity, and the Shannon’s equation connecting the length of a message (l) and its frequency (p), i.e., l = –log p for rational communication systems. This approach enabled us to obtain the following important results on ants’ communication and intelligence: (i) to reveal “distant homing” in ants, that is, their ability to transfer information about remote events; (ii) to estimate the rate of information transmission; (iii) to reveal that ants are able to grasp regularities and to use them for “compression” of information; (iv) to reveal that ants are able to transfer to each other the information about the number of objects; (v) to discover that ants can add and subtract small numbers. The obtained results show that information theory is not only excellent mathematical theory, but many of its results may be considered as Nature laws. Full article
(This article belongs to the Special Issue Information Theory Applied to Animal Communication)
Show Figures

Figure 1

149 KiB  
Review
Use of Maximum Entropy Modeling in Wildlife Research
by Roger A. Baldwin
Entropy 2009, 11(4), 854-866; https://doi.org/10.3390/e11040854 - 16 Nov 2009
Cited by 543 | Viewed by 25853
Abstract
Maximum entropy (Maxent) modeling has great potential for identifying distributions and habitat selection of wildlife given its reliance on only presence locations. Recent studies indicate Maxent is relatively insensitive to spatial errors associated with location data, requires few locations to construct useful models, [...] Read more.
Maximum entropy (Maxent) modeling has great potential for identifying distributions and habitat selection of wildlife given its reliance on only presence locations. Recent studies indicate Maxent is relatively insensitive to spatial errors associated with location data, requires few locations to construct useful models, and performs better than other presence-only modeling approaches. Further advances are needed to better define model thresholds, to test model significance, and to address model selection. Additionally, development of modeling approaches is needed when using repeated sampling of known individuals to assess habitat selection. These advancements would strengthen the utility of Maxent for wildlife research and management. Full article
(This article belongs to the Special Issue Maximum Entropy)
137 KiB  
Review
The Variety of Information Transfer in Animal Sonic Communication: Review from a Physics Perspective
by Neville H. Fletcher
Entropy 2009, 11(4), 888-906; https://doi.org/10.3390/e11040888 - 18 Nov 2009
Cited by 5 | Viewed by 8704
Abstract
For many anatomical and physical reasons animals of different genera use widely different communication strategies. While some are chemical or visual, the most common involve sound or vibration and these signals can carry a large amount of information over long distances. The acoustic [...] Read more.
For many anatomical and physical reasons animals of different genera use widely different communication strategies. While some are chemical or visual, the most common involve sound or vibration and these signals can carry a large amount of information over long distances. The acoustic signal varies greatly from one genus to another depending upon animal size, anatomy, physiology, and habitat, as also does the way in which information is encoded in the signal, but some general principles can be elucidated showing the possibilities and limitations for information transfer. Cases discussed range from insects through song birds to humans. Full article
(This article belongs to the Special Issue Information Theory Applied to Animal Communication)
Show Figures

Figure 1

266 KiB  
Review
Fisher Information and Semiclassical Treatments
by Flavia Pennini, Gustavo Ferri and Angelo Plastino
Entropy 2009, 11(4), 972-992; https://doi.org/10.3390/e11040972 - 03 Dec 2009
Cited by 13 | Viewed by 8488
Abstract
We review here the difference between quantum statistical treatments and semiclassical ones, using as the main concomitant tool a semiclassical, shift-invariant Fisher information measure built up with Husimi distributions. Its semiclassical character notwithstanding, this measure also contains abundant information of a purely quantal [...] Read more.
We review here the difference between quantum statistical treatments and semiclassical ones, using as the main concomitant tool a semiclassical, shift-invariant Fisher information measure built up with Husimi distributions. Its semiclassical character notwithstanding, this measure also contains abundant information of a purely quantal nature. Such a tool allows us to refine the celebrated Lieb bound for Wehrl entropies and to discover thermodynamic-like relations that involve the degree of delocalization. Fisher-related thermal uncertainty relations are developed and the degree of purity of canonical distributions, regarded as mixed states, is connected to this Fisher measure as well. Full article
(This article belongs to the Special Issue Maximum Entropy)
Show Figures

Figure 1

328 KiB  
Review
Processing Information in Quantum Decision Theory
by Vyacheslav I. Yukalov and Didier Sornette
Entropy 2009, 11(4), 1073-1120; https://doi.org/10.3390/e11041073 - 14 Dec 2009
Cited by 63 | Viewed by 12821
Abstract
A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures [...] Read more.
A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures the effect of superposition of composite prospects, including many incorporated intended actions. The theory characterizes entangled decision making, non-commutativity of subsequent decisions, and intention interference. The self-consistent procedure of decision making, in the frame of the quantum decision theory, takes into account both the available objective information as well as subjective contextual effects. This quantum approach avoids any paradox typical of classical decision theory. Conditional maximization of entropy, equivalent to the minimization of an information functional, makes it possible to connect the quantum and classical decision theories, showing that the latter is the limit of the former under vanishing interference terms. Full article
(This article belongs to the Special Issue Information and Entropy)

Other

Jump to: Research, Review

100 KiB  
Comment
Comment on “Equiprobability, Entropy, Gamma Distributions and Other Geometrical Questions in Multi-Agent Systems”, Entropy 2009, 11, 959-971
by Raúl Toral
Entropy 2009, 11(4), 1121-1122; https://doi.org/10.3390/e11041121 - 22 Dec 2009
Cited by 1 | Viewed by 7822
Abstract
The volume of the body enclosed by the n-dimensional Lamé curve defined by Ʃni=1 xbi = E is computed. Full article
(This article belongs to the Special Issue Information and Entropy)
Previous Issue
Next Issue
Back to TopTop