Next Issue
Previous Issue

E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Table of Contents

Entropy, Volume 11, Issue 3 (September 2009), Pages 326-528

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
View options order results:
result details:
Displaying articles 1-13
Export citation of selected articles as:

Research

Jump to: Review, Other

Open AccessArticle Imaging Velocimetry Measurements for Entropy Production in a Rotational Magnetic Stirring Tank and Parallel Channel Flow
Entropy 2009, 11(3), 334-350; doi:10.3390/e11030334
Received: 3 July 2009 / Accepted: 19 July 2009 / Published: 23 July 2009
Cited by 4 | PDF Full-text (667 KB) | HTML Full-text | XML Full-text
Abstract
An experimental design is presented for an optical method of measuring spatial variations of flow irreversibilities in laminar viscous fluid motion. Pulsed laser measurements of fluid velocity with PIV (Particle Image Velocimetry) are post-processed to determine the local flow irreversibilities. The experimental technique
[...] Read more.
An experimental design is presented for an optical method of measuring spatial variations of flow irreversibilities in laminar viscous fluid motion. Pulsed laser measurements of fluid velocity with PIV (Particle Image Velocimetry) are post-processed to determine the local flow irreversibilities. The experimental technique yields whole-field measurements of instantaneous entropy production with a non-intrusive, optical method. Unlike point-wise methods that give measured velocities at single points in space, the PIV method is used to measure spatial velocity gradients over the entire problem domain. When combined with local temperatures and thermal irreversibilities, these velocity gradients can be used to find local losses of energy availability and exergy destruction. This article focuses on the frictional portion of entropy production, which leads to irreversible dissipation of mechanical energy to internal energy through friction. Such effects are significant in various technological applications, ranging from power turbines to internal duct flows and turbomachinery. Specific problems of a rotational stirring tank and channel flow are examined in this paper. By tracking the local flow irreversibilities, designers can focus on problem areas of highest entropy production to make local component modifications, thereby improving the overall energy efficiency of the system. Full article
(This article belongs to the Special Issue Exergy: Analysis and Applications)
Open AccessArticle Decimative Multiplication of Entropy Arrays, with Application to Influenza
Entropy 2009, 11(3), 351-359; doi:10.3390/e11030351
Received: 17 May 2009 / Accepted: 29 July 2009 / Published: 31 July 2009
Cited by 4 | PDF Full-text (486 KB) | HTML Full-text | XML Full-text | Correction
Abstract
The use of the digital signal processing procedure of decimation is introduced as a tool to detect patterns of information entropy distribution and is applied to information entropy in influenza A segment 7. Decimation was able to reveal patterns of entropy accumulation in
[...] Read more.
The use of the digital signal processing procedure of decimation is introduced as a tool to detect patterns of information entropy distribution and is applied to information entropy in influenza A segment 7. Decimation was able to reveal patterns of entropy accumulation in archival and emerging segment 7 sequences that were not apparent in the complete, undecimated data. The low entropy accumulation along the first 25% of segment 7, revealed by the three frames of decimation, may be a sign of regulation at both protein and RNA levels to conserve important viral functions. Low segment 7 entropy values from the 2009 H1N1 swine flu pandemic suggests either that: (1) the viruses causing the current outbreak have convergently evolved to their low entropy state or (2) more likely, not enough time has yet passed for the entropy to accumulate. Because of its dependence upon the periodicity of the codon, the decimative procedure should be generalizable to any biological system. Full article
Open AccessArticle Properties of the Statistical Complexity Functional and Partially Deterministic HMMs
Entropy 2009, 11(3), 385-401; doi:10.3390/e110300385
Received: 31 March 2009 / Accepted: 5 August 2009 / Published: 11 August 2009
Cited by 8 | PDF Full-text (279 KB) | HTML Full-text | XML Full-text
Abstract
Statistical complexity is a measure of complexity of discrete-time stationary stochastic processes, which has many applications. We investigate its more abstract properties as a non-linear function of the space of processes and show its close relation to the Knight’s prediction process. We prove
[...] Read more.
Statistical complexity is a measure of complexity of discrete-time stationary stochastic processes, which has many applications. We investigate its more abstract properties as a non-linear function of the space of processes and show its close relation to the Knight’s prediction process. We prove lower semi-continuity, concavity, and a formula for the ergodic decomposition of statistical complexity. On the way, we show that the discrete version of the prediction process has a continuous Markov transition. We also prove that, given the past output of a partially deterministic hidden Markov model (HMM), the uncertainty of the internal state is constant over time and knowledge of the internal state gives no additional information on the future output. Using this fact, we show that the causal state distribution is the unique stationary representation on prediction space that may have finite entropy. Full article
Open AccessArticle Continuous-Discrete Path Integral Filtering
Entropy 2009, 11(3), 402-430; doi:10.3390/e110300402
Received: 19 February 2009 / Accepted: 6 August 2009 / Published: 17 August 2009
Cited by 14 | PDF Full-text (1207 KB) | HTML Full-text | XML Full-text
Abstract
A summary of the relationship between the Langevin equation, Fokker-Planck-Kolmogorov forward equation (FPKfe) and the Feynman path integral descriptions of stochastic processes relevant for the solution of the continuous-discrete filtering problem is provided in this paper. The practical utility of the path integral
[...] Read more.
A summary of the relationship between the Langevin equation, Fokker-Planck-Kolmogorov forward equation (FPKfe) and the Feynman path integral descriptions of stochastic processes relevant for the solution of the continuous-discrete filtering problem is provided in this paper. The practical utility of the path integral formula is demonstrated via some nontrivial examples. Specifically, it is shown that the simplest approximation of the path integral formula for the fundamental solution of the FPKfe can be applied to solve nonlinear continuous-discrete filtering problems quite accurately. The Dirac-Feynman path integral filtering algorithm is quite simple, and is suitable for real-time implementation. Full article
Open AccessArticle Quantification of Information in a One-Way Plant-to-Animal Communication System
Entropy 2009, 11(3), 431-442; doi:10.3390/e110300431
Received: 15 July 2009 / Revised: 18 August 2009 / Accepted: 20 August 2009 / Published: 21 August 2009
Cited by 7 | PDF Full-text (121 KB) | HTML Full-text | XML Full-text
Abstract
In order to demonstrate possible broader applications of information theory to the quantification of non-human communication systems, we apply calculations of information entropy to a simple chemical communication from the cotton plant (Gossypium hirsutum) to the wasp (Cardiochiles nigriceps)
[...] Read more.
In order to demonstrate possible broader applications of information theory to the quantification of non-human communication systems, we apply calculations of information entropy to a simple chemical communication from the cotton plant (Gossypium hirsutum) to the wasp (Cardiochiles nigriceps) studied by DeMoraes et al. The purpose of this chemical communication from cotton plants to wasps is presumed to be to allow the predatory wasp to more easily obtain the location of its preferred prey—one of two types of parasitic herbivores feeding on the cotton plants. Specification of the plant-eating herbivore feeding on it by the cotton plants allows preferential attraction of the wasps to those individual plants. We interpret the emission of nine chemicals by the plants as individual signal differences, (depending on the herbivore type), to be detected by the wasps as constituting a nine-signal one-way communication system across kingdoms (from the kingdom Plantae to the kingdom Animalia). We use fractional differences in the chemical abundances, (emitted as a result of the two herbivore types), to calculate the Shannon information entropic measures (marginal, joint, and mutual entropies, as well as the ambiguity, etc. of the transmitted message). We then compare these results with the subsequent behavior of the wasps, (calculating the equivocation in the message reception), for possible insights into the history and actual working of this one-way communication system. Full article
(This article belongs to the Special Issue Information Theory Applied to Animal Communication)
Open AccessArticle Thermoeconomic Optimum Operation Conditions of a Solar-driven Heat Engine Model
Entropy 2009, 11(3), 443-453; doi:10.3390/e11030443
Received: 25 June 2009 / Accepted: 8 July 2009 / Published: 25 August 2009
Cited by 15 | PDF Full-text (212 KB) | HTML Full-text | XML Full-text
Abstract
In the present paper, the thermoeconomic optimization of an endoreversible solardriven heat engine has been carried out by using finite-time/finite-size thermodynamic theory. In the considered heat engine model, the heat transfer from the hot reservoir to the working fluid is assumed to be
[...] Read more.
In the present paper, the thermoeconomic optimization of an endoreversible solardriven heat engine has been carried out by using finite-time/finite-size thermodynamic theory. In the considered heat engine model, the heat transfer from the hot reservoir to the working fluid is assumed to be the radiation type and the heat transfer to the cold reservoir is assumed the conduction type. In this work, the optimum performance and two design parameters have been investigated under three objective functions: the power output per unit total cost, the efficient power per unit total cost and the ecological function per unit total cost. The effects of the technical and economical parameters on the thermoeconomic performance have been also discussed under the aforementioned three criteria of performance. Full article
(This article belongs to the Special Issue Exergy: Analysis and Applications)
Open AccessArticle Emergence of Animals from Heat Engines – Part 1. Before the Snowball Earths
Entropy 2009, 11(3), 463-512; doi:10.3390/e11030463
Received: 20 August 2009 / Accepted: 15 September 2009 / Published: 18 September 2009
Cited by 1 | PDF Full-text (1242 KB) | HTML Full-text | XML Full-text
Abstract
The origin of life has previously been modeled by biological heat engines driven by thermal cycling, caused by suspension in convecting water. Here more complex heat engines are invoked to explain the origin of animals in the thermal gradient above a submarine hydrothermal
[...] Read more.
The origin of life has previously been modeled by biological heat engines driven by thermal cycling, caused by suspension in convecting water. Here more complex heat engines are invoked to explain the origin of animals in the thermal gradient above a submarine hydrothermal vent. Thermal cycling by a filamentous protein ‘thermotether’ was the result of a temperature-gradient induced relaxation oscillation not impeded by the low Reynolds number of a small scale. During evolution a ‘flagellar proton pump’ emerged that resembled Feynman’s ratchet and that turned into today’s bacterial flagellar motor. An emerged ‘flagellar computer’ functioning as Turing machine implemented chemotaxis. Full article
(This article belongs to the Special Issue Nonequilibrium Thermodynamics)
Figures

Open AccessArticle Scale-Based Gaussian Coverings: Combining Intra and Inter Mixture Models in Image Segmentation
Entropy 2009, 11(3), 513-528; doi:10.3390/e11030513
Received: 1 September 2009 / Accepted: 14 September 2009 / Published: 24 September 2009
Cited by 2 | PDF Full-text (387 KB) | HTML Full-text | XML Full-text
Abstract
By a “covering” we mean a Gaussian mixture model fit to observed data. Approximations of the Bayes factor can be availed of to judge model fit to the data within a given Gaussian mixture model. Between families of Gaussian mixture models, we propose
[...] Read more.
By a “covering” we mean a Gaussian mixture model fit to observed data. Approximations of the Bayes factor can be availed of to judge model fit to the data within a given Gaussian mixture model. Between families of Gaussian mixture models, we propose the Rényi quadratic entropy as an excellent and tractable model comparison framework. We exemplify this using the segmentation of an MRI image volume, based (1) on a direct Gaussian mixture model applied to the marginal distribution function, and (2) Gaussian model fit through k-means applied to the 4D multivalued image volume furnished by the wavelet transform. Visual preference for one model over another is not immediate. The Rényi quadratic entropy allows us to show clearly that one of these modelings is superior to the other. Full article
(This article belongs to the Special Issue Information and Entropy)

Review

Jump to: Research, Other

Open AccessReview Thermodynamics of the System of Distinguishable Particles
Entropy 2009, 11(3), 326-333; doi:10.3390/e11030326
Received: 16 April 2009 / Accepted: 25 June 2009 / Published: 29 June 2009
Cited by 9 | PDF Full-text (120 KB)
Abstract
The issue of the thermodynamics of a system of distinguishable particles is discussed in this paper. In constructing the statistical mechanics of distinguishable particles from the definition of Boltzmann entropy, it is found that the entropy is not extensive. The inextensivity leads to
[...] Read more.
The issue of the thermodynamics of a system of distinguishable particles is discussed in this paper. In constructing the statistical mechanics of distinguishable particles from the definition of Boltzmann entropy, it is found that the entropy is not extensive. The inextensivity leads to the so-called Gibbs paradox in which the mixing entropy of two identical classical gases increases. Lots of literature from different points of view were created to resolve the paradox. In this paper, starting from the Boltzmann entropy, we present the thermodynamics of the system of distinguishable particles. A straightforward way to get the corrected Boltzmann counting is shown. The corrected Boltzmann counting factor can be justified in classical statistical mechanics. Full article
(This article belongs to the Special Issue Gibbs Paradox and Its Resolutions)
Open AccessReview Entropic Forces in Geophysical Fluid Dynamics
Entropy 2009, 11(3), 360-383; doi:10.3390/e11030360
Received: 2 June 2009 / Accepted: 1 August 2009 / Published: 7 August 2009
Cited by 7 | PDF Full-text (846 KB) | HTML Full-text | XML Full-text
Abstract
Theories and numerical models of atmospheres and oceans are based on classical mechanics with added parameterizations to represent subgrid variability. Reformulated in terms of derivatives of information entropy with respect to large scale configurations, we find systematic forces very different from those usually
[...] Read more.
Theories and numerical models of atmospheres and oceans are based on classical mechanics with added parameterizations to represent subgrid variability. Reformulated in terms of derivatives of information entropy with respect to large scale configurations, we find systematic forces very different from those usually assumed. Two examples are given. We see that entropic forcing by ocean eddies systematically drives, rather than retards, large scale circulation. Additionally we find that small scale turbulence systematically drives up gradient (“un-mixing”) fluxes. Such results confront usual understanding and modeling practice. Full article
(This article belongs to the Special Issue Concepts of Entropy)

Other

Jump to: Research, Review

Open AccessCorrection Thompson, W.A. et al. Decimative Multiplication of Entropy Arrays, with Application to Influenza. Entropy, 2009, 11, 351-359
Entropy 2009, 11(3), 384; doi:10.3390/e110300384
Received: 6 August 2009 / Published: 7 August 2009
Cited by 1 | PDF Full-text (16 KB) | HTML Full-text | XML Full-text
Abstract The sentence sixth line from the end of paragraph two on page 355, “The second synonymous mutation was another G=>A transition at position 600 that converted the CAG codon to CAA, without change of encoded amino acid.” Full article
Open AccessLetter Gibbs’ Paradox in the Light of Newton’s Notion of State
Entropy 2009, 11(3), 454-456; doi:10.3390/e11030454
Received: 10 July 2009 / Accepted: 3 September 2009 / Published: 7 September 2009
Cited by 4 | PDF Full-text (48 KB) | HTML Full-text | XML Full-text
Abstract In this letter, it is argued that the correct counting of microstates is obtained from the very beginning when using Newtonian rather than Laplacian state functions, because the former are intrinsically permutation invariant. Full article
(This article belongs to the Special Issue Gibbs Paradox and Its Resolutions)
Open AccessCorrection Minardi, E. Thermodynamics of High Temperature Plasmas. Entropy, 2009, 11, 124-221
Entropy 2009, 11(3), 457-462; doi:10.3390/e11030457
Received: 4 September 2009 / Accepted: 14 September 2009 / Published: 14 September 2009
PDF Full-text (52 KB) | HTML Full-text | XML Full-text
Abstract I discovered some typographical errors in the paper "Thermodynamics of High Temperature Plasmas" [1] and some points that need clarification. These defects are remedied in this correction. [...] Full article

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
entropy@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy
Back to Top