Next Issue
Previous Issue

E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Table of Contents

Entropy, Volume 15, Issue 1 (January 2013), Pages 1-415

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
View options order results:
result details:
Displaying articles 1-20
Export citation of selected articles as:

Research

Jump to: Review, Other

Open AccessArticle Quantitative Analysis of Dynamic Behaviours of Rural Areas at Provincial Level Using Public Data of Gross Domestic Product
Entropy 2013, 15(1), 10-31; doi:10.3390/e15010010
Received: 7 November 2012 / Revised: 4 December 2012 / Accepted: 7 December 2012 / Published: 20 December 2012
Cited by 7 | PDF Full-text (8814 KB) | HTML Full-text | XML Full-text
Abstract
A spatial approach that incorporates three economic components and one environmental factor has been developed to evaluate the dynamic behaviours of the rural areas at a provincial level. An artificial fish swarm algorithm with variable population size (AFSAVP) is proposed for the [...] Read more.
A spatial approach that incorporates three economic components and one environmental factor has been developed to evaluate the dynamic behaviours of the rural areas at a provincial level. An artificial fish swarm algorithm with variable population size (AFSAVP) is proposed for the spatial problem. A functional region affecting index θ is employed as a fitness function for the AFSAVP driven optimisation, in which a gross domestic product (GDP) based method is utilised to estimate the CO2 emission of all provinces. A simulation for the administrative provinces of China has been implemented, and the results have shown that the modelling method based on GDP data can assess the spatial dynamic behaviours and can be taken as an operational tool for the policy planners. Full article
(This article belongs to the Special Issue Entropy and Urban Sprawl)
Open AccessArticle Function Based Fault Detection for Uncertain Multivariate Nonlinear Non-Gaussian Stochastic Systems Using Entropy Optimization Principle
Entropy 2013, 15(1), 32-52; doi:10.3390/e15010032
Received: 26 June 2012 / Revised: 16 December 2012 / Accepted: 18 December 2012 / Published: 21 December 2012
Cited by 1 | PDF Full-text (1113 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, the fault detection in uncertain multivariate nonlinear non-Gaussian stochastic systems is further investigated. Entropy is introduced to characterize the stochastic behavior of the detection errors, and the entropy optimization principle is established for the fault detection problem. The principle [...] Read more.
In this paper, the fault detection in uncertain multivariate nonlinear non-Gaussian stochastic systems is further investigated. Entropy is introduced to characterize the stochastic behavior of the detection errors, and the entropy optimization principle is established for the fault detection problem. The principle is to maximize the entropies of the stochastic detection errors in the presence of faults and to minimize the entropies of the detection errors in the presence of disturbances. In order to calculate the entropies, the formulations of the joint probability density functions (JPDFs) of the stochastic errors are presented in terms of the known JPDFs of both the disturbances and the faults. By using the novel performance indexes and the formulations for the entropies of the detection errors, new fault detection design methods are provided for the considered multivariate nonlinear non-Gaussian plants. Finally, a simulation example is given to illustrate the efficiency of the proposed fault detection algorithm. Full article
Open AccessArticle Ordered Regions within a Nonlinear Time Series Solution of a Lorenz Form of the Townsend Equations for a Boundary-Layer Flow
Entropy 2013, 15(1), 53-79; doi:10.3390/e15010053
Received: 21 August 2012 / Revised: 12 December 2012 / Accepted: 20 December 2012 / Published: 24 December 2012
Cited by 5 | PDF Full-text (1612 KB) | HTML Full-text | XML Full-text
Abstract
A modified form of the Townsend equations for the fluctuating velocity wave vectors is applied to a laminar three-dimensional boundary-layer flow. These equations are cast into a Lorenz-type system of equations. The initial system of Lorenz equations yields the generation of masked [...] Read more.
A modified form of the Townsend equations for the fluctuating velocity wave vectors is applied to a laminar three-dimensional boundary-layer flow. These equations are cast into a Lorenz-type system of equations. The initial system of Lorenz equations yields the generation of masked output signals containing internal ordered regions. The self-synchronizing property of the Lorenz system of equations is then exploited by considering the initial Lorenz system as a transmitter system providing chaotic masked information signals to a series of identical Lorenz receiver systems. The output signal from each successive receiver system indicates the growing recovery of ordered regions in the chaotic output signal. Finally, the three-dimensional graph of the output velocity wave vector signal from the fourth receiver system and the spectral entropy rates for the output axial velocity wave vector indicate the presence of ordered regions which are characterized as axially-directed spiral vortices. Full article
Open AccessArticle Numerical Study of Entropy Generation in a Flowing Nanofluid Used in Micro- and Minichannels
Entropy 2013, 15(1), 144-155; doi:10.3390/e15010144
Received: 13 November 2012 / Revised: 4 December 2012 / Accepted: 7 December 2012 / Published: 7 January 2013
Cited by 19 | PDF Full-text (614 KB) | HTML Full-text | XML Full-text
Abstract
This article mainly concerns theoretical research on entropy generation influences due to heat transfer and flow in nanofluid suspensions. A conventional nanofluid of alumina-water (Al2O3-H2O) was considered as the fluid model. Due to the sensitivity of [...] Read more.
This article mainly concerns theoretical research on entropy generation influences due to heat transfer and flow in nanofluid suspensions. A conventional nanofluid of alumina-water (Al2O3-H2O) was considered as the fluid model. Due to the sensitivity of entropy to duct diameter, mini- and microchannels with diameters of 3 mm and 0.05 mm were considered, and a laminar flow regime was assumed. The conductivity and viscosity of two different nanofluid models were examined with the help of theoretical and experimentally determined parameter values. It was shown that order of the magnitude analysis can be used for estimating entropy generation characteristics of nanofluids in mini- and microchannels. It was found that using highly viscous alumina-water nanofluid under laminar flow regime in microchannels was not desirable. Thus, there is a need for the development of low viscosity alumina-water (Al2O3-H2O) nanofluids for use in microchannels under laminar flow condition. On the other hand, Al2O3-H2O nanofluid was a superior coolant under laminar flow regime in minichannels. The presented results also indicate that flow friction and thermal irreversibility are, respectively, more significant at lower and higher tube diameters. Full article
Open AccessArticle The Thermal Entropy Density of Spacetime
Entropy 2013, 15(1), 156-161; doi:10.3390/e15010156
Received: 28 November 2012 / Accepted: 25 December 2012 / Published: 8 January 2013
Cited by 2 | PDF Full-text (147 KB)
Abstract
Introducing the notion of thermal entropy density via the first law of thermodynamics and assuming the Einstein equation as an equation of thermal state, we obtain the thermal entropy density of any arbitrary spacetime without assuming a temperature or a horizon. The [...] Read more.
Introducing the notion of thermal entropy density via the first law of thermodynamics and assuming the Einstein equation as an equation of thermal state, we obtain the thermal entropy density of any arbitrary spacetime without assuming a temperature or a horizon. The results confirm that there is a profound connection between gravity and thermodynamics. Full article
(This article belongs to the Special Issue Modified Gravity: From Black Holes Entropy to Current Cosmology)
Open AccessArticle Moving Frames of Reference, Relativity and Invariance in Transfer Entropy and Information Dynamics
Entropy 2013, 15(1), 177-197; doi:10.3390/e15010177
Received: 16 November 2012 / Accepted: 31 December 2012 / Published: 10 January 2013
Cited by 3 | PDF Full-text (493 KB) | HTML Full-text | XML Full-text
Abstract
We present a new interpretation of a local framework for informationdynamics, including the transfer entropy, by defining a moving frame of reference for theobserver of dynamics in lattice systems. This formulation is inspired by the idea ofinvestigating “relativistic” effects on observing the [...] Read more.
We present a new interpretation of a local framework for informationdynamics, including the transfer entropy, by defining a moving frame of reference for theobserver of dynamics in lattice systems. This formulation is inspired by the idea ofinvestigating “relativistic” effects on observing the dynamics of information - in particular,we investigate a Galilean transformation of the lattice system data. In applying thisinterpretation to elementary cellular automata, we demonstrate that using a moving frameof reference certainly alters the observed spatiotemporal measurements of informationdynamics, yet still returns meaningful results in this context. We find that, as expected,an observer will report coherent spatiotemporal structures that are moving in their frame asinformation transfer, and structures that are stationary in their frame as information storage.Crucially, the extent to which the shifted frame of reference alters the results dependson whether the shift of frame retains, adds or removes relevant information regarding thesource-destination interaction. Full article
(This article belongs to the Special Issue Transfer Entropy)
Open AccessArticle Compensated Transfer Entropy as a Tool for Reliably Estimating Information Transfer in Physiological Time Series
Entropy 2013, 15(1), 198-219; doi:10.3390/e15010198
Received: 29 October 2012 / Revised: 21 December 2012 / Accepted: 5 January 2013 / Published: 11 January 2013
Cited by 21 | PDF Full-text (314 KB) | HTML Full-text | XML Full-text
Abstract
We present a framework for the estimation of transfer entropy (TE) under the conditions typical of physiological system analysis, featuring short multivariate time series and the presence of instantaneous causality (IC). The framework is based on recognizing that TE can be interpreted [...] Read more.
We present a framework for the estimation of transfer entropy (TE) under the conditions typical of physiological system analysis, featuring short multivariate time series and the presence of instantaneous causality (IC). The framework is based on recognizing that TE can be interpreted as the difference between two conditional entropy (CE) terms, and builds on an efficient CE estimator that compensates for the bias occurring for high dimensional conditioning vectors and follows a sequential embedding procedure whereby the conditioning vectors are formed progressively according to a criterion for CE minimization. The issue of IC is faced accounting for zero-lag interactions according to two alternative empirical strategies: if IC is deemed as physiologically meaningful, zero-lag effects are assimilated to lagged effects to make them causally relevant; if not, zero-lag effects are incorporated in both CE terms to obtain a compensation. The resulting compensated TE (cTE) estimator is tested on simulated time series, showing that its utilization improves sensitivity (from 61% to 96%) and specificity (from 5/6 to 0/6 false positives) in the detection of information transfer respectively when instantaneous effect are causally meaningful and non-meaningful. Then, it is evaluated on examples of cardiovascular and neurological time series, supporting the feasibility of the proposed framework for the investigation of physiological mechanisms. Full article
(This article belongs to the Special Issue Transfer Entropy)
Figures

Open AccessArticle Effects of Convective Heating on Entropy Generation Rate in a Channel with Permeable Walls
Entropy 2013, 15(1), 220-233; doi:10.3390/e15010220
Received: 2 December 2012 / Revised: 31 December 2012 / Accepted: 5 January 2013 / Published: 11 January 2013
Cited by 13 | PDF Full-text (609 KB) | HTML Full-text | XML Full-text
Abstract
This study deals with the combined effects of convective heating and suction/injection on the entropy generation rate in a steady flow of an incompressible viscous fluid through a channel with permeable walls. The model equations for momentum and energy balance are solved [...] Read more.
This study deals with the combined effects of convective heating and suction/injection on the entropy generation rate in a steady flow of an incompressible viscous fluid through a channel with permeable walls. The model equations for momentum and energy balance are solved numerically using shooting quadrature. Both the velocity and temperature profiles are obtained and utilized to compute the entropy generation number. The effects of the key parameters on the fluid velocity, temperature, entropy generation rate and Bejan number are depicted graphically and analyzed in detail. Full article
Open AccessArticle Using Exergy to Correlate Energy Research Investments and Efficiencies: Concept and Case Studies
Entropy 2013, 15(1), 262-286; doi:10.3390/e15010262
Received: 30 November 2012 / Revised: 28 December 2012 / Accepted: 5 January 2013 / Published: 16 January 2013
Cited by 5 | PDF Full-text (934 KB) | HTML Full-text | XML Full-text
Abstract
The use of exergy to correlate energy-utilization efficiencies and energy research investments is described. Specifically, energy and exergy losses are compared with energy research and development expenditures, demonstrating that the latter correlates with energy losses, even though it would be more sensible [...] Read more.
The use of exergy to correlate energy-utilization efficiencies and energy research investments is described. Specifically, energy and exergy losses are compared with energy research and development expenditures, demonstrating that the latter correlates with energy losses, even though it would be more sensible to allocate energy research and development funding in line with exergy losses, as they represent the actual deviation of efficiency from the ideal. The methodology is outlined and illustrated with two case studies. The case studies consider the province of Ontario, Canada and the United States. The investigation utilizes data on the energy utilization in a country or region, including flows of energy and exergy through the main sectors of the economy. The results are expected to be of use to government and public authorities that administer research and development funding and resources and should help improve the effectiveness of such investments. Full article
(This article belongs to the Special Issue Exergy: Analysis and Applications)
Open AccessArticle Covariance-Based Measurement Selection Criterion for Gaussian-Based Algorithms
Entropy 2013, 15(1), 287-310; doi:10.3390/e15010287
Received: 15 October 2012 / Revised: 16 December 2012 / Accepted: 8 January 2013 / Published: 17 January 2013
Cited by 1 | PDF Full-text (3021 KB) | HTML Full-text | XML Full-text
Abstract
Process modeling by means of Gaussian-based algorithms often suffers from redundant information which usually increases the estimation computational complexity without significantly improving the estimation performance. In this article, a non-arbitrary measurement selection criterion for Gaussian-based algorithms is proposed. The measurement selection criterion [...] Read more.
Process modeling by means of Gaussian-based algorithms often suffers from redundant information which usually increases the estimation computational complexity without significantly improving the estimation performance. In this article, a non-arbitrary measurement selection criterion for Gaussian-based algorithms is proposed. The measurement selection criterion is based on the determination of the most significant measurement from both an estimation convergence perspective and the covariance matrix associated with the measurement. The selection criterion is independent from the nature of the measured variable. This criterion is used in conjunction with three Gaussian-based algorithms: the EIF (Extended Information Filter), the EKF (Extended Kalman Filter) and the UKF (Unscented Kalman Filter). Nevertheless, the measurement selection criterion shown herein can also be applied to other Gaussian-based algorithms. Although this work is focused on environment modeling, the results shown herein can be applied to other Gaussian-based algorithm implementations. Mathematical descriptions and implementation results that validate the proposal are also included in this work. Full article
Open AccessArticle Information and Metabolism in Bacterial Chemotaxis
Entropy 2013, 15(1), 311-326; doi:10.3390/e15010311
Received: 14 October 2012 / Revised: 13 November 2012 / Accepted: 26 December 2012 / Published: 18 January 2013
Cited by 6 | PDF Full-text (278 KB) | HTML Full-text | XML Full-text
Abstract
One of the most important issues in theoretical biology is to understand how control mechanisms are deployed by organisms to maintain their homeostasis and ensure their survival. A crucial issue is how organisms deal with environmental information in a way that ensures [...] Read more.
One of the most important issues in theoretical biology is to understand how control mechanisms are deployed by organisms to maintain their homeostasis and ensure their survival. A crucial issue is how organisms deal with environmental information in a way that ensures appropriate exchanges with the environment — even in the most basic of life forms (namely, bacteria). In this paper, I present an information theoretic formulation of how Escherichia coli responds to environmental information during chemotaxis and, more generally, a cybernetic model of the relationship between information and biophysical (metabolic) dynamics. Full article
Open AccessArticle Asymptotic Behavior of the Maximum Entropy Routing in Computer Networks
Entropy 2013, 15(1), 361-371; doi:10.3390/e15010361
Received: 26 November 2012 / Revised: 16 January 2013 / Accepted: 17 January 2013 / Published: 21 January 2013
Cited by 9 | PDF Full-text (191 KB) | HTML Full-text | XML Full-text
Abstract
Maximum entropy method has been successfully used for underdetermined systems. Network design problem, with routing and topology subproblems, is an underdetermined system and a good candidate for maximum entropy method application. Wireless ad-hoc networks with rapidly changing topology and link quality, where [...] Read more.
Maximum entropy method has been successfully used for underdetermined systems. Network design problem, with routing and topology subproblems, is an underdetermined system and a good candidate for maximum entropy method application. Wireless ad-hoc networks with rapidly changing topology and link quality, where the speed of recalculation is of crucial importance, have been recently successfully investigated by maximum entropy method application. In this paper we prove a theorem that establishes asymptotic properties of the maximum entropy routing solution. This result, besides being theoretically interesting, can be used to direct initial approximation for iterative optimization algorithms and to speed up their convergence. Full article
(This article belongs to the Special Issue Information Theory Applied to Communications and Networking)
Open AccessArticle Expanding the Algorithmic Information Theory Frame for Applications to Earth Observation
Entropy 2013, 15(1), 407-415; doi:10.3390/e15010407
Received: 28 November 2012 / Revised: 20 December 2012 / Accepted: 14 January 2013 / Published: 22 January 2013
Cited by 2 | PDF Full-text (480 KB) | HTML Full-text | XML Full-text
Abstract
Recent years have witnessed an increased interest towards compression-based methods and their applications to remote sensing, as these have a data-driven and parameter-free approach and can be thus succesfully employed in several applications, especially in image information mining. This paper expands the [...] Read more.
Recent years have witnessed an increased interest towards compression-based methods and their applications to remote sensing, as these have a data-driven and parameter-free approach and can be thus succesfully employed in several applications, especially in image information mining. This paper expands the algorithmic information theory frame, on which these methods are based. On the one hand, algorithms originally defined in the pattern matching domain are reformulated, allowing a better understanding of the available compression-based tools for remote sensing applications. On the other hand, the use of existing compression algorithms is proposed to store satellite images with added semantic value. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Figures

Review

Jump to: Research, Other

Open AccessReview Machine Learning with Squared-Loss Mutual Information
Entropy 2013, 15(1), 80-112; doi:10.3390/e15010080
Received: 29 October 2012 / Revised: 7 December 2012 / Accepted: 21 December 2012 / Published: 27 December 2012
Cited by 10 | PDF Full-text (350 KB) | HTML Full-text | XML Full-text
Abstract
Mutual information (MI) is useful for detecting statistical independence between random variables, and it has been successfully applied to solving various machine learning problems. Recently, an alternative to MI called squared-loss MI (SMI) was introduced. While ordinary MI is the Kullback–Leibler divergence [...] Read more.
Mutual information (MI) is useful for detecting statistical independence between random variables, and it has been successfully applied to solving various machine learning problems. Recently, an alternative to MI called squared-loss MI (SMI) was introduced. While ordinary MI is the Kullback–Leibler divergence from the joint distribution to the product of the marginal distributions, SMI is its Pearson divergence variant. Because both the divergences belong to the ƒ-divergence family, they share similar theoretical properties. However, a notable advantage of SMI is that it can be approximated from data in a computationally more efficient and numerically more stable way than ordinary MI. In this article, we review recent development in SMI approximation based on direct density-ratio estimation and SMI-based machine learning techniques such as independence testing, dimensionality reduction, canonical dependency analysis, independent component analysis, object matching, clustering, and causal inference. Full article
(This article belongs to the Special Issue Estimating Information-Theoretic Quantities from Data)
Open AccessReview The Relation between Granger Causality and Directed Information Theory: A Review
Entropy 2013, 15(1), 113-143; doi:10.3390/e15010113
Received: 14 November 2012 / Revised: 19 December 2012 / Accepted: 19 December 2012 / Published: 28 December 2012
Cited by 16 | PDF Full-text (579 KB) | HTML Full-text | XML Full-text
Abstract
This report reviews the conceptual and theoretical links between Granger causality and directed information theory. We begin with a short historical tour of Granger causality, concentrating on its closeness to information theory. The definitions of Granger causality based on prediction are recalled, [...] Read more.
This report reviews the conceptual and theoretical links between Granger causality and directed information theory. We begin with a short historical tour of Granger causality, concentrating on its closeness to information theory. The definitions of Granger causality based on prediction are recalled, and the importance of the observation set is discussed. We present the definitions based on conditional independence. The notion of instantaneous coupling is included in the definitions. The concept of Granger causality graphs is discussed. We present directed information theory from the perspective of studies of causal influences between stochastic processes. Causal conditioning appears to be the cornerstone for the relation between information theory and Granger causality. In the bivariate case, the fundamental measure is the directed information, which decomposes as the sum of the transfer entropies and a term quantifying instantaneous coupling. We show the decomposition of the mutual information into the sums of the transfer entropies and the instantaneous coupling measure, a relation known for the linear Gaussian case. We study the multivariate case, showing that the useful decomposition is blurred by instantaneous coupling. The links are further developed by studying how measures based on directed information theory naturally emerge from Granger causality inference frameworks as hypothesis testing. Full article
(This article belongs to the Special Issue Transfer Entropy)
Open AccessReview Conformal Gravity: Dark Matter and Dark Energy
Entropy 2013, 15(1), 162-176; doi:10.3390/e15010162
Received: 28 August 2012 / Revised: 17 December 2012 / Accepted: 7 January 2013 / Published: 9 January 2013
Cited by 7 | PDF Full-text (216 KB) | HTML Full-text | XML Full-text
Abstract
This short review examines recent progress in understanding dark matter, dark energy, and galactic halos using theory that departs minimally from standard particle physics and cosmology. Strict conformal symmetry (local Weyl scaling covariance), postulated for all elementary massless fields, retains standard fermion [...] Read more.
This short review examines recent progress in understanding dark matter, dark energy, and galactic halos using theory that departs minimally from standard particle physics and cosmology. Strict conformal symmetry (local Weyl scaling covariance), postulated for all elementary massless fields, retains standard fermion and gauge boson theory but modifies Einstein–Hilbert general relativity and the Higgs scalar field model, with no new physical fields. Subgalactic phenomenology is retained. Without invoking dark matter, conformal gravity and a conformal Higgs model fit empirical data on galactic rotational velocities, galactic halos, and Hubble expansion including dark energy. Full article
(This article belongs to the Special Issue Modified Gravity: From Black Holes Entropy to Current Cosmology)
Open AccessReview The Liang-Kleeman Information Flow: Theory and Applications
Entropy 2013, 15(1), 327-360; doi:10.3390/e15010327
Received: 17 October 2012 / Revised: 22 November 2012 / Accepted: 28 December 2012 / Published: 18 January 2013
Cited by 9 | PDF Full-text (996 KB) | HTML Full-text | XML Full-text
Abstract
Information flow, or information transfer as it may be referred to, is a fundamental notion in general physics which has wide applications in scientific disciplines. Recently, a rigorous formalism has been established with respect to both deterministic and stochastic systems, with flow [...] Read more.
Information flow, or information transfer as it may be referred to, is a fundamental notion in general physics which has wide applications in scientific disciplines. Recently, a rigorous formalism has been established with respect to both deterministic and stochastic systems, with flow measures explicitly obtained. These measures possess some important properties, among which is flow or transfer asymmetry. The formalism has been validated and put to application with a variety of benchmark systems, such as the baker transformation, Hénon map, truncated Burgers-Hopf system, Langevin equation, etc. In the chaotic Burgers-Hopf system, all the transfers, save for one, are essentially zero, indicating that the processes underlying a dynamical phenomenon, albeit complex, could be simple. (Truth is simple.) In the Langevin equation case, it is found that there could be no information flowing from one certain time series to another series, though the two are highly correlated. Information flow/transfer provides a potential measure of the cause–effect relation between dynamical events, a relation usually hidden behind the correlation in a traditional sense. Full article
(This article belongs to the Special Issue Transfer Entropy)
Open AccessReview Is Encephalopathy a Mechanism to Renew Sulfate in Autism?
Entropy 2013, 15(1), 372-406; doi:10.3390/e15010372
Received: 8 October 2012 / Revised: 14 January 2013 / Accepted: 15 January 2013 / Published: 22 January 2013
Cited by 6 | PDF Full-text (777 KB) | HTML Full-text | XML Full-text
Abstract
This paper makes two claims: (1) autism can be characterized as a chronic low-grade encephalopathy, associated with excess exposure to nitric oxide, ammonia and glutamate in the central nervous system, which leads to hippocampal pathologies and resulting cognitive impairment, and (2), encephalitis [...] Read more.
This paper makes two claims: (1) autism can be characterized as a chronic low-grade encephalopathy, associated with excess exposure to nitric oxide, ammonia and glutamate in the central nervous system, which leads to hippocampal pathologies and resulting cognitive impairment, and (2), encephalitis is provoked by a systemic deficiency in sulfate, but associated seizures and fever support sulfate restoration. We argue that impaired synthesis of cholesterol sulfate in the skin and red blood cells, catalyzed by sunlight and nitric oxide synthase enzymes, creates a state of colloidal instability in the blood manifested as a low zeta potential and increased interfacial stress. Encephalitis, while life-threatening, can result in partial renewal of sulfate supply, promoting neuronal survival. Research is cited showing how taurine may not only help protect neurons from hypochlorite exposure, but also provide a source for sulfate renewal. Several environmental factors can synergistically promote the encephalopathy of autism, including the herbicide, glyphosate, aluminum, mercury, lead, nutritional deficiencies in thiamine and zinc, and yeast overgrowth due to excess dietary sugar. Given these facts, dietary and lifestyle changes, including increased sulfur ingestion, organic whole foods, increased sun exposure, and avoidance of toxins such as aluminum, mercury, and lead, may help to alleviate symptoms or, in some instances, to prevent autism altogether. Full article
(This article belongs to the Special Issue Biosemiotic Entropy: Disorder, Disease, and Mortality)
Figures

Other

Jump to: Research, Review

Open AccessConcept Paper Urban Ecosystem Health Assessment and Its Application in Management: A Multi-Scale Perspective
Entropy 2013, 15(1), 1-9; doi:10.3390/e15010001
Received: 12 November 2012 / Revised: 6 December 2012 / Accepted: 16 December 2012 / Published: 20 December 2012
Cited by 3 | PDF Full-text (408 KB) | HTML Full-text | XML Full-text
Abstract
Urban ecosystem health assessments can be applied extensively in urban management to evaluate the status quo of the urban ecosystem, identify the limiting factors, identify key problems, optimize the scheme and guide ecological regulation. Regarding the multi-layer roles of urban ecosystems, urban [...] Read more.
Urban ecosystem health assessments can be applied extensively in urban management to evaluate the status quo of the urban ecosystem, identify the limiting factors, identify key problems, optimize the scheme and guide ecological regulation. Regarding the multi-layer roles of urban ecosystems, urban ecosystem health should be assessed at different scales with each assessment providing a specific reference to urban management from its own viewpoint. Therefore, a novel framework of multi-scale urban ecosystem health assessment is established on global, national, regional and local scales. A demonstration of the framework is shown by using a case study in Guangzhou City, China, where urban ecosystem health assessment is conducted in the order of global, national, regional, and local scales, from macro to micro, and rough to detailed analysis. The new multi-scale framework can be utilized to generate a more comprehensive understanding of urban ecosystem health, more accurate orientation of urban development, and more feasible regulation and management programs when compared with the traditional urban ecosystem health assessment focusing at the local scale. Full article
(This article belongs to the Special Issue Entropy and Urban Sprawl)
Open AccessConcept Paper Biosemiotic Entropy of the Genome: Mutations and Epigenetic Imbalances Resulting in Cancer
Entropy 2013, 15(1), 234-261; doi:10.3390/e15010234
Received: 1 November 2012 / Revised: 30 December 2012 / Accepted: 11 January 2013 / Published: 16 January 2013
Cited by 5 | PDF Full-text (6400 KB) | HTML Full-text | XML Full-text
Abstract
Biosemiotic entropy involves the deterioration of biological sign systems. The genome is a coded sign system that is connected to phenotypic outputs through the interpretive functions of the tRNA/ribosome machinery. This symbolic sign system (semiosis) at the core of all biology has [...] Read more.
Biosemiotic entropy involves the deterioration of biological sign systems. The genome is a coded sign system that is connected to phenotypic outputs through the interpretive functions of the tRNA/ribosome machinery. This symbolic sign system (semiosis) at the core of all biology has been termed “biosemiosis”. Layers of biosemiosis and cellular information management are analogous in varying degrees to the semiotics of computer programming, spoken, and written human languages. Biosemiotic entropy — an error or deviation from a healthy state — results from errors in copying functional information (mutations) and errors in the appropriate context or quantity of gene expression (epigenetic imbalance). The concept of biosemiotic entropy is a deeply imbedded assumption in the study of cancer biology. Cells have a homeostatic, preprogrammed, ideal or healthy state that is rooted in genomics, strictly orchestrated by epigenetic regulation, and maintained by DNA repair mechanisms. Cancer is an eminent illustration of biosemiotic entropy, in which the corrosion of genetic information via substitutions, deletions, insertions, fusions, and aberrant regulation results in malignant phenotypes. However, little attention has been given to explicitly outlining the paradigm of biosemiotic entropy in the context of cancer. Herein we distill semiotic theory (from the familiar and well understood spheres of human language and computer code) to draw analogies useful for understanding the operation of biological semiosis at the genetic level. We propose that the myriad checkpoints, error correcting mechanisms, and immunities are all systems whose primary role is to defend against the constant pressure of biosemiotic entropy, which malignancy must shut down in order to achieve advanced stages. In lieu of the narrower tumor suppressor/oncogene model, characterization of oncogenesis into the biosemiotic framework of sign, index, or object entropy may allow for more effective explanatory hypotheses for cancer diagnosis, with consequence in improving profiling and bettering therapeutic outcomes. Full article
(This article belongs to the Special Issue Biosemiotic Entropy: Disorder, Disease, and Mortality)
Figures

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
entropy@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy
Back to Top