Special Issue "Entropy in Genetics and Computational Biology"
A special issue of Entropy (ISSN 1099-4300).
Deadline for manuscript submissions: closed (31 March 2010)
Prof. Dr. Warren Ewens
324 Leidy Laboratories, Department of Biology, University of Pennsylvania, Philadelphia, PA 19104, USA
The concept of entropy arose in classical theoretical physics as describing a measure or randomness, or disorder, of a physical system. The second law of thermodynamics states that the entropy of a closed system increases with time: if a closed vessel initially contains hot air at one end and cold air at the other, then as time progresses the hot and cold air become increasingly mixed and this implies an increase in the entropy, or disorder, of the system. This is in effect a statistical law and in principle describes the most likely behaviour of the system. The huge number of atoms of air in the vessel implies however that this most likely behaviour is almost certain to arise, so that what is in principle a stochastic process can in practice be regarded as a deterministic one. In the biological world random events arise constantly, but here they are far more important than in the physical context just described. As just one example, the random transmission of genes from parent to offspring implies that the study of evolution as a genetic process must allow for this randomness. Thus this study involves quite complex mathematical stochastic processes, and developments in the theory of these processes have often been motivated by biological questions. Similarly advances in statistical theory have often, perhaps mainly, arisen in the biological and medical contexts. The analysis of medical data requires statistical methods to allow for the randomness inherent in the sampling process involved in obtaining these data. Thus entropy concepts, through statistics and stochastic process theory, pervade both medicine and biology.
Prof. Dr. Warren Ewens
All manuscripts should be submitted to email@example.com with a copy to the Guest Editor. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.
Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this Open Access journal is 1000 CHF per accepted paper.
- stochastic processes
Entropy 2010, 12(5), 1071-1101; doi:10.3390/e12051071
Received: 2 March 2010; in revised form: 10 April 2010 / Accepted: 28 April 2010 / Published: 4 May 2010| Download PDF Full-text (362 KB)
Entropy 2010, 12(5), 1102-1124; doi:10.3390/e12051102
Received: 21 February 2010; Accepted: 28 April 2010 / Published: 5 May 2010| Download PDF Full-text (329 KB)
Review: Entropy and Information Approaches to Genetic Diversity and its Expression: Genomic Geography
Entropy 2010, 12(7), 1765-1798; doi:10.3390/e12071765
Received: 1 April 2010; in revised form: 20 June 2010 / Accepted: 28 June 2010 / Published: 15 July 2010| Download PDF Full-text (294 KB)
The below list represents only planned manuscripts. Some of these manuscripts have not been received by the Editorial Office yet. Papers submitted to MDPI journals are subject to peer-review.
Type of Paper: Article
Title: Information in Biology : Metaphor or Model as Negative-Entropy? Anti-Entropy and Phenotypic Complexity along Evolution
Author: Giuseppe Longo1,2
Affiliations: 1 CNRS et Dépt. d\'Informatique, École Normale Supérieure, Paris, France
2 CREA, École Polytechnique, Palaiseau, France
Abstract: The major observables in Physics are largely, if not exclusively, based on or derived from energy (conservation properties as symmetries, geodetic principles as least action principles...). Biology forced us to think in the novel terms of “organization” and, even, of inherited organization; an organization whose “complexity” grows along Evolution and embryogenesis, against energy degradation in Physics (entropy production, also in non-isolated systems). Can we borrow for the analysis of life phenomena any relevant principle or precise result from Information Theory or the understanding of Information as Negentropy? A critique of the abuse of Information in Biology will be hinted.
Some recent work will be introduced on a quantification of “biological (phenotypic) organization” by a proper observable to Biology, Anti-entropy. The idea will be derived by conceptual dualities w. r. to Quantum Physics, where the operatorial approach by Schrödinger to his famous equation will (“dually”) guide us towards an equational modelling of Gould’s analysis of “phenotypic complexity” along Darwin’s Evolution and, if time allows, to some applications to embryogenesis.
References:  Longo, G.; Tendero, P.-E. The differential method and the causal incompleteness of Programming Theory in Molecular Biology. Foundations of Science 2007,12, 337-366.
 Longo, G. From exact sciences to life phenomena: following Schrödinger and Turing on Programs, Life and Causality. In From Type Theory to Morphological Complexity: A Colloquium in Honor of Giuseppe Longo’s 60th birthday, special issue of Information and Computation; 2009, 207, 545-558.
 Bailly, F.; Longo, G. Biological organization and anti-entropy. J. Biological Systems 2009, 17, 63-96.
Type of Paper: Review
Title: Entropy-based Approaches to Ecological and Genetic Diversity
Author: William Sherwin, Roddy Dewar et al.
Affiliation: School of BEES, UNSW Sydney, NSW 2052, Australia; E-Mail: firstname.lastname@example.org
Abstract: Shannon’s entropy-based diversity or information index is the standard for ecological communities. However, there is continuing emphasis on a related diversity measure, called heterozygosity (for genes) or Simpson’s index (for communities) which has serious non-independence between hierarchical levels of organisation. Shannon’s and mutual information are immune to these problems, and can now be theoretically predicted for given conditions of population size, dispersal and mutation. The entropy-based measures excel in their ability to: express diversity intuitively; convert to dispersal measures; and incorporate into statistical testing. We now need non-neutral (selection) theory at ecological and genetic levels. There is increasing uptake of entropy-based methods by user-friendly platforms for genetic analysis.
Last update: 5 May 2010