E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Special Issue "Information and Entropy in Biological Systems"

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (20 December 2015)

Special Issue Editors

Guest Editor
Prof. Dr. John Baez

Department of Mathematics, University of California, Riverside, California 92521, USA
Website | E-Mail
Interests: information theory, network theory, and mathematical physics
Guest Editor
Prof. Dr. John Harte

The Energy and Resources Group, 310 Barrows Hall, the University of California Berkeley, CA 94720, USA
E-Mail
Interests: Ecology, information theory, global change
Guest Editor
Dr. Marc Harper

Covariant Consulting, Illinois, IL, USA
Website | E-Mail
Interests: mathematical biology, bioinformatics, information theory, educational technology

Special Issue Information

Dear Colleagues

Please visit this site http://www.nimbios.org/workshops/WS_entropy for a detailed description of this special issue.

Prof. Dr. John Baez
Dr. Marc Harper
Prof. Dr. John Harte
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1500 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (8 papers)

View options order results:
result details:
Displaying articles 1-8
Export citation of selected articles as:

Research

Jump to: Review, Other

Open AccessArticle Stationary Stability for Evolutionary Dynamics in Finite Populations
Entropy 2016, 18(9), 316; doi:10.3390/e18090316
Received: 3 April 2016 / Revised: 24 July 2016 / Accepted: 16 August 2016 / Published: 25 August 2016
PDF Full-text (3619 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
We demonstrate a vast expansion of the theory of evolutionary stability to finite populations with mutation, connecting the theory of the stationary distribution of the Moran process with the Lyapunov theory of evolutionary stability. We define the notion of stationary stability for the
[...] Read more.
We demonstrate a vast expansion of the theory of evolutionary stability to finite populations with mutation, connecting the theory of the stationary distribution of the Moran process with the Lyapunov theory of evolutionary stability. We define the notion of stationary stability for the Moran process with mutation and generalizations, as well as a generalized notion of evolutionary stability that includes mutation called an incentive stable state (ISS) candidate. For sufficiently large populations, extrema of the stationary distribution are ISS candidates and we give a family of Lyapunov quantities that are locally minimized at the stationary extrema and at ISS candidates. In various examples, including for the Moran and Wright–Fisher processes, we show that the local maxima of the stationary distribution capture the traditionally-defined evolutionarily stable states. The classical stability theory of the replicator dynamic is recovered in the large population limit. Finally we include descriptions of possible extensions to populations of variable size and populations evolving on graphs. Full article
(This article belongs to the Special Issue Information and Entropy in Biological Systems)
Figures

Open AccessArticle A Cost/Speed/Reliability Tradeoff to Erasing
Entropy 2016, 18(5), 165; doi:10.3390/e18050165
Received: 21 December 2015 / Revised: 21 March 2016 / Accepted: 22 April 2016 / Published: 28 April 2016
PDF Full-text (294 KB) | HTML Full-text | XML Full-text
Abstract
We present a Kullback–Leibler (KL) control treatment of the fundamental problem of erasing a bit. We introduce notions of reliability of information storage via a reliability timescale τr, and speed of erasing via an erasing timescale τe. Our problem
[...] Read more.
We present a Kullback–Leibler (KL) control treatment of the fundamental problem of erasing a bit. We introduce notions of reliability of information storage via a reliability timescale τ r , and speed of erasing via an erasing timescale τ e . Our problem formulation captures the tradeoff between speed, reliability, and the KL cost required to erase a bit. We show that rapid erasing of a reliable bit costs at least log 2 - log 1 - e - τ e τ r > log 2 , which goes to 1 2 log 2 τ r τ e when τ r > > τ e . Full article
(This article belongs to the Special Issue Information and Entropy in Biological Systems)
Figures

Open AccessArticle Open Markov Processes: A Compositional Perspective on Non-Equilibrium Steady States in Biology
Entropy 2016, 18(4), 140; doi:10.3390/e18040140
Received: 5 January 2016 / Revised: 16 February 2016 / Accepted: 6 April 2016 / Published: 15 April 2016
Cited by 2 | PDF Full-text (289 KB) | HTML Full-text | XML Full-text
Abstract
In recent work, Baez, Fong and the author introduced a framework for describing Markov processes equipped with a detailed balanced equilibrium as open systems of a certain type. These “open Markov processes” serve as the building blocks for more complicated processes. In this
[...] Read more.
In recent work, Baez, Fong and the author introduced a framework for describing Markov processes equipped with a detailed balanced equilibrium as open systems of a certain type. These “open Markov processes” serve as the building blocks for more complicated processes. In this paper, we describe the potential application of this framework in the modeling of biological systems as open systems maintained away from equilibrium. We show that non-equilibrium steady states emerge in open systems of this type, even when the rates of the underlying process are such that a detailed balanced equilibrium is permitted. It is shown that these non-equilibrium steady states minimize a quadratic form which we call “dissipation”. In some circumstances, the dissipation is approximately equal to the rate of change of relative entropy plus a correction term. On the other hand, Prigogine’s principle of minimum entropy production generally fails for non-equilibrium steady states. We use a simple model of membrane transport to illustrate these concepts. Full article
(This article belongs to the Special Issue Information and Entropy in Biological Systems)
Open AccessArticle The Free Energy Requirements of Biological Organisms; Implications for Evolution
Entropy 2016, 18(4), 138; doi:10.3390/e18040138
Received: 8 February 2016 / Revised: 24 March 2016 / Accepted: 8 April 2016 / Published: 13 April 2016
Cited by 3 | PDF Full-text (331 KB) | HTML Full-text | XML Full-text | Correction
Abstract
Recent advances in nonequilibrium statistical physics have provided unprecedented insight into the thermodynamics of dynamic processes. The author recently used these advances to extend Landauer’s semi-formal reasoning concerning the thermodynamics of bit erasure, to derive the minimal free energy required to implement an
[...] Read more.
Recent advances in nonequilibrium statistical physics have provided unprecedented insight into the thermodynamics of dynamic processes. The author recently used these advances to extend Landauer’s semi-formal reasoning concerning the thermodynamics of bit erasure, to derive the minimal free energy required to implement an arbitrary computation. Here, I extend this analysis, deriving the minimal free energy required by an organism to run a given (stochastic) map π from its sensor inputs to its actuator outputs. I use this result to calculate the input-output map π of an organism that optimally trades off the free energy needed to run π with the phenotypic fitness that results from implementing π. I end with a general discussion of the limits imposed on the rate of the terrestrial biosphere’s information processing by the flux of sunlight on the Earth. Full article
(This article belongs to the Special Issue Information and Entropy in Biological Systems)
Open AccessArticle Maximizing Diversity in Biology and Beyond
Entropy 2016, 18(3), 88; doi:10.3390/e18030088
Received: 19 December 2015 / Revised: 22 February 2016 / Accepted: 24 February 2016 / Published: 9 March 2016
Cited by 1 | PDF Full-text (357 KB) | HTML Full-text | XML Full-text
Abstract
Entropy, under a variety of names, has long been used as a measure of diversity in ecology, as well as in genetics, economics and other fields. There is a spectrum of viewpoints on diversity, indexed by a real parameter q giving greater or
[...] Read more.
Entropy, under a variety of names, has long been used as a measure of diversity in ecology, as well as in genetics, economics and other fields. There is a spectrum of viewpoints on diversity, indexed by a real parameter q giving greater or lesser importance to rare species. Leinster and Cobbold (2012) proposed a one-parameter family of diversity measures taking into account both this variation and the varying similarities between species. Because of this latter feature, diversity is not maximized by the uniform distribution on species. So it is natural to ask: which distributions maximize diversity, and what is its maximum value? In principle, both answers depend on q, but our main theorem is that neither does. Thus, there is a single distribution that maximizes diversity from all viewpoints simultaneously, and any list of species has an unambiguous maximum diversity value. Furthermore, the maximizing distribution(s) can be computed in finite time, and any distribution maximizing diversity from some particular viewpoint q > 0 actually maximizes diversity for all q. Although we phrase our results in ecological terms, they apply very widely, with applications in graph theory and metric geometry. Full article
(This article belongs to the Special Issue Information and Entropy in Biological Systems)
Figures

Open AccessArticle Using Expectation Maximization and Resource Overlap Techniques to Classify Species According to Their Niche Similarities in Mutualistic Networks
Entropy 2015, 17(11), 7680-7697; doi:10.3390/e17117680
Received: 27 August 2015 / Revised: 12 October 2015 / Accepted: 9 November 2015 / Published: 12 November 2015
PDF Full-text (376 KB) | HTML Full-text | XML Full-text
Abstract
Mutualistic networks in nature are widespread and play a key role in generating the diversity of life on Earth. They constitute an interdisciplinary field where physicists, biologists and computer scientists work together. Plant-pollinator mutualisms in particular form complex networks of interdependence between often
[...] Read more.
Mutualistic networks in nature are widespread and play a key role in generating the diversity of life on Earth. They constitute an interdisciplinary field where physicists, biologists and computer scientists work together. Plant-pollinator mutualisms in particular form complex networks of interdependence between often hundreds of species. Understanding the architecture of these networks is of paramount importance for assessing the robustness of the corresponding communities to global change and management strategies. Advances in this problem are currently limited mainly due to the lack of methodological tools to deal with the intrinsic complexity of mutualisms, as well as the scarcity and incompleteness of available empirical data. One way to uncover the structure underlying complex networks is to employ information theoretical statistical inference methods, such as the expectation maximization (EM) algorithm. In particular, such an approach can be used to cluster the nodes of a network based on the similarity of their node neighborhoods. Here, we show how to connect network theory with the classical ecological niche theory for mutualistic plant-pollinator webs by using the EM algorithm. We apply EM to classify the nodes of an extensive collection of mutualistic plant-pollinator networks according to their connection similarity. We find that EM recovers largely the same clustering of the species as an alternative recently proposed method based on resource overlap, where one considers each party as a consuming resource for the other party (plants providing food to animals, while animals assist the reproduction of plants). Furthermore, using the EM algorithm, we can obtain a sequence of successfully-refined classifications that enables us to identify the fine-structure of the ecological network and understand better the niche distribution both for plants and animals. This is an example of how information theoretical methods help to systematize and unify work in ecology. Full article
(This article belongs to the Special Issue Information and Entropy in Biological Systems)
Figures

Review

Jump to: Research, Other

Open AccessReview Relative Entropy in Biological Systems
Entropy 2016, 18(2), 46; doi:10.3390/e18020046
Received: 9 December 2015 / Revised: 18 January 2016 / Accepted: 21 January 2016 / Published: 2 February 2016
Cited by 4 | PDF Full-text (291 KB) | HTML Full-text | XML Full-text
Abstract
In this paper we review various information-theoretic characterizations of the approach to equilibrium in biological systems. The replicator equation, evolutionary game theory, Markov processes and chemical reaction networks all describe the dynamics of a population or probability distribution. Under suitable assumptions, the distribution
[...] Read more.
In this paper we review various information-theoretic characterizations of the approach to equilibrium in biological systems. The replicator equation, evolutionary game theory, Markov processes and chemical reaction networks all describe the dynamics of a population or probability distribution. Under suitable assumptions, the distribution will approach an equilibrium with the passage of time. Relative entropy—that is, the Kullback–Leibler divergence, or various generalizations of this—provides a quantitative measure of how far from equilibrium the system is. We explain various theorems that give conditions under which relative entropy is nonincreasing. In biochemical applications these results can be seen as versions of the Second Law of Thermodynamics, stating that free energy can never increase with the passage of time. In ecological applications, they make precise the notion that a population gains information from its environment as it approaches equilibrium. Full article
(This article belongs to the Special Issue Information and Entropy in Biological Systems)

Other

Jump to: Research, Review

Open AccessCorrection Correction: Wolpert, D.H. The Free Energy Requirements of Biological Organisms; Implications for Evolution. Entropy 2016, 18, 138
Entropy 2016, 18(6), 219; doi:10.3390/e18060219
Received: 26 May 2016 / Accepted: 30 May 2016 / Published: 2 June 2016
PDF Full-text (176 KB) | HTML Full-text | XML Full-text
Abstract The following corrections should be made to the published paper [1]: [...] Full article
(This article belongs to the Special Issue Information and Entropy in Biological Systems)

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
E-Mail: 
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy Edit a special issue Review for Entropy
loading...
Back to Top