Special Issue "Information Flow and Entropy Production in Biomolecular Networks"

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Entropy and Biology".

Deadline for manuscript submissions: closed (2 December 2019).

Special Issue Editors

Prof. Dr. Edda Klipp
E-Mail Website
Guest Editor
Institute of Pathology, Charité – Universitätsmedizin Berlin, 10099 Berlin, Germany
Tel. +49 30 2093 9040
Interests: Mathematical modelling of dynamic biological phenomena
Prof. Dr. Ovidiu Radulescu
E-Mail
Guest Editor
Dynamique des Interactions Membranaires Normales et Pathologiques, UMR CNRS 5235, Université de Montpellier, 34095 Montpellier, France
Interests: Systems Biology, Mathematical Biology, Dynamical Systems, Stochastic Processes, Condensed Matter Physics

Special Issue Information

Dear Colleagues,

Information theory was introduced in 1948 by Shannon as a mathematical formalization of communication. Biologists realized that many biological processes can be described in terms of information flow and rapidly adopted Shannon's information theory. For instance, the central dogma of molecular biology can be formulated as information flow from genes to proteins. In biological signaling, information comes from the environment, is encoded by cell surface receptors, and is transmitted via biomolecular networks to transcription factors and effector genes.

In order to process information, cells use biochemical processes that are subject to constraints. Physical energetic and thermodynamic constraints are well known: in order to write or delete information, energy has to be dissipated. A theoretical limit for the energy per processed bit of information was calculated by Szilard and Landauer, but biological devices function far above this limit, which suggests that other constraints may exist. A different approach to information is the algorithmic theory of information was introduced in the 1960s by Kolmogorov and Chaitin. This theory deals with the relation between complexity and information. Contrary to Shannon's theory, algorithmic theory of information was less utilized in biology. It deserves more attention, especially in relation to emerging complexity in cellular decision making during embryonic development, or in adaptation to environmental changes.

As an alternative to the concept of information, entropy production and related energy dissipation are also central to phenomenological thermodynamics. It is of great interest to study them for open systems as pioneered by von Bertalanffy and Prigogine and to find principles of optimality for biological processes that allow cells to operate with a limited amount of loss of energy. Applications are widespread from understanding the organization of living systems to planning of biotechnological processes.

The main topics of this Special Issue include, but are not limited to, the following:

_ The coding, recognition, and prediction of the environment;

_ Channel information capacity and network bottlenecks;

_ Optimal information flow in biomolecular networks;

_ Redundancy and error-correction;

_ Energetic, thermodynamic, and other constraints in information processing;

_ Informational aspects of cell and tissue-level decision making and adaptation;

_ Optimality principles for entropy production;

_ Emerging complexity in biomolecular networks.

Prof. Dr. Edda Klipp
Prof. Dr. Ovidiu Radulescu
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • systems biology
  • biotechnology
  • information and developmental biology
  • information and biomolecular networks
  • information and signaling networks
  • information and metabolic networks
  • channel information and biomolecular networks
  • coding of environment and biomolecular networks
  • information processing and biomolecular networks
  • optimality of entropy production
  • recognition of environment and biomolecular networks
  • prediction of environment and biomolecular networks
  • emerging biological complexity
  • error-correction and biomolecular networks
  • information and morphogenesis
  • information and biological adaptation

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Open AccessArticle
Dissipation in Non-Steady State Regulatory Circuits
Entropy 2019, 21(12), 1212; https://doi.org/10.3390/e21121212 (registering DOI) - 10 Dec 2019
Abstract
In order to respond to environmental signals, cells often use small molecular circuits to transmit information about their surroundings. Recently, motivated by specific examples in signaling and gene regulation, a body of work has focused on the properties of circuits that function out [...] Read more.
In order to respond to environmental signals, cells often use small molecular circuits to transmit information about their surroundings. Recently, motivated by specific examples in signaling and gene regulation, a body of work has focused on the properties of circuits that function out of equilibrium and dissipate energy. We briefly review the probabilistic measures of information and dissipation and use simple models to discuss and illustrate trade-offs between information and dissipation in biological circuits. We find that circuits with non-steady state initial conditions can transmit more information at small readout delays than steady state circuits. The dissipative cost of this additional information proves marginal compared to the steady state dissipation. Feedback does not significantly increase the transmitted information for out of steady state circuits but does decrease dissipative costs. Lastly, we discuss the case of bursty gene regulatory circuits that, even in the fast switching limit, function out of equilibrium. Full article
(This article belongs to the Special Issue Information Flow and Entropy Production in Biomolecular Networks)
Show Figures

Figure 1

Open AccessArticle
Fitness Gain of Individually Sensed Information by Cells
Entropy 2019, 21(10), 1002; https://doi.org/10.3390/e21101002 - 13 Oct 2019
Abstract
Mutual information and its causal variant, directed information, have been widely used to quantitatively characterize the performance of biological sensing and information transduction. However, once coupled with selection in response to decision-making, the sensing signal could have more or less evolutionary value than [...] Read more.
Mutual information and its causal variant, directed information, have been widely used to quantitatively characterize the performance of biological sensing and information transduction. However, once coupled with selection in response to decision-making, the sensing signal could have more or less evolutionary value than its mutual or directed information. In this work, we show that an individually sensed signal always has a better fitness value, on average, than its mutual or directed information. The fitness gain, which satisfies fluctuation relations (FRs), is attributed to the selection of organisms in a population that obtain a better sensing signal by chance. A new quantity, similar to the coarse-grained entropy production in information thermodynamics, is introduced to quantify the total fitness gain from individual sensing, which also satisfies FRs. Using this quantity, the optimizing fitness gain of individual sensing is shown to be related to fidelity allocations for individual environmental histories. Our results are supplemented by numerical verifications of FRs, and a discussion on how this problem is linked to information encoding and decoding. Full article
(This article belongs to the Special Issue Information Flow and Entropy Production in Biomolecular Networks)
Show Figures

Figure 1

Back to TopTop