entropy-logo

Journal Browser

Journal Browser

Information Flow and Entropy Production in Biomolecular Networks

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Entropy and Biology".

Deadline for manuscript submissions: closed (2 December 2019) | Viewed by 24872

Special Issue Editors

Institute of Pathology, Charité – Universitätsmedizin Berlin, 10099 Berlin, Germany
Interests: Mathematical modelling of dynamic biological phenomena
Laboratory of Pathogens and Host Immunity, UMR CNRS 5294, University of Montpellier, 34095 Montpellier, France
Interests: systems biology; mathematical biology; dynamical systems; stochastic processes; condensed matter physics
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Information theory was introduced in 1948 by Shannon as a mathematical formalization of communication. Biologists realized that many biological processes can be described in terms of information flow and rapidly adopted Shannon's information theory. For instance, the central dogma of molecular biology can be formulated as information flow from genes to proteins. In biological signaling, information comes from the environment, is encoded by cell surface receptors, and is transmitted via biomolecular networks to transcription factors and effector genes.

In order to process information, cells use biochemical processes that are subject to constraints. Physical energetic and thermodynamic constraints are well known: in order to write or delete information, energy has to be dissipated. A theoretical limit for the energy per processed bit of information was calculated by Szilard and Landauer, but biological devices function far above this limit, which suggests that other constraints may exist. A different approach to information is the algorithmic theory of information was introduced in the 1960s by Kolmogorov and Chaitin. This theory deals with the relation between complexity and information. Contrary to Shannon's theory, algorithmic theory of information was less utilized in biology. It deserves more attention, especially in relation to emerging complexity in cellular decision making during embryonic development, or in adaptation to environmental changes.

As an alternative to the concept of information, entropy production and related energy dissipation are also central to phenomenological thermodynamics. It is of great interest to study them for open systems as pioneered by von Bertalanffy and Prigogine and to find principles of optimality for biological processes that allow cells to operate with a limited amount of loss of energy. Applications are widespread from understanding the organization of living systems to planning of biotechnological processes.

The main topics of this Special Issue include, but are not limited to, the following:

_ The coding, recognition, and prediction of the environment;

_ Channel information capacity and network bottlenecks;

_ Optimal information flow in biomolecular networks;

_ Redundancy and error-correction;

_ Energetic, thermodynamic, and other constraints in information processing;

_ Informational aspects of cell and tissue-level decision making and adaptation;

_ Optimality principles for entropy production;

_ Emerging complexity in biomolecular networks.

Prof. Dr. Edda Klipp
Prof. Dr. Ovidiu Radulescu
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • systems biology
  • biotechnology
  • information and developmental biology
  • information and biomolecular networks
  • information and signaling networks
  • information and metabolic networks
  • channel information and biomolecular networks
  • coding of environment and biomolecular networks
  • information processing and biomolecular networks
  • optimality of entropy production
  • recognition of environment and biomolecular networks
  • prediction of environment and biomolecular networks
  • emerging biological complexity
  • error-correction and biomolecular networks
  • information and morphogenesis
  • information and biological adaptation

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

18 pages, 1318 KiB  
Article
Binary Expression Enhances Reliability of Messaging in Gene Networks
by Leonardo R. Gama, Guilherme Giovanini, Gábor Balázsi and Alexandre F. Ramos
Entropy 2020, 22(4), 479; https://doi.org/10.3390/e22040479 - 22 Apr 2020
Cited by 2 | Viewed by 2810
Abstract
The promoter state of a gene and its expression levels are modulated by the amounts of transcription factors interacting with its regulatory regions. Hence, one may interpret a gene network as a communicating system in which the state of the promoter of a [...] Read more.
The promoter state of a gene and its expression levels are modulated by the amounts of transcription factors interacting with its regulatory regions. Hence, one may interpret a gene network as a communicating system in which the state of the promoter of a gene (the source) is communicated by the amounts of transcription factors that it expresses (the message) to modulate the state of the promoter and expression levels of another gene (the receptor). The reliability of the gene network dynamics can be quantified by Shannon’s entropy of the message and the mutual information between the message and the promoter state. Here we consider a stochastic model for a binary gene and use its exact steady state solutions to calculate the entropy and mutual information. We show that a slow switching promoter with long and equally standing ON and OFF states maximizes the mutual information and reduces entropy. That is a binary gene expression regime generating a high variance message governed by a bimodal probability distribution with peaks of the same height. Our results indicate that Shannon’s theory can be a powerful framework for understanding how bursty gene expression conciliates with the striking spatio-temporal precision exhibited in pattern formation of developing organisms. Full article
(This article belongs to the Special Issue Information Flow and Entropy Production in Biomolecular Networks)
Show Figures

Figure 1

17 pages, 963 KiB  
Article
Can Transfer Entropy Infer Information Flow in Neuronal Circuits for Cognitive Processing?
by Ali Tehrani-Saleh and Christoph Adami
Entropy 2020, 22(4), 385; https://doi.org/10.3390/e22040385 - 28 Mar 2020
Cited by 7 | Viewed by 4068
Abstract
How cognitive neural systems process information is largely unknown, in part because of how difficult it is to accurately follow the flow of information from sensors via neurons to actuators. Measuring the flow of information is different from measuring correlations between firing neurons, [...] Read more.
How cognitive neural systems process information is largely unknown, in part because of how difficult it is to accurately follow the flow of information from sensors via neurons to actuators. Measuring the flow of information is different from measuring correlations between firing neurons, for which several measures are available, foremost among them the Shannon information, which is an undirected measure. Several information-theoretic notions of “directed information” have been used to successfully detect the flow of information in some systems, in particular in the neuroscience community. However, recent work has shown that directed information measures such as transfer entropy can sometimes inadequately estimate information flow, or even fail to identify manifest directed influences, especially if neurons contribute in a cryptographic manner to influence the effector neuron. Because it is unclear how often such cryptic influences emerge in cognitive systems, the usefulness of transfer entropy measures to reconstruct information flow is unknown. Here, we test how often cryptographic logic emerges in an evolutionary process that generates artificial neural circuits for two fundamental cognitive tasks (motion detection and sound localization). Besides counting the frequency of problematic logic gates, we also test whether transfer entropy applied to an activity time-series recorded from behaving digital brains can infer information flow, compared to a ground-truth model of direct influence constructed from connectivity and circuit logic. Our results suggest that transfer entropy will sometimes fail to infer directed information when it exists, and sometimes suggest a causal connection when there is none. However, the extent of incorrect inference strongly depends on the cognitive task considered. These results emphasize the importance of understanding the fundamental logic processes that contribute to information flow in cognitive processing, and quantifying their relevance in any given nervous system. Full article
(This article belongs to the Special Issue Information Flow and Entropy Production in Biomolecular Networks)
Show Figures

Figure 1

19 pages, 6349 KiB  
Article
Entropy as a Robustness Marker in Genetic Regulatory Networks
by Mustapha Rachdi, Jules Waku, Hana Hazgui and Jacques Demongeot
Entropy 2020, 22(3), 260; https://doi.org/10.3390/e22030260 - 25 Feb 2020
Cited by 8 | Viewed by 2810
Abstract
Genetic regulatory networks have evolved by complexifying their control systems with numerous effectors (inhibitors and activators). That is, for example, the case for the double inhibition by microRNAs and circular RNAs, which introduce a ubiquitous double brake control reducing in general the number [...] Read more.
Genetic regulatory networks have evolved by complexifying their control systems with numerous effectors (inhibitors and activators). That is, for example, the case for the double inhibition by microRNAs and circular RNAs, which introduce a ubiquitous double brake control reducing in general the number of attractors of the complex genetic networks (e.g., by destroying positive regulation circuits), in which complexity indices are the number of nodes, their connectivity, the number of strong connected components and the size of their interaction graph. The stability and robustness of the networks correspond to their ability to respectively recover from dynamical and structural disturbances the same asymptotic trajectories, and hence the same number and nature of their attractors. The complexity of the dynamics is quantified here using the notion of attractor entropy: it describes the way the invariant measure of the dynamics is spread over the state space. The stability (robustness) is characterized by the rate at which the system returns to its equilibrium trajectories (invariant measure) after a dynamical (structural) perturbation. The mathematical relationships between the indices of complexity, stability and robustness are presented in case of Markov chains related to threshold Boolean random regulatory networks updated with a Hopfield-like rule. The entropy of the invariant measure of a network as well as the Kolmogorov-Sinaï entropy of the Markov transition matrix ruling its random dynamics can be considered complexity, stability and robustness indices; and it is possible to exploit the links between these notions to characterize the resilience of a biological system with respect to endogenous or exogenous perturbations. The example of the genetic network controlling the kinin-kallikrein system involved in a pathology called angioedema shows the practical interest of the present approach of the complexity and robustness in two cases, its physiological normal and pathological, abnormal, dynamical behaviors. Full article
(This article belongs to the Special Issue Information Flow and Entropy Production in Biomolecular Networks)
Show Figures

Figure 1

15 pages, 2398 KiB  
Article
Chemical Reaction Networks Possess Intrinsic, Temperature-Dependent Functionality
by Stephan O. Adler and Edda Klipp
Entropy 2020, 22(1), 117; https://doi.org/10.3390/e22010117 - 18 Jan 2020
Cited by 5 | Viewed by 3552
Abstract
Temperature influences the life of many organisms in various ways. A great number of organisms live under conditions where their ability to adapt to changes in temperature can be vital and largely determines their fitness. Understanding the mechanisms and principles underlying this ability [...] Read more.
Temperature influences the life of many organisms in various ways. A great number of organisms live under conditions where their ability to adapt to changes in temperature can be vital and largely determines their fitness. Understanding the mechanisms and principles underlying this ability to adapt can be of great advantage, for example, to improve growth conditions for crops and increase their yield. In times of imminent, increasing climate change, this becomes even more important in order to find strategies and help crops cope with these fundamental changes. There is intense research in the field of acclimation that comprises fluctuations of various environmental conditions, but most acclimation research focuses on regulatory effects and the observation of gene expression changes within the examined organism. As thermodynamic effects are a direct consequence of temperature changes, these should necessarily be considered in this field of research but are often neglected. Additionally, compensated effects might be missed even though they are equally important for the organism, since they do not cause observable changes, but rather counteract them. In this work, using a systems biology approach, we demonstrate that even simple network motifs can exhibit temperature-dependent functional features resulting from the interplay of network structure and the distribution of activation energies over the involved reactions. The demonstrated functional features are (i) the reversal of fluxes within a linear pathway, (ii) a thermo-selective branched pathway with different flux modes and (iii) the increased flux towards carbohydrates in a minimal Calvin cycle that was designed to demonstrate temperature compensation within reaction networks. Comparing a system’s response to either temperature changes or changes in enzyme activity we also dissect the influence of thermodynamic changes versus genetic regulation. By this, we expand the scope of thermodynamic modelling of biochemical processes by addressing further possibilities and effects, following established mathematical descriptions of biophysical properties. Full article
(This article belongs to the Special Issue Information Flow and Entropy Production in Biomolecular Networks)
Show Figures

Figure 1

27 pages, 698 KiB  
Article
Dissipation in Non-Steady State Regulatory Circuits
by Paulina Szymańska-Rożek, Dario Villamaina, Jacek Miȩkisz and Aleksandra M. Walczak
Entropy 2019, 21(12), 1212; https://doi.org/10.3390/e21121212 - 10 Dec 2019
Cited by 2 | Viewed by 2473
Abstract
In order to respond to environmental signals, cells often use small molecular circuits to transmit information about their surroundings. Recently, motivated by specific examples in signaling and gene regulation, a body of work has focused on the properties of circuits that function out [...] Read more.
In order to respond to environmental signals, cells often use small molecular circuits to transmit information about their surroundings. Recently, motivated by specific examples in signaling and gene regulation, a body of work has focused on the properties of circuits that function out of equilibrium and dissipate energy. We briefly review the probabilistic measures of information and dissipation and use simple models to discuss and illustrate trade-offs between information and dissipation in biological circuits. We find that circuits with non-steady state initial conditions can transmit more information at small readout delays than steady state circuits. The dissipative cost of this additional information proves marginal compared to the steady state dissipation. Feedback does not significantly increase the transmitted information for out of steady state circuits but does decrease dissipative costs. Lastly, we discuss the case of bursty gene regulatory circuits that, even in the fast switching limit, function out of equilibrium. Full article
(This article belongs to the Special Issue Information Flow and Entropy Production in Biomolecular Networks)
Show Figures

Figure 1

19 pages, 2137 KiB  
Article
Fitness Gain of Individually Sensed Information by Cells
by Tetsuya J. Kobayashi and Yuki Sughiyama
Entropy 2019, 21(10), 1002; https://doi.org/10.3390/e21101002 - 13 Oct 2019
Cited by 4 | Viewed by 2482
Abstract
Mutual information and its causal variant, directed information, have been widely used to quantitatively characterize the performance of biological sensing and information transduction. However, once coupled with selection in response to decision-making, the sensing signal could have more or less evolutionary value than [...] Read more.
Mutual information and its causal variant, directed information, have been widely used to quantitatively characterize the performance of biological sensing and information transduction. However, once coupled with selection in response to decision-making, the sensing signal could have more or less evolutionary value than its mutual or directed information. In this work, we show that an individually sensed signal always has a better fitness value, on average, than its mutual or directed information. The fitness gain, which satisfies fluctuation relations (FRs), is attributed to the selection of organisms in a population that obtain a better sensing signal by chance. A new quantity, similar to the coarse-grained entropy production in information thermodynamics, is introduced to quantify the total fitness gain from individual sensing, which also satisfies FRs. Using this quantity, the optimizing fitness gain of individual sensing is shown to be related to fidelity allocations for individual environmental histories. Our results are supplemented by numerical verifications of FRs, and a discussion on how this problem is linked to information encoding and decoding. Full article
(This article belongs to the Special Issue Information Flow and Entropy Production in Biomolecular Networks)
Show Figures

Figure 1

Review

Jump to: Research

19 pages, 509 KiB  
Review
Thermodynamic Limits and Optimality of Microbial Growth
by Nima P. Saadat, Tim Nies, Yvan Rousset and Oliver Ebenhöh
Entropy 2020, 22(3), 277; https://doi.org/10.3390/e22030277 - 28 Feb 2020
Cited by 18 | Viewed by 6012
Abstract
Understanding microbial growth with the use of mathematical models has a long history that dates back to the pioneering work of Jacques Monod in the 1940s. Monod’s famous growth law expressed microbial growth rate as a simple function of the limiting nutrient concentration. [...] Read more.
Understanding microbial growth with the use of mathematical models has a long history that dates back to the pioneering work of Jacques Monod in the 1940s. Monod’s famous growth law expressed microbial growth rate as a simple function of the limiting nutrient concentration. However, to explain growth laws from underlying principles is extremely challenging. In the second half of the 20th century, numerous experimental approaches aimed at precisely measuring heat production during microbial growth to determine the entropy balance in a growing cell and to quantify the exported entropy. This has led to the development of thermodynamic theories of microbial growth, which have generated fundamental understanding and identified the principal limitations of the growth process. Although these approaches ignored metabolic details and instead considered microbial metabolism as a black box, modern theories heavily rely on genomic resources to describe and model metabolism in great detail to explain microbial growth. Interestingly, however, thermodynamic constraints are often included in modern modeling approaches only in a rather superficial fashion, and it appears that recent modeling approaches and classical theories are rather disconnected fields. To stimulate a closer interaction between these fields, we here review various theoretical approaches that aim at describing microbial growth based on thermodynamics and outline the resulting thermodynamic limits and optimality principles. We start with classical black box models of cellular growth, and continue with recent metabolic modeling approaches that include thermodynamics, before we place these models in the context of fundamental considerations based on non-equilibrium statistical mechanics. We conclude by identifying conceptual overlaps between the fields and suggest how the various types of theories and models can be integrated. We outline how concepts from one approach may help to inform or constrain another, and we demonstrate how genome-scale models can be used to infer key black box parameters, such as the energy of formation or the degree of reduction of biomass. Such integration will allow understanding to what extent microbes can be viewed as thermodynamic machines, and how close they operate to theoretical optima. Full article
(This article belongs to the Special Issue Information Flow and Entropy Production in Biomolecular Networks)
Show Figures

Figure 1

Back to TopTop