entropy-logo

Journal Browser

Journal Browser

Entropy and Information in Biological Systems

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Entropy and Biology".

Deadline for manuscript submissions: closed (20 May 2024) | Viewed by 13250

Special Issue Editor


E-Mail Website
Guest Editor
Department of Physiology & Biophysics, University of Mississippi Medical Center, 2500 North State Street, Jackson, MS 39216, USA
Interests: systems physiology and theoretical biology

Special Issue Information

Dear Colleagues,

In 1943, Erwin Schrödinger proposed that an understanding of the true nature of living systems first requires an apprehension of their ability to control entropy dynamics within their environment. The development of information theory for communications by Claude Shannon was subsequently linked to the concept of entropy. Living organisms utilize and exchange information as a form of biological currency during the process of adapting to their environmental conditions. The mechanics of the flow of information in open living systems have not been deeply explored in the literature and deserve attention. The derivation of biological systems from the information/entropy perspective could provide considerable insights into the functioning and fundamental nature of entropy dynamics and provide a foundation for a comprehensive theoretical biology.

Prof. Dr. Richard Summers
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • entropy
  • information theory
  • entropy dynamics
  • biological systems
  • theoretical biology

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Other

17 pages, 3290 KiB  
Article
An Information-Geometric Formulation of Pattern Separation and Evaluation of Existing Indices
by Harvey Wang, Selena Singh, Thomas Trappenberg and Abraham Nunes
Entropy 2024, 26(9), 737; https://doi.org/10.3390/e26090737 - 29 Aug 2024
Cited by 1 | Viewed by 950
Abstract
Pattern separation is a computational process by which dissimilar neural patterns are generated from similar input patterns. We present an information-geometric formulation of pattern separation, where a pattern separator is modeled as a family of statistical distributions on a manifold. Such a manifold [...] Read more.
Pattern separation is a computational process by which dissimilar neural patterns are generated from similar input patterns. We present an information-geometric formulation of pattern separation, where a pattern separator is modeled as a family of statistical distributions on a manifold. Such a manifold maps an input (i.e., coordinates) to a probability distribution that generates firing patterns. Pattern separation occurs when small coordinate changes result in large distances between samples from the corresponding distributions. Under this formulation, we implement a two-neuron system whose probability law forms a three-dimensional manifold with mutually orthogonal coordinates representing the neurons’ marginal and correlational firing rates. We use this highly controlled system to examine the behavior of spike train similarity indices commonly used in pattern separation research. We find that all indices (except scaling factor) are sensitive to relative differences in marginal firing rates, but no index adequately captures differences in spike trains that result from altering the correlation in activity between the two neurons. That is, existing pattern separation metrics appear (A) sensitive to patterns that are encoded by different neurons but (B) insensitive to patterns that differ only in relative spike timing (e.g., synchrony between neurons in the ensemble). Full article
(This article belongs to the Special Issue Entropy and Information in Biological Systems)
Show Figures

Figure 1

16 pages, 673 KiB  
Article
Data-Driven Identification of Stroke through Machine Learning Applied to Complexity Metrics in Multimodal Electromyography and Kinematics
by Francesco Romano, Damiano Formenti, Daniela Cardone, Emanuele Francesco Russo, Paolo Castiglioni, Giampiero Merati, Arcangelo Merla and David Perpetuini
Entropy 2024, 26(7), 578; https://doi.org/10.3390/e26070578 - 7 Jul 2024
Cited by 1 | Viewed by 1503
Abstract
A stroke represents a significant medical condition characterized by the sudden interruption of blood flow to the brain, leading to cellular damage or death. The impact of stroke on individuals can vary from mild impairments to severe disability. Treatment for stroke often focuses [...] Read more.
A stroke represents a significant medical condition characterized by the sudden interruption of blood flow to the brain, leading to cellular damage or death. The impact of stroke on individuals can vary from mild impairments to severe disability. Treatment for stroke often focuses on gait rehabilitation. Notably, assessing muscle activation and kinematics patterns using electromyography (EMG) and stereophotogrammetry, respectively, during walking can provide information regarding pathological gait conditions. The concurrent measurement of EMG and kinematics can help in understanding disfunction in the contribution of specific muscles to different phases of gait. To this aim, complexity metrics (e.g., sample entropy; approximate entropy; spectral entropy) applied to EMG and kinematics have been demonstrated to be effective in identifying abnormal conditions. Moreover, the conditional entropy between EMG and kinematics can identify the relationship between gait data and muscle activation patterns. This study aims to utilize several machine learning classifiers to distinguish individuals with stroke from healthy controls based on kinematics and EMG complexity measures. The cubic support vector machine applied to EMG metrics delivered the best classification results reaching 99.85% of accuracy. This method could assist clinicians in monitoring the recovery of motor impairments for stroke patients. Full article
(This article belongs to the Special Issue Entropy and Information in Biological Systems)
Show Figures

Figure 1

58 pages, 131141 KiB  
Article
Neural Activity in Quarks Language: Lattice Field Theory for a Network of Real Neurons
by Giampiero Bardella, Simone Franchini, Liming Pan, Riccardo Balzan, Surabhi Ramawat, Emiliano Brunamonti, Pierpaolo Pani and Stefano Ferraina
Entropy 2024, 26(6), 495; https://doi.org/10.3390/e26060495 - 6 Jun 2024
Cited by 5 | Viewed by 5222
Abstract
Brain–computer interfaces have seen extraordinary surges in developments in recent years, and a significant discrepancy now exists between the abundance of available data and the limited headway made in achieving a unified theoretical framework. This discrepancy becomes particularly pronounced when examining the collective [...] Read more.
Brain–computer interfaces have seen extraordinary surges in developments in recent years, and a significant discrepancy now exists between the abundance of available data and the limited headway made in achieving a unified theoretical framework. This discrepancy becomes particularly pronounced when examining the collective neural activity at the micro and meso scale, where a coherent formalization that adequately describes neural interactions is still lacking. Here, we introduce a mathematical framework to analyze systems of natural neurons and interpret the related empirical observations in terms of lattice field theory, an established paradigm from theoretical particle physics and statistical mechanics. Our methods are tailored to interpret data from chronic neural interfaces, especially spike rasters from measurements of single neuron activity, and generalize the maximum entropy model for neural networks so that the time evolution of the system is also taken into account. This is obtained by bridging particle physics and neuroscience, paving the way for particle physics-inspired models of the neocortex. Full article
(This article belongs to the Special Issue Entropy and Information in Biological Systems)
Show Figures

Figure 1

19 pages, 3690 KiB  
Article
Embedded Complexity of Evolutionary Sequences
by Jonathan D. Phillips
Entropy 2024, 26(6), 458; https://doi.org/10.3390/e26060458 - 28 May 2024
Viewed by 918
Abstract
Multiple pathways and outcomes are common in evolutionary sequences for biological and other environmental systems due to nonlinear complexity, historical contingency, and disturbances. From any starting point, multiple evolutionary pathways are possible. From an endpoint or observed state, multiple possibilities exist for the [...] Read more.
Multiple pathways and outcomes are common in evolutionary sequences for biological and other environmental systems due to nonlinear complexity, historical contingency, and disturbances. From any starting point, multiple evolutionary pathways are possible. From an endpoint or observed state, multiple possibilities exist for the sequence of events that created it. However, for any observed historical sequence—e.g., ecological or soil chronosequences, stratigraphic records, or lineages—only one historical sequence actually occurred. Here, a measure of the embedded complexity of historical sequences based on algebraic graph theory is introduced. Sequences are represented as system states S(t), such that S(t − 1) ≠ S(t) ≠ S(t + 1). Each sequence of N states contains nested subgraph sequences of length 2, 3, …, N − 1. The embedded complexity index (which can also be interpreted in terms of embedded information) compares the complexity (based on the spectral radius λ1) of the entire sequence to the cumulative complexity of the constituent subsequences. The spectral radius is closely linked to graph entropy, so the index also reflects information in the sequence. The analysis is also applied to ecological state-and-transition models (STM), which represent observed transitions, along with information on their causes or triggers. As historical sequences are lengthened (by the passage of time and additional transitions or by improved resolutions or new observations of historical changes), the overall complexity asymptotically approaches λ1 = 2, while the embedded complexity increases as N2.6. Four case studies are presented, representing coastal benthic community shifts determined from biostratigraphy, ecological succession on glacial forelands, vegetation community changes in longleaf pine woodlands, and habitat changes in a delta. Full article
(This article belongs to the Special Issue Entropy and Information in Biological Systems)
Show Figures

Figure 1

11 pages, 2914 KiB  
Article
Entropic Dynamics of Mutations in SARS-CoV-2 Genomic Sequences
by Marco Favretti
Entropy 2024, 26(2), 163; https://doi.org/10.3390/e26020163 - 14 Feb 2024
Viewed by 1534
Abstract
In this paper, we investigate a certain class of mutations in genomic sequences by studying the evolution of the entropy and relative entropy associated with the base frequencies of a given genomic sequence. Even if the method is, in principle, applicable to every [...] Read more.
In this paper, we investigate a certain class of mutations in genomic sequences by studying the evolution of the entropy and relative entropy associated with the base frequencies of a given genomic sequence. Even if the method is, in principle, applicable to every sequence which varies randomly, the case of SARS-CoV-2 RNA genome is particularly interesting to analyze, due to the richness of the available sequence database containing more than a million sequences. Our model is able to track known features of the mutation dynamics like the Cytosine–Thymine bias, but also to reveal new features of the virus mutation dynamics. We show that these new findings can be studied using an approach that combines the mean field approximation of a Markov dynamics within a stochastic thermodynamics framework. Full article
(This article belongs to the Special Issue Entropy and Information in Biological Systems)
Show Figures

Figure 1

Other

Jump to: Research

29 pages, 710 KiB  
Perspective
Information Thermodynamics: From Physics to Neuroscience
by Jan Karbowski
Entropy 2024, 26(9), 779; https://doi.org/10.3390/e26090779 - 11 Sep 2024
Cited by 1 | Viewed by 2240
Abstract
This paper provides a perspective on applying the concepts of information thermodynamics, developed recently in non-equilibrium statistical physics, to problems in theoretical neuroscience. Historically, information and energy in neuroscience have been treated separately, in contrast to physics approaches, where the relationship of entropy [...] Read more.
This paper provides a perspective on applying the concepts of information thermodynamics, developed recently in non-equilibrium statistical physics, to problems in theoretical neuroscience. Historically, information and energy in neuroscience have been treated separately, in contrast to physics approaches, where the relationship of entropy production with heat is a central idea. It is argued here that also in neural systems, information and energy can be considered within the same theoretical framework. Starting from basic ideas of thermodynamics and information theory on a classic Brownian particle, it is shown how noisy neural networks can infer its probabilistic motion. The decoding of the particle motion by neurons is performed with some accuracy, and it has some energy cost, and both can be determined using information thermodynamics. In a similar fashion, we also discuss how neural networks in the brain can learn the particle velocity and maintain that information in the weights of plastic synapses from a physical point of view. Generally, it is shown how the framework of stochastic and information thermodynamics can be used practically to study neural inference, learning, and information storing. Full article
(This article belongs to the Special Issue Entropy and Information in Biological Systems)
Show Figures

Figure 1

Back to TopTop