Special Issue "Information Theory in Computational Biology"
Deadline for manuscript submissions: 30 May 2022 | Viewed by 2701
Interests: information networks; complex networks; information science; machine learning; computational biology
Starting with Claude Shannon’s foundational work in 1948, the field of Information Theory, key to statistical learning and inference, has shaped a wide range of scientific disciplines. Concepts including self-information, entropy, and mutual information have guided the progress of research ranging from physics to the biological sciences. In recent decades, Information Theory has contributed to significant advances in Computational Biology and Bioinformatics across a broad range of topics.
We are pleased to invite submissions to this Special Issue of Entropy, with the theme “Information Theory in Computational Biology”. Submissions can include, but are not limited to, the following research areas: sequencing, sequence comparison, and error correction; gene expression and transcriptomics; biological networks; omics analyses; genome-wide disease-gene association mapping; and protein sequence, structure, and interaction analysis.
Topics that are particularly welcome include analyses, and/or development of application tools, involving single-cell data; multi-omics integration; biological networks; human health; high-dimensional statistical theory for biological applications; unifying definitions and interpretations of statistical interactions; adaptation of existing information theoretic test statistics and estimators for cases involving missing, erroneous, or heterogeneous data; analyses when distributions of the test statistics under the null and the alternative hypotheses are unknown; biologically inspired information storage; and efficient analysis of very large datasets.
Submitted manuscripts should present original work, and may describe novel algorithms, methods, metrics, applications, tools, platforms, and other resources that apply Information Theory principles to advance the field of Computational Biology. We encourage the rigorous comparison of original work with existing methods. We also welcome survey papers, as well as essays reflecting on theory, controversies, and/or the state of current research involving the application of Information Theory concepts to Computational Biology, and providing informed recommendations to advance this research. (Please limit commentary/perspective articles to 3,000 words.)
Thanks for considering this publication opportunity. We look forward to your submission!
Requests for extensions to the stated manuscript submission deadline will be considered.
Dr. Alon Bartal
Dr. Kathleen M. Jagodnik
Manuscript Submission Information
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.
Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.
- biological networks
- computational biology
- gene expression and transcriptomics
- heterogeneous data
- human health
- information theory
- machine learning
- systems biology
The below list represents only planned manuscripts. Some of these manuscripts have not been received by the Editorial Office yet. Papers submitted to MDPI journals are subject to peer-review.
Author: Hande Kucuk-McGinty
Tentative Title: Revealing the dynamics of neural information processing with multivariate information decomposition
Authors: Newman, E.L.; Varley, T.F.; Sherill, S.P.; Timme, N.M.; Beggs, J.M.
Abstract: The varied cognitive abilities and rich adaptive behaviours enabled by the animal nervous sys- tem are often described in terms of “information processing.” This framing raises the issue of how biological neural circuits actually implement “computations” and some of the most fundamental outstanding questions in neuroscience centre on defining and determining the mechanisms of neu- ral information processing. Classical information theory has long been understood to be a natural framework by which neural computation can be understood, and recent advances in the field of multivariate information theory specifically offer exciting new insights into the structure of compu- tation in complex nervous systems. In this paper, we specifically focus on the partial information decomposition (PID), which reveals multiple redundant, unique, and synergistic “modes” by which neurons integrate information from multiple upstream sources. Of these different dynamics, syn- ergistic integration, where the output of a computation is irreducible the individual components represents a fundamental operation, with implications for many cognitive process such as pattern recognition, learning, and memory updating. Recent work in a variety of model systems has revealed that synergistic dynamics are ubiquitous in neural circuitry and show reliable structure-function relationships: emerging disproportionately in neuronal rich clubs, downstream of recurrent connectivity, and during correlated processes. In this review, we synthesize existing literature on higher-order information dynamics in neuronal networks, and describe what insights have been gained by taking an information-decomposition perspective on neural activity. Furthermore, we provide an introduction to the PID framework intended to be accessible to both novice and experienced scientists. Finally, we briefly discuss future promising directions for information decomposition approaches to neuroscience, such as work in behaving animals, multi-target generalizations of the PID, and time-resolved local analyses.
Highlights: - we’ll describe use of MV IT to decompose neural dynamics for study as unique, redundant, and synergistic components - we’ll survey and synthesize prior successes in using this approach to study synergistic processing