Special Issue "Information Theory in Computational Neuroscience"

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: 30 September 2020.

Special Issue Editors

Prof. Dr. Claudius Gros
Website
Guest Editor
Institute for Theoretical Physics, Goethe University Frankfurt/Main,Germany
Interests: computational neuroscience; self-organized robots; information theory; complex and cognitive systems
Dr. Dimitrije Marković
Website
Guest Editor
School of Science, Technische Universität Dresden, 01062 Dresden, Germany
Interests: computational and cognitive neuroscience; machine learning; complex systems

Special Issue Information

Dear Colleagues,

Brains empower living organisms with extraordinary information processing capabilities. Complex spatio-temporal dependencies present in their environments may be learned, long- and short-term memories of experiences and plans regarding the future. Importantly, information processing in the brain happens over multiple spatio-temporal scales, from single synapses and synaptic terminals to dendritic trees and neuronal bodies, all the way up to neuronal networks and large brain areas.

Hence, it is not surprising that information theory has led to many exciting developments in computational neuroscience, providing tools essential for our modern understanding of the computational principles that govern the development, structure, physiology, and dynamics of the nervous system.

In this Special Issue, we aim to bring together neuronal models and neuronal plasticity mechanisms that are grounded in information theory principles, modern inference, and learning algorithms. We welcome submissions that use information theory as the basis for defining generative principles of neuronal dynamics over multiple spatio-temporal scales, which informs our understanding of information processing in the brain.

Prof. Dr. Claudius Gros
Dr. Dimitrije Marković
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • information theory;
  • computational neuroscience;
  • brain complexity;
  • generating functional;
  • spatio-temporal dependencies

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Open AccessArticle
A Method to Present and Analyze Ensembles of Information Sources
Entropy 2020, 22(5), 580; https://doi.org/10.3390/e22050580 - 21 May 2020
Abstract
Information theory is a powerful tool for analyzing complex systems. In many areas of neuroscience, it is now possible to gather data from large ensembles of neural variables (e.g., data from many neurons, genes, or voxels). The individual variables can be analyzed with [...] Read more.
Information theory is a powerful tool for analyzing complex systems. In many areas of neuroscience, it is now possible to gather data from large ensembles of neural variables (e.g., data from many neurons, genes, or voxels). The individual variables can be analyzed with information theory to provide estimates of information shared between variables (forming a network between variables), or between neural variables and other variables (e.g., behavior or sensory stimuli). However, it can be difficult to (1) evaluate if the ensemble is significantly different from what would be expected in a purely noisy system and (2) determine if two ensembles are different. Herein, we introduce relatively simple methods to address these problems by analyzing ensembles of information sources. We demonstrate how an ensemble built of mutual information connections can be compared to null surrogate data to determine if the ensemble is significantly different from noise. Next, we show how two ensembles can be compared using a randomization process to determine if the sources in one contain more information than the other. All code necessary to carry out these analyses and demonstrations are provided. Full article
(This article belongs to the Special Issue Information Theory in Computational Neuroscience)
Show Figures

Figure 1

Open AccessArticle
Limitations to Estimating Mutual Information in Large Neural Populations
Entropy 2020, 22(4), 490; https://doi.org/10.3390/e22040490 - 24 Apr 2020
Abstract
Information theory provides a powerful framework to analyse the representation of sensory stimuli in neural population activity. However, estimating the quantities involved such as entropy and mutual information from finite samples is notoriously hard and any direct estimate is known to be heavily [...] Read more.
Information theory provides a powerful framework to analyse the representation of sensory stimuli in neural population activity. However, estimating the quantities involved such as entropy and mutual information from finite samples is notoriously hard and any direct estimate is known to be heavily biased. This is especially true when considering large neural populations. We study a simple model of sensory processing and show through a combinatorial argument that, with high probability, for large neural populations any finite number of samples of neural activity in response to a set of stimuli is mutually distinct. As a consequence, the mutual information when estimated directly from empirical histograms will be equal to the stimulus entropy. Importantly, this is the case irrespective of the precise relation between stimulus and neural activity and corresponds to a maximal bias. This argument is general and applies to any application of information theory, where the state space is large and one relies on empirical histograms. Overall, this work highlights the need for alternative approaches for an information theoretic analysis when dealing with large neural populations. Full article
(This article belongs to the Special Issue Information Theory in Computational Neuroscience)
Show Figures

Figure 1

Open AccessArticle
Early Detection of Alzheimer’s Disease: Detecting Asymmetries with a Return Random Walk Link Predictor
Entropy 2020, 22(4), 465; https://doi.org/10.3390/e22040465 - 19 Apr 2020
Abstract
Alzheimer’s disease has been extensively studied using undirected graphs to represent the correlations of BOLD signals in different anatomical regions through functional magnetic resonance imaging (fMRI). However, there has been relatively little analysis of this kind of data using directed graphs, which potentially [...] Read more.
Alzheimer’s disease has been extensively studied using undirected graphs to represent the correlations of BOLD signals in different anatomical regions through functional magnetic resonance imaging (fMRI). However, there has been relatively little analysis of this kind of data using directed graphs, which potentially offer the potential to capture asymmetries in the interactions between different anatomical brain regions. The detection of these asymmetries is relevant to detect the disease in an early stage. For this reason, in this paper, we analyze data extracted from fMRI images using the net4Lap algorithm to infer a directed graph from the available BOLD signals, and then seek to determine asymmetries between the left and right hemispheres of the brain using a directed version of the Return Random Walk (RRW). Experimental evaluation of this method reveals that it leads to the identification of anatomical brain regions known to be implicated in the early development of Alzheimer’s disease in clinical studies. Full article
(This article belongs to the Special Issue Information Theory in Computational Neuroscience)
Show Figures

Figure 1

Back to TopTop