Next Article in Journal
Probabilistic Shaping for Finite Blocklengths: Distribution Matching and Sphere Shaping
Previous Article in Journal
Wearable Inertial Sensors for Daily Activity Analysis Based on Adam Optimization and the Maximum Entropy Markov Model
Previous Article in Special Issue
Limitations to Estimating Mutual Information in Large Neural Populations
Open AccessArticle

A Method to Present and Analyze Ensembles of Information Sources

1
Department of Psychology, Indiana University—Purdue University Indianapolis, Indianapolis, IN 46202, USA
2
Department of Neurosciences, University of New Mexico School of Medicine, Albuquerque, NM 87131, USA
3
Stark Neuroscience Research Institute, Indiana University—Purdue University Indianapolis, Indianapolis, IN 46202, USA
*
Author to whom correspondence should be addressed.
Entropy 2020, 22(5), 580; https://doi.org/10.3390/e22050580
Received: 29 April 2020 / Accepted: 18 May 2020 / Published: 21 May 2020
(This article belongs to the Special Issue Information Theory in Computational Neuroscience)
Information theory is a powerful tool for analyzing complex systems. In many areas of neuroscience, it is now possible to gather data from large ensembles of neural variables (e.g., data from many neurons, genes, or voxels). The individual variables can be analyzed with information theory to provide estimates of information shared between variables (forming a network between variables), or between neural variables and other variables (e.g., behavior or sensory stimuli). However, it can be difficult to (1) evaluate if the ensemble is significantly different from what would be expected in a purely noisy system and (2) determine if two ensembles are different. Herein, we introduce relatively simple methods to address these problems by analyzing ensembles of information sources. We demonstrate how an ensemble built of mutual information connections can be compared to null surrogate data to determine if the ensemble is significantly different from noise. Next, we show how two ensembles can be compared using a randomization process to determine if the sources in one contain more information than the other. All code necessary to carry out these analyses and demonstrations are provided. View Full-Text
Keywords: information theory; information ensemble; ensemble comparison; population coding; mutual information; neural ensemble; genetic network; population study information theory; information ensemble; ensemble comparison; population coding; mutual information; neural ensemble; genetic network; population study
Show Figures

Figure 1

MDPI and ACS Style

Timme, N.M.; Linsenbardt, D.; Lapish, C.C. A Method to Present and Analyze Ensembles of Information Sources. Entropy 2020, 22, 580.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop