Next Article in Journal
Energy Dissipation and Decoherence in Solid-State Quantum Devices: Markovian versus non-Markovian Treatments
Next Article in Special Issue
A Method to Present and Analyze Ensembles of Information Sources
Previous Article in Journal
Comparison of Outlier-Tolerant Models for Measuring Visual Complexity
Previous Article in Special Issue
Early Detection of Alzheimer’s Disease: Detecting Asymmetries with a Return Random Walk Link Predictor
Article

Limitations to Estimating Mutual Information in Large Neural Populations

Queensland Brain Institute & School of Mathematics and Physics, The University of Queensland, St. Lucia, QLD 4072, Australia
*
Author to whom correspondence should be addressed.
Entropy 2020, 22(4), 490; https://doi.org/10.3390/e22040490
Received: 2 March 2020 / Revised: 22 April 2020 / Accepted: 22 April 2020 / Published: 24 April 2020
(This article belongs to the Special Issue Information Theory in Computational Neuroscience)
Information theory provides a powerful framework to analyse the representation of sensory stimuli in neural population activity. However, estimating the quantities involved such as entropy and mutual information from finite samples is notoriously hard and any direct estimate is known to be heavily biased. This is especially true when considering large neural populations. We study a simple model of sensory processing and show through a combinatorial argument that, with high probability, for large neural populations any finite number of samples of neural activity in response to a set of stimuli is mutually distinct. As a consequence, the mutual information when estimated directly from empirical histograms will be equal to the stimulus entropy. Importantly, this is the case irrespective of the precise relation between stimulus and neural activity and corresponds to a maximal bias. This argument is general and applies to any application of information theory, where the state space is large and one relies on empirical histograms. Overall, this work highlights the need for alternative approaches for an information theoretic analysis when dealing with large neural populations. View Full-Text
Keywords: sensory coding; information theory; entropy; sampling bias sensory coding; information theory; entropy; sampling bias
Show Figures

Figure 1

MDPI and ACS Style

Mölter, J.; Goodhill, G.J. Limitations to Estimating Mutual Information in Large Neural Populations. Entropy 2020, 22, 490. https://doi.org/10.3390/e22040490

AMA Style

Mölter J, Goodhill GJ. Limitations to Estimating Mutual Information in Large Neural Populations. Entropy. 2020; 22(4):490. https://doi.org/10.3390/e22040490

Chicago/Turabian Style

Mölter, Jan, and Geoffrey J. Goodhill 2020. "Limitations to Estimating Mutual Information in Large Neural Populations" Entropy 22, no. 4: 490. https://doi.org/10.3390/e22040490

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop