Next Article in Journal
Analysis of Solidarity Effect for Entropy, Pareto, and Gini Indices on Two-Class Society Using Kinetic Wealth Exchange Model
Next Article in Special Issue
Binary Expression Enhances Reliability of Messaging in Gene Networks
Previous Article in Journal
Endoreversible Modeling of a Hydraulic Recuperation System
Previous Article in Special Issue
Thermodynamic Limits and Optimality of Microbial Growth
Open AccessArticle

Can Transfer Entropy Infer Information Flow in Neuronal Circuits for Cognitive Processing?

by Ali Tehrani-Saleh 1,2 and Christoph Adami 2,3,4,*
1
Department of Computer Science and Engineering, Michigan State University, East Lansing, MI 48824, USA
2
BEACON Center for the Study of Evolution, Michigan State University, East Lansing, MI 48824, USA
3
Department of Microbiology & Molecular Genetics, Michigan State University, East Lansing, MI 48824, USA
4
Department of Physics & Astronomy, Michigan State University, East Lansing, MI 48824, USA
*
Author to whom correspondence should be addressed.
Entropy 2020, 22(4), 385; https://doi.org/10.3390/e22040385
Received: 4 December 2019 / Revised: 11 March 2020 / Accepted: 25 March 2020 / Published: 28 March 2020
(This article belongs to the Special Issue Information Flow and Entropy Production in Biomolecular Networks)
How cognitive neural systems process information is largely unknown, in part because of how difficult it is to accurately follow the flow of information from sensors via neurons to actuators. Measuring the flow of information is different from measuring correlations between firing neurons, for which several measures are available, foremost among them the Shannon information, which is an undirected measure. Several information-theoretic notions of “directed information” have been used to successfully detect the flow of information in some systems, in particular in the neuroscience community. However, recent work has shown that directed information measures such as transfer entropy can sometimes inadequately estimate information flow, or even fail to identify manifest directed influences, especially if neurons contribute in a cryptographic manner to influence the effector neuron. Because it is unclear how often such cryptic influences emerge in cognitive systems, the usefulness of transfer entropy measures to reconstruct information flow is unknown. Here, we test how often cryptographic logic emerges in an evolutionary process that generates artificial neural circuits for two fundamental cognitive tasks (motion detection and sound localization). Besides counting the frequency of problematic logic gates, we also test whether transfer entropy applied to an activity time-series recorded from behaving digital brains can infer information flow, compared to a ground-truth model of direct influence constructed from connectivity and circuit logic. Our results suggest that transfer entropy will sometimes fail to infer directed information when it exists, and sometimes suggest a causal connection when there is none. However, the extent of incorrect inference strongly depends on the cognitive task considered. These results emphasize the importance of understanding the fundamental logic processes that contribute to information flow in cognitive processing, and quantifying their relevance in any given nervous system. View Full-Text
Keywords: transfer entropy; information flow; neural processing; motion detection; sound localization transfer entropy; information flow; neural processing; motion detection; sound localization
Show Figures

Figure 1

MDPI and ACS Style

Tehrani-Saleh, A.; Adami, C. Can Transfer Entropy Infer Information Flow in Neuronal Circuits for Cognitive Processing? Entropy 2020, 22, 385.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop