Information Theory in Neuroscience

Edited by
March 2019
280 pages
  • ISBN978-3-03897-664-6 (Paperback)
  • ISBN978-3-03897-665-3 (PDF)

This book is a reprint of the Special Issue Information Theory in Neuroscience that was published in

Chemistry & Materials Science
Computer Science & Mathematics
Physical Sciences

As the ultimate information processing device, the brain naturally lends itself to being studied with information theory. The application of information theory to neuroscience has spurred the development of principled theories of brain function, and has led to advances in the study of consciousness, as well as to the development of analytical techniques to crack the neural code—that is, to unveil the language used by neurons to encode and process information. In particular, advances in experimental techniques enabling the precise recording and manipulation of neural activity on a large scale now enable for the first time the precise formulation and the quantitative testing of hypotheses about how the brain encodes and transmits the information used for specific functions across areas.

This Special Issue presents twelve original contributions on novel approaches in neuroscience using information theory, and on the development of new information theoretic results inspired by problems in neuroscience.

  • Paperback
License and Copyright
© 2019 by the authors; CC BY-NC-ND license
neural network; Potts model; latching; recursion; functional connectome; graph theoretical analysis; eigenvector centrality; orderness; network eigen-entropy; information entropy production; discrete Markov chains; spike train statistics; Gibbs measures; maximum entropy principle; pulse-gating; channel capacity; neural coding; feedforward networks; neural information propagation; information theory; mutual information decomposition; synergy; redundancy; integrated information theory; integrated information; minimum information partition; submodularity; Queyranne’s algorithm; consciousness; maximum entropy; higher-order correlations; neural population coding; Ising model; brain network; complex networks; connectome; information theory; graph theory; free-energy principle; internal model hypothesis; unconscious inference; infomax principle; independent component analysis; principal component analysis; goodness; categorical perception; perceptual magnet; information theory; perceived similarity; mutual information; synergy; redundancy; neural code; hippocampus; entorhinal cortex; navigation; neural code; representation; decoding; spike-time precision; discrimination; noise correlations; information theory; mismatched decoding; information theory; neuroscience

Related Books

September 2022

Information Theory and Machine Learning

Computer Science & Mathematics
January 2021

Information Theory for Data Communications and Processing

Computer Science & Mathematics
August 2020

Information Theory and Language

Social Sciences, Arts & Humanities
April 2021

Applications of Information Theory to Epidemiology

Biology & Life Sciences
April 2019

Information Geometry

Computer Science & Mathematics