Next Article in Journal
Composite Tests under Corrupted Data
Previous Article in Journal
K-th Nearest Neighbor (KNN) Entropy Estimates of Complexity and Integration from Ongoing and Stimulus-Evoked Electroencephalographic (EEG) Recordings of the Human Brain
Previous Article in Special Issue
Assessing the Relevance of Specific Response Features in the Neural Code
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Information Theory in Neuroscience

by
Eugenio Piasini
1,* and
Stefano Panzeri
2,*
1
Computational Neuroscience Initiative and Department of Physics and Astronomy, University of Pennsylvania, Philadelphia, PA 19104, USA
2
Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems @UniTn, Istituto Italiano di Tecnologia, 38068 Rovereto (TN), Italy
*
Authors to whom correspondence should be addressed.
Entropy 2019, 21(1), 62; https://doi.org/10.3390/e21010062
Submission received: 26 December 2018 / Accepted: 9 January 2019 / Published: 14 January 2019
(This article belongs to the Special Issue Information Theory in Neuroscience)

Abstract

:
This is the Editorial article summarizing the scope and contents of the Special Issue, Information Theory in Neuroscience.

As the ultimate information processing device, the brain naturally lends itself to be studied with information theory. Because of this, information theory [1] has been applied to the study of the brain systematically for many decades and has been instrumental in many advances. It has spurred the development of principled theories of brain function [2,3,4,5,6,7,8]. It has led to advances in the study of consciousness [9]. It has also led to the development of many influential neural recording analysis techniques to crack the neural code, that is to unveil the language used by neurons to encode and process information [10,11,12,13,14,15].
The influence of information theory on the study of neural information processing continues today in many ways. In particular, concepts from information theory are beginning to be applied to the large-scale recordings of neural activity that can be obtained with techniques such as two-photon calcium imaging to understand the nature of the neural population code [16]. Advances in experimental techniques enabling precise recording and manipulation of neural activity on a large scale now enable for the first time the precise formulation and the quantitative test of hypotheses about how the brain encodes and transmits across areas the information used for specific functions, and information theory is a formalism that plays a useful role in the analysis and design of such experiments [17].
This Special Issue presents twelve original contributions on novel approaches in neuroscience using information theory, and on the development of new information theoretic results inspired by problems in neuroscience. The original contributions presented in this Special Issue span a wide range of topics.
Two papers use the concept of maximum entropy [18] to develop maximum entropy models to measure the existence of functional interactions between neurons and understand their potential role in neural information processing [19,20]. Kitazono et al. [21] and Bonmati et al. [22] develop concepts relating information theory to measures of complexity and integrated information. These techniques have potential for a wide range of applications, not least of which is the study of how consciousness emerges from the dynamics of the brain. Other work uses information theory as a tool to investigate different aspects of brain dynamics, from latching in neural networks [23], to the long-term development dynamics of the human brain studied using functional imaging data [24], to rapid information processing possibly mediated by the synfire chains [25] that have been reported in studies of simultaneously-recorded spike trains [26]. Other studies attempt to bridge between information theory and the theory of inference [27] and of categorical perception mediated by representation similarity in neural activity [28]. One paper [29] uses the recently-developed framework of partial information decomposition [30] to investigate the origins of synergy and redundancy in information representations, a topic of strong interest for the understanding of how neurons in the brain work together to represent information [31]. Finally, the two contributions of Samengo and colleagues examine applications of information theory to two specific problems of empirical importance in neuroscience: how to define how relevant specific response features are in a neural code [32], and what the code used by neurons in the temporal lobe to encode information is [33].

Author Contributions

E.P. and S.P. wrote the paper.

Acknowledgments

We are grateful to the contributing authors, to the anonymous referees, and to the Editorial Staff of Entropy for their excellent and tireless work, which made this Special Issue possible.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Shannon, C.E. A Mathematical Theory of Communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
  2. Srinivasan, M.V.; Laughlin, S.B.; Dubs, A.; Horridge, G.A. Predictive coding: a fresh view of inhibition in the retina. Proc. R. Soc. Lond. Ser. B Biol. Sci. 1982, 216, 427–459. [Google Scholar] [CrossRef]
  3. Atick, J.J.; Redlich, A.N. Towards a Theory of Early Visual Processing. Neural Comput. 1990, 2, 308–320. [Google Scholar] [CrossRef]
  4. Dong, D.W.; Atick, J. Temporal decorrelation: A theory of lagged and nonlagged responses in the lateral geniculate nucleus Network. Netw. Comput. Neural Syst 1995, 6, 159–178. [Google Scholar] [CrossRef]
  5. Laughlin, S.B.; de Ruyter van Steveninck, R.R.; Anderson, J.C. The metabolic cost of neural information. Nat. Neurosci. 1998, 1, 36–41. [Google Scholar] [CrossRef] [PubMed]
  6. Hermundstad, A.M.; Briguglio, J.J.; Conte, M.M.; Victor, J.D.; Balasubramanian, V.; Tkačik, G. Variance predicts salience in central sensory processing. eLife 2014, 3, e03722. [Google Scholar] [CrossRef] [PubMed]
  7. Billings, G.; Piasini, E.; Lőrincz, A.; Nusser, Z.; Silver, R.A. Network Structure within the Cerebellar Input Layer Enables Lossless Sparse Encoding. Neuron 2014, 83, 960–974. [Google Scholar] [CrossRef] [Green Version]
  8. Chalk, M.; Marre, O.; Tkačik, G. Toward a unified theory of efficient, predictive, and sparse coding. Proc. Natl. Acad. Sci. USA 2018, 115, 186–191. [Google Scholar] [CrossRef]
  9. Tononi, G.; Sporns, O.; Edelman, G.M. A measure for brain complexity: Relating functional segregation and integration in the nervous system. Proc. Natl. Acad. Sci. USA 1994, 91, 5033–5037. [Google Scholar] [CrossRef]
  10. Strong, S.P.; Koberle, R.; de Ruyter van Steveninck, R.R.; Bialek, W. Entropy and information in neural spike trains. Phys. Rev. Lett. 1998, 80, 197–200. [Google Scholar] [CrossRef]
  11. Borst, A.; Theunissen, F.E. Information theory and neural coding. Nat. Neurosci. 1999, 2, 947–957. [Google Scholar] [CrossRef] [PubMed]
  12. Schneidman, E.; Berry, M.J.; Segev, R.; Bialek, W. Weak pairwise correlations imply strongly correlated network states in a neural population. Nature 2006, 440, 1007–1012. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Quian Quiroga, R.; Panzeri, S. Extracting information from neural populations: information theory and decoding approaches. Nat. Rev. Neurosci. 2009, 10, 173–185. [Google Scholar] [CrossRef] [PubMed]
  14. Victor, J.D. Approaches to information-theoretic analysis of neural activity. Biol. Theory 2006, 1, 302–316. [Google Scholar] [CrossRef] [PubMed]
  15. Tkačik, G.; Marre, O.; Amodei, D.; Bialek, W.; Berry, M.J. Searching for Collective Behavior in a Large Network of Sensory Neurons. PLoS Comput. Biol. 2014, 10, e1003408. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  16. Runyan, C.A.; Piasini, E.; Panzeri, S.; Harvey, C.D. Distinct timescales of population coding across cortex. Nature 2017, 548, 92–96. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Panzeri, S.; Harvey, C.D.; Piasini, E.; Latham, P.E.; Fellin, T. Cracking the Neural Code for Sensory Perception by Combining Statistics, Intervention, and Behavior. Neuron 2017, 93, 491–507. [Google Scholar] [CrossRef] [Green Version]
  18. Jaynes, E.T. Information theory and statistical mechanics. Phys. Rev. 1957, 106, 620–630. [Google Scholar] [CrossRef]
  19. Cofré, R.; Maldonado, C. Information Entropy Production of Maximum Entropy Markov Chains from Spike Trains. Entropy 2018, 20, 34. [Google Scholar] [CrossRef]
  20. Cayco-Gajic, N.A.; Zylberberg, J.; Shea-Brown, E. A Moment-Based Maximum Entropy Model for Fitting Higher-Order Interactions in Neural Data. Entropy 2018, 20, 489. [Google Scholar] [CrossRef]
  21. Kitazono, J.; Kanai, R.; Oizumi, M. Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory. Entropy 2018, 20, 173. [Google Scholar] [CrossRef]
  22. Bonmati, E.; Bardera, A.; Feixas, M.; Boada, I. Novel Brain Complexity Measures Based on Information Theory. Entropy 2018, 20, 491. [Google Scholar] [CrossRef]
  23. Kang, C.J.; Naim, M.; Boboeva, V.; Treves, A. Life on the Edge: Latching Dynamics in a Potts Neural Network. Entropy 2017, 19, 468. [Google Scholar] [CrossRef]
  24. Fan, Y.; Zeng, L.L.; Shen, H.; Qin, J.; Li, F.; Hu, D. Lifespan Development of the Human Brain Revealed by Large-Scale Network Eigen-Entropy. Entropy 2017, 19, 471. [Google Scholar] [CrossRef]
  25. Abeles, M.; Bergman, H.; Margalit, E.; Vaadia, E. Spatiotemporal firing patterns in the frontal cortex of behaving monkeys. J. Neurophysiol. 1993, 70, 1629–1638. [Google Scholar] [CrossRef] [PubMed]
  26. Xiao, Z.; Wang, B.; Sornborger, A.T.; Tao, L. Mutual Information and Information Gating in Synfire Chains. Entropy 2018, 20, 102. [Google Scholar] [CrossRef]
  27. Isomura, T. A Measure of Information Available for Inference. Entropy 2018, 20, 512. [Google Scholar] [CrossRef]
  28. Brasselet, R.; Arleo, A. Category Structure and Categorical Perception Jointly Explained by Similarity-Based Information Theory. Entropy 2018, 20, 527. [Google Scholar] [CrossRef]
  29. Chicharro, D.; Pica, G.; Panzeri, S. The Identity of Information: How Deterministic Dependencies Constrain Information Synergy and Redundancy. Entropy 2018, 20, 169. [Google Scholar] [CrossRef]
  30. Williams, P.L.; Beer, R.D. Nonnegative Decomposition of Multivariate Information. arXiv, 2010; arXiv:1004.2515. [Google Scholar]
  31. Griffith, V.; Koch, C. Quantifying Synergistic Mutual Information. In Guided Self-Organization: Inception; Prokopenko, M., Ed.; Springer: Berlin, Germany, 2014; pp. 159–190. [Google Scholar]
  32. Eyherabide, H.G.; Samengo, I. Assessing the Relevance of Specific Response Features in the Neural Code. Entropy 2018, 20, 879. [Google Scholar] [CrossRef]
  33. Maidana Capitán, M.B.; Kropff, E.; Samengo, I. Information-Theoretical Analysis of the Neural Code in the Rodent Temporal Lobe. Entropy 2018, 20, 571. [Google Scholar] [CrossRef]

Share and Cite

MDPI and ACS Style

Piasini, E.; Panzeri, S. Information Theory in Neuroscience. Entropy 2019, 21, 62. https://doi.org/10.3390/e21010062

AMA Style

Piasini E, Panzeri S. Information Theory in Neuroscience. Entropy. 2019; 21(1):62. https://doi.org/10.3390/e21010062

Chicago/Turabian Style

Piasini, Eugenio, and Stefano Panzeri. 2019. "Information Theory in Neuroscience" Entropy 21, no. 1: 62. https://doi.org/10.3390/e21010062

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop