Information Theory in Neuroscience
Abstract
:Author Contributions
Acknowledgments
Conflicts of Interest
References
- Shannon, C.E. A Mathematical Theory of Communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
- Srinivasan, M.V.; Laughlin, S.B.; Dubs, A.; Horridge, G.A. Predictive coding: a fresh view of inhibition in the retina. Proc. R. Soc. Lond. Ser. B Biol. Sci. 1982, 216, 427–459. [Google Scholar] [CrossRef]
- Atick, J.J.; Redlich, A.N. Towards a Theory of Early Visual Processing. Neural Comput. 1990, 2, 308–320. [Google Scholar] [CrossRef]
- Dong, D.W.; Atick, J. Temporal decorrelation: A theory of lagged and nonlagged responses in the lateral geniculate nucleus Network. Netw. Comput. Neural Syst 1995, 6, 159–178. [Google Scholar] [CrossRef]
- Laughlin, S.B.; de Ruyter van Steveninck, R.R.; Anderson, J.C. The metabolic cost of neural information. Nat. Neurosci. 1998, 1, 36–41. [Google Scholar] [CrossRef] [PubMed]
- Hermundstad, A.M.; Briguglio, J.J.; Conte, M.M.; Victor, J.D.; Balasubramanian, V.; Tkačik, G. Variance predicts salience in central sensory processing. eLife 2014, 3, e03722. [Google Scholar] [CrossRef] [PubMed]
- Billings, G.; Piasini, E.; Lőrincz, A.; Nusser, Z.; Silver, R.A. Network Structure within the Cerebellar Input Layer Enables Lossless Sparse Encoding. Neuron 2014, 83, 960–974. [Google Scholar] [CrossRef] [Green Version]
- Chalk, M.; Marre, O.; Tkačik, G. Toward a unified theory of efficient, predictive, and sparse coding. Proc. Natl. Acad. Sci. USA 2018, 115, 186–191. [Google Scholar] [CrossRef]
- Tononi, G.; Sporns, O.; Edelman, G.M. A measure for brain complexity: Relating functional segregation and integration in the nervous system. Proc. Natl. Acad. Sci. USA 1994, 91, 5033–5037. [Google Scholar] [CrossRef]
- Strong, S.P.; Koberle, R.; de Ruyter van Steveninck, R.R.; Bialek, W. Entropy and information in neural spike trains. Phys. Rev. Lett. 1998, 80, 197–200. [Google Scholar] [CrossRef]
- Borst, A.; Theunissen, F.E. Information theory and neural coding. Nat. Neurosci. 1999, 2, 947–957. [Google Scholar] [CrossRef] [PubMed]
- Schneidman, E.; Berry, M.J.; Segev, R.; Bialek, W. Weak pairwise correlations imply strongly correlated network states in a neural population. Nature 2006, 440, 1007–1012. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Quian Quiroga, R.; Panzeri, S. Extracting information from neural populations: information theory and decoding approaches. Nat. Rev. Neurosci. 2009, 10, 173–185. [Google Scholar] [CrossRef] [PubMed]
- Victor, J.D. Approaches to information-theoretic analysis of neural activity. Biol. Theory 2006, 1, 302–316. [Google Scholar] [CrossRef] [PubMed]
- Tkačik, G.; Marre, O.; Amodei, D.; Bialek, W.; Berry, M.J. Searching for Collective Behavior in a Large Network of Sensory Neurons. PLoS Comput. Biol. 2014, 10, e1003408. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Runyan, C.A.; Piasini, E.; Panzeri, S.; Harvey, C.D. Distinct timescales of population coding across cortex. Nature 2017, 548, 92–96. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Panzeri, S.; Harvey, C.D.; Piasini, E.; Latham, P.E.; Fellin, T. Cracking the Neural Code for Sensory Perception by Combining Statistics, Intervention, and Behavior. Neuron 2017, 93, 491–507. [Google Scholar] [CrossRef] [Green Version]
- Jaynes, E.T. Information theory and statistical mechanics. Phys. Rev. 1957, 106, 620–630. [Google Scholar] [CrossRef]
- Cofré, R.; Maldonado, C. Information Entropy Production of Maximum Entropy Markov Chains from Spike Trains. Entropy 2018, 20, 34. [Google Scholar] [CrossRef]
- Cayco-Gajic, N.A.; Zylberberg, J.; Shea-Brown, E. A Moment-Based Maximum Entropy Model for Fitting Higher-Order Interactions in Neural Data. Entropy 2018, 20, 489. [Google Scholar] [CrossRef]
- Kitazono, J.; Kanai, R.; Oizumi, M. Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory. Entropy 2018, 20, 173. [Google Scholar] [CrossRef]
- Bonmati, E.; Bardera, A.; Feixas, M.; Boada, I. Novel Brain Complexity Measures Based on Information Theory. Entropy 2018, 20, 491. [Google Scholar] [CrossRef]
- Kang, C.J.; Naim, M.; Boboeva, V.; Treves, A. Life on the Edge: Latching Dynamics in a Potts Neural Network. Entropy 2017, 19, 468. [Google Scholar] [CrossRef]
- Fan, Y.; Zeng, L.L.; Shen, H.; Qin, J.; Li, F.; Hu, D. Lifespan Development of the Human Brain Revealed by Large-Scale Network Eigen-Entropy. Entropy 2017, 19, 471. [Google Scholar] [CrossRef]
- Abeles, M.; Bergman, H.; Margalit, E.; Vaadia, E. Spatiotemporal firing patterns in the frontal cortex of behaving monkeys. J. Neurophysiol. 1993, 70, 1629–1638. [Google Scholar] [CrossRef] [PubMed]
- Xiao, Z.; Wang, B.; Sornborger, A.T.; Tao, L. Mutual Information and Information Gating in Synfire Chains. Entropy 2018, 20, 102. [Google Scholar] [CrossRef]
- Isomura, T. A Measure of Information Available for Inference. Entropy 2018, 20, 512. [Google Scholar] [CrossRef]
- Brasselet, R.; Arleo, A. Category Structure and Categorical Perception Jointly Explained by Similarity-Based Information Theory. Entropy 2018, 20, 527. [Google Scholar] [CrossRef]
- Chicharro, D.; Pica, G.; Panzeri, S. The Identity of Information: How Deterministic Dependencies Constrain Information Synergy and Redundancy. Entropy 2018, 20, 169. [Google Scholar] [CrossRef]
- Williams, P.L.; Beer, R.D. Nonnegative Decomposition of Multivariate Information. arXiv, 2010; arXiv:1004.2515. [Google Scholar]
- Griffith, V.; Koch, C. Quantifying Synergistic Mutual Information. In Guided Self-Organization: Inception; Prokopenko, M., Ed.; Springer: Berlin, Germany, 2014; pp. 159–190. [Google Scholar]
- Eyherabide, H.G.; Samengo, I. Assessing the Relevance of Specific Response Features in the Neural Code. Entropy 2018, 20, 879. [Google Scholar] [CrossRef]
- Maidana Capitán, M.B.; Kropff, E.; Samengo, I. Information-Theoretical Analysis of the Neural Code in the Rodent Temporal Lobe. Entropy 2018, 20, 571. [Google Scholar] [CrossRef]
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Piasini, E.; Panzeri, S. Information Theory in Neuroscience. Entropy 2019, 21, 62. https://doi.org/10.3390/e21010062
Piasini E, Panzeri S. Information Theory in Neuroscience. Entropy. 2019; 21(1):62. https://doi.org/10.3390/e21010062
Chicago/Turabian StylePiasini, Eugenio, and Stefano Panzeri. 2019. "Information Theory in Neuroscience" Entropy 21, no. 1: 62. https://doi.org/10.3390/e21010062
APA StylePiasini, E., & Panzeri, S. (2019). Information Theory in Neuroscience. Entropy, 21(1), 62. https://doi.org/10.3390/e21010062