Next Article in Journal
On Voronoi Diagrams on the Information-Geometric Cauchy Manifolds
Previous Article in Journal
RZcoin: Ethereum-Based Decentralized Payment with Optional Privacy Service
Open AccessArticle

Optimal Encoding in Stochastic Latent-Variable Models

Department of Engineering, University of Cambridge, Cambridge CB2 1PZ, UK
Institute of Neuroinformatics, University of Zürich and ETH, 8057 Zürich, Switzerland
Institute for Adaptive and Neural Computation, School of Informatics, University of Edinburgh, Edinburgh EH8 9AB, UK
Author to whom correspondence should be addressed.
Entropy 2020, 22(7), 714;
Received: 25 May 2020 / Revised: 23 June 2020 / Accepted: 23 June 2020 / Published: 28 June 2020
(This article belongs to the Special Issue Thermodynamics and Information Theory of Living Systems)
In this work we explore encoding strategies learned by statistical models of sensory coding in noisy spiking networks. Early stages of sensory communication in neural systems can be viewed as encoding channels in the information-theoretic sense. However, neural populations face constraints not commonly considered in communications theory. Using restricted Boltzmann machines as a model of sensory encoding, we find that networks with sufficient capacity learn to balance precision and noise-robustness in order to adaptively communicate stimuli with varying information content. Mirroring variability suppression observed in sensory systems, informative stimuli are encoded with high precision, at the cost of more variable responses to frequent, hence less informative stimuli. Curiously, we also find that statistical criticality in the neural population code emerges at model sizes where the input statistics are well captured. These phenomena have well-defined thermodynamic interpretations, and we discuss their connection to prevailing theories of coding and statistical criticality in neural populations. View Full-Text
Keywords: information theory; encoding; neural networks; sensory systems information theory; encoding; neural networks; sensory systems
Show Figures

Figure 1

MDPI and ACS Style

Rule, M.E.; Sorbaro, M.; Hennig, M.H. Optimal Encoding in Stochastic Latent-Variable Models. Entropy 2020, 22, 714.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

Search more from Scilit
Back to TopTop