31 March 2022
Entropy Best ECR Presentation Awards at the CNS*2021 Online Workshop on Methods of Information Theory in Computational Neuroscience—Winners Announced

We are pleased to announce the two winners of the Best ECR Presentation Awards sponsored by Entropy (ISSN: 1099-4300) for the CNS*2021 Online Workshop on Methods of Information Theory in Computational Neuroscience held on 6 and 7 July 2021. Congratulations to the winners, Dr. Fleur Zeldenrust and Mr. Lucas Rudelt.

“Estimating Information Transfer In Vitro: Results from Barrel Cortex” by Fleur Zeldenrust

Understanding the relation between (sensory) stimuli and the activity of neurons (i.e., ‘the neural code’) lies at the heart of understanding the computational properties of the brain. However, quantifying the information between a stimulus and a spike train has proven to be challenging due to the limited life span of a cell in an in vitro setup. In 2017, in this workshop, I presented a new in vitro method to measure how much information a single neuron transfers from the input it receives to its output spike train. This method has the advantage that it is fast (~10 minutes) compared with traditional methods. This decrease in recording time is obtained by generating an input by an artificial neural network that responds to a randomly appearing and disappearing ‘sensory stimulus’: the hidden state. The low entropy of this hidden state allows for a fast estimate of transferred information. Using this method, we have now recorded over 850 trials in almost 300 inhibitory and excitatory neurons of the barrel cortex, using different pharmacological modulations (including dopamine, acetylcholine, and serotonin receptor agonists). Here, I presented the first conclusions of this large database of recordings and showed how this method can be extended to dynamic clamp and the effects this has on modeled neurons.

"Embedding Optimization Reveals Long-Lasting History Dependence in Neural Spiking Activity" by Lucas Rudelt

Information processing can leave distinct footprints on the statistics of neural spiking. For example, efficient coding minimizes statistical dependencies on spiking history, while temporal integration of information may require the maintenance of information over various timescales. To investigate these footprints, I presented an approach that quantifies history dependence within the spiking of a single neuron, using mutual information between the entire past and current spiking. Applying this approach to extracellular spike recordings, we found that both the strength and the timescale of history dependence showed striking differences between different neural systems. In conjunction with recent highly parallel spike recording techniques, this approach could yield valuable insights into how information processing is organized in the brain.

Back to TopTop