Next Article in Journal
Entropy and Contrast Enhancement of Infrared Thermal Images Using the Multiscale Top-Hat Transform
Next Article in Special Issue
CDCS: Cluster-Based Distributed Compressed Sensing to Facilitate QoS Routing in Cognitive Video Sensor Networks
Previous Article in Journal
A Method Based on Differential Entropy-Like Function for Detecting Differentially Expressed Genes Across Multiple Conditions in RNA-Seq Studies
Previous Article in Special Issue
Identification of Hypsarrhythmia in Children with Microcephaly Infected by Zika Virus
Article Menu
Issue 3 (March) cover image

Export Article

Open AccessArticle

Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding

1,2,* and 2,*
1
Key Laboratory of Cognition and Intelligence and Information Science Academy of China Electronics Technology Group Corporation, Beijing 100086, China
2
Department of Biomedical Engineering, Johns Hopkins University School of Medicine, Baltimore, MD 21205, USA
*
Authors to whom correspondence should be addressed.
Entropy 2019, 21(3), 243; https://doi.org/10.3390/e21030243
Received: 15 December 2018 / Revised: 11 February 2019 / Accepted: 28 February 2019 / Published: 4 March 2019
  |  
PDF [678 KB, uploaded 4 March 2019]
  |  

Abstract

Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted to continuous variables because the calculation of Fisher information requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations of the mutual information based on Kullback-Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all work for discrete variables, one of them has consistent performance and high accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating the mutual information between the stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems. View Full-Text
Keywords: neural population coding; mutual information; Kullback-Leibler divergence; Rényi divergence; Chernoff divergence; approximation; discrete variables neural population coding; mutual information; Kullback-Leibler divergence; Rényi divergence; Chernoff divergence; approximation; discrete variables
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Huang, W.; Zhang, K. Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding. Entropy 2019, 21, 243.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top