This article is
- freely available
Nonparametric Estimation of Information-Based Measures of Statistical Dispersion
Institute of Physiology, Academy of Sciences of the Czech Republic, Videnska 1083, 142 20 Prague, Czech Republic
* Author to whom correspondence should be addressed.
Received: 29 March 2012; in revised form: 20 June 2012 / Accepted: 4 July 2012 / Published: 10 July 2012
Abstract: We address the problem of non-parametric estimation of the recently proposed measures of statistical dispersion of positive continuous random variables. The measures are based on the concepts of differential entropy and Fisher information and describe the “spread” or “variability” of the random variable from a different point of view than the ubiquitously used concept of standard deviation. The maximum penalized likelihood estimation of the probability density function proposed by Good and Gaskins is applied and a complete methodology of how to estimate the dispersion measures with a single algorithm is presented. We illustrate the approach on three standard statistical models describing neuronal activity.
Keywords: statistical dispersion; entropy; Fisher information; nonparametric density estimation
Article StatisticsClick here to load and display the download statistics.
Notes: Multiple requests from the same IP address are counted as one view.
Cite This Article
MDPI and ACS Style
Kostal, L.; Pokora, O. Nonparametric Estimation of Information-Based Measures of Statistical Dispersion. Entropy 2012, 14, 1221-1233.
Kostal L, Pokora O. Nonparametric Estimation of Information-Based Measures of Statistical Dispersion. Entropy. 2012; 14(7):1221-1233.
Kostal, Lubomir; Pokora, Ondrej. 2012. "Nonparametric Estimation of Information-Based Measures of Statistical Dispersion." Entropy 14, no. 7: 1221-1233.