Next Article in Journal
Association of Finite-Dimension Thermodynamics and a Bond-Graph Approach for Modeling an Irreversible Heat Engine
Next Article in Special Issue
Quantum Dynamical Entropies and Gács Algorithmic Entropy
Previous Article in Journal
The Dark Energy Properties of the Dirac–Born–Infeld Action
Previous Article in Special Issue
Multivariate Multi-Scale Permutation Entropy for Complexity Analysis of Alzheimer’s Disease EEG
Article Menu

Export Article

Open AccessArticle
Entropy 2012, 14(7), 1221-1233; doi:10.3390/e14071221

Nonparametric Estimation of Information-Based Measures of Statistical Dispersion

Institute of Physiology, Academy of Sciences of the Czech Republic, Videnska 1083, 142 20 Prague, Czech Republic
Author to whom correspondence should be addressed.
Received: 29 March 2012 / Revised: 20 June 2012 / Accepted: 4 July 2012 / Published: 10 July 2012
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
View Full-Text   |   Download PDF [289 KB, uploaded 24 February 2015]   |  


We address the problem of non-parametric estimation of the recently proposed measures of statistical dispersion of positive continuous random variables. The measures are based on the concepts of differential entropy and Fisher information and describe the “spread” or “variability” of the random variable from a different point of view than the ubiquitously used concept of standard deviation. The maximum penalized likelihood estimation of the probability density function proposed by Good and Gaskins is applied and a complete methodology of how to estimate the dispersion measures with a single algorithm is presented. We illustrate the approach on three standard statistical models describing neuronal activity. View Full-Text
Keywords: statistical dispersion; entropy; Fisher information; nonparametric density estimation statistical dispersion; entropy; Fisher information; nonparametric density estimation

Figure 1

This is an open access article distributed under the Creative Commons Attribution License (CC BY 3.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Kostal, L.; Pokora, O. Nonparametric Estimation of Information-Based Measures of Statistical Dispersion. Entropy 2012, 14, 1221-1233.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics



[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top