Insights into Entropy as a Measure of Multivariate Variability
AbstractEntropy has been widely employed as a measure of variability for problems, such as machine learning and signal processing. In this paper, we provide some new insights into the behaviors of entropy as a measure of multivariate variability. The relationships between multivariate entropy (joint or total marginal) and traditional measures of multivariate variability, such as total dispersion and generalized variance, are investigated. It is shown that for the jointly Gaussian case, the joint entropy (or entropy power) is equivalent to the generalized variance, while total marginal entropy is equivalent to the geometric mean of the marginal variances and total marginal entropy power is equivalent to the total dispersion. The smoothed multivariate entropy (joint or total marginal) and the kernel density estimation (KDE)-based entropy estimator (with finite samples) are also studied, which, under certain conditions, will be approximately equivalent to the total dispersion (or a total dispersion estimator), regardless of the data distribution. View Full-Text
Scifeed alert for new publicationsNever miss any articles matching your research from any publisher
- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now
Chen, B.; Wang, J.; Zhao, H.; Principe, J.C. Insights into Entropy as a Measure of Multivariate Variability. Entropy 2016, 18, 196.
Chen B, Wang J, Zhao H, Principe JC. Insights into Entropy as a Measure of Multivariate Variability. Entropy. 2016; 18(5):196.Chicago/Turabian Style
Chen, Badong; Wang, Jianji; Zhao, Haiquan; Principe, Jose C. 2016. "Insights into Entropy as a Measure of Multivariate Variability." Entropy 18, no. 5: 196.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.