Next Article in Journal
A Two-Stage Maximum Entropy Prior of Location Parameter with a Stochastic Multivariate Interval Constraint and Its Properties
Previous Article in Journal
Detection of Left-Sided and Right-Sided Hearing Loss via Fractional Fourier Transform
Previous Article in Special Issue
Information and the Quantum World
Article Menu

Export Article

Open AccessArticle
Entropy 2016, 18(5), 196; doi:10.3390/e18050196

Insights into Entropy as a Measure of Multivariate Variability

1
School of Electronic and Information Engineering, Xi’an Jiaotong University, Xi’an 710049, China
2
School of Electrical Engineering, Southwest Jiaotong University, Chengdu 610031, China
3
Department of Electrical and Computer Engineering, University of Florida, Gainesville, FL 32611, USA
*
Author to whom correspondence should be addressed.
Academic Editor: Olimpia Lombardi
Received: 29 February 2016 / Revised: 26 April 2016 / Accepted: 16 May 2016 / Published: 20 May 2016
(This article belongs to the Special Issue Information: Meanings and Interpretations)
View Full-Text   |   Download PDF [816 KB, uploaded 20 May 2016]   |  

Abstract

Entropy has been widely employed as a measure of variability for problems, such as machine learning and signal processing. In this paper, we provide some new insights into the behaviors of entropy as a measure of multivariate variability. The relationships between multivariate entropy (joint or total marginal) and traditional measures of multivariate variability, such as total dispersion and generalized variance, are investigated. It is shown that for the jointly Gaussian case, the joint entropy (or entropy power) is equivalent to the generalized variance, while total marginal entropy is equivalent to the geometric mean of the marginal variances and total marginal entropy power is equivalent to the total dispersion. The smoothed multivariate entropy (joint or total marginal) and the kernel density estimation (KDE)-based entropy estimator (with finite samples) are also studied, which, under certain conditions, will be approximately equivalent to the total dispersion (or a total dispersion estimator), regardless of the data distribution. View Full-Text
Keywords: entropy; smoothed entropy; multivariate variability; generalized variance; total dispersion entropy; smoothed entropy; multivariate variability; generalized variance; total dispersion
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Chen, B.; Wang, J.; Zhao, H.; Principe, J.C. Insights into Entropy as a Measure of Multivariate Variability. Entropy 2016, 18, 196.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top