Divergence from, and Convergence to, Uniformity of Probability Density Quantiles
AbstractWe demonstrate that questions of convergence and divergence regarding shapes of distributions can be carried out in a location- and scale-free environment. This environment is the class of probability density quantiles (pdQs), obtained by normalizing the composition of the density with the associated quantile function. It has earlier been shown that the pdQ is representative of a location-scale family and carries essential information regarding shape and tail behavior of the family. The class of pdQs are densities of continuous distributions with common domain, the unit interval, facilitating metric and semi-metric comparisons. The Kullback–Leibler divergences from uniformity of these pdQs are mapped to illustrate their relative positions with respect to uniformity. To gain more insight into the information that is conserved under the pdQ mapping, we repeatedly apply the pdQ mapping and find that further applications of it are quite generally entropy increasing so convergence to the uniform distribution is investigated. New fixed point theorems are established with elementary probabilistic arguments and illustrated by examples. View Full-Text
- Supplementary File 1:
Supplementary (ZIP, 1 KB)
Share & Cite This Article
Staudte, R.G.; Xia, A. Divergence from, and Convergence to, Uniformity of Probability Density Quantiles. Entropy 2018, 20, 317.
Staudte RG, Xia A. Divergence from, and Convergence to, Uniformity of Probability Density Quantiles. Entropy. 2018; 20(5):317.Chicago/Turabian Style
Staudte, Robert G.; Xia, Aihua. 2018. "Divergence from, and Convergence to, Uniformity of Probability Density Quantiles." Entropy 20, no. 5: 317.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.