Next Article in Journal
Quantifying Configuration-Sampling Error in Langevin Simulations of Complex Molecular Systems
Next Article in Special Issue
Quantum Statistical Manifolds
Previous Article in Journal
Adjusted Empirical Likelihood Method in the Presence of Nuisance Parameters with Application to the Sharpe Ratio
Previous Article in Special Issue
On Normalized Mutual Information: Measure Derivations and Properties
Article Menu
Issue 5 (May) cover image

Export Article

Open AccessArticle
Entropy 2018, 20(5), 317; https://doi.org/10.3390/e20050317

Divergence from, and Convergence to, Uniformity of Probability Density Quantiles

1,†,* and 2,†
1
Department of Mathematics and Statistics, La Trobe University, Bundoora, VIC 3086, Australia
2
School of Mathematics and Statistics, University of Melbourne, Parkville, VIC 3010, Australia
These authors contributed equally to this work.
*
Author to whom correspondence should be addressed.
Received: 7 March 2018 / Revised: 10 April 2018 / Accepted: 19 April 2018 / Published: 25 April 2018
(This article belongs to the Special Issue Entropy: From Physics to Information Sciences and Geometry)
Full-Text   |   PDF [995 KB, uploaded 3 May 2018]   |  

Abstract

We demonstrate that questions of convergence and divergence regarding shapes of distributions can be carried out in a location- and scale-free environment. This environment is the class of probability density quantiles (pdQs), obtained by normalizing the composition of the density with the associated quantile function. It has earlier been shown that the pdQ is representative of a location-scale family and carries essential information regarding shape and tail behavior of the family. The class of pdQs are densities of continuous distributions with common domain, the unit interval, facilitating metric and semi-metric comparisons. The Kullback–Leibler divergences from uniformity of these pdQs are mapped to illustrate their relative positions with respect to uniformity. To gain more insight into the information that is conserved under the pdQ mapping, we repeatedly apply the pdQ mapping and find that further applications of it are quite generally entropy increasing so convergence to the uniform distribution is investigated. New fixed point theorems are established with elementary probabilistic arguments and illustrated by examples. View Full-Text
Keywords: convergence in Lr norm; fixed point theorem; Kullback–Leibler divergence; relative entropy; semi-metric; uniformity testing convergence in Lr norm; fixed point theorem; Kullback–Leibler divergence; relative entropy; semi-metric; uniformity testing
Figures

Graphical abstract

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

Supplementary material

SciFeed

Share & Cite This Article

MDPI and ACS Style

Staudte, R.G.; Xia, A. Divergence from, and Convergence to, Uniformity of Probability Density Quantiles. Entropy 2018, 20, 317.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top