# Divergence from, and Convergence to, Uniformity of Probability Density Quantiles

^{1}

^{2}

^{*}

^{†}

## Abstract

**:**

_{r}norm; fixed point theorem; Kullback–Leibler divergence; relative entropy; semi-metric; uniformity testing

## 1. Introduction

## 2. Divergences between Probability Density Quantiles

#### 2.1. Definitions

**Definition**

**1.**

**Remark**

**1.**

#### 2.2. Divergence Map

**Definition**

**2.**

**Remark**

**2.**

#### 2.3. Uniformity Testing

## 3. Convergence of Density Shapes to Uniformity via Fixed Point Theorems

#### 3.1. Conditions for Convergence to Uniformity

**Definition**

**3.**

**Proposition**

**1.**

- (i)
- ${\mu}_{m+2}<\infty $,
- (ii)
- ${\mu}_{j}<\infty $ for all $1\le j\le m+2$,
- (iii)
- ${\kappa}_{j}<\infty $ and ${\kappa}_{j}=\frac{{\mu}_{j}{\mu}_{j+2}}{{\mu}_{j+1}^{2}}$ for all $1\le j\le m$.

**Proof**

**of**

**Proposition**

**1.**

**Proposition**

**2.**

**Proof**

**of**

**Proposition**

**2.**

**Theorem**

**1.**

- (i)
- ${f}^{n*}\stackrel{{L}_{2}}{\u27f6}1$;
- (ii)
- For all $r>0$, ${f}^{n*}\stackrel{{L}_{r}}{\u27f6}1$;
- (iii)
- $\frac{{\mu}_{n}{\mu}_{n+2}}{{\mu}_{n+1}^{2}}\to 1$ as $n\to \infty $.

**Remark**

**3.**

**Proof of**

**Theorem 1**

**Theorem**

**2.**

- (i)
- for all $n\ge 0$, $\parallel {f}^{(n+1)*}\parallel \le \parallel {f}^{n*}\parallel $ and the inequality becomes equality if and only if ${f}^{n*}\sim \mathcal{U}$;
- (ii)
- ${f}^{n*}\stackrel{{L}_{r}}{\u27f6}1$ for all $r>0$.

**Proof**

**of**

**Theorem**

**2.**

**Theorem**

**3.**

- (i*)
- ${f}^{n*}\stackrel{\mathbb{P}}{\u27f6}1$;
- (ii)
- For all $r>0$, ${f}^{n*}\stackrel{{L}_{r}}{\u27f6}1$;
- (iii)
- ${\mu}_{n}{\mu}_{n+2}{\mu}_{n+1}^{-2}\to 1$ as $n\to \infty $.

**Proof**

**of**

**Theorem**

**3.**

**Remark**

**4.**

#### 3.2. Examples of Convergence to Uniformity

**Example**

**1.**

**Example**

**2.**

**Example**

**3.**

**Example**

**4.**

**Example**

**5.**

**Example**

**6.**

## 4. Discussion

## 5. Conclusions

## Supplementary Materials

## Author Contributions

## Acknowledgments

## Conflicts of Interest

## References

- Staudte, R. The shapes of things to come: Probability density quantiles. Statistics
**2017**, 51, 782–800. [Google Scholar] [CrossRef] - Parzen, E. Nonparametric statistical data modeling. J. Am. Stat. Assoc.
**1979**, 7, 105–131. [Google Scholar] [CrossRef] - Kullback, S.; Leibler, R. On information and sufficiency. Ann. Math. Stat.
**1951**, 22, 79–86. [Google Scholar] [CrossRef] - Kullback, S. Information Theory and Statistics; Dover: Mineola, NY, USA, 1968. [Google Scholar]
- Abbas, A.; Cadenbach, A.; Salimi, E. A Kullback–Leibler View of Maximum Entropy and Maximum Log-Probability Methods. Entropy
**2017**, 19, 232. [Google Scholar] [CrossRef] - Jeffreys, H. An invariant form for the prior probability in estimation problems. Proc. R. Soc. Lond. A
**1946**, 186, 453–461. [Google Scholar] [CrossRef] - Freimer, M.; Kollia, G.; Mudholkar, G.; Lin, C. A study of the generalized Tukey lambda family. Commun. Stat. Theory Methods
**1988**, 17, 3547–3567. [Google Scholar] [CrossRef] - Stephens, M. Uniformity, Tests of. In Encyclopedia of Statistical Sciences; John Wiley & Sons: Hoboken, NJ, USA, 2006; Volume 53, pp. 1–8. [Google Scholar] [CrossRef]
- Lockhart, R.; O’Reilly, F.; Stephens, M. Tests of Fit Based on Normalized Spacings. J. R. Stat. Soc. B
**1986**, 48, 344–352. [Google Scholar] - Schader, M.; Schmid, F. Power of tests for uniformity when limits are unknown. J. Appl. Stat.
**1997**, 24, 193–205. [Google Scholar] [CrossRef] - Prendergast, L.; Staudte, R. Exploiting the quantile optimality ratio in finding confidence Intervals for a quantile. Stat
**2016**, 5, 70–81. [Google Scholar] [CrossRef] - Dudewicz, E.; Van Der Meulen, E. Entropy-Based Tests of Uniformity. J. Am. Stat. Assoc.
**1981**, 76, 967–974. [Google Scholar] [CrossRef] - Bowman, A. Density based tests for goodness-of-fit. J. Stat. Comput. Simul.
**1992**, 40, 1–13. [Google Scholar] [CrossRef] - Fan, Y. Testing the Goodness of Fit of a Parametric Density Function by Kernel Method. Econ. Theory
**1994**, 10, 316–356. [Google Scholar] [CrossRef] - Pavia, J. Testing Goodness-of-Fit with the Kernel Density Estimator: GoFKernel. J. Stat. Softw.
**2015**, 66, 1–27. [Google Scholar] [CrossRef] - Noughabi, H. Entropy-based tests of uniformity: A Monte Carlo power comparison. Commun. Stat. Simul. Comput.
**2017**, 46, 1266–1279. [Google Scholar] [CrossRef] - Arellano-Valle, R.; Contreras-Reyes, J.; Stehlik, M. Generalized Skew-Normal Negentropy and Its Application to Fish Condition Factor Time Series. Entropy
**2017**, 19, 528. [Google Scholar] [CrossRef] - R Core Team. R Foundation for Statistical Computing; R Core Team: Vienna, Austria, 2008; ISBN 3-900051-07-0. [Google Scholar]
- Luenberger, D. Optimization by Vector Space Methods; Wiley: New York, NY, USA, 1969. [Google Scholar]
- Bessenyei, M.; Páles, Z. A contraction principle in semimetric spaces. J. Nonlinear Convex Anal.
**2017**, 18, 515–524. [Google Scholar] - Feller, W. An Introduction to Probability Theory and Its Applications; John Wiley & Sons: New York, NY, USA, 1971; Volume 2. [Google Scholar]
- Johnson, N.; Kotz, S.; Balakrishnan, N. Continuous Univariate Distributions; John Wiley & Sons: New York, NY, USA, 1994; Volume 1. [Google Scholar]
- Johnson, N.; Kotz, S.; Balakrishnan, N. Continuous Univariate Distributions; John Wiley & Sons: New York, NY, USA, 1995; Volume 2, ISBN 0-471-58494-0. [Google Scholar]
- Azzalini, A. A Class of Distributions which Includes the Normal Ones. Scand. J. Stat.
**1985**, 12, 171–178. [Google Scholar] - Jones, M.; Pewsey, A. Sinh-arcsinh distributions. Biometrika
**2009**, 96, 761–780. [Google Scholar] [CrossRef] - Brockwell, P.; Davis, R. Time Series: Theory and Methods; Springer: New York, NY, USA, 2009. [Google Scholar]

**Figure 1.**Divergence from uniformity. The loci of points $({s}_{1},{s}_{2})=(\sqrt{I(\mathcal{U}:{f}^{*})}\phantom{\rule{0.166667em}{0ex}},\sqrt{I({f}^{*}:\mathcal{U})}\phantom{\rule{0.166667em}{0ex}})$ is shown for various standard families. The large disks correspond respectively to the symmetric families: uniform (purple), normal (red) and Cauchy (black). The crosses correspond to the asymmetric distributions: exponential (blue) and standard lognormal (red). More details are given in Section 2.2.

**Table 1.**Quantiles of some distributions, their pdQs and divergences. In general, we denote ${x}_{u}=Q\left(u\right)={F}^{-1}\left(u\right)$, but for the normal $F=\mathsf{\Phi}$ with density $\varphi $, we use ${z}_{u}={\mathsf{\Phi}}^{-1}\left(u\right)$. The logistic quantile function is only defined for $u\le 0.5$, but it is symmetric about $u=0.5.$ Lognormal($\sigma $) represents the lognormal distribution with shape parameter $\sigma $. The quantile function for the Pareto is for the Type II distribution with shape parameter a, and the pdQ is the same for Type I and Type II Pareto models.

$\phantom{\rule{1.em}{0ex}}\mathit{Q}\left(\mathit{u}\right)$ | $\phantom{\rule{1.em}{0ex}}{\mathit{f}}^{*}\left(\mathit{u}\right)$ | $\mathit{I}(\mathcal{U}:{\mathit{f}}^{*})$ | $\mathit{I}({\mathit{f}}^{*}:\mathcal{U})$ | $\mathit{J}(\mathcal{U},{\mathit{f}}^{*})$ | ||
---|---|---|---|---|---|---|

Normal | ${z}_{u}$ | $2\sqrt{\pi}\phantom{\rule{0.166667em}{0ex}}\varphi \left({z}_{u}\right)$ | 0.153 | 0.097 | 0.250 | |

Logistic | $ln(u/(1-u\left)\right)$ | $6u(1-u)$ | 0.208 | 0.125 | 0.333 | |

Laplace | $ln\left(2u\right),\phantom{\rule{3.33333pt}{0ex}}u\le 0.5$ | $2\phantom{\rule{0.166667em}{0ex}}\mathrm{min}\{u,1-u\}$ | 0.307 | 0.193 | 0.500 | |

${t}_{2}$ | $\frac{2u-1}{{\{2u(1-u)\}}^{1/2}}$ | $\frac{{2}^{7}{\{u(1-u)\}}^{3/2}}{3\pi}$ | 0.391 | 0.200 | 0.591 | |

Cauchy | $tan\{\pi (u-0.5)\}$ | $2{sin}^{2}\left(\pi u\right)$ | 0.693 | 0.307 | 1.000 | |

Exponential | $-ln(1-u)$ | $2(1-u)$ | 0.307 | 0.193 | 0.500 | |

Gumbel | $-ln(-ln(u\left)\right)$ | $-4uln\left(u\right)$ | 0.191 | 0.116 | 0.307 | |

Lognormal ($\sigma $) | ${e}^{\phantom{\rule{0.166667em}{0ex}}\sigma \phantom{\rule{0.166667em}{0ex}}{z}_{u}}$ | $\frac{2\sqrt{\pi}\phantom{\rule{0.277778em}{0ex}}}{{e}^{{\sigma}^{2}/4}}\phantom{\rule{0.166667em}{0ex}}\varphi \left({z}_{u}\right)\phantom{\rule{0.166667em}{0ex}}{e}^{-\sigma \phantom{\rule{0.166667em}{0ex}}{z}_{u}}$ | $\frac{{\sigma}^{2}}{4}+\frac{1}{2}-ln\left(\sqrt{2}\right)$ | - | $\frac{1}{4}+\frac{3{\sigma}^{2}}{8}$ | |

Pareto (a) | ${(1-u)}^{-1/a}$ | $\frac{2a+1}{a}\phantom{\rule{0.166667em}{0ex}}{(1-u)}^{1+1/a}$ | $\frac{1+a}{a}-ln(2+\frac{1}{a})$ | - | $\frac{{(1+a)}^{2}}{a(1+2a)}$ | |

Power (b) | ${u}^{1/b}$ | $\frac{2b-1}{b}\phantom{\rule{0.166667em}{0ex}}{u}^{1-1/b}$ | $\frac{b-1}{b}-ln(2-\frac{1}{b})$ | - | $\frac{{(b-1)}^{2}}{b(2b-1)}$ |

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Staudte, R.G.; Xia, A. Divergence from, and Convergence to, Uniformity of Probability Density Quantiles. *Entropy* **2018**, *20*, 317.
https://doi.org/10.3390/e20050317

**AMA Style**

Staudte RG, Xia A. Divergence from, and Convergence to, Uniformity of Probability Density Quantiles. *Entropy*. 2018; 20(5):317.
https://doi.org/10.3390/e20050317

**Chicago/Turabian Style**

Staudte, Robert G., and Aihua Xia. 2018. "Divergence from, and Convergence to, Uniformity of Probability Density Quantiles" *Entropy* 20, no. 5: 317.
https://doi.org/10.3390/e20050317