Bayesian Estimation of Variance-Based Information Measures and Their Application to Testing Uniformity
Abstract
:1. Introduction
2. Bayesian Estimation of Varentropy and Varextropy
3. Computational Algorithms
- (i)
- Generate a sample from , where is an approximation of .
- (ii)
- Generate a sample from , where is an approximation of ∼.
- (iii)
- Compute and as specified in Lemma 2.
- (iv)
- Repeat steps (i) and (iii) to obtain a sample of r values from and . As r increases, the average of the generated r values becomes the estimator for varentropy and varextropy.
4. Testing for Uniformity
- (i)
- if and only if for all (i.e., f is the PDF of the uniform random variable on ).
- (ii)
- if and only if for all (i.e., f is the PDF of the uniform random variable on ).
5. Examples
5.1. Simulation Study
5.2. Real Data Examples
- 162, 200, 271, 320, 393, 508, 539, 629, 706, 778, 884, 1003, 1101, 1182, 1463, 1603, 1984, 2355, 2880.
- 156, 162, 168, 182, 186, 190, 190, 196, 202, 210, 214, 220, 226, 230, 230, 236, 236, 242, 246, 270.
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
- Cover, T.M.; Thomas, J.A. Elements of Information Theory, 2nd ed.; John Wiley & Sons, Inc.: New York, NY, USA, 1991. [Google Scholar]
- Kamavaram, S.; Goseva-Popstojanova, K. Entropy as a measure of uncertainty in software reliability. In Proceedings of the 13th International Symposium on Software Reliability Engineering, Annapolis, MD, USA, 12–15 November 2002; pp. 209–210. [Google Scholar]
- Lad, F.; Sanfilippo, G.; Agro, G. Extropy: Complementary dual of entropy. Stat. Sci. 2015, 30, 40–58. [Google Scholar] [CrossRef]
- Qiu, G.; Jia, K. Extropy estimators with applications in testing uniformity. J. Nonparametr. Stat. 2018, 30, 182–196. [Google Scholar] [CrossRef]
- Al-Labadi, L.; Berry, S. Bayesian estimation of extropy and goodness of fit tests. J. Appl. Stat. 2020, 49, 357–370. [Google Scholar] [CrossRef]
- Vasicek, O. A test for normality based on sample entropy. J. R. Stat. Soc. B 1976, 38, 54–59. [Google Scholar] [CrossRef]
- Ebrahimi, N.; Habibullah, M.; Soofi, E.S. Testing exponentiality based on Kullback-Leibler information. J. R. Stat. Soc. Ser. B Stat. Methodol. 1992, 54, 739–748. [Google Scholar] [CrossRef]
- Van Es, B. Estimating functionals related to a density by a class of statistics based on spacings. Scand. J. Stat. 1992, 19, 61–72. [Google Scholar]
- Correa, J.C. A new estimator of entropy. Commun. Stat.—Theory Methods 1995, 24, 2439–2449. [Google Scholar] [CrossRef]
- Wieczorkowski, R.; Grzegorzewski, P. Entropy estimators-improvements and comparisons. Commun. Stat.—Simul. Comput. 1999, 28, 541–567. [Google Scholar] [CrossRef]
- Alizadeh Noughabi, H. A new estimator of entropy and its application in testing normality. J. Stat. Comput. Simul. 2010, 80, 1151–1162. [Google Scholar] [CrossRef]
- Alizadeh Noughabi, H.; Arghami, N.R. A new estimator of entropy. J. Iran. Stat. Soc. 2010, 9, 53–64. [Google Scholar]
- Al-Omari, A.I. A new measure of entropy of continuous random variable. J. Stat. Theory Pract. 2016, 10, 721–735. [Google Scholar] [CrossRef]
- Beirlant, J.; Dudewicz, E.J.; Györia, L.; van der Meulen, E.C. Nonparametric entropy estimation: An overview. Int. J. Math. Stat. 1997, 6, 17–39. [Google Scholar]
- Noughabia, H.A.; Jarrahiferizb, J. Extropy of order statistics applied to testing symmetry. Commun. Stat.—Simul. Comput. 2020, 51, 3389–3399. [Google Scholar] [CrossRef]
- Källberg, D.; Seleznjev, O. Estimation of entropy-type integral functionals. Commun. Stat.—Theory Methods 2016, 45, 887–905. [Google Scholar] [CrossRef]
- Mazzuchi, T.A.; Soofi, E.S.; Soyer, R. Bayes estimate and inference for entropy and information index of fit. Econom. Rev. 2008, 27, 428–456. [Google Scholar] [CrossRef]
- Ferguson, T.S. A Bayesian analysis of some nonparametric problems. Ann. Stat. 1973, 1, 209–230. [Google Scholar] [CrossRef]
- Al-Labadi, L.; Patel, V.; Vakiloroayaei, K.; Wan, C. A Bayesian nonparametric estimation to entropy. Braz. J. Probab. Stat. 2021, 35, 421–434. [Google Scholar] [CrossRef]
- Al-Labadi, L.; Patel, V.; Vakiloroayaei, K.; Wan, C. Kullback-Leibler divergence for Bayesian nonparametric model checking. J. Korean Stat. Soc. 2021, 50, 272–289. [Google Scholar] [CrossRef]
- Ishwaran, H.; Zarepour, M. Exact and approximate sum representations for the Dirichlet process. Can. J. Stat. 2002, 30, 269–283. [Google Scholar] [CrossRef]
- Song, K.S. Rényi information, log likelihood and an intrinsic distribution measure. J. Stat. Plan. Inference 2001, 93, 51–69. [Google Scholar] [CrossRef]
- Saha, S.; Kayal, S. Weighted (residual) varentropy with properties and applications. arXiv 2023, arXiv:2305.00852. [Google Scholar]
- Noughabi, H.A.; Noughabi, M.S. Varentropy estimators with applications in testing uniformity. J. Stat. Comput. Simul. 2023, 93, 2582–2599. [Google Scholar] [CrossRef]
- Maadani, S.; Mohtashami Borzadaran, G.R.; Rezaei Roknabadi, A.H. A new generalized varentropy and its properties. Ural. Math. J. 2020, 6, 114–129. [Google Scholar] [CrossRef]
- Sharma, A.; Kundu, C. Varentropy of doubly truncated random variable. Probab. Eng. Inf. Sci. 2022, 7, 852–871. [Google Scholar] [CrossRef]
- Vaselabadi, N.M.; Tahmasebi, S.; Kazemi, M.R.; Buono, F. Results on varextropy measure of random variables. Entropy 2021, 23, 356. [Google Scholar] [CrossRef]
- Grzegorzewski, P.; Wieczorkowski, R. Entropy-based goodness-of-fit test for exponentiality. Commun. Stat.—Theory Methods 1999, 28, 1183–1202. [Google Scholar] [CrossRef]
- D’Agostino, R.B.; Stephens, M.A. Goodness-of-Fit Techniques; Marcel Dekker: New York, NY, USA, 1986. [Google Scholar]
- Stephens, M.A. EDF statistics for goodness of fit and some comparisons. J. Am. Stat. Assoc. 1974, 69, 730–737. [Google Scholar] [CrossRef]
- Grubbs, F.E. Fiducial bounds on reliability for the two-parameter negative exponential distribution. Technometrics 1971, 13, 873–876. [Google Scholar] [CrossRef]
- Ebrahimi, N.; Pflughoeft, K.; Soofi, E.S. Two measures of sample entropy. Stat. Probab. Lett. 1994, 20, 225–234. [Google Scholar] [CrossRef]
- Balakrishnan, N.; Buono, F.; Calì, C.; Longobardi, M. Dispersion indices based on Kerridge inaccuracy measure and Kullback-Leibler divergence. Commun. Stat.—Theory Methods 2023. [Google Scholar] [CrossRef]
Distribution | Est(RMSE) | ||||
---|---|---|---|---|---|
U(0, 1) | 20 | 4 | 0.2055(0.2248) | 0.1749(0.218) | 0.1099(0.1465) |
50 | 7 | 0.1266(0.1334) | 0.1059(0.1214) | 0.0641(0.0769) | |
100 | 10 | 0.0919(0.0954) | 0.0783(0.0873) | 0.0469(0.0536) | |
Exp(1) | 20 | 4 | 0.7446(0.4699) | 0.6663(0.5105) | 0.6586(0.5258) |
50 | 7 | 0.9025(0.3148) | 0.7929(0.3454) | 0.8344(0.3396) | |
100 | 10 | 0.9671(0.2315) | 0.8641(0.2533) | 0.9203(0.2417) | |
20 | 4 | 0.2646(0.2621) | 0.1221(0.3885) | 0.1620(0.3574) | |
50 | 7 | 0.3051(0.2245) | 0.1449(0.3633) | 0.2351(0.2864) | |
100 | 10 | 0.3666(0.1677) | 0.2136(0.2971) | 0.319(0.2076) |
Distribution | Est(RMSE) | ||||
---|---|---|---|---|---|
U(0, 1) | 20 | 4 | 0.0828(0.1159) | 0.1698(0.3129) | 0.0502(0.0875) |
50 | 7 | 0.0404(0.0460) | 0.0642(0.0943) | 0.0217(0.0291) | |
100 | 10 | 0.0262(0.0283) | 0.0352(0.0460) | 0.0135(0.0166) | |
Exp(1) | 20 | 4 | 0.0589(0.1451) | 0.1419(0.5516) | 0.0505(0.1466) |
50 | 7 | 0.0370(0.0291) | 0.0621(0.0696) | 0.0311(0.0238) | |
100 | 10 | 0.0337(0.0197) | 0.0514(0.0435) | 0.0294(0.0161) | |
20 | 4 | 0.0063(0.0065) | 0.0062(0.0136) | 0.0043(0.0047) | |
50 | 7 | 0.0053(0.0034) | 0.0033(0.0022) | 0.0041(0.0026) | |
100 | 10 | 0.0069(0.0044) | 0.0037(0.0018) | 0.0043(0.0021) |
G | a | ||
---|---|---|---|
0.01 | 0.8758 | 0.0663 | |
5 | 2.4917 | 0.0255 | |
0.01 | 0.8877 | 0.0667 | |
5 | 3.2328 | 0.0073 | |
0.01 | 0.8932 | 0.0681 | |
5 | 7.7335 | 0.0215 | |
0.001 | 0.8587 | 0.0657 | |
5 | 1.6113 | 0.0725 | |
0.01 | 0.8699 | 0.0668 | |
5 | 0.7525 | 0.0776 |
n | Est. of | Est. of | Threshold of | Threshold of | p-Value | |
---|---|---|---|---|---|---|
20 | 0.1573 | 0.0423 | 0.2461 | 0.0955 | 0.2953 | |
0.2494 | 0.0839 | 0.0442 | ||||
0.1322 | 0.0337 | 0.7472 | ||||
0.1553 | 0.0421 | 0.4646 | ||||
0.2934 | 0.1163 | 0.1469 | ||||
0.3082 | 0.1687 | 0.1958 | ||||
0.5543 | 0.6276 | 0.0745 | ||||
0.1529 | 0.0484 | 0.6307 | ||||
0.2604 | 0.1169 | 0.1300 | ||||
1.1438 | 2.6605 | 0.0000 | ||||
0.1944 | 0.3892 | 0.0000 | ||||
50 | 0.2378 | 0.1044 | 0.1469 | 0.0475 | 0.0073 | |
0.6105 | 0.7342 | 0.0000 | ||||
0.1176 | 0.0391 | 0.4119 | ||||
0.2564 | 0.1421 | 0.0857 | ||||
0.7699 | 1.6625 | 0.0048 | ||||
0.2449 | 0.1046 | 0.0430 | ||||
0.54717 | 0.4817 | 0.0045 | ||||
0.0876 | 0.0232 | 0.3349 | ||||
0.3819 | 0.2553 | 0.0013 | ||||
1.0189 | 1.6964 | 0.0000 | ||||
0.1361 | 0.2849 | 0.0000 | ||||
100 | 0.2073 | 0.0873 | 0.1042 | 0.03107 | 0.0019 | |
0.5906 | 0.8231 | 0.0000 | ||||
0.0958 | 0.0329 | 0.6042 | ||||
0.2602 | 0.1665 | 0.0764 | ||||
0.9135 | 3.0551 | 0.0011 | ||||
0.2877 | 0.1323 | 0.0080 | ||||
0.7074 | 0.8427 | 0.0001 | ||||
0.0755 | 0.0193 | 0.2672 | ||||
0.3507 | 0.1926 | 0.0000 | ||||
1.3669 | 4.1632 | 0.0000 | ||||
0.1006 | 0.1951 | 0.0000 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Al-Labadi, L.; Hamlili, M.; Ly, A. Bayesian Estimation of Variance-Based Information Measures and Their Application to Testing Uniformity. Axioms 2023, 12, 887. https://doi.org/10.3390/axioms12090887
Al-Labadi L, Hamlili M, Ly A. Bayesian Estimation of Variance-Based Information Measures and Their Application to Testing Uniformity. Axioms. 2023; 12(9):887. https://doi.org/10.3390/axioms12090887
Chicago/Turabian StyleAl-Labadi, Luai, Mohammed Hamlili, and Anna Ly. 2023. "Bayesian Estimation of Variance-Based Information Measures and Their Application to Testing Uniformity" Axioms 12, no. 9: 887. https://doi.org/10.3390/axioms12090887
APA StyleAl-Labadi, L., Hamlili, M., & Ly, A. (2023). Bayesian Estimation of Variance-Based Information Measures and Their Application to Testing Uniformity. Axioms, 12(9), 887. https://doi.org/10.3390/axioms12090887