Different Classical and Bayesian Methods of Estimation of the Power Log-Logistic Distribution with Applications
Abstract
1. Introduction
2. Model Description and Statistical Properties
- If and then the PLL distribution reduces to a two-parameter Fisk distribution with parameters
- ∼
- if Y∼ then where is a constant.
- ∼
- Again, ∼
2.1. Log-Convexity
- is log-convex → is log-convex,
- is closed under convolution.
- is closed under scaling transformations.
- the density is unimodal and monotone.
- Hazard rate function properties
- Case 1. If then from Equation (11) the numerator is negative. Therefore, the hazard function is strictly decreasing.
- Case 2. If then for small x: numerator which implies that the hazard increases. Again, after , the numerator i.e., the hazard decreases. Therefore, the hazard is upside-down.
- Case 3. The hazard function has an increasing behavior for suitable parameter ranges when is sufficiently large.
2.2. Moments
2.3. Incomplete Moments
2.4. Order Statistics
- If and in Equation (14), we get the k-th moment of order statistics for Fisk distribution with parameter and .
- If in Equation (14), we get the k-th moment of order statistics for log-logistic distribution
3. Method of Estimation
3.1. Maximum Likelihood Estimation
- (a)
- When the model is correctly specified.
- (b)
- Moderate to large sample sizes.
- (c)
- Data are complete (no censoring/truncation complications).
- (d)
- Interest lies in asymptotic efficiency and standard inferential tools (confidence intervals, likelihood ratio tests). Moreover, t provides asymptotically efficient and consistent estimates under regularity conditions and serves as a reference for comparing alternative estimators. A random sample of size n selected from the PLL distribution with a density function in is defined as = . The likelihood of the observed data is given byThe log likelihood of the PLL distribution is given as
3.2. Maximum Product Spacing Distance Estimation
- (a)
- Small sample sizes.
- (b)
- Models with heavy tails or extreme-value behavior.
- (c)
- Situations where MLE fails to converge or produces boundary estimates.
- (d)
- Distributions with flat or multimodal likelihoods. Most importantly, MPSDE often exhibits better numerical stability and can outperform MLE in small samples or non-regular cases. The estimators produced by the MPSE approach are asymptotically normal, consistent, and as effective as MLEs. The invariance property of the MPSEs was examined by [22], who demonstrated that it is the same as the MLEs. Let us define the spacing for the PLL distributionwhere and .
3.3. Weighted Least Square Estimation
- (a)
- When variance of errors differs across quantiles.
- (b)
- When tail fitting is particularly important.
- (c)
- Moderate sample sizes where efficiency improvements over LSE are desired.
- (d)
- It provides improved efficiency over LSE and better performance in the tails by incorporating variance-based weights. The methods of WLSEs of the parameters and are obtained by minimizing the functionwith respect to andwhere , and are given in Equations (20)–(22).
3.4. Approximate Confidence Intervals
3.5. Bootstrap Confidence Intervals
4. Bayesian Estimation
MCMC Method
- Details of the MCMC procedure
- Step 1: Choose initial values as , , and .
- Step 2: Generate , , and from Normal distribution as , , and .
- Step 3: Compute , , and .
- Step 4: Generate samples for ∼Uniform(0,1), Uniform(0,1), and Uniform(0,1).
- Step 5: Set ==, and =
- Step 6: Set .
- Step 7: Repeat steps 1–6 upto N times to generate , , and .
5. Simulation Study
- From Table 2, we clearly observe that the ABs and MSEs decrease when the sample size increases.
- Based on ABs and MSEs, in general, the Bayes estimates give better results than the classical point estimates. Among the classical estimates, MPSE performs better than the other classical estimates.
- From Table 3, it can be noticed that AWs decrease with the increase of n.
- Based on AWs and CPs, the HPD credible intervals perform better than other interval estimates. In case of confidence intervals, Boot-t outperforms Boot-p and ACIs.
- In particular, for the Bayesian inference, the following observations are made (based on the simulation study):
- 1.
- The choice of priors has a significant impact on both bias and computational time.
- 2.
- As the sample size increases, bias and MSE decrease while the number of required proposals increases as expected.
- (a)
- Choice 2 (C2):
- (b)
- Choice 3 (C3):
- (c)
- Choice 4 (C4):
Comment on the Convergence of MCMC Procedure
6. Real-Life Data Application
7. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Appendix A
- On the uniqueness and existence of MLEs
- From Equation (17), and This implies that is a decreasing function which has a unique root
- From Equation (18), and This implies that is a decreasing function which has a unique root
- From Equation (19), and This implies that is a decreasing function which has a unique root
References
- Johnson, N.L.; Kotz, S.; Balakrishnan, N. Continuous Univariate Distributions; John Wiley and Sons: Albany, NY, USA, 1995; Volume 2. [Google Scholar]
- De Santana, T.V.F.; Ortega, E.M.; Cordeiro, G.M.; Silva, G.O. The Kumaraswamy-log-logistic distribution. J. Stat. Theory Appl. 2012, 11, 265–291. [Google Scholar]
- Gui, W. Marshall-Olkin extended log-logistic distribution and its application in minification processes. Appl. Math. Sci. 2013, 7, 3947–3961. [Google Scholar] [CrossRef]
- Tahir, M.H.; Mansoor, M.; Zubair, M.; Hamedani, G. McDonald log-logistic distribution with an application to breast cancer data. J. Stat. Theory Appl. 2014, 13, 65–82. [Google Scholar] [CrossRef]
- Bennett, S. Log-logistic regression models for survival data. J. R. Stat. Soc. Ser. C Appl. Stat. 1983, 32, 165–171. [Google Scholar] [CrossRef]
- Zheng, X.; Chiang, J.Y.; Tsai, T.R.; Wang, S. Estimating the failure rate of the log-logistic distribution by smooth adaptive and bias-correction methods. Comput. Ind. Eng. 2021, 156, 107188. [Google Scholar] [CrossRef]
- Granzotto, D.C.T.; Louzada, F. The transmuted log-logistic distribution: Modeling, inference, and an application to a polled tabapua race time up to first calving data. Commun. Stat.-Theory Methods 2015, 44, 3387–3402. [Google Scholar] [CrossRef]
- Lemonte, A.J. The beta log-logistic distribution. Braz. J. Probab. Stat. 2014, 28, 313–332. [Google Scholar] [CrossRef]
- Aldahlan, M.A. Alpha power transformed log-logistic distribution with application to breaking stress data. Adv. Math. Phys. 2020, 2020, 1–9. [Google Scholar] [CrossRef]
- Guure, C.B. Inference on the loglogistic model with right censored data. Austin. Biom. Biostat. 2015, 2, 1015. [Google Scholar]
- Adepoju, A.A.; Elbarkawy, M.A.; Ishaq, A.I.; Singh, N.S.S.; Daud, H.; Suleiman, A.A.; Almetwally, E.M.; Elgarhy, M. A Flexible Extension of the Log-Logistic Distribution with Application to Cancer Data. Int. J. 2025, 14, 627. [Google Scholar] [CrossRef]
- Alfaer, N.M.; Gemeay, A.M.; Aljohani, H.M.; Afify, A.Z. The extended log-logistic distribution: Inference and actuarial applications. Mathematics 2021, 9, 1386. [Google Scholar] [CrossRef]
- Al-Shomrani, A.A.; Shawky, A.I.; Arif, O.H.; Aslam, M. Log-logistic distribution for survival data analysis using MCMC. Springer Plus 2016, 5, 1–16. [Google Scholar] [CrossRef]
- Afify, A.Z.; Hussein, E.A.; Alnssyan, B.; Mahran, H.A. The extended log-logistic distribution: Properties, inference, and applications in medicine and geology. J. Stat. Appl. Probab. 2023, 12, 1155–1580. [Google Scholar] [CrossRef]
- Yang, G.; Liu, D.; Wang, J.; Xie, M.G. Meta-analysis framework for exact inferences with application to the analysis of rare events. Biometrics 2016, 72, 1378–1386. [Google Scholar] [CrossRef]
- Fan, Z.; Liu, D.; Chen, Y.; Zhang, N. Something out of nothing? The influence of double-zero studies in meta-analysis of adverse events in clinical trials. Stat. Biosci. 2024, 1–19. [Google Scholar] [CrossRef]
- Ashkar, F.; Mahdi, S. Fitting the log-logistic distribution by generalized moments. J. Hydrol. 2006, 328, 694–703. [Google Scholar] [CrossRef]
- Galton, F. Enquiries into Human Faculty and Its Development; Macmillan: London, UK, 1883. [Google Scholar]
- Moors, J.J.A. A quantile alternative for Kurtosis. J. R. Stat. Soc. Ser. D 1988, 37, 25–32. [Google Scholar] [CrossRef]
- Arnold, B.C.; Balakrishnan, N.; Nagaraja, H.N. A First Course in Order Statistics; Society for Industrial and Applied Mathematics: Philadelphia, PA, USA, 2008. [Google Scholar]
- Cheng, R.; Amin, N. Maximum Product of Spacings Estimation with Application to the Lognormal Distribution; Mathematical Report 79-1; Department of Mathematics, UWIST: Cardiff, Wales, 1979. [Google Scholar]
- Anatolyev, S.; Kosenok, G. An alternative to maximum likelihood based on spacings. Econom. Theory 2005, 21, 472–476. [Google Scholar] [CrossRef]
- Kundu, D.; Pradhan, B. Bayesian inference and life testing plans for generalized exponential distribution. Sci. China Ser. A Math. 2009, 52, 1373–1388. [Google Scholar] [CrossRef]
- Dutta, S.; Kayal, S. Estimation and prediction for Burr type III distribution based on unified progressive hybrid censoring scheme. J. Appl. Stat. 2024, 51, 1–33. [Google Scholar] [CrossRef] [PubMed]
- Giannone, D.; Lenza, M.; Primiceri, G.E. Prior selection for vector autoregressions. Rev. Econ. Stat. 2015, 97, 436–451. [Google Scholar] [CrossRef]
- Gelman, A.; Rubin, D.B. [Practical markov chain Monte Carlo]: Rejoinder: Replication without contrition. Stat. Sci. 1992, 7, 503–511. [Google Scholar] [CrossRef]
- Dutta, S.; Dey, S.; Kayal, S. Bayesian survival analysis of logistic exponential distribution for adaptive progressive Type-II censored data. Comput. Stat. 2024, 39, 2109–2155. [Google Scholar] [CrossRef]
- Mäkeläinen, T.; Schmidt, K.; Styan, G.P. On the existence and uniqueness of the maximum likelihood estimate of a vector-valued parameter in fixed-size samples. Ann. Stat. 1981, 9, 758–767. [Google Scholar] [CrossRef]









| Mean | Variance | Skewness | Kurtosis | |||
|---|---|---|---|---|---|---|
| 0.5 | 1.5 | 3.5 | 1.161420 | 0.098500 | 2.125250 | 10.361600 |
| 2 | 1.260910 | 0.116110 | 2.125080 | 10.361570 | ||
| 2.5 | 1.343920 | 0.131900 | 2.125020 | 10.361720 | ||
| 3 | 1.415790 | 0.146380 | 2.125050 | 10.361610 | ||
| 3.5 | 1.479539 | 0.159865 | 2.124856 | 10.361731 | ||
| 4.5 | 1.589683 | 0.184554 | 2.124805 | 10.361928 | ||
| 5.5 | 1.683490 | 0.206978 | 2.124704 | 10.361964 | ||
| 6 | 1.725867 | 0.217528 | 2.124943 | 10.361529 | ||
| 10 | 1.997068 | 0.291261 | 2.125026 | 10.361725 | ||
| 1 | 1.5 | 4.5 | 1.188500 | 0.285220 | 6.176897 | 29.556462 |
| 2 | 1.266960 | 0.324120 | 6.176608 | 29.556169 | ||
| 2.5 | 1.331370 | 0.357910 | 6.176768 | 29.556373 | ||
| 3 | 1.386420 | 0.388130 | 6.176549 | 29.556093 | ||
| 3.5 | 1.434742 | 0.415653 | 6.176398 | 29.556052 | ||
| 4.5 | 1.517149 | 0.464771 | 6.176614 | 29.556172 | ||
| 5.5 | 1.586335 | 0.508127 | 6.176443 | 29.555964 | ||
| 6 | 1.617307 | 0.528161 | 6.176690 | 29.556364 | ||
| 10 | 1.811724 | 0.662777 | 6.176467 | 29.556176 | ||
| 1.2 | 1.5 | 5.5 | 1.165660 | 0.262170 | 8.869790 | 52.534350 |
| 2 | 1.228250 | 0.291090 | 8.869680 | 52.534350 | ||
| 2.5 | 1.279110 | 0.315690 | 8.869720 | 52.534360 | ||
| 3 | 1.322220 | 0.337330 | 8.869710 | 52.534230 | ||
| 3.5 | 1.359808 | 0.356786 | 8.869724 | 52.534315 | ||
| 4.5 | 1.423384 | 0.390927 | 8.869724 | 52.534350 | ||
| 5.5 | 1.476276 | 0.420520 | 8.869874 | 52.534521 | ||
| 6 | 1.499817 | 0.434030 | 8.869891 | 52.534626 | ||
| 10 | 1.645790 | 0.522637 | 8.869831 | 52.534575 | ||
| 1.4 | 1.5 | 6 | 1.172090 | 0.313660 | 9.650770 | 62.019420 |
| 2 | 1.229660 | 0.345230 | 9.650490 | 62.018780 | ||
| 2.5 | 1.276260 | 0.371890 | 9.650700 | 62.019290 | ||
| 3 | 1.315630 | 0.395190 | 9.650550 | 62.018820 | ||
| 3.5 | 1.349876 | 0.416032 | 9.650528 | 62.018825 | ||
| 4.5 | 1.407617 | 0.452386 | 9.650626 | 62.019082 | ||
| 5.5 | 1.455491 | 0.483681 | 9.650621 | 62.019040 | ||
| 6 | 1.476752 | 0.497916 | 9.650737 | 62.019315 | ||
| 10 | 1.607987 | 0.590342 | 9.650496 | 62.018961 | ||
| 2 | 1.5 | 10 | 1.113193 | 0.193720 | 12.294730 | 113.01472 |
| 2 | 1.145682 | 0.205195 | 12.294950 | 113.01528 | ||
| 2.5 | 1.171535 | 0.214559 | 12.294890 | 113.01514 | ||
| 3 | 1.193090 | 0.222528 | 12.294960 | 113.01530 | ||
| 3.5 | 1.211624 | 0.229496 | 12.294897 | 113.01525 | ||
| 4.5 | 1.242460 | 0.241325 | 12.294695 | 113.01448 | ||
| 5.5 | 1.267644 | 0.251209 | 12.294799 | 113.01479 | ||
| 6 | 1.278723 | 0.255616 | 12.294738 | 113.01455 | ||
| 10 | 1.345740 | 0.283114 | 12.294952 | 113.01538 |
| MLE | MPSE | WLSE | Bayes | ||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| (1.5, 0.5, 0.5) | 20 | 0.1861 | 0.0887 | 0.0802 | 0.1725 | 0.0844 | 0.0783 | 0.1920 | 0.0905 | 0.0823 | 0.1051 | 0.0535 | 0.0472 |
| (0.0544) | (0.0143) | (0.0113) | (0.0515) | (0.0128) | (0.0102) | (0.0565) | (0.0158) | (0.0122) | (0.0283) | (0.0082) | (0.0061) | ||
| 30 | 0.1493 | 0.0676 | 0.0638 | 0.1386 | 0.0634 | 0.0607 | 0.1541 | 0.0691 | 0.0657 | 0.0982 | 0.0427 | 0.0409 | |
| (0.0349) | (0.0088) | (0.0072) | (0.0302) | (0.0075) | (0.0066) | (0.0370) | (0.0095) | (0.0079) | (0.0201) | (0.0059) | (0.0048) | ||
| 50 | 0.1185 | 0.0552 | 0.0501 | 0.1092 | 0.0523 | 0.0485 | 0.1243 | 0.0591 | 0.0536 | 0.0753 | 0.0352 | 0.0329 | |
| (0.0228) | (0.0057) | (0.0046) | (0.0203) | (0.0051) | (0.0042) | (0.0241) | (0.0060) | (0.0048) | (0.0156) | (0.0032) | (0.0028) | ||
| 100 | 0.0856 | 0.0392 | 0.0364 | 0.0822 | 0.0369 | 0.0341 | 0.0894 | 0.0411 | 0.0395 | 0.0510 | 0.0225 | 0.0218 | |
| (0.0179) | (0.0037) | (0.0033) | (0.0165) | (0.0035) | (0.0031) | (0.0195) | (0.0042) | (0.0040) | (0.0103) | (0.0025) | (0.0024) | ||
| 200 | 0.0592 | 0.0319 | 0.0308 | 0.0575 | 0.0310 | 0.0301 | 0.0627 | 0.0330 | 0.0319 | 0.0352 | 0.0175 | 0.0168 | |
| (0.0093) | (0.0020) | (0.0019) | (0.0089) | (0.0019) | (0.0018) | (0.0101) | (0.0023) | (0.0021) | (0.0068) | (0.0012) | (0.0011) | ||
| (1.2, 0.75, 0.5) | 20 | 0.1581 | 0.0581 | 0.0393 | 0.1532 | 0.0565 | 0.0378 | 0.1627 | 0.0615 | 0.0408 | 0.0963 | 0.0395 | 0.0227 |
| (0.0358) | (0.0054) | (0.0032) | (0.0334) | (0.0050) | (0.0030) | (0.0362) | (0.0058) | (0.0036) | (0.0261) | (0.0036) | (0.0020) | ||
| 30 | 0.1534 | 0.0565 | 0.0378 | 0.1503 | 0.0543 | 0.0365 | 0.1593 | 0.0597 | 0.0396 | 0.0942 | 0.0378 | 0.0220 | |
| (0.0342) | (0.0051) | (0.0030) | (0.0322) | (0.0048) | (0.0029) | (0.0350) | (0.0056) | (0.0035) | (0.0245) | (0.0034) | (0.0019) | ||
| 50 | 0.1309 | 0.0510 | 0.0329 | 0.1459 | 0.0503 | 0.0320 | 0.1367 | 0.0568 | 0.0380 | 0.0905 | 0.0349 | 0.0207 | |
| (0.0310) | (0.0046) | (0.0027) | (0.0303) | (0.0045) | (0.0026) | (0.0322) | (0.0054) | (0.0033) | (0.0219) | (0.0030) | (0.0018) | ||
| 100 | 0.1267 | 0.0491 | 0.0296 | 0.1314 | 0.0495 | 0.0290 | 0.1324 | 0.0525 | 0.0331 | 0.0864 | 0.0318 | 0.0198 | |
| (0.0305) | (0.0045) | (0.0026) | (0.0298) | (0.0044) | (0.0025) | (0.0314) | (0.0054) | (0.0031) | (0.0204) | (0.0027) | (0.0017) | ||
| 200 | 0.1198 | 0.0455 | 0.0272 | 0.1163 | 0.0443 | 0.0266 | 0.1224 | 0.0480 | 0.0294 | 0.0796 | 0.0284 | 0.0171 | |
| (0.0296) | (0.0041) | (0.0024) | (0.0275) | (0.0042) | (0.0024) | (0.0302) | (0.0050) | (0.0029) | (0.0183) | (0.0025) | (0.0016) | ||
| (1.5, 0.5, 1.25) | 20 | 0.5132 | 0.2927 | 0.1257 | 0.4639 | 0.2836 | 0.1198 | 0.5427 | 0.2981 | 0.1295 | 0.2681 | 0.1575 | 0.0793 |
| (0.3565) | (0.1838) | (0.0312) | (0.3150) | (0.1796) | (0.0301) | (0.3781) | (0.1892) | (0.0351) | (0.1759) | (0.0965) | (0.0210) | ||
| 30 | 0.4934 | 0.2850 | 0.1209 | 0.4425 | 0.2791 | 0.1133 | 0.5384 | 0.2937 | 0.1240 | 0.2473 | 0.1498 | 0.0757 | |
| (0.2961) | (0.1562) | (0.0285) | (0.2773) | (0.1510) | (0.0268) | (0.3259) | (0.1772) | (0.0324) | (0.1494) | (0.0826) | (0.0165) | ||
| 50 | 0.4382 | 0.2696 | 0.1134 | 0.4151 | 0.2682 | 0.1074 | 0.4909 | 0.2783 | 0.1178 | 0.2130 | 0.1285 | 0.0696 | |
| (0.2608) | (0.1431) | (0.0246) | (0.2581) | (0.1403) | (0.0239) | (0.2934) | (0.1581) | (0.0295) | (0.1358) | (0.0785) | (0.0151) | ||
| 100 | 0.3969 | 0.2387 | 0.1028 | 0.3843 | 0.2329 | 0.1012 | 0.4458 | 0.2620 | 0.1064 | 0.1965 | 0.1143 | 0.0627 | |
| (0.2010) | (0.1268) | (0.0210) | (0.1935) | (0.1215) | (0.0202) | (0.2459) | (0.1318) | (0.0233) | (0.1072) | (0.0645) | (0.0133) | ||
| 200 | 0.3162 | 0.1864 | 0.0897 | 0.2968 | 0.1767 | 0.0835 | 0.3512 | 0.2034 | 0.0939 | 0.1214 | 0.0859 | 0.0416 | |
| (0.1254) | (0.0961) | (0.0162) | (0.1189) | (0.0924) | (0.0150) | (0.1352) | (0.1023) | (0.0184) | (0.0759) | (0.0461) | (0.0095) | ||
| ACI | Boot-p | Boot-t | HPD | ||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| (1.5, 0.5, 0.5) | 20 | 0.4159 | 0.2141 | 0.2035 | 0.3682 | 0.1995 | 0.1967 | 0.3451 | 0.1927 | 0.1893 | 0.1750 | 0.1054 | 0.1015 |
| (0.9026) | (0.8957) | (0.9063) | (0.9078) | (0.9052) | (0.9148) | (0.9102) | (0.9076) | (0.9175) | (0.9385) | (0.9330) | (0.9372) | ||
| 30 | 0.4086 | 0.2115 | 0.2010 | 0.3599 | 0.1941 | 0.1923 | 0.3389 | 0.1894 | 0.1872 | 0.1706 | 0.1031 | 0.0989 | |
| (0.9069) | (0.9028) | (0.9105) | (0.9110) | (0.9089) | (0.9173) | (0.9135) | (0.9097) | (0.9199) | (0.9404) | (0.9385) | (0.9396) | ||
| 50 | 0.3753 | 0.1956 | 0.1934 | 0.3267 | 0.1835 | 0.1810 | 0.3236 | 0.1821 | 0.1803 | 0.1583 | 0.0969 | 0.0925 | |
| (0.9113) | (0.9057) | (0.9149) | (0.9178) | (0.9144) | (0.9204) | (0.9196) | (0.9131) | (0.9217) | (0.9423) | (0.9417) | (0.9429) | ||
| 100 | 0.2981 | 0.1358 | 0.1315 | 0.2869 | 0.1327 | 0.1302 | 0.2831 | 0.1305 | 0.1296 | 0.0984 | 0.0783 | 0.0754 | |
| (0.9235) | (0.9108) | (0.9212) | (0.9256) | (0.9180) | (0.9237) | (0.9272) | (0.9204) | (0.9255) | (0.9441) | (0.9435) | (0.9447) | ||
| 200 | 0.2769 | 0.1358 | 0.1315 | 0.2869 | 0.1327 | 0.1302 | 0.2831 | 0.1305 | 0.1296 | 0.0984 | 0.0783 | 0.0754 | |
| (0.9274) | (0.9159) | (0.9288) | (0.9297) | (0.9213) | (0.9261) | (0.9308) | (0.9245) | (0.9281) | (0.9469) | (0.9460) | (0.9461) | ||
| (1.2, 0.75, 0.5) | 20 | 0.3357 | 0.1852 | 0.1804 | 0.3309 | 0.1831 | 0.1786 | 0.3282 | 0.1812 | 0.1770 | 0.1689 | 0.1010 | 0.0972 |
| (0.9105) | (0.9056) | (0.9124) | (0.9172) | (0.9105) | (0.9163) | (0.9204) | (0.9138) | (0.9195) | (0.9402) | (0.9367) | (0.9395) | ||
| 30 | 0.3154 | 0.1796 | 0.1763 | 0.3123 | 0.1785 | 0.1750 | 0.3081 | 0.1763 | 0.1738 | 0.1554 | 0.0986 | 0.0953 | |
| (0.9165) | (0.9094) | (0.9156) | (0.9202) | (0.9135) | (0.9198) | (0.9230) | (0.9172) | (0.9236) | (0.9419) | (0.9398) | (0.9416) | ||
| 50 | 0.2969 | 0.1695 | 0.1637 | 0.2892 | 0.1624 | 0.1595 | 0.2853 | 0.1605 | 0.1582 | 0.1461 | 0.0946 | 0.0921 | |
| (0.9280) | (0.9161) | (0.9243) | (0.9324) | (0.9208) | (0.9275) | (0.9350) | (0.9265) | (0.9304) | (0.9443) | (0.9439) | (0.9458) | ||
| 100 | 0.2697 | 0.1585 | 0.1550 | 0.2636 | 0.1556 | 0.1534 | 0.2583 | 0.1529 | 0.1510 | 0.1249 | 0.0894 | 0.0875 | |
| (0.9324) | (0.9208) | (0.9292) | (0.9355) | (0.9261) | (0.9310) | (0.9375) | (0.9314) | (0.9339) | (0.9468) | (0.9459) | (0.9476) | ||
| 200 | 0.2387 | 0.1431 | 0.1412 | 0.2343 | 0.1382 | 0.1367 | 0.2312 | 0.1358 | 0.1326 | 0.1093 | 0.0827 | 0.0803 | |
| (0.9387) | (0.9281) | (0.9353) | (0.9403) | (0.9336) | (0.9389) | (0.9423) | (0.9357) | (0.9402) | (0.9489) | (0.9477) | (0.9490) | ||
| (1.5, 0.5, 1.25) | 20 | 0.5635 | 0.2419 | 0.2275 | 0.5240 | 0.2370 | 0.2204 | 0.5034 | 0.2328 | 0.2168 | 0.2752 | 0.1284 | 0.1237 |
| (0.9082) | (0.9143) | (0.9168) | (0.9115) | (0.9196) | (0.9205) | (0.9157) | (0.9231) | (0.9249) | (0.9362) | (0.9385) | (0.9403) | ||
| 30 | 0.5025 | 0.2293 | 0.2124 | 0.4864 | 0.2231 | 0.2096 | 0.4750 | 0.2189 | 0.2027 | 0.2346 | 0.1161 | 0.1133 | |
| (0.9165) | (0.9220) | (0.9252) | (0.9208) | (0.9264) | (0.9283) | (0.9251) | (0.9289) | (0.9306) | (0.9421) | (0.9439) | (0.9444) | ||
| 50 | 0.4126 | 0.2008 | 0.1985 | 0.4035 | 0.1996 | 0.1939 | 0.3948 | 0.1925 | 0.1899 | 0.2010 | 0.0989 | 0.0964 | |
| (0.9275) | (0.9296) | (0.9323) | (0.9313) | (0.9327) | (0.9344) | (0.9340) | (0.9352) | (0.9368) | (0.9443) | (0.9452) | (0.9468) | ||
| 100 | 0.3650 | 0.1795 | 0.1742 | 0.3461 | 0.1735 | 0.1692 | 0.3386 | 0.1704 | 0.1658 | 0.1752 | 0.0891 | 0.0865 | |
| (0.9302) | (0.9321) | (0.9344) | (0.9352) | (0.9364) | (0.9368) | (0.9381) | (0.9392) | (0.9396) | (0.9467) | (0.9475) | (0.9472) | ||
| 200 | 0.3209 | 0.1632 | 0.1596 | 0.3055 | 0.1586 | 0.1545 | 0.3020 | 0.1527 | 0.1495 | 0.1461 | 0.0820 | 0.0782 | |
| (0.9351) | (0.9382) | (0.9396) | (0.9388) | (0.9405) | (0.9421) | (0.9418) | (0.9422) | (0.9430) | (0.9475) | (0.9482) | (0.9493) | ||
| Sample Size | ||||||
|---|---|---|---|---|---|---|
| P.M. | 95% HPD | P.M. | 95% HPD | P.M. | 95% HPD | |
| 20 | 1.234 | (1.054, 3.436) | 0.423 | (0.273, 1.781) | 0.398 | (0.277, 1.638) |
| 30 | 1.412 | (1.078, 3.409) | 0.431 | (0.277, 1.546) | 0.403 | (0.283, 1.537) |
| 50 | 1.424 | (1.112, 3.232) | 0.435 | (0.279, 1.541) | 0.412 | (0.289, 1.531) |
| 100 | 1.428 | (1.117, 2.789) | 0.438 | (0.281, 1.538) | 0.422 | (0.291, 1.473) |
| 200 | 1.453 | (1.242, 2.421) | 0.447 | (0.282, 1.532) | 0.439 | (0.292, 1.470) |
| Sample Size | ||||||
|---|---|---|---|---|---|---|
| P.M. | 95% HPD | P.M. | 95% HPD | P.M. | 95% HPD | |
| 20 | 1.226 | (1.064, 3.463) | 0.418 | (0.274, 1.692) | 0.386 | (0.273, 1.664) |
| 30 | 1.417 | (1.073, 3.412) | 0.427 | (0.278, 1.564) | 0.398 | (0.268, 1.637) |
| 50 | 1.422 | (1.114, 3.238) | 0.430 | (0.281, 1.516) | 0.413 | (0.292, 1.584) |
| 100 | 1.425 | (1.116, 2.987) | 0.432 | (0.283, 1.502) | 0.426 | (0.293, 1.488) |
| 200 | 1.448 | (1.224, 2.412) | 0.442 | (0.278, 1.496) | 0.435 | (0.295, 1.468) |
| Sample Size | ||||||
|---|---|---|---|---|---|---|
| P.M. | 95% HPD | P.M. | 95% HPD | P.M. | 95% HPD | |
| 20 | 1.238 | (1.112, 3.263) | 0.413 | (0.267, 1.689) | 0.381 | (0.271, 1.643) |
| 30 | 1.421 | (1.071, 3.210) | 0.429 | (0.268, 1.564) | 0.401 | (0.286, 1.573) |
| 50 | 1.428 | (1.121, 3.132) | 0.434 | (0.272, 1.546) | 0.407 | (0.283, 1.513) |
| 100 | 1.436 | (1.217, 2.789) | 0.432 | (0.274, 1.528) | 0.418 | (0.290, 1.479) |
| 200 | 1.452 | (1.246, 2.321) | 0.436 | (0.283, 1.517) | 0.429 | (0.293, 1.468) |
| Chain Number | |||
|---|---|---|---|
| 1 | 1.07 | 0.83 | 1.08 |
| 2 | 1.05 | 0.91 | 1.04 |
| 3 | 1.01 | 1.03 | 0.93 |
| 4 | 0.81 | 1.01 | 0.89 |
| 5 | 0.83 | 0.95 | 0.92 |
| 6 | 0.87 | 0.94 | 0.86 |
| 7 | 0.76 | 0.83 | 0.92 |
| 8 | 0.82 | 0.85 | 0.85 |
| 9 | 0.97 | 0.88 | 0.87 |
| 10 | 0.96 | 0.95 | 0.92 |
| Chain Number | |||
|---|---|---|---|
| 1 | 1.13 | 1.12 | 0.81 |
| 2 | 1.05 | 1.07 | 0.84 |
| 3 | 1.01 | 1.05 | 0.83 |
| 4 | 0.96 | 1.02 | 0.77 |
| 5 | 1.01 | 0.96 | 0.87 |
| 6 | 0.97 | 1.01 | 1.06 |
| 7 | 0.89 | 0.93 | 0.98 |
| 8 | 0.91 | 0.95 | 0.84 |
| 9 | 0.92 | 0.82 | 0.86 |
| 10 | 0.96 | 0.83 | 0.89 |
| Chain Number | |||
|---|---|---|---|
| 1 | 1.07 | 0.83 | 1.08 |
| 2 | 1.05 | 0.91 | 1.04 |
| 3 | 1.01 | 1.03 | 0.93 |
| 4 | 0.81 | 1.01 | 0.89 |
| 5 | 0.83 | 0.95 | 0.92 |
| 6 | 0.87 | 0.94 | 0.86 |
| 7 | 0.76 | 0.83 | 0.92 |
| 8 | 0.82 | 0.85 | 0.85 |
| 9 | 0.97 | 0.88 | 0.87 |
| 10 | 0.96 | 0.95 | 0.90 |
| Chain Number | |||
|---|---|---|---|
| 1 | 1.11 | 1.07 | 1.08 |
| 2 | 1.12 | 1.09 | 1.10 |
| 3 | 1.01 | 0.95 | 1.02 |
| 4 | 0.78 | 0.89 | 0.84 |
| 5 | 1.02 | 1.02 | 0.84 |
| 6 | 0.84 | 0.91 | 0.86 |
| 7 | 0.77 | 0.81 | 0.92 |
| 8 | 0.85 | 0.82 | 0.79 |
| 9 | 0.90 | 0.89 | 0.97 |
| 10 | 0.92 | 0.93 | 0.94 |
| 5 | 11 | 21 | 31 | 46 | 75 | 98 | 122 | 145 |
| 65 | 196 | 224 | 245 | 293 | 321 | 330 | 350 | 420 |
| Estimates | SE | SE | SE | |||
|---|---|---|---|---|---|---|
| MLE | 3.1750 | 0.2846 | 0.1746 | 0.0537 | 0.2443 | 0.0981 |
| MPSE | 3.2019 | 0.1951 | 0.1653 | 0.0466 | 0.2205 | 0.0827 |
| WLSE | 3.1425 | 0.3301 | 0.1883 | 0.0624 | 0.2162 | 0.1028 |
| Bayes | 3.2455 | 0.1304 | 0.1563 | 0.0369 | 0.1751 | 0.0526 |
| Estimates | Length | Length | Length | |||
|---|---|---|---|---|---|---|
| ACI | (2.6172, 3.7328) | 1.1156 | (0.0693, 0.2799) | 0.2106 | (0.0520, 0.4365) | 0.3845 |
| Boot-p | (2.7051, 3.5963) | 0.8912 | (0.0762, 0.2548) | 0.1786 | (0.0721, 0.4019) | 0.3298 |
| Boot-t | (2.7529, 3.5082) | 0.7553 | (0.0844, 0.2427) | 0.6013 | (0.0952, 0.3865) | 0.2913 |
| Bayes | (3.0249, 3.5217) | 0.4968 | (0.1208, 0.1961) | 0.0753 | (0.1127, 0.2659) | 0.1532 |
| Distributions | AIC | BIC | p-Value | K-S Distance |
|---|---|---|---|---|
| PLL | 313.5353 | 306.1403 | 0.8531 | 0.0873 |
| LL | 512.9789 | 495.5547 | 0.4678 | 0.5829 |
| L | 3015.691 | 3015.472 | 0.0000 | 0.99186 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Ghosh, I.; Kumar, D.; Dutta, S. Different Classical and Bayesian Methods of Estimation of the Power Log-Logistic Distribution with Applications. Axioms 2026, 15, 285. https://doi.org/10.3390/axioms15040285
Ghosh I, Kumar D, Dutta S. Different Classical and Bayesian Methods of Estimation of the Power Log-Logistic Distribution with Applications. Axioms. 2026; 15(4):285. https://doi.org/10.3390/axioms15040285
Chicago/Turabian StyleGhosh, Indranil, Devendra Kumar, and Subhankar Dutta. 2026. "Different Classical and Bayesian Methods of Estimation of the Power Log-Logistic Distribution with Applications" Axioms 15, no. 4: 285. https://doi.org/10.3390/axioms15040285
APA StyleGhosh, I., Kumar, D., & Dutta, S. (2026). Different Classical and Bayesian Methods of Estimation of the Power Log-Logistic Distribution with Applications. Axioms, 15(4), 285. https://doi.org/10.3390/axioms15040285

