A New Exponential-Type Model Under Unified Progressive Hybrid Censoring: Computational Inference and Its Applications
Abstract
1. Introduction
| Algorithm 1 The MCMC iterative procedure of , , , , and |
|
- Various inference frameworks for the NOT-Exp distribution using diverse datasets generated from the UT2-PH plan are proposed.
- Two asymptotic confidence interval approaches, including normal-based and log-normal-based, are systematically developed and compared for both parameters and reliability time functions, addressing positivity and finite-sample limitations.
- A comprehensive Bayesian estimation procedure is proposed using independent gamma priors and Metropolis–Hastings MCMC algorithms, allowing for the estimation of parameters, reliability function (RF), and HR function (HRF) along with credible and HPD intervals.
- An extensive simulation study evaluates estimator accuracy based on different metrics of precision under varying sample sizes, censoring designs, and prior specifications, providing practical guidance on optimal censoring designs.
- The study demonstrates the superior stability and efficiency of Bayesian estimators—particularly HPD intervals—over classical counterparts in heavily censored scenarios.
- Airborne variation and bank waiting time datasets represent two challenging real-world lifetime settings marked by heterogeneity, skewness, non-constant hazard behavior, and realistic censoring, making them ideal benchmarks for advanced reliability modeling.
- Empirical analyses clearly demonstrate the superiority of the NOT-Exp model over twelve competitive lifetime distributions—alpha-power exponential, Weibull exponential, Nadarajah and Haghighi, generalized exponential, and Weibull distributions, among others—consistently yielding better fit and inferential performance across both environmental and banking applications.
2. The NOT-Exp Model
3. Likelihood Inference
3.1. Maximum Likelihood Estimators
- ,
3.2. Asymptotic Interval Bounds
- and ,
- and
- with
4. Bayesian Estimation
5. Monte Carlo Comparisons
5.1. Simulation Framework
| Algorithm 2 Simulate UT2-PH censored datasets from NOT-Exp |
|
- Set-1 [Prior-A]: and ;
- Set-1 [Prior-B]: and ,
- Set-2 [Prior-A]: and ;
- Set-2 [Prior-B]: and .
- (i)
- Informative (Prior-A): , , and ;
- (ii)
- Overdispersed: , , and ;
- (iii)
- Weakly informative: , , and ;
- (iv)
- Improper: .
5.2. Simulation Results and Interpretations
- Across all simulated scenarios, the proposed estimation procedures demonstrate stable performance. This is reflected in consistently low values of RMSE, MRAB, and AIL, alongside CPs that remain close to the nominal 95% level.
- Increasing n or () generally improves estimation accuracy. Similarly, reducing the number of censored observations contributes to more precise estimates. Additionally, longer censoring times () tend to enhance the reliability and precision of estimates for all coefficients.
- Under Prior-B, the Bayesian estimates outperform those based on Prior-A. Overall, both Bayesian approaches produce more efficient and less biased estimates compared to their frequentist counterparts. This behavior can be explained by the fact that the hyperparameters of Prior-B provide prior distributions that are more compatible with the true parameter values used in the simulation design, leading to posterior distributions with reduced variability and improved estimation precision.
- Comparisons of the 95% interval methods reveal that:
- –
- Intervals constructed via the HPD (for either prior) generally provide superior coverage and shorter interval lengths than those obtained with the BCI approach. Both types of credible intervals tend to outperform the asymptotic intervals (ACI-NA and ACI-NL) in terms of reliability and precision.
- –
- For and , the ACI[Log-Norm] results are slightly narrower and more accurate than the corresponding ACI-[Norm] intervals.
- –
- For , , and , the ACI[Norm] approach shows marginally better performance than ACI[Log-Norm].
- –
- The CPs for both Bayesian (BCI/HPD) and classical (ACI-NA/ACI-NL) intervals generally achieve or exceed the nominal 95% across most scenarios.
- –
- The HPD intervals consistently outperform the two approaches of ACIs in terms of AIL and CP behaviors. This result is expected because HPD intervals are constructed directly from the posterior distribution and therefore fully incorporate the uncertainty arising from both the observed data and the prior information.
- –
- In contrast, both ACIs are based on asymptotic normal and log-normal approximations to the likelihood function, which may be less accurate for moderate sample sizes or in the presence of censoring. Consequently, HPD intervals tend to produce more efficient interval estimates in censored lifetime models.
- When the true values in NOT-Exp are increased, it can be noted that all point and interval estimation findings of , , , , and become satisfactory in terms of lowest RMSE, lowest MRAB, AIL results, and highest CP values.
- Evaluating the performance of the proposed censoring designs listed in Table 1:
- –
- The right schemes and provide the most efficient estimates for and .
- –
- The left schemes and provide the most efficient estimates for and .
- –
- The middle schemes and provide the most efficient estimates for .
- Overall, once the UT2-PH dataset is generated, the Bayesian point and credible interval methodologies demonstrate clear advantages over other methods. We finally recommend the MCMC framework, which delivers more efficient, stable, and reliable inference for the NOT-Exp parameters and associated reliability functions.
6. Real-World Applications
- Toxicological Application: Variations in airborne exposure and their influence on urinary metabolite concentrations are a central focus in toxicology and environmental health research. Airborne chemicals, once inhaled, undergo absorption, distribution, metabolism, and excretion, leading to the formation of measurable metabolites in biological matrices such as urine. Understanding how fluctuations in airborne exposure translate into changes in urinary metabolite concentrations is crucial for accurately assessing human exposure, validating toxicokinetic and physiologically based pharmacokinetic (PBPK) models, and identifying dose–response relationships (see Valavanidis et al. [15] for more details). This application (say, App.1) investigates the impact of varying airborne exposure on urinary metabolite concentrations for thirty individual human subjects exposed to acetone under controlled conditions. Airborne exposure was expressed in mg/m3 (milligrams per cubic meter of air), while urinary metabolite concentrations were expressed in mg/g creatinine. This dataset was first provided by Kumagai and Matsunaga [16] and reanalyzed by Peter et al. [17].
- Banking Management Application: Banking systems are designed to manage customer flow and provide financial services efficiently, but they often involve queues that lead to long waiting times before customers receive service. Bank waiting times refer to the duration customers spend from arrival until being attended by a teller or service agent, and they represent a fundamental performance indicator for evaluating service efficiency, improving customer satisfaction, and optimizing the allocation of human and operational resources; Cowdrey et al. [18]. This application (say, App.2) examines one hundred recorded waiting times (in minutes) representing the duration customers spend in a bank before being served. This dataset was presented by Ghitany et al. [19] and later reanalyzed by Alsubie [20].
| Airborne Variations | ||||||||||
| 1.5 | 1.7 | 2.1 | 2.2 | 2.4 | 2.5 | 2.6 | 3.8 | 3.8 | 4.2 | 4.3 |
| 5.6 | 6.0 | 7.0 | 7.5 | 9.3 | 9.9 | 10.2 | 10.6 | 12.3 | 12.9 | 13.7 |
| 14.1 | 17.8 | 27.6 | 31.0 | 42.0 | 45.6 | 51.9 | 91.3 | 131.8 | ||
| Bank Waiting Times | ||||||||||
| 0.8 | 0.8 | 1.3 | 1.5 | 1.8 | 1.9 | 1.9 | 2.1 | 2.6 | 2.7 | 2.9 |
| 3.1 | 3.2 | 3.3 | 3.5 | 3.6 | 4.0 | 4.1 | 4.2 | 4.2 | 4.3 | 4.3 |
| 4.4 | 4.4 | 4.6 | 4.7 | 4.7 | 4.8 | 4.9 | 4.9 | 5.0 | 5.3 | 5.5 |
| 5.7 | 5.7 | 6.1 | 6.2 | 6.2 | 6.2 | 6.3 | 6.7 | 6.9 | 7.1 | 7.1 |
| 7.1 | 7.1 | 7.4 | 7.6 | 7.7 | 8.0 | 8.2 | 8.6 | 8.6 | 8.6 | 8.8 |
| 8.8 | 8.9 | 8.9 | 9.5 | 9.6 | 9.7 | 9.8 | 10.7 | 10.9 | 11.0 | 11.0 |
| 11.1 | 11.2 | 11.2 | 11.5 | 11.9 | 12.4 | 12.5 | 12.9 | 13.0 | 13.1 | 13.3 |
| 13.6 | 13.7 | 13.9 | 14.1 | 15.4 | 15.4 | 17.3 | 17.3 | 18.1 | 18.2 | |
| 18.4 | 18.9 | 19.0 | 19.9 | 20.6 | 21.3 | 21.4 | 21.9 | 23.0 | 27.0 | 31.6 |
| 33.1 | 38.5 | |||||||||
| Author(s) | Symbol | Model |
|---|---|---|
| Bagdonavicius and Nikulin [22] | PGW | Power Generalized Weibull |
| Peng and Yan [23] | NEW | New Extended Weibull |
| Mahdavi and Kundu [24] | APE | Alpha-Power Exponential |
| Pinho et al. [25] | HEE | Harris-Extended Exponential |
| Alotaibi et al. [26] | EP | Exponentiated-Pham |
| Mudholkar and Srivastava [27] | EW | Exponentiated Weibull |
| Oguntunde et al. [28] | WE | Weibull Exponential |
| Nadarajah and Haghighi [29] | NH | Nadarajah–Haghighi |
| Gupta and Kundu [30] | GE | Generalized Exponential |
| Birnbaum and Saunders [31] | BS | Birnbaum–Saunders |
| Weibull [32] | W | Weibull |
| Johnson et al. [33] | G | Gamma |
| Model | p-Value | ||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Est. | St.E | Est. | St.E | Est. | St.E | ||||||||||
| App.1 | |||||||||||||||
| NOT-Exp | 0.0121 | 0.0145 | 8.2571 | 8.3908 | 0.0162 | 0.0110 | 115.96 | 237.92 | 238.81 | 242.23 | 239.33 | 0.2152 | 0.0287 | 0.0935 | 0.9493 |
| PGW | 0.0030 | 0.0059 | 13.635 | 7.5739 | 0.0255 | 0.0155 | 116.50 | 239.09 | 240.00 | 243.46 | 240.52 | 0.2366 | 0.0313 | 0.0985 | 0.9244 |
| NEW | 4.0576 | 1.5384 | 0.4488 | 0.1520 | 0.4545 | 0.2743 | 116.90 | 239.86 | 240.76 | 244.20 | 241.27 | 0.2160 | 0.0294 | 0.0947 | 0.9451 |
| APE | - | - | 0.0440 | 0.0842 | 0.0242 | 0.0133 | 119.47 | 242.94 | 243.37 | 245.81 | 243.88 | 0.6876 | 0.1039 | 0.1203 | 0.7609 |
| HEE | 0.0136 | 0.0561 | 1.1475 | 4.5699 | 15.253 | 60.316 | 119.98 | 245.95 | 246.84 | 250.26 | 247.36 | 0.6748 | 0.1002 | 0.1255 | 0.7129 |
| EP | 0.1031 | 0.0313 | 3.8999 | 0.9714 | 62.604 | 67.090 | 117.08 | 240.16 | 241.05 | 244.46 | 241.56 | 0.3342 | 0.0446 | 0.0939 | 0.9483 |
| EW | 8.9994 | 41.509 | 21.260 | 42.862 | 0.2828 | 0.1620 | 117.02 | 240.05 | 240.94 | 244.35 | 241.45 | 0.3213 | 0.0425 | 0.0957 | 0.9373 |
| WE | 7.4202 | 3.9030 | 0.7653 | 0.0991 | 0.0040 | 0.0020 | 121.55 | 249.09 | 249.98 | 253.39 | 250.49 | 1.0355 | 0.1661 | 0.1717 | 0.3205 |
| NH | - | - | 0.5319 | 0.1268 | 0.1808 | 0.0958 | 119.51 | 243.02 | 243.45 | 245.89 | 243.96 | 0.6289 | 0.0938 | 0.1273 | 0.6968 |
| GE | - | - | 0.8110 | 0.1901 | 0.0454 | 0.0117 | 121.87 | 247.75 | 248.18 | 250.62 | 248.68 | 1.1182 | 0.1815 | 0.1970 | 0.1801 |
| BS | - | - | 1.3370 | 0.1684 | 10.301 | 1.9825 | 116.68 | 239.26 | 239.69 | 242.35 | 240.20 | 0.3533 | 0.0504 | 0.1488 | 0.4990 |
| W | - | - | 0.8202 | 0.1063 | 0.0995 | 0.0403 | 120.98 | 245.96 | 246.39 | 248.82 | 246.89 | 0.9432 | 0.1499 | 0.1603 | 0.4031 |
| G | - | - | 0.8012 | 0.1755 | 0.0422 | 0.0125 | 121.74 | 247.49 | 247.91 | 250.35 | 248.42 | 1.0969 | 0.1777 | 0.1904 | 0.2112 |
| App.2 | |||||||||||||||
| NOT-Exp | 2.6162 | 6.9501 | 2.5701 | 1.0643 | 0.1497 | 0.0270 | 317.00 | 640.00 | 640.25 | 647.81 | 643.16 | 0.1270 | 0.0170 | 0.0368 | 0.9993 |
| PGW | 0.0322 | 0.0127 | 2.0169 | 0.3881 | 0.4571 | 0.1626 | 317.07 | 640.14 | 640.39 | 647.96 | 643.30 | 0.1275 | 0.0174 | 0.0379 | 0.9988 |
| NEW | 1.7880 | 1.1060 | 1.2248 | 0.1726 | 0.0662 | 0.0361 | 317.02 | 640.04 | 640.29 | 647.86 | 643.21 | 0.1761 | 0.0238 | 0.0465 | 0.9820 |
| APE | - | - | 20.982 | 13.9661 | 0.1829 | 0.0197 | 319.04 | 642.07 | 642.20 | 647.98 | 644.18 | 0.4175 | 0.0666 | 0.0526 | 0.9452 |
| HEE | 6.2972 | 3.3510 | 0.1368 | 0.0177 | 10.969 | 8.1686 | 317.52 | 641.05 | 641.30 | 648.86 | 644.21 | 0.1361 | 0.0191 | 0.0415 | 0.9953 |
| EP | 0.3405 | 0.1025 | 1.8353 | 0.4145 | 7.6234 | 5.0187 | 317.15 | 640.30 | 640.55 | 648.12 | 643.19 | 0.1272 | 0.0173 | 0.0376 | 0.9987 |
| EW | 0.1905 | 0.1065 | 2.6747 | 1.6430 | 0.9057 | 0.2570 | 317.03 | 640.07 | 640.32 | 647.88 | 643.23 | 0.1271 | 0.0175 | 0.0384 | 0.9984 |
| WE | 18.634 | 14.842 | 1.3636 | 0.1070 | 0.0100 | 0.0047 | 319.50 | 645.00 | 645.25 | 652.82 | 648.16 | 0.5010 | 0.0796 | 0.0629 | 0.8233 |
| NH | - | - | 3.3367 | 1.8424 | 0.0212 | 0.0139 | 323.45 | 650.90 | 651.02 | 656.11 | 653.01 | 0.6969 | 0.1113 | 0.1076 | 0.1976 |
| GE | - | - | 2.1834 | 0.3343 | 0.1592 | 0.0175 | 317.10 | 642.02 | 642.14 | 647.93 | 644.14 | 0.1428 | 0.0207 | 0.0402 | 0.9969 |
| BS | - | - | 0.8462 | 0.0597 | 7.2078 | 0.5562 | 320.33 | 644.66 | 644.78 | 649.87 | 646.77 | 0.5987 | 0.0808 | 0.0801 | 0.5426 |
| W | - | - | 1.4581 | 0.1089 | 0.0305 | 0.0095 | 318.73 | 641.46 | 641.59 | 647.97 | 643.57 | 0.3960 | 0.0629 | 0.0576 | 0.8941 |
| G | - | - | 2.0091 | 0.2639 | 0.2034 | 0.0303 | 317.30 | 642.43 | 642.56 | 647.94 | 644.55 | 0.1823 | 0.0276 | 0.0425 | 0.9935 |
7. Optimal Plans
8. Conclusions, Recommendations, and Future Research
8.1. Recommendations
- Based on the theoretical findings and empirical evidence, the following recommendations can be made:
- The NOT-Exp distribution is strongly recommended for lifetime data exhibiting complex hazard shapes that cannot be adequately captured by classical exponential-type models.
- The UT2-PH censoring scheme should be favored in practical life-testing experiments where multiple termination criteria and progressive withdrawals are present, as it provides flexibility without sacrificing inferential tractability.
- Bayesian MCMC-based inference is recommended in moderate-to-small samples or heavily censored settings, where classical asymptotic approximations may be less reliable.
- Practitioners in toxicology, environmental monitoring, and service operations should consider reliability-based modeling to better quantify persistence, risk, and system performance.
8.2. Future Directions
- First, the proposed framework may be extended to other members of the odd-family or exponential-generated distributions under UT2-PH or adaptive censoring schemes, for example, odd Weibull, odd gamma, odd log-logistic, and odd Lomax, among others.
- Second, incorporating covariates through accelerated failure time or proportional hazard-type regressions based on the NOT-Exp baseline would significantly broaden its applicability.
- Third, future studies may explore optimal censoring design problems that aim to minimize estimator variance or experimental costs under the NOT-Exp model, such as a cost-minimization or a meta-heuristic algorithm.
- Fourth, multivariate and dependent lifetime extensions—such as shared frailty or copula-based NOT-Exp models—represent an important avenue for analyzing correlated failure data. Finally, machine learning–assisted Bayesian computation and approximate inference techniques could be integrated to further enhance computational efficiency in large-scale or high-dimensional reliability studies.
- In summary, this study provides a unified, flexible, and practically relevant contribution to censored lifetime modeling, offering both theoretical advancement and applied insight. The proposed NOT-Exp using the UT2-PH-based framework lays a solid foundation for future developments in reliability theory and its multidisciplinary applications.
Supplementary Materials
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Appendix A. Numerical Results
| Design | MLE | Bayes | MLE | Bayes | |||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Prior → | A | B | A | B | |||||||||||||||
| Set-1→ | |||||||||||||||||||
| (0.2, 0.5) | (0.5, 0.8) | ||||||||||||||||||
| D[11] | 1.2772 | 2.9741 | 2.7879 | 0.8350 | 0.1551 | 0.1396 | 0.7884 | 0.1259 | 0.1258 | 1.1454 | 2.8863 | 1.8690 | 0.8335 | 0.1511 | 0.1390 | 0.7889 | 0.1259 | 0.1254 | |
| D[12] | 1.2571 | 3.1494 | 2.9283 | 0.8332 | 0.1583 | 0.1417 | 0.7888 | 0.1276 | 0.1271 | 1.2350 | 3.0336 | 2.1479 | 0.8339 | 0.1523 | 0.1408 | 0.7863 | 0.1271 | 0.1268 | |
| D[13] | 1.3629 | 2.8863 | 2.6221 | 0.8343 | 0.1532 | 0.1366 | 0.7886 | 0.1254 | 0.1243 | 1.2112 | 2.8060 | 1.8140 | 0.8360 | 0.1503 | 0.1359 | 0.7883 | 0.1254 | 0.1227 | |
| D[14] | 1.2405 | 2.6044 | 2.3147 | 0.8180 | 0.1354 | 0.1237 | 0.8080 | 0.1002 | 0.0456 | 0.9691 | 2.5686 | 1.6931 | 1.8373 | 0.1331 | 0.1222 | 1.7913 | 0.0977 | 0.0452 | |
| D[15] | 1.5520 | 2.6620 | 2.4707 | 0.8122 | 0.1355 | 0.1239 | 0.8073 | 0.1019 | 0.0464 | 1.3029 | 2.5786 | 1.8082 | 1.8383 | 0.1348 | 0.1223 | 1.7905 | 0.1003 | 0.0457 | |
| D[16] | 0.9711 | 2.5008 | 2.2170 | 0.8188 | 0.1323 | 0.1230 | 0.8077 | 0.1001 | 0.0452 | 0.9522 | 2.3814 | 1.6913 | 1.8378 | 0.1303 | 0.1170 | 1.7914 | 0.0971 | 0.0450 | |
| D[21] | 1.0739 | 2.3809 | 1.7634 | 0.8120 | 0.1215 | 0.0989 | 0.8117 | 0.0753 | 0.0452 | 1.1446 | 2.2534 | 1.0750 | 0.8138 | 0.1097 | 0.0785 | 0.8089 | 0.0475 | 0.0448 | |
| D[22] | 0.9184 | 2.3061 | 1.6389 | 0.8155 | 0.1200 | 0.0979 | 0.8074 | 0.0732 | 0.0451 | 0.8986 | 2.1739 | 0.9861 | 0.8120 | 0.1053 | 0.0743 | 0.8077 | 0.0461 | 0.0446 | |
| D[23] | 0.8890 | 2.2981 | 1.2159 | 0.8155 | 0.1192 | 0.0970 | 0.8059 | 0.0715 | 0.0450 | 1.1411 | 2.1159 | 0.8967 | 0.8144 | 0.1047 | 0.0726 | 0.8062 | 0.0455 | 0.0443 | |
| D[24] | 0.6148 | 1.7106 | 0.7023 | 0.6971 | 0.1070 | 0.0675 | 0.7746 | 0.0453 | 0.0435 | 0.9433 | 1.5749 | 0.5484 | 0.7009 | 0.0695 | 0.0675 | 0.7769 | 0.0441 | 0.0434 | |
| D[25] | 1.0067 | 2.1143 | 0.7796 | 0.7027 | 0.1082 | 0.0677 | 0.7767 | 0.0458 | 0.0449 | 1.0498 | 1.9588 | 0.6354 | 0.6980 | 0.0718 | 0.0677 | 0.7771 | 0.0450 | 0.0441 | |
| D[26] | 0.9674 | 1.5622 | 0.5616 | 0.6975 | 0.1004 | 0.0664 | 0.7762 | 0.0449 | 0.0429 | 0.9562 | 1.4622 | 0.5195 | 0.6973 | 0.0685 | 0.0664 | 0.7734 | 0.0440 | 0.0427 | |
| (0.5, 1.0) | (1.5, 2.5) | ||||||||||||||||||
| D[31] | 0.9744 | 1.2093 | 0.4562 | 0.6945 | 0.0695 | 0.0410 | 0.7785 | 0.0376 | 0.0334 | 1.0893 | 0.7367 | 0.3029 | 0.7008 | 0.0439 | 0.0407 | 0.7752 | 0.0369 | 0.0332 | |
| D[32] | 1.1437 | 1.5507 | 0.5290 | 0.6976 | 0.0696 | 0.0417 | 0.7777 | 0.0381 | 0.0339 | 1.0045 | 0.9488 | 0.4641 | 0.6948 | 0.0439 | 0.0414 | 0.7744 | 0.0371 | 0.0343 | |
| D[33] | 0.9658 | 1.1479 | 0.3660 | 0.6937 | 0.0685 | 0.0407 | 0.7755 | 0.0376 | 0.0312 | 0.9286 | 0.3870 | 0.2812 | 0.6929 | 0.0437 | 0.0394 | 0.7775 | 0.0367 | 0.0311 | |
| D[34] | 0.7548 | 0.8885 | 0.2984 | 0.8531 | 0.0406 | 0.0374 | 0.6781 | 0.0301 | 0.0299 | 1.1275 | 0.3670 | 0.2253 | 0.8531 | 0.0381 | 0.0302 | 0.6781 | 0.0300 | 0.0255 | |
| D[35] | 1.0928 | 0.7685 | 0.1869 | 0.8521 | 0.0397 | 0.0371 | 0.6797 | 0.0289 | 0.0282 | 0.8719 | 0.2771 | 0.1577 | 0.8521 | 0.0375 | 0.0291 | 0.6797 | 0.0287 | 0.0235 | |
| D[36] | 0.8703 | 0.6207 | 0.1403 | 0.8532 | 0.0394 | 0.0368 | 0.6791 | 0.0281 | 0.0278 | 0.9624 | 0.2692 | 0.1319 | 0.8532 | 0.0368 | 0.0283 | 0.6791 | 0.0279 | 0.0225 | |
| Set-2→ | |||||||||||||||||||
| D[11] | 1.9381 | 3.3185 | 2.1412 | 1.8373 | 0.1312 | 0.1239 | 1.7913 | 0.0701 | 0.0524 | 1.9430 | 3.2365 | 1.7890 | 1.8398 | 0.1290 | 0.1235 | 1.7902 | 0.0676 | 0.0512 | |
| D[12] | 2.9228 | 3.4079 | 2.7127 | 1.8383 | 0.1338 | 0.1251 | 1.7905 | 0.0704 | 0.0548 | 2.3526 | 3.3670 | 1.9491 | 1.8374 | 0.1306 | 0.1243 | 1.7909 | 0.0698 | 0.0544 | |
| D[13] | 1.9045 | 3.2133 | 1.8166 | 1.8378 | 0.1304 | 0.1237 | 1.7914 | 0.0697 | 0.0514 | 1.8402 | 3.1954 | 1.7398 | 1.8382 | 0.1286 | 0.1228 | 1.7902 | 0.0659 | 0.0504 | |
| D[14] | 2.1887 | 3.0398 | 1.6327 | 1.8131 | 0.1110 | 0.1002 | 1.8133 | 0.0599 | 0.0421 | 1.9271 | 2.8420 | 1.3867 | 1.8136 | 0.1101 | 0.0977 | 1.8123 | 0.0537 | 0.0402 | |
| D[15] | 2.1614 | 3.0870 | 1.6752 | 1.8140 | 0.1113 | 0.1100 | 1.8128 | 0.0654 | 0.0393 | 2.2061 | 2.9438 | 1.6721 | 1.8120 | 0.1104 | 0.0989 | 1.8120 | 0.0540 | 0.0374 | |
| D[16] | 1.3417 | 2.6599 | 1.5576 | 1.8136 | 0.1101 | 0.0979 | 1.8121 | 0.0536 | 0.0401 | 1.8342 | 2.4696 | 1.3096 | 1.8136 | 0.1097 | 0.0928 | 1.8121 | 0.0536 | 0.0387 | |
| D[21] | 1.2409 | 1.9893 | 1.5492 | 1.8121 | 0.1097 | 0.0544 | 1.8113 | 0.0388 | 0.0370 | 2.2976 | 1.7951 | 1.2159 | 1.8127 | 0.1093 | 0.0414 | 1.8140 | 0.0371 | 0.0348 | |
| D[22] | 1.9805 | 1.7474 | 1.5245 | 1.8138 | 0.1097 | 0.0530 | 1.8132 | 0.0370 | 0.0326 | 2.4415 | 1.6295 | 1.1003 | 1.8130 | 0.1091 | 0.0383 | 1.8132 | 0.0369 | 0.0311 | |
| D[23] | 2.3902 | 1.6472 | 1.5097 | 1.8138 | 0.1089 | 0.0517 | 1.8125 | 0.0347 | 0.0318 | 1.3902 | 1.5738 | 1.0767 | 1.8138 | 0.1089 | 0.0382 | 1.8125 | 0.0347 | 0.0302 | |
| D[24] | 1.9208 | 1.5056 | 1.4640 | 1.7091 | 0.0739 | 0.0513 | 1.7911 | 0.0328 | 0.0290 | 1.9713 | 1.4874 | 0.7723 | 1.7104 | 0.0734 | 0.0380 | 1.7916 | 0.0325 | 0.0276 | |
| D[25] | 2.3758 | 1.5374 | 1.4908 | 1.7101 | 0.0753 | 0.0514 | 1.7916 | 0.0335 | 0.0308 | 2.1656 | 1.5240 | 1.0580 | 1.7094 | 0.0735 | 0.0381 | 1.7916 | 0.0327 | 0.0291 | |
| D[26] | 1.9608 | 1.4900 | 1.4258 | 1.7091 | 0.0724 | 0.0494 | 1.7911 | 0.0319 | 0.0276 | 1.9986 | 1.4790 | 0.7711 | 1.7091 | 0.0720 | 0.0379 | 1.7911 | 0.0318 | 0.0268 | |
| D[31] | 1.9036 | 1.4184 | 1.3298 | 1.7094 | 0.0614 | 0.0311 | 1.7920 | 0.0285 | 0.0261 | 1.7226 | 1.4011 | 0.7008 | 1.7102 | 0.0591 | 0.0306 | 1.7911 | 0.0278 | 0.0247 | |
| D[32] | 2.2524 | 1.4794 | 1.3453 | 1.7076 | 0.0653 | 0.0328 | 1.7898 | 0.0295 | 0.0270 | 2.1264 | 1.4588 | 0.7454 | 1.7086 | 0.0605 | 0.0311 | 1.7898 | 0.0295 | 0.0256 | |
| D[33] | 1.8883 | 1.4011 | 1.1096 | 1.7094 | 0.0583 | 0.0306 | 1.7911 | 0.0279 | 0.0246 | 1.8883 | 1.3703 | 0.5757 | 1.7094 | 0.0551 | 0.0301 | 1.7911 | 0.0263 | 0.0235 | |
| D[34] | 1.8804 | 1.3121 | 0.9870 | 1.8582 | 0.0557 | 0.0299 | 1.6747 | 0.0231 | 0.0189 | 1.8676 | 1.2416 | 0.5116 | 1.8576 | 0.0510 | 0.0283 | 1.6737 | 0.0215 | 0.0171 | |
| D[35] | 2.0725 | 1.3659 | 1.0793 | 1.8564 | 0.0579 | 0.0305 | 1.6743 | 0.0257 | 0.0234 | 2.0128 | 1.3121 | 0.5338 | 1.8562 | 0.0531 | 0.0299 | 1.6733 | 0.0227 | 0.0227 | |
| D[36] | 1.9583 | 1.2299 | 0.8029 | 1.8596 | 0.0527 | 0.0296 | 1.6747 | 0.0210 | 0.0171 | 1.9686 | 0.8806 | 0.4935 | 1.8579 | 0.0505 | 0.0279 | 1.6745 | 0.0200 | 0.0161 | |
| Design | MLE | Bayes | MLE | Bayes | |||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Prior → | A | B | A | B | |||||||||||||||
| Set-1→ | |||||||||||||||||||
| (0.2, 0.5) | (0.5, 0.8) | ||||||||||||||||||
| D[11] | 0.6541 | 1.1524 | 0.4277 | 0.5068 | 0.2232 | 0.1416 | 0.4397 | 0.1183 | 0.0864 | 0.6395 | 1.0684 | 0.3913 | 0.5076 | 0.2232 | 0.1406 | 0.4405 | 0.1183 | 0.0847 | |
| D[12] | 0.6341 | 0.8521 | 0.4029 | 0.5076 | 0.2175 | 0.1406 | 0.4386 | 0.1154 | 0.0841 | 0.6191 | 0.8177 | 0.3650 | 0.5068 | 0.2175 | 0.1398 | 0.4396 | 0.1154 | 0.0838 | |
| D[13] | 0.6346 | 1.2807 | 0.4408 | 0.5091 | 0.2344 | 0.1487 | 0.4353 | 0.1242 | 0.0881 | 0.6208 | 1.2098 | 0.4066 | 0.5101 | 0.2344 | 0.1483 | 0.4350 | 0.1242 | 0.0879 | |
| D[14] | 0.5804 | 0.7982 | 0.3088 | 0.4672 | 0.1637 | 0.1398 | 0.4612 | 0.0895 | 0.0831 | 0.8357 | 0.7925 | 0.2322 | 0.4589 | 0.1625 | 0.1398 | 0.4361 | 0.0889 | 0.0822 | |
| D[15] | 0.5512 | 0.5746 | 0.2432 | 0.4760 | 0.1601 | 0.1394 | 0.4648 | 0.0881 | 0.0819 | 0.8448 | 0.5375 | 0.1871 | 0.5078 | 0.1600 | 0.1379 | 0.4277 | 0.0879 | 0.0818 | |
| D[16] | 0.5777 | 0.7860 | 0.2705 | 0.4731 | 0.1608 | 0.1398 | 0.4636 | 0.0884 | 0.0822 | 0.7715 | 0.5886 | 0.2190 | 0.4494 | 0.1603 | 0.1394 | 0.4420 | 0.0879 | 0.0819 | |
| D[21] | 0.5466 | 0.4431 | 0.1596 | 0.4803 | 0.1574 | 0.0877 | 0.4650 | 0.0824 | 0.0798 | 0.5443 | 0.4030 | 0.1591 | 0.4798 | 0.1557 | 0.0867 | 0.4619 | 0.0818 | 0.0504 | |
| D[22] | 0.5423 | 0.4128 | 0.1573 | 0.4856 | 0.1546 | 0.0863 | 0.4645 | 0.0809 | 0.0753 | 0.5432 | 0.3865 | 0.1558 | 0.4850 | 0.1458 | 0.0851 | 0.4643 | 0.0779 | 0.0498 | |
| D[23] | 0.5512 | 0.5419 | 0.1602 | 0.4666 | 0.1596 | 0.0891 | 0.4575 | 0.0870 | 0.0819 | 0.5387 | 0.4242 | 0.1597 | 0.4664 | 0.1566 | 0.0879 | 0.4561 | 0.0867 | 0.0530 | |
| D[24] | 0.5510 | 0.3865 | 0.1509 | 0.5796 | 0.0823 | 0.0753 | 0.5261 | 0.0720 | 0.0533 | 0.5424 | 0.3283 | 0.1442 | 0.5804 | 0.0762 | 0.0722 | 0.5258 | 0.0591 | 0.0496 | |
| D[25] | 0.5442 | 0.2380 | 0.1375 | 0.5800 | 0.0813 | 0.0711 | 0.5266 | 0.0697 | 0.0508 | 0.5395 | 0.2217 | 0.1369 | 0.5801 | 0.0753 | 0.0704 | 0.5263 | 0.0585 | 0.0491 | |
| D[26] | 0.5480 | 0.3033 | 0.1476 | 0.5798 | 0.0823 | 0.0716 | 0.5256 | 0.0700 | 0.0510 | 0.5443 | 0.2531 | 0.1415 | 0.5787 | 0.0754 | 0.0708 | 0.5270 | 0.0586 | 0.0496 | |
| D[31] | 0.5319 | 0.1888 | 0.1248 | 0.5812 | 0.0750 | 0.0699 | 0.5314 | 0.0590 | 0.0483 | 0.5290 | 0.1810 | 0.1187 | 0.5819 | 0.0701 | 0.0690 | 0.5310 | 0.0497 | 0.0459 | |
| D[32] | 0.5215 | 0.1596 | 0.1068 | 0.5802 | 0.0692 | 0.0634 | 0.5247 | 0.0495 | 0.0453 | 0.5248 | 0.1582 | 0.1067 | 0.5773 | 0.0667 | 0.0586 | 0.5254 | 0.0485 | 0.0419 | |
| D[33] | 0.5372 | 0.1849 | 0.1202 | 0.5787 | 0.0699 | 0.0677 | 0.5243 | 0.0549 | 0.0478 | 0.5339 | 0.1734 | 0.1176 | 0.5798 | 0.0692 | 0.0588 | 0.5221 | 0.0493 | 0.0459 | |
| D[34] | 0.5060 | 0.1519 | 0.1015 | 0.3828 | 0.0691 | 0.0489 | 0.4324 | 0.0459 | 0.0396 | 0.5060 | 0.1512 | 0.1014 | 0.3828 | 0.0599 | 0.0480 | 0.4324 | 0.0441 | 0.0386 | |
| D[35] | 0.5080 | 0.1463 | 0.0992 | 0.3913 | 0.0678 | 0.0446 | 0.4324 | 0.0431 | 0.0327 | 0.5085 | 0.1462 | 0.0992 | 0.3913 | 0.0500 | 0.0432 | 0.4324 | 0.0380 | 0.0326 | |
| D[36] | 0.5086 | 0.1502 | 0.1009 | 0.3884 | 0.0684 | 0.0459 | 0.4320 | 0.0436 | 0.0365 | 0.5080 | 0.1491 | 0.1000 | 0.3884 | 0.0553 | 0.0451 | 0.4320 | 0.0386 | 0.0364 | |
| Set-2→ | |||||||||||||||||||
| (0.5, 1.0) | (1.5, 2.5) | ||||||||||||||||||
| D[11] | 2.1264 | 0.5035 | 0.3145 | 1.5554 | 0.1372 | 0.0851 | 1.4420 | 0.0821 | 0.0480 | 1.7356 | 0.4824 | 0.3013 | 1.5578 | 0.1369 | 0.0823 | 1.4375 | 0.0819 | 0.0451 | |
| D[12] | 1.9914 | 0.4491 | 0.3001 | 1.5565 | 0.1357 | 0.0851 | 1.4428 | 0.0809 | 0.0480 | 1.7106 | 0.4241 | 0.2861 | 1.5568 | 0.1341 | 0.0814 | 1.4375 | 0.0805 | 0.0446 | |
| D[13] | 2.2097 | 0.7059 | 0.3931 | 1.5565 | 0.1387 | 0.0870 | 1.4361 | 0.0831 | 0.0492 | 1.6838 | 0.5936 | 0.3271 | 1.5556 | 0.1386 | 0.0864 | 1.4358 | 0.0831 | 0.0489 | |
| D[14] | 1.5716 | 0.3996 | 0.1942 | 1.4795 | 0.0898 | 0.0535 | 1.4821 | 0.0504 | 0.0264 | 1.2768 | 0.3217 | 0.1657 | 1.4792 | 0.0877 | 0.0519 | 1.4811 | 0.0503 | 0.0263 | |
| D[15] | 1.6161 | 0.3589 | 0.1490 | 1.4815 | 0.0835 | 0.0497 | 1.4828 | 0.0484 | 0.0261 | 1.7703 | 0.2843 | 0.1401 | 1.4815 | 0.0824 | 0.0496 | 1.4828 | 0.0484 | 0.0260 | |
| D[16] | 1.9645 | 0.3685 | 0.1702 | 1.4799 | 0.0846 | 0.0503 | 1.4825 | 0.0499 | 0.0263 | 1.8031 | 0.3156 | 0.1491 | 1.4815 | 0.0824 | 0.0499 | 1.4827 | 0.0484 | 0.0261 | |
| D[21] | 1.6255 | 0.2441 | 0.0940 | 1.4804 | 0.0824 | 0.0467 | 1.4816 | 0.0435 | 0.0253 | 1.4671 | 0.2277 | 0.0921 | 1.4807 | 0.0823 | 0.0451 | 1.4816 | 0.0431 | 0.0252 | |
| D[22] | 1.8118 | 0.2346 | 0.0920 | 1.4823 | 0.0815 | 0.0451 | 1.4849 | 0.0432 | 0.0253 | 1.6156 | 0.2268 | 0.0902 | 1.4823 | 0.0813 | 0.0451 | 1.4849 | 0.0431 | 0.0251 | |
| D[23] | 1.8957 | 0.2845 | 0.0948 | 1.4756 | 0.0828 | 0.0468 | 1.4773 | 0.0439 | 0.0255 | 1.8088 | 0.2291 | 0.0945 | 1.4749 | 0.0823 | 0.0463 | 1.4783 | 0.0432 | 0.0253 | |
| D[24] | 1.7786 | 0.2326 | 0.0882 | 1.5695 | 0.0803 | 0.0431 | 1.5088 | 0.0394 | 0.0248 | 1.4047 | 0.2205 | 0.0882 | 1.5701 | 0.0795 | 0.0430 | 1.5088 | 0.0386 | 0.0247 | |
| D[25] | 1.8883 | 0.2251 | 0.0777 | 1.5676 | 0.0682 | 0.0417 | 1.5094 | 0.0381 | 0.0246 | 2.2427 | 0.1980 | 0.0777 | 1.5676 | 0.0675 | 0.0417 | 1.5094 | 0.0380 | 0.0224 | |
| D[26] | 1.8827 | 0.2292 | 0.0825 | 1.5676 | 0.0790 | 0.0428 | 1.5094 | 0.0388 | 0.0247 | 1.6123 | 0.2128 | 0.0801 | 1.5702 | 0.0777 | 0.0428 | 1.5088 | 0.0386 | 0.0246 | |
| D[31] | 1.4258 | 0.1428 | 0.0491 | 1.5726 | 0.0398 | 0.0397 | 1.5104 | 0.0209 | 0.0202 | 1.6808 | 0.1190 | 0.0468 | 1.5748 | 0.0398 | 0.0394 | 1.5101 | 0.0208 | 0.0200 | |
| D[32] | 1.7281 | 0.1277 | 0.0466 | 1.5726 | 0.0398 | 0.0393 | 1.5101 | 0.0208 | 0.0200 | 1.7281 | 0.1171 | 0.0444 | 1.5726 | 0.0398 | 0.0387 | 1.5101 | 0.0208 | 0.0200 | |
| D[33] | 1.4398 | 0.1577 | 0.0495 | 1.5802 | 0.0420 | 0.0405 | 1.5130 | 0.0212 | 0.0209 | 2.0235 | 0.1577 | 0.0491 | 1.5779 | 0.0418 | 0.0405 | 1.5130 | 0.0211 | 0.0211 | |
| D[34] | 1.7333 | 0.0907 | 0.0393 | 1.3758 | 0.0388 | 0.0297 | 1.4695 | 0.0208 | 0.0200 | 1.6871 | 0.0903 | 0.0393 | 1.3759 | 0.0386 | 0.0289 | 1.4694 | 0.0208 | 0.0181 | |
| D[35] | 1.7334 | 0.0794 | 0.0392 | 1.3784 | 0.0386 | 0.0295 | 1.4701 | 0.0207 | 0.0198 | 1.7209 | 0.0756 | 0.0392 | 1.3798 | 0.0386 | 0.0270 | 1.4701 | 0.0207 | 0.0167 | |
| D[36] | 1.7472 | 0.0833 | 0.0393 | 1.3771 | 0.0387 | 0.0295 | 1.4699 | 0.0208 | 0.0198 | 1.6962 | 0.0796 | 0.0392 | 1.3773 | 0.0386 | 0.0273 | 1.4695 | 0.0208 | 0.0175 | |
| Design | MLE | Bayes | MLE | Bayes | |||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Prior → | A | B | A | B | |||||||||||||||
| Set-1→ | |||||||||||||||||||
| (0.2, 0.5) | (0.5, 0.8) | ||||||||||||||||||
| D[11] | 0.3262 | 1.6181 | 0.8669 | 0.0986 | 0.5548 | 0.4847 | 0.1443 | 0.1855 | 0.1000 | 0.2841 | 1.5711 | 0.5851 | 0.1022 | 0.5244 | 0.4368 | 0.1482 | 0.1176 | 0.0923 | |
| D[12] | 0.2888 | 1.7609 | 0.9168 | 0.0709 | 0.6413 | 0.4973 | 0.1253 | 0.1875 | 0.1034 | 0.2599 | 1.7289 | 0.6946 | 0.0816 | 0.5283 | 0.4795 | 0.1312 | 0.1459 | 0.0989 | |
| D[13] | 0.3306 | 1.5795 | 0.7683 | 0.1007 | 0.5373 | 0.4462 | 0.1431 | 0.1836 | 0.0988 | 0.3543 | 1.5211 | 0.5449 | 0.0926 | 0.5119 | 0.4022 | 0.1304 | 0.1157 | 0.0920 | |
| D[14] | 0.2449 | 1.5230 | 0.5226 | 0.2107 | 0.4222 | 0.2137 | 0.1108 | 0.1236 | 0.0840 | 1.1867 | 1.3552 | 0.4351 | 1.0389 | 0.3867 | 0.2049 | 1.1479 | 0.0928 | 0.0520 | |
| D[15] | 0.4035 | 1.5781 | 0.5288 | 0.2238 | 0.4460 | 0.2196 | 0.1216 | 0.1352 | 0.0912 | 1.1759 | 1.3606 | 0.4714 | 1.0423 | 0.3916 | 0.2141 | 1.1573 | 0.0933 | 0.0528 | |
| D[16] | 0.2036 | 1.3694 | 0.4388 | 0.2026 | 0.3688 | 0.2081 | 0.1005 | 0.1142 | 0.0784 | 1.3169 | 1.1399 | 0.4161 | 1.0407 | 0.3156 | 0.2038 | 1.1484 | 0.0884 | 0.0516 | |
| D[21] | 0.2802 | 1.3118 | 0.3980 | 0.2195 | 0.3665 | 0.2067 | 0.1156 | 0.0824 | 0.0518 | 0.2262 | 1.1371 | 0.3922 | 0.2145 | 0.3135 | 0.2035 | 0.1126 | 0.0778 | 0.0507 | |
| D[22] | 0.2219 | 1.3032 | 0.3937 | 0.2089 | 0.3295 | 0.2035 | 0.1108 | 0.0794 | 0.0507 | 0.2426 | 1.1242 | 0.3500 | 0.2146 | 0.3060 | 0.1988 | 0.1196 | 0.0751 | 0.0506 | |
| D[23] | 0.3510 | 1.2100 | 0.3270 | 0.1971 | 0.3142 | 0.2030 | 0.1031 | 0.0786 | 0.0503 | 0.3924 | 1.0978 | 0.3240 | 0.1985 | 0.2918 | 0.1841 | 0.1041 | 0.0710 | 0.0500 | |
| D[24] | 0.2248 | 1.2026 | 0.3135 | 0.2012 | 0.2691 | 0.1438 | 0.1893 | 0.0708 | 0.0498 | 0.2115 | 1.0554 | 0.3090 | 0.1971 | 0.1988 | 0.1335 | 0.1843 | 0.0682 | 0.0498 | |
| D[25] | 0.3114 | 1.2093 | 0.3171 | 0.1928 | 0.2735 | 0.1520 | 0.1779 | 0.0710 | 0.0499 | 0.2334 | 1.0930 | 0.3156 | 0.1999 | 0.2029 | 0.1452 | 0.1879 | 0.0708 | 0.0498 | |
| D[26] | 0.2255 | 1.1941 | 0.3048 | 0.2009 | 0.2543 | 0.1428 | 0.1881 | 0.0665 | 0.0466 | 0.2194 | 1.0196 | 0.2918 | 0.1983 | 0.1841 | 0.1329 | 0.1854 | 0.0665 | 0.0466 | |
| D[31] | 0.2109 | 1.1803 | 0.2878 | 0.2047 | 0.1471 | 0.1318 | 0.1848 | 0.0348 | 0.0323 | 0.2441 | 0.9191 | 0.2484 | 0.1952 | 0.1401 | 0.1285 | 0.1747 | 0.0339 | 0.0319 | |
| D[32] | 0.2752 | 1.1824 | 0.3021 | 0.2014 | 0.1492 | 0.1425 | 0.1879 | 0.0355 | 0.0353 | 0.1868 | 1.0117 | 0.2509 | 0.2032 | 0.1474 | 0.1309 | 0.1891 | 0.0354 | 0.0340 | |
| D[33] | 0.2315 | 1.1656 | 0.2768 | 0.2074 | 0.1378 | 0.1314 | 0.1889 | 0.0338 | 0.0319 | 0.2352 | 0.9150 | 0.2407 | 0.2076 | 0.1366 | 0.1194 | 0.1936 | 0.0337 | 0.0319 | |
| D[34] | 0.2544 | 1.0910 | 0.2534 | 0.1658 | 0.1375 | 0.1300 | 0.1417 | 0.0337 | 0.0317 | 0.2544 | 0.6060 | 0.2261 | 0.1658 | 0.1330 | 0.0518 | 0.1417 | 0.0329 | 0.0316 | |
| D[35] | 0.2787 | 1.0886 | 0.2511 | 0.1613 | 0.1375 | 0.1298 | 0.1373 | 0.0337 | 0.0316 | 0.2771 | 0.5759 | 0.2147 | 0.1613 | 0.1330 | 0.0514 | 0.1373 | 0.0329 | 0.0313 | |
| D[36] | 0.2626 | 0.9196 | 0.2149 | 0.1625 | 0.1319 | 0.1239 | 0.1369 | 0.0329 | 0.0302 | 0.2183 | 0.5143 | 0.2128 | 0.1625 | 0.1301 | 0.0438 | 0.1369 | 0.0327 | 0.0291 | |
| Set-2→ | |||||||||||||||||||
| (0.5, 1.0) | (1.5, 2.5) | ||||||||||||||||||
| D[11] | 1.1867 | 1.1055 | 0.7854 | 1.0389 | 0.1855 | 0.1438 | 1.1479 | 0.1128 | 0.0876 | 1.0904 | 0.9168 | 0.6060 | 1.0445 | 0.1827 | 0.1419 | 1.1582 | 0.1113 | 0.0864 | |
| D[12] | 1.1759 | 1.1454 | 0.7924 | 1.0423 | 0.1881 | 0.1456 | 1.1573 | 0.1133 | 0.0878 | 0.9971 | 0.9173 | 0.6287 | 1.0385 | 0.1875 | 0.1452 | 1.1554 | 0.1115 | 0.0867 | |
| D[13] | 1.3169 | 1.0630 | 0.7589 | 1.0407 | 0.1836 | 0.1425 | 1.1484 | 0.1115 | 0.0867 | 1.0374 | 0.8669 | 0.5759 | 1.0430 | 0.1815 | 0.1407 | 1.1582 | 0.1109 | 0.0863 | |
| D[14] | 1.1898 | 0.7874 | 0.5258 | 1.2214 | 0.1102 | 0.0855 | 1.0946 | 0.0517 | 0.0346 | 1.1148 | 0.7799 | 0.5195 | 1.2254 | 0.1102 | 0.0855 | 1.0964 | 0.0515 | 0.0345 | |
| D[15] | 1.1936 | 0.9173 | 0.6287 | 1.2227 | 0.1109 | 0.0864 | 1.0949 | 0.0531 | 0.0353 | 1.1092 | 0.8377 | 0.5682 | 1.2293 | 0.1109 | 0.0861 | 1.1002 | 0.0531 | 0.0353 | |
| D[16] | 1.1839 | 0.7684 | 0.5254 | 1.2254 | 0.1068 | 0.0831 | 1.0973 | 0.0515 | 0.0345 | 1.1768 | 0.7683 | 0.5143 | 1.2254 | 0.1060 | 0.0824 | 1.0973 | 0.0510 | 0.0342 | |
| D[21] | 1.1491 | 0.7651 | 0.4897 | 1.2286 | 0.0784 | 0.0518 | 1.1011 | 0.0510 | 0.0345 | 1.1926 | 0.7131 | 0.4834 | 1.2249 | 0.0708 | 0.0515 | 1.0967 | 0.0454 | 0.0342 | |
| D[22] | 1.1408 | 0.6983 | 0.4834 | 1.2233 | 0.0778 | 0.0514 | 1.0960 | 0.0509 | 0.0343 | 1.2554 | 0.6983 | 0.4770 | 1.2225 | 0.0675 | 0.0510 | 1.0960 | 0.0432 | 0.0341 | |
| D[23] | 1.1927 | 0.6969 | 0.4791 | 1.2240 | 0.0682 | 0.0509 | 1.0964 | 0.0438 | 0.0341 | 1.1927 | 0.6923 | 0.4615 | 1.2240 | 0.0675 | 0.0509 | 1.0964 | 0.0432 | 0.0341 | |
| D[24] | 1.0633 | 0.6304 | 0.4116 | 1.2045 | 0.0574 | 0.0393 | 1.1825 | 0.0375 | 0.0262 | 1.1515 | 0.4871 | 0.3319 | 1.2016 | 0.0547 | 0.0388 | 1.1793 | 0.0363 | 0.0243 | |
| D[25] | 1.0721 | 0.6438 | 0.4439 | 1.2025 | 0.0574 | 0.0400 | 1.1793 | 0.0393 | 0.0269 | 1.1484 | 0.6368 | 0.4180 | 1.2031 | 0.0569 | 0.0399 | 1.1793 | 0.0390 | 0.0269 | |
| D[26] | 1.0893 | 0.5558 | 0.3803 | 1.2045 | 0.0560 | 0.0383 | 1.1825 | 0.0373 | 0.0253 | 1.1248 | 0.4759 | 0.3225 | 1.2045 | 0.0544 | 0.0376 | 1.1825 | 0.0357 | 0.0239 | |
| D[31] | 1.0550 | 0.5363 | 0.3644 | 1.2044 | 0.0397 | 0.0327 | 1.1830 | 0.0274 | 0.0221 | 1.1672 | 0.4246 | 0.2947 | 1.2020 | 0.0396 | 0.0327 | 1.1825 | 0.0273 | 0.0221 | |
| D[32] | 1.0891 | 0.5408 | 0.3719 | 1.2018 | 0.0397 | 0.0327 | 1.1792 | 0.0274 | 0.0221 | 1.1561 | 0.4423 | 0.3005 | 1.2012 | 0.0397 | 0.0327 | 1.1792 | 0.0274 | 0.0221 | |
| D[33] | 1.0309 | 0.5326 | 0.3611 | 1.2044 | 0.0396 | 0.0327 | 1.1825 | 0.0273 | 0.0221 | 1.0309 | 0.4213 | 0.2892 | 1.2044 | 0.0383 | 0.0326 | 1.1825 | 0.0267 | 0.0220 | |
| D[34] | 1.1617 | 0.4376 | 0.3057 | 1.2276 | 0.0383 | 0.0325 | 1.1549 | 0.0267 | 0.0219 | 1.2272 | 0.3671 | 0.2435 | 1.2306 | 0.0383 | 0.0325 | 1.1580 | 0.0267 | 0.0218 | |
| D[35] | 1.1754 | 0.4844 | 0.3193 | 1.2316 | 0.0383 | 0.0327 | 1.1564 | 0.0267 | 0.0221 | 1.2257 | 0.4151 | 0.2888 | 1.2315 | 0.0383 | 0.0326 | 1.1584 | 0.0267 | 0.0219 | |
| D[36] | 1.2236 | 0.4333 | 0.2849 | 1.2268 | 0.0383 | 0.0324 | 1.1550 | 0.0267 | 0.0218 | 1.2349 | 0.3467 | 0.2251 | 1.2291 | 0.0382 | 0.0324 | 1.1556 | 0.0266 | 0.0218 | |
| Design | MLE | Bayes | MLE | Bayes | |||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Prior → | A | B | A | B | |||||||||||||||
| Set-1→ | |||||||||||||||||||
| (0.2, 0.5) | (0.5, 0.8) | ||||||||||||||||||
| D[11] | 0.7923 | 0.8863 | 0.7707 | 0.8553 | 0.6936 | 0.6839 | 0.7769 | 0.4639 | 0.4548 | 0.7923 | 0.8821 | 0.7707 | 0.8613 | 0.6910 | 0.6717 | 0.7875 | 0.4548 | 0.3969 | |
| D[12] | 0.7877 | 1.2768 | 1.0989 | 0.8820 | 0.7757 | 0.7621 | 0.7886 | 0.5976 | 0.5253 | 0.7879 | 1.1320 | 0.9731 | 0.8723 | 0.7636 | 0.7502 | 0.7831 | 0.5439 | 0.4765 | |
| D[13] | 0.7876 | 0.9727 | 0.8339 | 0.8564 | 0.7372 | 0.7156 | 0.7769 | 0.5038 | 0.4885 | 0.7880 | 0.9027 | 0.7774 | 0.8541 | 0.7251 | 0.7043 | 0.7749 | 0.4903 | 0.4037 | |
| D[14] | 0.7903 | 0.7540 | 0.6689 | 0.7759 | 0.6049 | 0.5301 | 0.8305 | 0.4514 | 0.3945 | 0.9392 | 0.7540 | 0.6689 | 0.9500 | 0.6022 | 0.5223 | 0.9290 | 0.4264 | 0.3784 | |
| D[15] | 0.7866 | 0.7786 | 0.6890 | 0.7693 | 0.6399 | 0.5331 | 0.8217 | 0.4519 | 0.4281 | 0.9380 | 0.7786 | 0.6890 | 0.9500 | 0.6230 | 0.5325 | 0.9289 | 0.4407 | 0.3787 | |
| D[16] | 0.7838 | 0.8821 | 0.7637 | 0.7579 | 0.6792 | 0.5809 | 0.8120 | 0.4529 | 0.4422 | 0.9356 | 0.8690 | 0.7494 | 0.9499 | 0.6781 | 0.5769 | 0.9274 | 0.4521 | 0.3930 | |
| D[21] | 0.7868 | 0.5352 | 0.5070 | 0.7854 | 0.4824 | 0.4564 | 0.8281 | 0.4225 | 0.3653 | 0.7867 | 0.5308 | 0.4969 | 0.7841 | 0.4815 | 0.4459 | 0.8272 | 0.3797 | 0.3607 | |
| D[22] | 0.7789 | 0.5840 | 0.5322 | 0.7596 | 0.5185 | 0.4717 | 0.8143 | 0.4299 | 0.3930 | 0.7812 | 0.5802 | 0.5222 | 0.7624 | 0.5115 | 0.4608 | 0.8153 | 0.4225 | 0.3735 | |
| D[23] | 0.7832 | 0.5393 | 0.5090 | 0.7753 | 0.4879 | 0.4675 | 0.8225 | 0.4264 | 0.3763 | 0.7843 | 0.5381 | 0.4999 | 0.7717 | 0.4878 | 0.4484 | 0.8140 | 0.3844 | 0.3689 | |
| D[24] | 0.7913 | 0.4974 | 0.4401 | 0.8296 | 0.4309 | 0.4164 | 0.8114 | 0.3689 | 0.3520 | 0.7908 | 0.4764 | 0.4334 | 0.8308 | 0.4184 | 0.4127 | 0.8123 | 0.3675 | 0.3489 | |
| D[25] | 0.7902 | 0.5126 | 0.4779 | 0.8293 | 0.4418 | 0.4324 | 0.8101 | 0.3759 | 0.3551 | 0.7906 | 0.4894 | 0.4482 | 0.8310 | 0.4358 | 0.4188 | 0.8135 | 0.3713 | 0.3490 | |
| D[26] | 0.7895 | 0.5219 | 0.4912 | 0.8336 | 0.4603 | 0.4454 | 0.8164 | 0.4199 | 0.3626 | 0.7902 | 0.5167 | 0.4794 | 0.8303 | 0.4462 | 0.4388 | 0.8111 | 0.3790 | 0.3530 | |
| D[31] | 0.7903 | 0.4148 | 0.4015 | 0.8263 | 0.3737 | 0.3593 | 0.8097 | 0.3488 | 0.3213 | 0.7899 | 0.4056 | 0.3915 | 0.8246 | 0.3730 | 0.3499 | 0.8078 | 0.3421 | 0.3187 | |
| D[32] | 0.7862 | 0.4627 | 0.4365 | 0.8300 | 0.4085 | 0.4054 | 0.8145 | 0.3653 | 0.3503 | 0.7868 | 0.4538 | 0.4123 | 0.8292 | 0.4074 | 0.3891 | 0.8133 | 0.3592 | 0.3460 | |
| D[33] | 0.7871 | 0.4312 | 0.4125 | 0.8269 | 0.4017 | 0.3769 | 0.8117 | 0.3557 | 0.3498 | 0.7872 | 0.4223 | 0.4112 | 0.8325 | 0.3852 | 0.3570 | 0.8154 | 0.3536 | 0.3434 | |
| D[34] | 0.7892 | 0.3753 | 0.3687 | 0.7320 | 0.3637 | 0.3290 | 0.7705 | 0.1900 | 0.0556 | 0.7892 | 0.3700 | 0.3667 | 0.7320 | 0.3374 | 0.3273 | 0.7705 | 0.1897 | 0.0505 | |
| D[35] | 0.7892 | 0.3903 | 0.3727 | 0.7300 | 0.3654 | 0.3399 | 0.7700 | 0.1913 | 0.0561 | 0.7893 | 0.3784 | 0.3708 | 0.7300 | 0.3422 | 0.3366 | 0.7700 | 0.1908 | 0.0509 | |
| D[36] | 0.7881 | 0.4075 | 0.3910 | 0.7217 | 0.3717 | 0.3436 | 0.7669 | 0.1916 | 0.0671 | 0.7881 | 0.4018 | 0.3844 | 0.7217 | 0.3571 | 0.3372 | 0.7669 | 0.1910 | 0.0605 | |
| Set-2→ | |||||||||||||||||||
| (0.5, 1.0) | (1.5, 2.5) | ||||||||||||||||||
| D[11] | 0.9392 | 0.3953 | 0.3436 | 0.9500 | 0.2050 | 0.1959 | 0.9290 | 0.0894 | 0.0784 | 0.9377 | 0.3915 | 0.3422 | 0.9499 | 0.2048 | 0.1959 | 0.9275 | 0.0884 | 0.0775 | |
| D[12] | 0.9356 | 0.4203 | 0.3631 | 0.9499 | 0.2155 | 0.2055 | 0.9274 | 0.0905 | 0.0791 | 0.9345 | 0.4164 | 0.3571 | 0.9501 | 0.2153 | 0.2054 | 0.9275 | 0.0894 | 0.0784 | |
| D[13] | 0.9380 | 0.4174 | 0.3591 | 0.9500 | 0.2112 | 0.2017 | 0.9289 | 0.0899 | 0.0786 | 0.9370 | 0.4091 | 0.3528 | 0.9499 | 0.2086 | 0.1992 | 0.9275 | 0.0893 | 0.0782 | |
| D[14] | 0.9344 | 0.3788 | 0.3332 | 0.9285 | 0.1900 | 0.1897 | 0.9382 | 0.0841 | 0.0739 | 0.9344 | 0.3779 | 0.3324 | 0.9285 | 0.1897 | 0.1894 | 0.9382 | 0.0803 | 0.0709 | |
| D[15] | 0.9338 | 0.3849 | 0.3356 | 0.9286 | 0.1913 | 0.1899 | 0.9384 | 0.0877 | 0.0770 | 0.9339 | 0.3803 | 0.3345 | 0.9285 | 0.1908 | 0.1896 | 0.9382 | 0.0844 | 0.0740 | |
| D[16] | 0.9311 | 0.3856 | 0.3386 | 0.9285 | 0.1921 | 0.1914 | 0.9383 | 0.0884 | 0.0775 | 0.9306 | 0.3844 | 0.3374 | 0.9279 | 0.1916 | 0.1910 | 0.9378 | 0.0877 | 0.0770 | |
| D[21] | 0.9334 | 0.3016 | 0.2662 | 0.9287 | 0.0757 | 0.0701 | 0.9385 | 0.0673 | 0.0595 | 0.9334 | 0.2995 | 0.2636 | 0.9287 | 0.0747 | 0.0701 | 0.9385 | 0.0671 | 0.0523 | |
| D[22] | 0.9298 | 0.3436 | 0.2889 | 0.9275 | 0.0835 | 0.0795 | 0.9373 | 0.0703 | 0.0605 | 0.9299 | 0.3358 | 0.2821 | 0.9277 | 0.0819 | 0.0778 | 0.9378 | 0.0682 | 0.0600 | |
| D[23] | 0.9325 | 0.3111 | 0.2730 | 0.9286 | 0.0786 | 0.0742 | 0.9382 | 0.0696 | 0.0600 | 0.9330 | 0.3087 | 0.2730 | 0.9286 | 0.0770 | 0.0701 | 0.9382 | 0.0671 | 0.0528 | |
| D[24] | 0.9358 | 0.2960 | 0.2556 | 0.9384 | 0.0743 | 0.0661 | 0.9347 | 0.0651 | 0.0523 | 0.9360 | 0.2951 | 0.2556 | 0.9384 | 0.0713 | 0.0651 | 0.9347 | 0.0584 | 0.0521 | |
| D[25] | 0.9360 | 0.2989 | 0.2614 | 0.9384 | 0.0746 | 0.0689 | 0.9347 | 0.0651 | 0.0523 | 0.9360 | 0.2960 | 0.2610 | 0.9389 | 0.0734 | 0.0665 | 0.9349 | 0.0594 | 0.0521 | |
| D[26] | 0.9349 | 0.2993 | 0.2634 | 0.9388 | 0.0747 | 0.0701 | 0.9349 | 0.0668 | 0.0528 | 0.9351 | 0.2977 | 0.2612 | 0.9388 | 0.0747 | 0.0688 | 0.9349 | 0.0665 | 0.0521 | |
| D[31] | 0.9340 | 0.2408 | 0.2083 | 0.9390 | 0.0712 | 0.0584 | 0.9348 | 0.0534 | 0.0521 | 0.9340 | 0.2381 | 0.2082 | 0.9390 | 0.0709 | 0.0578 | 0.9348 | 0.0534 | 0.0505 | |
| D[32] | 0.9321 | 0.2591 | 0.2261 | 0.9399 | 0.0735 | 0.0594 | 0.9354 | 0.0580 | 0.0521 | 0.9323 | 0.2580 | 0.2257 | 0.9398 | 0.0712 | 0.0585 | 0.9354 | 0.0573 | 0.0521 | |
| D[33] | 0.9339 | 0.2441 | 0.2130 | 0.9390 | 0.0712 | 0.0584 | 0.9348 | 0.0559 | 0.0521 | 0.9341 | 0.2408 | 0.2130 | 0.9394 | 0.0710 | 0.0580 | 0.9348 | 0.0534 | 0.0509 | |
| D[34] | 0.9316 | 0.2278 | 0.1941 | 0.9145 | 0.0700 | 0.0578 | 0.9300 | 0.0531 | 0.0472 | 0.9315 | 0.2259 | 0.1928 | 0.9145 | 0.0700 | 0.0556 | 0.9299 | 0.0530 | 0.0469 | |
| D[35] | 0.9314 | 0.2296 | 0.1961 | 0.9142 | 0.0706 | 0.0580 | 0.9299 | 0.0534 | 0.0485 | 0.9314 | 0.2269 | 0.1942 | 0.9140 | 0.0700 | 0.0561 | 0.9296 | 0.0533 | 0.0470 | |
| D[36] | 0.9304 | 0.2365 | 0.2077 | 0.9136 | 0.0706 | 0.0580 | 0.9298 | 0.0534 | 0.0490 | 0.9304 | 0.2354 | 0.2028 | 0.9136 | 0.0706 | 0.0578 | 0.9296 | 0.0534 | 0.0474 | |
| Design | MLE | Bayes | MLE | Bayes | |||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Prior → | A | B | A | B | |||||||||||||||
| Set-1→ | |||||||||||||||||||
| (0.2, 0.5) | (0.5, 0.8) | ||||||||||||||||||
| D[11] | 1.4126 | 0.5899 | 0.5670 | 0.8554 | 0.4437 | 0.3953 | 1.2361 | 0.3604 | 0.2376 | 1.4134 | 0.5787 | 0.5481 | 0.8178 | 0.4222 | 0.3882 | 1.1747 | 0.3024 | 0.2053 | |
| D[12] | 1.4542 | 0.8396 | 0.6775 | 0.7015 | 0.6253 | 0.5494 | 1.1667 | 0.4241 | 0.2690 | 1.4349 | 0.7429 | 0.6313 | 0.7574 | 0.5506 | 0.4923 | 1.1946 | 0.3772 | 0.2526 | |
| D[13] | 1.4857 | 0.6574 | 0.6189 | 0.8473 | 0.5178 | 0.4313 | 1.2389 | 0.3807 | 0.2571 | 1.4709 | 0.6436 | 0.6014 | 0.8625 | 0.4442 | 0.4030 | 1.2531 | 0.3531 | 0.2073 | |
| D[14] | 1.3230 | 0.4700 | 0.3290 | 1.3272 | 0.2784 | 0.2504 | 0.9304 | 0.2228 | 0.1499 | 0.9566 | 0.4643 | 0.2881 | 0.7351 | 0.2771 | 0.2441 | 0.9796 | 0.2003 | 0.1445 | |
| D[15] | 1.4002 | 0.4817 | 0.3554 | 1.3666 | 0.3018 | 0.2523 | 0.9844 | 0.2378 | 0.1911 | 0.9925 | 0.4686 | 0.3302 | 0.7346 | 0.2875 | 0.2457 | 0.9801 | 0.2275 | 0.1453 | |
| D[16] | 1.4238 | 0.5408 | 0.3812 | 1.4334 | 0.3475 | 0.2532 | 1.0424 | 0.2450 | 0.2047 | 1.0354 | 0.4775 | 0.3491 | 0.7365 | 0.2892 | 0.2463 | 0.9977 | 0.2288 | 0.1491 | |
| D[21] | 1.3815 | 0.4453 | 0.2739 | 1.2833 | 0.2335 | 0.2153 | 0.9454 | 0.1591 | 0.1340 | 1.3846 | 0.3968 | 0.2606 | 1.2907 | 0.2206 | 0.1785 | 0.9511 | 0.1345 | 0.1178 | |
| D[22] | 1.4452 | 0.4641 | 0.3257 | 1.4196 | 0.2731 | 0.2213 | 1.0205 | 0.1845 | 0.1378 | 1.4014 | 0.4537 | 0.2774 | 1.3999 | 0.2339 | 0.1912 | 1.0109 | 0.1416 | 0.1263 | |
| D[23] | 1.4042 | 0.4506 | 0.2928 | 1.3420 | 0.2630 | 0.2181 | 0.9818 | 0.1600 | 0.1357 | 1.3921 | 0.4471 | 0.2700 | 1.3667 | 0.2207 | 0.1821 | 1.0311 | 0.1396 | 0.1200 | |
| D[24] | 1.3223 | 0.3122 | 0.1979 | 1.1440 | 0.1888 | 0.1764 | 1.1772 | 0.1203 | 0.1161 | 1.3205 | 0.2968 | 0.1942 | 1.1348 | 0.1865 | 0.1706 | 1.1696 | 0.1163 | 0.1059 | |
| D[25] | 1.3397 | 0.3304 | 0.2062 | 1.1456 | 0.2030 | 0.1793 | 1.1849 | 0.1310 | 0.1178 | 1.3296 | 0.2978 | 0.2048 | 1.1309 | 0.1903 | 0.1760 | 1.1624 | 0.1256 | 0.1159 | |
| D[26] | 1.3551 | 0.3309 | 0.2102 | 1.1122 | 0.2062 | 0.1927 | 1.1401 | 0.1340 | 0.1306 | 1.3270 | 0.3034 | 0.2067 | 1.1391 | 0.1970 | 0.1781 | 1.1781 | 0.1307 | 0.1175 | |
| D[31] | 1.3060 | 0.2657 | 0.1823 | 1.1697 | 0.1706 | 0.1584 | 1.1851 | 0.1159 | 0.1086 | 1.3142 | 0.2540 | 0.1823 | 1.1778 | 0.1655 | 0.1543 | 1.2000 | 0.1145 | 0.1022 | |
| D[32] | 1.3591 | 0.3091 | 0.1909 | 1.1427 | 0.1877 | 0.1736 | 1.1642 | 0.1176 | 0.1143 | 1.3402 | 0.2905 | 0.1905 | 1.1498 | 0.1861 | 0.1671 | 1.1716 | 0.1148 | 0.1057 | |
| D[33] | 1.3598 | 0.2861 | 0.1880 | 1.1626 | 0.1849 | 0.1727 | 1.1695 | 0.1161 | 0.1140 | 1.3547 | 0.2803 | 0.1868 | 1.1215 | 0.1813 | 0.1663 | 1.1383 | 0.1146 | 0.1040 | |
| D[34] | 1.2840 | 0.2015 | 0.1671 | 1.3869 | 0.1598 | 0.1249 | 1.2437 | 0.1008 | 0.0570 | 1.2822 | 0.1997 | 0.1652 | 1.3869 | 0.1274 | 0.1022 | 1.2437 | 0.0957 | 0.0479 | |
| D[35] | 1.2871 | 0.2061 | 0.1795 | 1.3906 | 0.1663 | 0.1269 | 1.2464 | 0.1070 | 0.0571 | 1.2844 | 0.2024 | 0.1762 | 1.3906 | 0.1289 | 0.1112 | 1.2464 | 0.1057 | 0.0489 | |
| D[36] | 1.2863 | 0.2068 | 0.1813 | 1.4262 | 0.1671 | 0.1288 | 1.2682 | 0.1138 | 0.0622 | 1.2862 | 0.2066 | 0.1795 | 1.4262 | 0.1290 | 0.1145 | 1.2682 | 0.1059 | 0.0537 | |
| Set-2→ | |||||||||||||||||||
| (0.5, 1.0) | (1.5, 2.5) | ||||||||||||||||||
| D[11] | 0.9566 | 0.4568 | 0.3673 | 0.7351 | 0.2504 | 0.2441 | 0.9796 | 0.1192 | 0.1012 | 0.9714 | 0.4506 | 0.3604 | 0.7368 | 0.2497 | 0.2435 | 0.9970 | 0.1176 | 0.0996 | |
| D[12] | 1.0354 | 0.5520 | 0.4317 | 0.7365 | 0.2540 | 0.2469 | 0.9977 | 0.1208 | 0.1019 | 1.0331 | 0.5408 | 0.4241 | 0.7343 | 0.2532 | 0.2463 | 0.9960 | 0.1192 | 0.1013 | |
| D[13] | 0.9925 | 0.4932 | 0.3876 | 0.7346 | 0.2523 | 0.2457 | 0.9801 | 0.1199 | 0.1013 | 0.9944 | 0.4817 | 0.3807 | 0.7369 | 0.2497 | 0.2436 | 0.9970 | 0.1189 | 0.1005 | |
| D[14] | 0.9677 | 0.3878 | 0.3138 | 1.0065 | 0.2000 | 0.1868 | 0.8721 | 0.1122 | 0.0949 | 0.9685 | 0.3850 | 0.3126 | 1.0065 | 0.1991 | 0.1857 | 0.8721 | 0.1073 | 0.0910 | |
| D[15] | 0.9836 | 0.4061 | 0.3285 | 1.0043 | 0.2062 | 0.1923 | 0.8694 | 0.1173 | 0.0994 | 0.9898 | 0.3985 | 0.3243 | 1.0065 | 0.2027 | 0.1889 | 0.8712 | 0.1132 | 0.0957 | |
| D[16] | 1.0118 | 0.4526 | 0.3612 | 1.0061 | 0.2105 | 0.1961 | 0.8703 | 0.1176 | 0.0996 | 1.0228 | 0.4497 | 0.3548 | 1.0138 | 0.2103 | 0.1959 | 0.8769 | 0.1173 | 0.0994 | |
| D[21] | 0.9884 | 0.3309 | 0.2671 | 1.0039 | 0.0821 | 0.0693 | 0.8684 | 0.0664 | 0.0592 | 0.9884 | 0.3275 | 0.2641 | 1.0039 | 0.0821 | 0.0690 | 0.8684 | 0.0656 | 0.0584 | |
| D[22] | 1.0513 | 0.3822 | 0.2987 | 1.0177 | 0.0881 | 0.0756 | 0.8823 | 0.0723 | 0.0600 | 1.0482 | 0.3816 | 0.2982 | 1.0144 | 0.0862 | 0.0741 | 0.8759 | 0.0699 | 0.0600 | |
| D[23] | 1.0174 | 0.3323 | 0.2786 | 1.0055 | 0.0856 | 0.0715 | 0.8718 | 0.0691 | 0.0592 | 1.0085 | 0.3279 | 0.2780 | 1.0045 | 0.0835 | 0.0703 | 0.8718 | 0.0671 | 0.0592 | |
| D[24] | 0.9859 | 0.3069 | 0.2493 | 0.9064 | 0.0816 | 0.0680 | 0.9329 | 0.0651 | 0.0584 | 0.9804 | 0.3046 | 0.2483 | 0.9064 | 0.0813 | 0.0671 | 0.9329 | 0.0624 | 0.0584 | |
| D[25] | 0.9910 | 0.3080 | 0.2630 | 0.9064 | 0.0820 | 0.0680 | 0.9329 | 0.0656 | 0.0584 | 0.9800 | 0.3080 | 0.2624 | 0.9003 | 0.0816 | 0.0680 | 0.9302 | 0.0651 | 0.0584 | |
| D[26] | 1.0049 | 0.3103 | 0.2664 | 0.9021 | 0.0821 | 0.0691 | 0.9302 | 0.0656 | 0.0584 | 0.9929 | 0.3093 | 0.2640 | 0.9022 | 0.0818 | 0.0688 | 0.9302 | 0.0656 | 0.0584 | |
| D[31] | 0.9981 | 0.2540 | 0.2116 | 0.9006 | 0.0748 | 0.0670 | 0.9321 | 0.0624 | 0.0533 | 0.9981 | 0.2507 | 0.2116 | 0.9006 | 0.0741 | 0.0670 | 0.9321 | 0.0620 | 0.0489 | |
| D[32] | 1.0376 | 0.3064 | 0.2401 | 0.8899 | 0.0813 | 0.0671 | 0.9255 | 0.0649 | 0.0539 | 1.0260 | 0.3007 | 0.2401 | 0.8917 | 0.0813 | 0.0671 | 0.9255 | 0.0622 | 0.0537 | |
| D[33] | 1.0092 | 0.2550 | 0.2158 | 0.9006 | 0.0785 | 0.0671 | 0.9320 | 0.0624 | 0.0539 | 0.9972 | 0.2549 | 0.2122 | 0.8956 | 0.0741 | 0.0670 | 0.9321 | 0.0621 | 0.0489 | |
| D[34] | 0.9749 | 0.2199 | 0.1820 | 1.1356 | 0.0695 | 0.0615 | 0.9789 | 0.0608 | 0.0484 | 0.9731 | 0.2161 | 0.1783 | 1.1366 | 0.0695 | 0.0608 | 0.9796 | 0.0570 | 0.0478 | |
| D[35] | 0.9843 | 0.2249 | 0.1871 | 1.1387 | 0.0733 | 0.0623 | 0.9790 | 0.0608 | 0.0494 | 0.9737 | 0.2193 | 0.1809 | 1.1419 | 0.0695 | 0.0619 | 0.9831 | 0.0571 | 0.0482 | |
| D[36] | 0.9959 | 0.2507 | 0.2070 | 1.1456 | 0.0741 | 0.0623 | 0.9813 | 0.0620 | 0.0499 | 0.9858 | 0.2420 | 0.2010 | 1.1454 | 0.0734 | 0.0622 | 0.9838 | 0.0620 | 0.0483 | |
| Design | ACI[Norm] | ACI[Log-Norm] | BCI | HPD | ||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Prior → | A | B | A | B | ||||||||
| Set-1 | ||||||||||||
| D[11] | 1.083 | 0.934 | 0.396 | 0.972 | 0.312 | 0.977 | 0.278 | 0.979 | 0.242 | 0.981 | 0.218 | 0.982 |
| D[12] | 1.166 | 0.929 | 0.480 | 0.968 | 0.313 | 0.977 | 0.280 | 0.979 | 0.242 | 0.981 | 0.221 | 0.982 |
| D[13] | 0.943 | 0.942 | 0.331 | 0.976 | 0.310 | 0.977 | 0.273 | 0.979 | 0.242 | 0.981 | 0.217 | 0.982 |
| D[14] | 0.786 | 0.950 | 0.321 | 0.976 | 0.273 | 0.979 | 0.227 | 0.982 | 0.218 | 0.982 | 0.146 | 0.986 |
| D[15] | 0.883 | 0.945 | 0.329 | 0.976 | 0.276 | 0.979 | 0.227 | 0.982 | 0.218 | 0.982 | 0.147 | 0.986 |
| D[16] | 0.673 | 0.957 | 0.291 | 0.978 | 0.267 | 0.980 | 0.222 | 0.982 | 0.218 | 0.982 | 0.145 | 0.986 |
| D[21] | 0.637 | 0.959 | 0.271 | 0.979 | 0.228 | 0.982 | 0.220 | 0.982 | 0.147 | 0.986 | 0.143 | 0.987 |
| D[22] | 0.633 | 0.959 | 0.265 | 0.980 | 0.226 | 0.982 | 0.217 | 0.982 | 0.146 | 0.986 | 0.142 | 0.987 |
| D[23] | 0.548 | 0.964 | 0.253 | 0.980 | 0.223 | 0.982 | 0.216 | 0.982 | 0.143 | 0.987 | 0.140 | 0.987 |
| D[24] | 0.485 | 0.967 | 0.246 | 0.981 | 0.215 | 0.983 | 0.155 | 0.986 | 0.142 | 0.987 | 0.129 | 0.987 |
| D[25] | 0.500 | 0.966 | 0.246 | 0.981 | 0.216 | 0.982 | 0.156 | 0.986 | 0.143 | 0.987 | 0.131 | 0.987 |
| D[26] | 0.468 | 0.968 | 0.242 | 0.981 | 0.209 | 0.983 | 0.154 | 0.986 | 0.142 | 0.987 | 0.129 | 0.987 |
| D[31] | 0.363 | 0.974 | 0.188 | 0.984 | 0.155 | 0.986 | 0.129 | 0.987 | 0.121 | 0.988 | 0.111 | 0.988 |
| D[32] | 0.454 | 0.969 | 0.225 | 0.982 | 0.156 | 0.986 | 0.131 | 0.987 | 0.121 | 0.988 | 0.115 | 0.988 |
| D[33] | 0.283 | 0.979 | 0.175 | 0.985 | 0.154 | 0.986 | 0.129 | 0.987 | 0.120 | 0.988 | 0.110 | 0.988 |
| D[34] | 0.276 | 0.979 | 0.154 | 0.986 | 0.122 | 0.988 | 0.120 | 0.988 | 0.111 | 0.988 | 0.100 | 0.989 |
| D[35] | 0.270 | 0.979 | 0.142 | 0.987 | 0.121 | 0.988 | 0.119 | 0.988 | 0.111 | 0.988 | 0.099 | 0.989 |
| D[36] | 0.185 | 0.984 | 0.136 | 0.987 | 0.121 | 0.988 | 0.118 | 0.988 | 0.110 | 0.988 | 0.099 | 0.989 |
| D[11] | 0.946 | 0.941 | 0.369 | 0.974 | 0.304 | 0.977 | 0.272 | 0.979 | 0.241 | 0.981 | 0.216 | 0.982 |
| D[12] | 1.041 | 0.936 | 0.390 | 0.973 | 0.310 | 0.977 | 0.278 | 0.979 | 0.242 | 0.981 | 0.220 | 0.982 |
| D[13] | 0.899 | 0.944 | 0.311 | 0.977 | 0.282 | 0.979 | 0.272 | 0.979 | 0.224 | 0.982 | 0.216 | 0.982 |
| D[14] | 0.610 | 0.960 | 0.273 | 0.979 | 0.270 | 0.979 | 0.222 | 0.982 | 0.213 | 0.983 | 0.143 | 0.987 |
| D[15] | 0.826 | 0.948 | 0.277 | 0.979 | 0.273 | 0.979 | 0.225 | 0.982 | 0.215 | 0.982 | 0.143 | 0.987 |
| D[16] | 0.591 | 0.961 | 0.269 | 0.979 | 0.266 | 0.980 | 0.220 | 0.982 | 0.213 | 0.983 | 0.142 | 0.987 |
| D[21] | 0.496 | 0.967 | 0.263 | 0.980 | 0.221 | 0.982 | 0.213 | 0.983 | 0.144 | 0.986 | 0.141 | 0.987 |
| D[22] | 0.474 | 0.968 | 0.253 | 0.980 | 0.221 | 0.982 | 0.211 | 0.983 | 0.143 | 0.987 | 0.140 | 0.987 |
| D[23] | 0.454 | 0.969 | 0.244 | 0.981 | 0.216 | 0.982 | 0.211 | 0.983 | 0.143 | 0.987 | 0.140 | 0.987 |
| D[24] | 0.348 | 0.975 | 0.219 | 0.982 | 0.205 | 0.983 | 0.155 | 0.986 | 0.139 | 0.987 | 0.127 | 0.987 |
| D[25] | 0.412 | 0.971 | 0.223 | 0.982 | 0.215 | 0.982 | 0.155 | 0.986 | 0.139 | 0.987 | 0.129 | 0.987 |
| D[26] | 0.336 | 0.976 | 0.219 | 0.982 | 0.184 | 0.984 | 0.154 | 0.986 | 0.139 | 0.987 | 0.127 | 0.987 |
| D[31] | 0.229 | 0.982 | 0.158 | 0.986 | 0.155 | 0.986 | 0.127 | 0.987 | 0.119 | 0.988 | 0.101 | 0.989 |
| D[32] | 0.239 | 0.981 | 0.167 | 0.985 | 0.155 | 0.986 | 0.129 | 0.987 | 0.119 | 0.988 | 0.105 | 0.989 |
| D[33] | 0.205 | 0.983 | 0.157 | 0.986 | 0.154 | 0.986 | 0.127 | 0.987 | 0.118 | 0.988 | 0.101 | 0.989 |
| D[34] | 0.168 | 0.985 | 0.150 | 0.986 | 0.121 | 0.988 | 0.119 | 0.988 | 0.111 | 0.988 | 0.100 | 0.989 |
| D[35] | 0.155 | 0.986 | 0.137 | 0.987 | 0.121 | 0.988 | 0.118 | 0.988 | 0.110 | 0.988 | 0.099 | 0.989 |
| D[36] | 0.149 | 0.986 | 0.123 | 0.988 | 0.120 | 0.988 | 0.118 | 0.988 | 0.108 | 0.989 | 0.099 | 0.989 |
| Set-2 | ||||||||||||
| D[11] | 0.946 | 0.940 | 0.400 | 0.973 | 0.296 | 0.979 | 0.271 | 0.980 | 0.242 | 0.982 | 0.220 | 0.983 |
| D[12] | 1.094 | 0.931 | 0.424 | 0.971 | 0.311 | 0.978 | 0.278 | 0.980 | 0.241 | 0.982 | 0.220 | 0.983 |
| D[13] | 0.899 | 0.943 | 0.390 | 0.973 | 0.288 | 0.979 | 0.271 | 0.980 | 0.242 | 0.982 | 0.216 | 0.984 |
| D[14] | 0.527 | 0.965 | 0.263 | 0.981 | 0.208 | 0.984 | 0.204 | 0.984 | 0.142 | 0.988 | 0.139 | 0.988 |
| D[15] | 0.864 | 0.945 | 0.301 | 0.978 | 0.208 | 0.984 | 0.204 | 0.984 | 0.143 | 0.988 | 0.140 | 0.988 |
| D[16] | 0.454 | 0.969 | 0.207 | 0.984 | 0.204 | 0.984 | 0.197 | 0.985 | 0.142 | 0.988 | 0.139 | 0.988 |
| D[21] | 0.452 | 0.969 | 0.207 | 0.984 | 0.204 | 0.984 | 0.192 | 0.985 | 0.141 | 0.988 | 0.138 | 0.988 |
| D[22] | 0.323 | 0.977 | 0.207 | 0.984 | 0.204 | 0.984 | 0.188 | 0.985 | 0.141 | 0.988 | 0.138 | 0.988 |
| D[23] | 0.308 | 0.978 | 0.207 | 0.984 | 0.204 | 0.984 | 0.187 | 0.985 | 0.141 | 0.988 | 0.138 | 0.988 |
| D[24] | 0.283 | 0.980 | 0.185 | 0.985 | 0.160 | 0.987 | 0.156 | 0.987 | 0.132 | 0.989 | 0.131 | 0.989 |
| D[25] | 0.301 | 0.978 | 0.186 | 0.985 | 0.161 | 0.987 | 0.158 | 0.987 | 0.134 | 0.988 | 0.131 | 0.989 |
| D[26] | 0.269 | 0.980 | 0.181 | 0.986 | 0.160 | 0.987 | 0.155 | 0.987 | 0.132 | 0.989 | 0.130 | 0.989 |
| D[31] | 0.245 | 0.982 | 0.170 | 0.986 | 0.122 | 0.989 | 0.119 | 0.989 | 0.113 | 0.990 | 0.102 | 0.990 |
| D[32] | 0.246 | 0.982 | 0.181 | 0.986 | 0.122 | 0.989 | 0.120 | 0.989 | 0.113 | 0.990 | 0.102 | 0.990 |
| D[33] | 0.237 | 0.982 | 0.166 | 0.987 | 0.122 | 0.989 | 0.119 | 0.989 | 0.113 | 0.990 | 0.101 | 0.990 |
| D[34] | 0.160 | 0.987 | 0.127 | 0.989 | 0.120 | 0.989 | 0.118 | 0.989 | 0.106 | 0.990 | 0.094 | 0.991 |
| D[35] | 0.187 | 0.985 | 0.156 | 0.987 | 0.121 | 0.989 | 0.119 | 0.989 | 0.112 | 0.990 | 0.100 | 0.990 |
| D[36] | 0.151 | 0.987 | 0.119 | 0.989 | 0.117 | 0.989 | 0.115 | 0.990 | 0.105 | 0.990 | 0.094 | 0.991 |
| D[11] | 0.915 | 0.942 | 0.312 | 0.978 | 0.278 | 0.980 | 0.270 | 0.980 | 0.235 | 0.982 | 0.215 | 0.984 |
| D[12] | 0.965 | 0.939 | 0.380 | 0.974 | 0.303 | 0.978 | 0.276 | 0.980 | 0.241 | 0.982 | 0.219 | 0.983 |
| D[13] | 0.841 | 0.946 | 0.311 | 0.978 | 0.273 | 0.980 | 0.269 | 0.980 | 0.220 | 0.983 | 0.215 | 0.984 |
| D[14] | 0.527 | 0.965 | 0.247 | 0.982 | 0.207 | 0.984 | 0.204 | 0.984 | 0.142 | 0.988 | 0.139 | 0.988 |
| D[15] | 0.780 | 0.950 | 0.265 | 0.981 | 0.208 | 0.984 | 0.204 | 0.984 | 0.143 | 0.988 | 0.139 | 0.988 |
| D[16] | 0.452 | 0.969 | 0.207 | 0.984 | 0.204 | 0.984 | 0.197 | 0.985 | 0.141 | 0.988 | 0.138 | 0.988 |
| D[21] | 0.345 | 0.976 | 0.207 | 0.984 | 0.204 | 0.984 | 0.190 | 0.985 | 0.141 | 0.988 | 0.138 | 0.988 |
| D[22] | 0.322 | 0.977 | 0.207 | 0.984 | 0.204 | 0.984 | 0.186 | 0.985 | 0.141 | 0.988 | 0.138 | 0.988 |
| D[23] | 0.304 | 0.978 | 0.206 | 0.984 | 0.203 | 0.984 | 0.182 | 0.986 | 0.141 | 0.988 | 0.138 | 0.988 |
| D[24] | 0.273 | 0.980 | 0.177 | 0.986 | 0.160 | 0.987 | 0.156 | 0.987 | 0.131 | 0.989 | 0.130 | 0.989 |
| D[25] | 0.292 | 0.979 | 0.181 | 0.986 | 0.160 | 0.987 | 0.156 | 0.987 | 0.132 | 0.989 | 0.131 | 0.989 |
| D[26] | 0.258 | 0.981 | 0.175 | 0.986 | 0.159 | 0.987 | 0.154 | 0.987 | 0.131 | 0.989 | 0.130 | 0.989 |
| D[31] | 0.218 | 0.983 | 0.169 | 0.986 | 0.122 | 0.989 | 0.119 | 0.989 | 0.113 | 0.990 | 0.101 | 0.990 |
| D[32] | 0.237 | 0.982 | 0.171 | 0.986 | 0.122 | 0.989 | 0.119 | 0.989 | 0.113 | 0.990 | 0.102 | 0.990 |
| D[33] | 0.203 | 0.984 | 0.165 | 0.987 | 0.121 | 0.989 | 0.119 | 0.989 | 0.112 | 0.990 | 0.100 | 0.990 |
| D[34] | 0.157 | 0.987 | 0.122 | 0.989 | 0.120 | 0.989 | 0.115 | 0.990 | 0.105 | 0.990 | 0.094 | 0.991 |
| D[35] | 0.181 | 0.986 | 0.153 | 0.987 | 0.121 | 0.989 | 0.119 | 0.989 | 0.111 | 0.990 | 0.097 | 0.991 |
| D[36] | 0.142 | 0.988 | 0.119 | 0.989 | 0.117 | 0.989 | 0.111 | 0.990 | 0.102 | 0.990 | 0.091 | 0.991 |
| Design | ACI[Norm] | ACI[Log-Norm] | BCI | HPD | ||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Prior → | A | B | A | B | ||||||||
| Set-1 | ||||||||||||
| D[11] | 1.190 | 0.927 | 0.854 | 0.944 | 0.214 | 0.977 | 0.209 | 0.977 | 0.207 | 0.977 | 0.185 | 0.978 |
| D[12] | 1.015 | 0.936 | 0.845 | 0.945 | 0.213 | 0.977 | 0.208 | 0.977 | 0.204 | 0.977 | 0.184 | 0.978 |
| D[13] | 1.377 | 0.918 | 1.147 | 0.930 | 0.215 | 0.977 | 0.211 | 0.977 | 0.209 | 0.977 | 0.188 | 0.978 |
| D[14] | 0.981 | 0.938 | 0.753 | 0.950 | 0.212 | 0.977 | 0.183 | 0.978 | 0.167 | 0.979 | 0.159 | 0.980 |
| D[15] | 0.967 | 0.939 | 0.601 | 0.957 | 0.205 | 0.977 | 0.180 | 0.979 | 0.163 | 0.979 | 0.156 | 0.980 |
| D[16] | 0.969 | 0.939 | 0.745 | 0.950 | 0.209 | 0.977 | 0.183 | 0.978 | 0.165 | 0.979 | 0.156 | 0.980 |
| D[21] | 0.737 | 0.950 | 0.562 | 0.959 | 0.168 | 0.979 | 0.163 | 0.979 | 0.158 | 0.980 | 0.150 | 0.980 |
| D[22] | 0.604 | 0.957 | 0.526 | 0.961 | 0.166 | 0.979 | 0.160 | 0.980 | 0.156 | 0.980 | 0.148 | 0.980 |
| D[23] | 0.796 | 0.947 | 0.583 | 0.958 | 0.171 | 0.979 | 0.163 | 0.979 | 0.158 | 0.980 | 0.151 | 0.980 |
| D[24] | 0.510 | 0.962 | 0.481 | 0.963 | 0.165 | 0.979 | 0.159 | 0.980 | 0.155 | 0.980 | 0.141 | 0.981 |
| D[25] | 0.479 | 0.963 | 0.462 | 0.964 | 0.162 | 0.979 | 0.154 | 0.980 | 0.149 | 0.980 | 0.137 | 0.981 |
| D[26] | 0.499 | 0.962 | 0.472 | 0.964 | 0.165 | 0.979 | 0.159 | 0.980 | 0.152 | 0.980 | 0.137 | 0.981 |
| D[31] | 0.466 | 0.964 | 0.447 | 0.965 | 0.159 | 0.980 | 0.141 | 0.981 | 0.125 | 0.981 | 0.122 | 0.982 |
| D[32] | 0.423 | 0.966 | 0.382 | 0.968 | 0.156 | 0.980 | 0.134 | 0.981 | 0.123 | 0.981 | 0.121 | 0.982 |
| D[33] | 0.458 | 0.964 | 0.426 | 0.966 | 0.159 | 0.980 | 0.136 | 0.981 | 0.124 | 0.981 | 0.121 | 0.982 |
| D[34] | 0.393 | 0.968 | 0.359 | 0.970 | 0.127 | 0.981 | 0.126 | 0.981 | 0.123 | 0.981 | 0.120 | 0.982 |
| D[35] | 0.334 | 0.971 | 0.325 | 0.971 | 0.122 | 0.981 | 0.119 | 0.982 | 0.119 | 0.982 | 0.093 | 0.983 |
| D[36] | 0.353 | 0.970 | 0.327 | 0.971 | 0.124 | 0.981 | 0.124 | 0.981 | 0.122 | 0.981 | 0.120 | 0.982 |
| D[11] | 1.009 | 0.937 | 0.803 | 0.947 | 0.213 | 0.977 | 0.207 | 0.977 | 0.206 | 0.977 | 0.173 | 0.979 |
| D[12] | 0.937 | 0.940 | 0.693 | 0.953 | 0.212 | 0.977 | 0.205 | 0.977 | 0.201 | 0.978 | 0.172 | 0.979 |
| D[13] | 1.218 | 0.926 | 0.970 | 0.939 | 0.214 | 0.977 | 0.210 | 0.977 | 0.208 | 0.977 | 0.175 | 0.979 |
| D[14] | 0.865 | 0.944 | 0.594 | 0.958 | 0.211 | 0.977 | 0.175 | 0.979 | 0.165 | 0.979 | 0.151 | 0.980 |
| D[15] | 0.796 | 0.947 | 0.569 | 0.959 | 0.203 | 0.977 | 0.171 | 0.979 | 0.161 | 0.980 | 0.149 | 0.980 |
| D[16] | 0.821 | 0.946 | 0.581 | 0.958 | 0.209 | 0.977 | 0.173 | 0.979 | 0.163 | 0.979 | 0.149 | 0.980 |
| D[21] | 0.629 | 0.956 | 0.533 | 0.961 | 0.166 | 0.979 | 0.160 | 0.980 | 0.153 | 0.980 | 0.146 | 0.980 |
| D[22] | 0.553 | 0.960 | 0.502 | 0.962 | 0.166 | 0.979 | 0.160 | 0.980 | 0.153 | 0.980 | 0.145 | 0.980 |
| D[23] | 0.666 | 0.954 | 0.553 | 0.960 | 0.171 | 0.979 | 0.160 | 0.980 | 0.155 | 0.980 | 0.147 | 0.980 |
| D[24] | 0.481 | 0.963 | 0.479 | 0.963 | 0.164 | 0.979 | 0.157 | 0.980 | 0.148 | 0.980 | 0.140 | 0.981 |
| D[25] | 0.466 | 0.964 | 0.445 | 0.965 | 0.160 | 0.980 | 0.153 | 0.980 | 0.142 | 0.980 | 0.137 | 0.981 |
| D[26] | 0.479 | 0.963 | 0.464 | 0.964 | 0.162 | 0.979 | 0.157 | 0.980 | 0.144 | 0.980 | 0.137 | 0.981 |
| D[31] | 0.461 | 0.964 | 0.435 | 0.966 | 0.157 | 0.980 | 0.140 | 0.981 | 0.125 | 0.981 | 0.121 | 0.982 |
| D[32] | 0.393 | 0.968 | 0.382 | 0.968 | 0.154 | 0.980 | 0.133 | 0.981 | 0.121 | 0.982 | 0.120 | 0.982 |
| D[33] | 0.433 | 0.966 | 0.414 | 0.967 | 0.157 | 0.980 | 0.135 | 0.981 | 0.122 | 0.981 | 0.121 | 0.982 |
| D[34] | 0.361 | 0.969 | 0.350 | 0.970 | 0.127 | 0.981 | 0.126 | 0.981 | 0.120 | 0.982 | 0.119 | 0.982 |
| D[35] | 0.327 | 0.971 | 0.319 | 0.972 | 0.121 | 0.982 | 0.119 | 0.982 | 0.117 | 0.982 | 0.092 | 0.983 |
| D[36] | 0.345 | 0.970 | 0.320 | 0.971 | 0.124 | 0.981 | 0.123 | 0.981 | 0.120 | 0.982 | 0.117 | 0.982 |
| Set-2 | ||||||||||||
| D[11] | 1.021 | 0.929 | 0.323 | 0.971 | 0.213 | 0.977 | 0.207 | 0.978 | 0.191 | 0.979 | 0.183 | 0.979 |
| D[12] | 0.899 | 0.936 | 0.321 | 0.971 | 0.212 | 0.977 | 0.207 | 0.978 | 0.189 | 0.979 | 0.183 | 0.979 |
| D[13] | 1.139 | 0.922 | 0.387 | 0.967 | 0.214 | 0.977 | 0.210 | 0.978 | 0.191 | 0.979 | 0.185 | 0.979 |
| D[14] | 0.885 | 0.937 | 0.313 | 0.971 | 0.208 | 0.978 | 0.192 | 0.979 | 0.149 | 0.981 | 0.146 | 0.981 |
| D[15] | 0.648 | 0.951 | 0.294 | 0.973 | 0.205 | 0.978 | 0.190 | 0.979 | 0.147 | 0.981 | 0.144 | 0.982 |
| D[16] | 0.833 | 0.940 | 0.295 | 0.973 | 0.207 | 0.978 | 0.192 | 0.979 | 0.149 | 0.981 | 0.146 | 0.981 |
| D[21] | 0.639 | 0.952 | 0.247 | 0.975 | 0.165 | 0.980 | 0.159 | 0.981 | 0.140 | 0.982 | 0.139 | 0.982 |
| D[22] | 0.482 | 0.961 | 0.236 | 0.976 | 0.163 | 0.980 | 0.157 | 0.981 | 0.140 | 0.982 | 0.139 | 0.982 |
| D[23] | 0.647 | 0.951 | 0.265 | 0.974 | 0.168 | 0.980 | 0.162 | 0.980 | 0.140 | 0.982 | 0.139 | 0.982 |
| D[24] | 0.409 | 0.966 | 0.231 | 0.976 | 0.163 | 0.980 | 0.156 | 0.981 | 0.138 | 0.982 | 0.137 | 0.982 |
| D[25] | 0.360 | 0.969 | 0.203 | 0.978 | 0.162 | 0.981 | 0.153 | 0.981 | 0.138 | 0.982 | 0.134 | 0.982 |
| D[26] | 0.382 | 0.967 | 0.205 | 0.978 | 0.163 | 0.980 | 0.156 | 0.981 | 0.138 | 0.982 | 0.136 | 0.982 |
| D[31] | 0.322 | 0.971 | 0.193 | 0.979 | 0.129 | 0.982 | 0.128 | 0.983 | 0.126 | 0.983 | 0.095 | 0.984 |
| D[32] | 0.271 | 0.974 | 0.191 | 0.979 | 0.129 | 0.982 | 0.127 | 0.983 | 0.126 | 0.983 | 0.094 | 0.985 |
| D[33] | 0.339 | 0.970 | 0.194 | 0.979 | 0.129 | 0.982 | 0.128 | 0.983 | 0.127 | 0.983 | 0.098 | 0.984 |
| D[34] | 0.265 | 0.974 | 0.188 | 0.979 | 0.127 | 0.983 | 0.127 | 0.983 | 0.125 | 0.983 | 0.091 | 0.985 |
| D[35] | 0.245 | 0.976 | 0.151 | 0.981 | 0.112 | 0.984 | 0.108 | 0.984 | 0.087 | 0.985 | 0.087 | 0.985 |
| D[36] | 0.259 | 0.975 | 0.165 | 0.980 | 0.115 | 0.983 | 0.111 | 0.984 | 0.090 | 0.985 | 0.090 | 0.985 |
| D[11] | 0.967 | 0.932 | 0.318 | 0.971 | 0.210 | 0.978 | 0.202 | 0.978 | 0.187 | 0.979 | 0.183 | 0.979 |
| D[12] | 0.887 | 0.937 | 0.305 | 0.972 | 0.208 | 0.978 | 0.201 | 0.978 | 0.185 | 0.979 | 0.180 | 0.979 |
| D[13] | 1.024 | 0.929 | 0.323 | 0.971 | 0.212 | 0.978 | 0.204 | 0.978 | 0.188 | 0.979 | 0.183 | 0.979 |
| D[14] | 0.852 | 0.939 | 0.294 | 0.973 | 0.204 | 0.978 | 0.191 | 0.979 | 0.149 | 0.981 | 0.146 | 0.981 |
| D[15] | 0.629 | 0.953 | 0.291 | 0.973 | 0.202 | 0.978 | 0.189 | 0.979 | 0.147 | 0.981 | 0.144 | 0.982 |
| D[16] | 0.788 | 0.943 | 0.291 | 0.973 | 0.202 | 0.978 | 0.191 | 0.979 | 0.149 | 0.981 | 0.145 | 0.981 |
| D[21] | 0.604 | 0.954 | 0.233 | 0.976 | 0.163 | 0.980 | 0.157 | 0.981 | 0.140 | 0.982 | 0.139 | 0.982 |
| D[22] | 0.413 | 0.965 | 0.225 | 0.977 | 0.162 | 0.980 | 0.157 | 0.981 | 0.140 | 0.982 | 0.139 | 0.982 |
| D[23] | 0.610 | 0.954 | 0.259 | 0.975 | 0.166 | 0.980 | 0.162 | 0.980 | 0.140 | 0.982 | 0.139 | 0.982 |
| D[24] | 0.376 | 0.968 | 0.219 | 0.977 | 0.162 | 0.981 | 0.155 | 0.981 | 0.138 | 0.982 | 0.137 | 0.982 |
| D[25] | 0.339 | 0.970 | 0.194 | 0.979 | 0.160 | 0.981 | 0.153 | 0.981 | 0.138 | 0.982 | 0.133 | 0.982 |
| D[26] | 0.371 | 0.968 | 0.200 | 0.978 | 0.160 | 0.981 | 0.153 | 0.981 | 0.138 | 0.982 | 0.135 | 0.982 |
| D[31] | 0.285 | 0.973 | 0.192 | 0.979 | 0.129 | 0.982 | 0.128 | 0.983 | 0.126 | 0.983 | 0.094 | 0.985 |
| D[32] | 0.270 | 0.974 | 0.188 | 0.979 | 0.128 | 0.983 | 0.127 | 0.983 | 0.125 | 0.983 | 0.094 | 0.985 |
| D[33] | 0.337 | 0.970 | 0.193 | 0.979 | 0.129 | 0.982 | 0.128 | 0.983 | 0.126 | 0.983 | 0.097 | 0.984 |
| D[34] | 0.262 | 0.975 | 0.168 | 0.980 | 0.127 | 0.983 | 0.126 | 0.983 | 0.125 | 0.983 | 0.091 | 0.985 |
| D[35] | 0.155 | 0.981 | 0.129 | 0.982 | 0.111 | 0.984 | 0.107 | 0.984 | 0.087 | 0.985 | 0.087 | 0.985 |
| D[36] | 0.250 | 0.975 | 0.155 | 0.981 | 0.114 | 0.983 | 0.110 | 0.984 | 0.090 | 0.985 | 0.090 | 0.985 |
| Design | ACI[Norm] | ACI[Log-Norm] | BCI | HPD | ||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Prior → | A | B | A | B | ||||||||
| Set-1 | ||||||||||||
| D[11] | 0.720 | 0.941 | 0.307 | 0.970 | 0.269 | 0.973 | 0.243 | 0.975 | 0.193 | 0.978 | 0.184 | 0.979 |
| D[12] | 0.815 | 0.934 | 0.315 | 0.969 | 0.277 | 0.972 | 0.245 | 0.974 | 0.211 | 0.977 | 0.196 | 0.978 |
| D[13] | 0.676 | 0.944 | 0.303 | 0.970 | 0.265 | 0.973 | 0.233 | 0.975 | 0.188 | 0.978 | 0.183 | 0.979 |
| D[14] | 0.366 | 0.966 | 0.200 | 0.978 | 0.195 | 0.978 | 0.184 | 0.979 | 0.176 | 0.979 | 0.138 | 0.982 |
| D[15] | 0.430 | 0.961 | 0.212 | 0.977 | 0.201 | 0.977 | 0.187 | 0.978 | 0.183 | 0.979 | 0.140 | 0.982 |
| D[16] | 0.347 | 0.967 | 0.191 | 0.978 | 0.187 | 0.979 | 0.178 | 0.979 | 0.172 | 0.980 | 0.138 | 0.982 |
| D[21] | 0.345 | 0.967 | 0.190 | 0.978 | 0.178 | 0.979 | 0.154 | 0.981 | 0.138 | 0.982 | 0.137 | 0.982 |
| Set-1 | ||||||||||||
| D[22] | 0.319 | 0.969 | 0.182 | 0.979 | 0.173 | 0.980 | 0.153 | 0.981 | 0.135 | 0.982 | 0.135 | 0.982 |
| D[23] | 0.315 | 0.969 | 0.182 | 0.979 | 0.164 | 0.980 | 0.152 | 0.981 | 0.132 | 0.982 | 0.132 | 0.982 |
| D[24] | 0.279 | 0.972 | 0.124 | 0.983 | 0.122 | 0.983 | 0.117 | 0.983 | 0.115 | 0.984 | 0.113 | 0.984 |
| D[25] | 0.304 | 0.970 | 0.129 | 0.983 | 0.124 | 0.983 | 0.117 | 0.983 | 0.116 | 0.984 | 0.114 | 0.984 |
| D[26] | 0.223 | 0.976 | 0.123 | 0.983 | 0.121 | 0.983 | 0.117 | 0.983 | 0.115 | 0.984 | 0.113 | 0.984 |
| D[31] | 0.201 | 0.978 | 0.123 | 0.983 | 0.121 | 0.983 | 0.117 | 0.983 | 0.115 | 0.984 | 0.113 | 0.984 |
| D[32] | 0.218 | 0.976 | 0.123 | 0.983 | 0.121 | 0.983 | 0.117 | 0.983 | 0.115 | 0.984 | 0.113 | 0.984 |
| D[33] | 0.188 | 0.978 | 0.122 | 0.983 | 0.120 | 0.983 | 0.117 | 0.983 | 0.114 | 0.984 | 0.113 | 0.984 |
| D[34] | 0.149 | 0.981 | 0.122 | 0.983 | 0.120 | 0.983 | 0.106 | 0.984 | 0.103 | 0.984 | 0.098 | 0.985 |
| D[35] | 0.122 | 0.983 | 0.118 | 0.983 | 0.116 | 0.984 | 0.105 | 0.984 | 0.101 | 0.985 | 0.096 | 0.985 |
| D[36] | 0.118 | 0.983 | 0.116 | 0.984 | 0.106 | 0.984 | 0.105 | 0.984 | 0.101 | 0.985 | 0.095 | 0.985 |
| D[11] | 0.453 | 0.960 | 0.301 | 0.970 | 0.263 | 0.973 | 0.194 | 0.978 | 0.189 | 0.978 | 0.179 | 0.979 |
| D[12] | 0.565 | 0.952 | 0.312 | 0.970 | 0.274 | 0.972 | 0.214 | 0.977 | 0.207 | 0.977 | 0.192 | 0.978 |
| D[13] | 0.393 | 0.964 | 0.301 | 0.970 | 0.262 | 0.973 | 0.192 | 0.978 | 0.187 | 0.979 | 0.179 | 0.979 |
| D[14] | 0.360 | 0.966 | 0.195 | 0.978 | 0.190 | 0.978 | 0.180 | 0.979 | 0.142 | 0.982 | 0.138 | 0.982 |
| D[15] | 0.382 | 0.965 | 0.207 | 0.977 | 0.192 | 0.978 | 0.187 | 0.979 | 0.146 | 0.981 | 0.140 | 0.982 |
| D[16] | 0.342 | 0.968 | 0.190 | 0.978 | 0.181 | 0.979 | 0.173 | 0.980 | 0.141 | 0.982 | 0.138 | 0.982 |
| D[21] | 0.302 | 0.970 | 0.183 | 0.979 | 0.173 | 0.980 | 0.154 | 0.981 | 0.138 | 0.982 | 0.137 | 0.982 |
| D[22] | 0.285 | 0.972 | 0.177 | 0.979 | 0.166 | 0.980 | 0.153 | 0.981 | 0.135 | 0.982 | 0.135 | 0.982 |
| D[23] | 0.280 | 0.972 | 0.176 | 0.979 | 0.163 | 0.980 | 0.151 | 0.981 | 0.132 | 0.982 | 0.132 | 0.982 |
| D[24] | 0.221 | 0.976 | 0.123 | 0.983 | 0.121 | 0.983 | 0.117 | 0.983 | 0.115 | 0.984 | 0.113 | 0.984 |
| D[25] | 0.277 | 0.972 | 0.126 | 0.983 | 0.121 | 0.983 | 0.117 | 0.983 | 0.115 | 0.984 | 0.113 | 0.984 |
| D[26] | 0.216 | 0.976 | 0.122 | 0.983 | 0.120 | 0.983 | 0.117 | 0.983 | 0.115 | 0.984 | 0.113 | 0.984 |
| D[31] | 0.176 | 0.979 | 0.121 | 0.983 | 0.118 | 0.983 | 0.117 | 0.983 | 0.115 | 0.984 | 0.113 | 0.984 |
| D[32] | 0.201 | 0.978 | 0.122 | 0.983 | 0.120 | 0.983 | 0.117 | 0.983 | 0.115 | 0.984 | 0.113 | 0.984 |
| D[33] | 0.156 | 0.981 | 0.120 | 0.983 | 0.117 | 0.983 | 0.117 | 0.983 | 0.114 | 0.984 | 0.112 | 0.984 |
| D[34] | 0.122 | 0.983 | 0.120 | 0.983 | 0.117 | 0.983 | 0.106 | 0.984 | 0.102 | 0.985 | 0.097 | 0.985 |
| D[35] | 0.120 | 0.983 | 0.117 | 0.984 | 0.111 | 0.984 | 0.105 | 0.984 | 0.101 | 0.985 | 0.096 | 0.985 |
| D[36] | 0.118 | 0.983 | 0.114 | 0.984 | 0.105 | 0.984 | 0.105 | 0.984 | 0.101 | 0.985 | 0.095 | 0.985 |
| Set-2 | ||||||||||||
| D[11] | 0.720 | 0.942 | 0.307 | 0.971 | 0.243 | 0.976 | 0.213 | 0.978 | 0.198 | 0.979 | 0.187 | 0.980 |
| D[12] | 0.799 | 0.936 | 0.312 | 0.971 | 0.245 | 0.976 | 0.225 | 0.977 | 0.208 | 0.978 | 0.188 | 0.980 |
| D[13] | 0.631 | 0.948 | 0.303 | 0.971 | 0.233 | 0.977 | 0.208 | 0.978 | 0.190 | 0.980 | 0.172 | 0.981 |
| D[14] | 0.337 | 0.969 | 0.176 | 0.981 | 0.172 | 0.981 | 0.157 | 0.982 | 0.141 | 0.983 | 0.108 | 0.986 |
| D[15] | 0.467 | 0.960 | 0.177 | 0.981 | 0.173 | 0.981 | 0.159 | 0.982 | 0.143 | 0.983 | 0.108 | 0.985 |
| D[16] | 0.314 | 0.971 | 0.175 | 0.981 | 0.171 | 0.981 | 0.156 | 0.982 | 0.138 | 0.983 | 0.105 | 0.986 |
| D[21] | 0.309 | 0.971 | 0.173 | 0.981 | 0.170 | 0.981 | 0.141 | 0.983 | 0.108 | 0.985 | 0.104 | 0.986 |
| D[22] | 0.295 | 0.972 | 0.173 | 0.981 | 0.169 | 0.981 | 0.139 | 0.983 | 0.108 | 0.986 | 0.104 | 0.986 |
| D[23] | 0.271 | 0.974 | 0.173 | 0.981 | 0.169 | 0.981 | 0.136 | 0.983 | 0.105 | 0.986 | 0.103 | 0.986 |
| D[24] | 0.232 | 0.977 | 0.123 | 0.984 | 0.118 | 0.985 | 0.114 | 0.985 | 0.103 | 0.986 | 0.101 | 0.986 |
| D[25] | 0.249 | 0.975 | 0.123 | 0.984 | 0.118 | 0.985 | 0.115 | 0.985 | 0.105 | 0.986 | 0.102 | 0.986 |
| D[26] | 0.218 | 0.978 | 0.123 | 0.984 | 0.118 | 0.985 | 0.113 | 0.985 | 0.100 | 0.986 | 0.099 | 0.986 |
| D[31] | 0.217 | 0.978 | 0.122 | 0.984 | 0.117 | 0.985 | 0.112 | 0.985 | 0.098 | 0.986 | 0.097 | 0.986 |
| D[32] | 0.217 | 0.978 | 0.122 | 0.984 | 0.118 | 0.985 | 0.112 | 0.985 | 0.099 | 0.986 | 0.098 | 0.986 |
| D[33] | 0.202 | 0.979 | 0.122 | 0.985 | 0.117 | 0.985 | 0.111 | 0.985 | 0.098 | 0.986 | 0.097 | 0.986 |
| D[34] | 0.169 | 0.981 | 0.113 | 0.985 | 0.112 | 0.985 | 0.109 | 0.985 | 0.095 | 0.986 | 0.093 | 0.987 |
| D[35] | 0.174 | 0.981 | 0.113 | 0.985 | 0.112 | 0.985 | 0.111 | 0.985 | 0.096 | 0.986 | 0.094 | 0.987 |
| D[36] | 0.167 | 0.981 | 0.112 | 0.985 | 0.111 | 0.985 | 0.095 | 0.986 | 0.094 | 0.986 | 0.088 | 0.987 |
| D[11] | 0.453 | 0.961 | 0.269 | 0.974 | 0.231 | 0.977 | 0.207 | 0.978 | 0.189 | 0.980 | 0.171 | 0.981 |
| D[12] | 0.467 | 0.960 | 0.274 | 0.974 | 0.241 | 0.976 | 0.221 | 0.977 | 0.196 | 0.979 | 0.177 | 0.981 |
| D[13] | 0.393 | 0.965 | 0.265 | 0.974 | 0.231 | 0.977 | 0.198 | 0.979 | 0.189 | 0.980 | 0.171 | 0.981 |
| D[14] | 0.314 | 0.971 | 0.175 | 0.981 | 0.171 | 0.981 | 0.155 | 0.982 | 0.141 | 0.983 | 0.107 | 0.986 |
| D[15] | 0.360 | 0.967 | 0.177 | 0.981 | 0.172 | 0.981 | 0.156 | 0.982 | 0.142 | 0.983 | 0.108 | 0.986 |
| D[16] | 0.309 | 0.971 | 0.173 | 0.981 | 0.169 | 0.981 | 0.155 | 0.982 | 0.138 | 0.983 | 0.105 | 0.986 |
| D[21] | 0.309 | 0.971 | 0.173 | 0.981 | 0.169 | 0.981 | 0.141 | 0.983 | 0.108 | 0.986 | 0.104 | 0.986 |
| D[22] | 0.293 | 0.972 | 0.173 | 0.981 | 0.169 | 0.981 | 0.136 | 0.984 | 0.107 | 0.986 | 0.102 | 0.986 |
| D[23] | 0.246 | 0.976 | 0.172 | 0.981 | 0.168 | 0.981 | 0.135 | 0.984 | 0.105 | 0.986 | 0.101 | 0.986 |
| D[24] | 0.196 | 0.979 | 0.122 | 0.984 | 0.118 | 0.985 | 0.114 | 0.985 | 0.101 | 0.986 | 0.100 | 0.986 |
| D[25] | 0.236 | 0.976 | 0.122 | 0.984 | 0.118 | 0.985 | 0.115 | 0.985 | 0.103 | 0.986 | 0.100 | 0.986 |
| D[26] | 0.187 | 0.980 | 0.122 | 0.984 | 0.117 | 0.985 | 0.112 | 0.985 | 0.099 | 0.986 | 0.098 | 0.986 |
| D[31] | 0.166 | 0.981 | 0.122 | 0.985 | 0.117 | 0.985 | 0.111 | 0.985 | 0.097 | 0.986 | 0.097 | 0.986 |
| D[32] | 0.169 | 0.981 | 0.122 | 0.984 | 0.117 | 0.985 | 0.111 | 0.985 | 0.098 | 0.986 | 0.097 | 0.986 |
| D[33] | 0.163 | 0.982 | 0.122 | 0.985 | 0.117 | 0.985 | 0.110 | 0.985 | 0.097 | 0.986 | 0.095 | 0.986 |
| D[34] | 0.140 | 0.983 | 0.112 | 0.985 | 0.111 | 0.985 | 0.108 | 0.986 | 0.094 | 0.986 | 0.092 | 0.987 |
| D[35] | 0.161 | 0.982 | 0.112 | 0.985 | 0.112 | 0.985 | 0.108 | 0.985 | 0.095 | 0.986 | 0.093 | 0.987 |
| D[36] | 0.132 | 0.984 | 0.111 | 0.985 | 0.111 | 0.985 | 0.094 | 0.986 | 0.094 | 0.987 | 0.087 | 0.987 |
| Design | ACI[Norm] | ACI[Log-Norm] | BCI | HPD | ||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Prior → | A | B | A | B | ||||||||
| Set-1 | ||||||||||||
| D[11] | 0.251 | 0.949 | 0.235 | 0.953 | 0.148 | 0.971 | 0.140 | 0.972 | 0.137 | 0.973 | 0.123 | 0.976 |
| D[12] | 0.297 | 0.940 | 0.279 | 0.944 | 0.206 | 0.959 | 0.177 | 0.965 | 0.175 | 0.965 | 0.152 | 0.970 |
| D[13] | 0.274 | 0.945 | 0.247 | 0.950 | 0.168 | 0.966 | 0.148 | 0.970 | 0.142 | 0.972 | 0.131 | 0.974 |
| D[14] | 0.237 | 0.952 | 0.198 | 0.960 | 0.135 | 0.973 | 0.132 | 0.974 | 0.116 | 0.977 | 0.114 | 0.977 |
| D[15] | 0.240 | 0.952 | 0.201 | 0.960 | 0.135 | 0.973 | 0.133 | 0.974 | 0.118 | 0.977 | 0.116 | 0.977 |
| D[16] | 0.246 | 0.950 | 0.230 | 0.954 | 0.138 | 0.973 | 0.134 | 0.973 | 0.121 | 0.976 | 0.121 | 0.976 |
| D[21] | 0.200 | 0.960 | 0.182 | 0.964 | 0.115 | 0.977 | 0.110 | 0.978 | 0.097 | 0.981 | 0.090 | 0.982 |
| D[22] | 0.236 | 0.952 | 0.188 | 0.962 | 0.116 | 0.977 | 0.113 | 0.978 | 0.106 | 0.979 | 0.097 | 0.981 |
| D[23] | 0.215 | 0.957 | 0.184 | 0.963 | 0.116 | 0.977 | 0.113 | 0.978 | 0.102 | 0.980 | 0.091 | 0.982 |
| D[24] | 0.184 | 0.963 | 0.153 | 0.970 | 0.110 | 0.978 | 0.091 | 0.982 | 0.090 | 0.983 | 0.079 | 0.985 |
| D[25] | 0.184 | 0.963 | 0.158 | 0.969 | 0.111 | 0.978 | 0.096 | 0.981 | 0.090 | 0.982 | 0.081 | 0.984 |
| D[26] | 0.188 | 0.962 | 0.176 | 0.965 | 0.113 | 0.978 | 0.096 | 0.981 | 0.090 | 0.982 | 0.081 | 0.984 |
| D[31] | 0.154 | 0.969 | 0.145 | 0.971 | 0.090 | 0.983 | 0.088 | 0.983 | 0.080 | 0.985 | 0.071 | 0.986 |
| D[32] | 0.177 | 0.965 | 0.146 | 0.971 | 0.090 | 0.982 | 0.090 | 0.983 | 0.082 | 0.984 | 0.078 | 0.985 |
| D[33] | 0.158 | 0.969 | 0.146 | 0.971 | 0.090 | 0.982 | 0.089 | 0.983 | 0.080 | 0.984 | 0.078 | 0.985 |
| D[34] | 0.145 | 0.971 | 0.129 | 0.975 | 0.087 | 0.983 | 0.072 | 0.986 | 0.028 | 0.995 | 0.014 | 0.998 |
| D[35] | 0.145 | 0.971 | 0.129 | 0.974 | 0.088 | 0.983 | 0.077 | 0.985 | 0.028 | 0.995 | 0.015 | 0.998 |
| D[36] | 0.146 | 0.971 | 0.141 | 0.972 | 0.089 | 0.983 | 0.079 | 0.985 | 0.028 | 0.995 | 0.015 | 0.998 |
| D[11] | 0.249 | 0.950 | 0.234 | 0.953 | 0.143 | 0.972 | 0.137 | 0.973 | 0.134 | 0.973 | 0.122 | 0.976 |
| D[12] | 0.281 | 0.943 | 0.277 | 0.944 | 0.199 | 0.960 | 0.176 | 0.965 | 0.171 | 0.966 | 0.151 | 0.970 |
| D[13] | 0.272 | 0.945 | 0.246 | 0.950 | 0.149 | 0.970 | 0.145 | 0.971 | 0.137 | 0.973 | 0.126 | 0.975 |
| D[14] | 0.236 | 0.952 | 0.197 | 0.960 | 0.135 | 0.973 | 0.128 | 0.975 | 0.116 | 0.977 | 0.114 | 0.978 |
| D[15] | 0.239 | 0.952 | 0.201 | 0.960 | 0.135 | 0.973 | 0.132 | 0.974 | 0.116 | 0.977 | 0.115 | 0.977 |
| D[16] | 0.245 | 0.951 | 0.229 | 0.954 | 0.138 | 0.973 | 0.133 | 0.974 | 0.121 | 0.976 | 0.121 | 0.976 |
| D[21] | 0.199 | 0.960 | 0.181 | 0.964 | 0.113 | 0.978 | 0.110 | 0.978 | 0.096 | 0.981 | 0.090 | 0.982 |
| D[22] | 0.235 | 0.953 | 0.188 | 0.962 | 0.115 | 0.977 | 0.112 | 0.978 | 0.106 | 0.979 | 0.097 | 0.981 |
| D[23] | 0.213 | 0.957 | 0.183 | 0.963 | 0.115 | 0.977 | 0.112 | 0.978 | 0.101 | 0.980 | 0.091 | 0.982 |
| D[24] | 0.183 | 0.963 | 0.153 | 0.970 | 0.107 | 0.979 | 0.091 | 0.982 | 0.089 | 0.983 | 0.079 | 0.985 |
| D[25] | 0.184 | 0.963 | 0.158 | 0.969 | 0.108 | 0.979 | 0.095 | 0.981 | 0.090 | 0.983 | 0.080 | 0.984 |
| D[26] | 0.188 | 0.962 | 0.175 | 0.965 | 0.112 | 0.978 | 0.096 | 0.981 | 0.090 | 0.982 | 0.080 | 0.984 |
| D[31] | 0.154 | 0.969 | 0.145 | 0.971 | 0.089 | 0.983 | 0.088 | 0.983 | 0.079 | 0.985 | 0.071 | 0.986 |
| D[32] | 0.177 | 0.965 | 0.145 | 0.971 | 0.090 | 0.982 | 0.089 | 0.983 | 0.082 | 0.984 | 0.078 | 0.985 |
| D[33] | 0.158 | 0.969 | 0.145 | 0.971 | 0.090 | 0.983 | 0.089 | 0.983 | 0.080 | 0.985 | 0.078 | 0.985 |
| D[34] | 0.145 | 0.971 | 0.128 | 0.975 | 0.087 | 0.983 | 0.072 | 0.986 | 0.026 | 0.996 | 0.014 | 0.998 |
| D[35] | 0.145 | 0.971 | 0.129 | 0.974 | 0.088 | 0.983 | 0.077 | 0.985 | 0.026 | 0.996 | 0.015 | 0.998 |
| D[36] | 0.146 | 0.971 | 0.141 | 0.972 | 0.089 | 0.983 | 0.079 | 0.985 | 0.027 | 0.995 | 0.015 | 0.998 |
| Set-2 | ||||||||||||
| D[11] | 0.136 | 0.951 | 0.136 | 0.951 | 0.035 | 0.990 | 0.034 | 0.990 | 0.022 | 0.995 | 0.021 | 0.995 |
| D[12] | 0.149 | 0.946 | 0.148 | 0.946 | 0.037 | 0.989 | 0.036 | 0.990 | 0.023 | 0.995 | 0.021 | 0.995 |
| D[13] | 0.144 | 0.948 | 0.141 | 0.949 | 0.036 | 0.990 | 0.035 | 0.990 | 0.023 | 0.995 | 0.021 | 0.995 |
| D[14] | 0.130 | 0.953 | 0.129 | 0.954 | 0.028 | 0.993 | 0.026 | 0.993 | 0.020 | 0.996 | 0.020 | 0.996 |
| D[15] | 0.130 | 0.953 | 0.129 | 0.954 | 0.028 | 0.993 | 0.026 | 0.993 | 0.020 | 0.996 | 0.020 | 0.996 |
| D[16] | 0.134 | 0.952 | 0.134 | 0.952 | 0.029 | 0.993 | 0.027 | 0.993 | 0.020 | 0.996 | 0.020 | 0.996 |
| D[21] | 0.110 | 0.961 | 0.109 | 0.961 | 0.019 | 0.996 | 0.019 | 0.996 | 0.019 | 0.996 | 0.019 | 0.996 |
| D[22] | 0.123 | 0.956 | 0.122 | 0.956 | 0.020 | 0.996 | 0.019 | 0.996 | 0.019 | 0.996 | 0.019 | 0.996 |
| D[23] | 0.112 | 0.960 | 0.112 | 0.960 | 0.019 | 0.996 | 0.019 | 0.996 | 0.019 | 0.996 | 0.019 | 0.996 |
| D[24] | 0.106 | 0.963 | 0.105 | 0.963 | 0.019 | 0.996 | 0.018 | 0.996 | 0.018 | 0.996 | 0.018 | 0.997 |
| D[25] | 0.106 | 0.963 | 0.106 | 0.963 | 0.019 | 0.996 | 0.019 | 0.996 | 0.018 | 0.996 | 0.018 | 0.997 |
| D[26] | 0.109 | 0.961 | 0.109 | 0.961 | 0.019 | 0.996 | 0.019 | 0.996 | 0.018 | 0.996 | 0.018 | 0.997 |
| D[31] | 0.088 | 0.969 | 0.088 | 0.970 | 0.018 | 0.997 | 0.017 | 0.997 | 0.017 | 0.997 | 0.016 | 0.997 |
| D[32] | 0.095 | 0.967 | 0.094 | 0.967 | 0.018 | 0.997 | 0.018 | 0.997 | 0.017 | 0.997 | 0.016 | 0.997 |
| D[33] | 0.094 | 0.967 | 0.094 | 0.967 | 0.018 | 0.997 | 0.018 | 0.997 | 0.017 | 0.997 | 0.016 | 0.997 |
| D[34] | 0.086 | 0.970 | 0.086 | 0.970 | 0.017 | 0.997 | 0.016 | 0.997 | 0.015 | 0.998 | 0.014 | 0.998 |
| D[35] | 0.087 | 0.970 | 0.086 | 0.970 | 0.017 | 0.997 | 0.016 | 0.997 | 0.015 | 0.998 | 0.014 | 0.998 |
| D[36] | 0.088 | 0.970 | 0.088 | 0.970 | 0.017 | 0.997 | 0.016 | 0.997 | 0.015 | 0.998 | 0.015 | 0.998 |
| D[11] | 0.136 | 0.951 | 0.136 | 0.951 | 0.035 | 0.990 | 0.034 | 0.990 | 0.022 | 0.995 | 0.021 | 0.995 |
| D[12] | 0.149 | 0.946 | 0.148 | 0.946 | 0.037 | 0.989 | 0.036 | 0.990 | 0.022 | 0.995 | 0.021 | 0.995 |
| D[13] | 0.144 | 0.948 | 0.141 | 0.949 | 0.036 | 0.990 | 0.035 | 0.990 | 0.022 | 0.995 | 0.021 | 0.995 |
| D[14] | 0.130 | 0.953 | 0.128 | 0.954 | 0.028 | 0.993 | 0.026 | 0.993 | 0.020 | 0.996 | 0.020 | 0.996 |
| D[15] | 0.130 | 0.953 | 0.129 | 0.954 | 0.028 | 0.993 | 0.026 | 0.993 | 0.020 | 0.996 | 0.020 | 0.996 |
| D[16] | 0.134 | 0.952 | 0.134 | 0.952 | 0.028 | 0.993 | 0.027 | 0.993 | 0.020 | 0.996 | 0.020 | 0.996 |
| D[21] | 0.110 | 0.961 | 0.109 | 0.961 | 0.019 | 0.996 | 0.019 | 0.996 | 0.019 | 0.996 | 0.019 | 0.996 |
| D[22] | 0.123 | 0.956 | 0.122 | 0.957 | 0.020 | 0.996 | 0.019 | 0.996 | 0.019 | 0.996 | 0.019 | 0.996 |
| D[23] | 0.112 | 0.960 | 0.111 | 0.961 | 0.019 | 0.996 | 0.019 | 0.996 | 0.019 | 0.996 | 0.019 | 0.996 |
| D[24] | 0.106 | 0.963 | 0.105 | 0.963 | 0.019 | 0.996 | 0.018 | 0.996 | 0.018 | 0.996 | 0.018 | 0.997 |
| D[25] | 0.106 | 0.963 | 0.106 | 0.963 | 0.019 | 0.996 | 0.019 | 0.996 | 0.018 | 0.996 | 0.018 | 0.997 |
| D[26] | 0.109 | 0.961 | 0.109 | 0.961 | 0.019 | 0.996 | 0.019 | 0.996 | 0.018 | 0.996 | 0.018 | 0.997 |
| D[31] | 0.088 | 0.969 | 0.088 | 0.970 | 0.018 | 0.997 | 0.017 | 0.997 | 0.017 | 0.997 | 0.016 | 0.997 |
| D[32] | 0.095 | 0.967 | 0.094 | 0.967 | 0.018 | 0.997 | 0.018 | 0.997 | 0.017 | 0.997 | 0.016 | 0.997 |
| D[33] | 0.094 | 0.967 | 0.094 | 0.967 | 0.018 | 0.997 | 0.018 | 0.997 | 0.017 | 0.997 | 0.016 | 0.997 |
| D[34] | 0.086 | 0.970 | 0.086 | 0.970 | 0.017 | 0.997 | 0.016 | 0.997 | 0.015 | 0.998 | 0.014 | 0.998 |
| D[35] | 0.087 | 0.970 | 0.086 | 0.970 | 0.017 | 0.997 | 0.016 | 0.997 | 0.015 | 0.998 | 0.014 | 0.998 |
| D[36] | 0.088 | 0.970 | 0.088 | 0.970 | 0.017 | 0.997 | 0.016 | 0.997 | 0.015 | 0.998 | 0.015 | 0.998 |
| Design | ACI[Norm] | ACI[Log-Norm] | BCI | HPD | ||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Prior → | A | B | A | B | ||||||||
| Set-1 | ||||||||||||
| D[11] | 2.145 | 0.920 | 1.960 | 0.926 | 1.336 | 0.944 | 1.175 | 0.949 | 1.093 | 0.951 | 0.963 | 0.955 |
| D[12] | 2.415 | 0.912 | 2.216 | 0.918 | 1.860 | 0.929 | 1.627 | 0.935 | 1.502 | 0.939 | 1.324 | 0.944 |
| D[13] | 2.232 | 0.917 | 2.133 | 0.920 | 1.541 | 0.938 | 1.287 | 0.946 | 1.199 | 0.948 | 1.046 | 0.953 |
| D[14] | 1.968 | 0.925 | 1.552 | 0.938 | 0.723 | 0.962 | 0.682 | 0.964 | 0.631 | 0.965 | 0.625 | 0.965 |
| D[15] | 2.024 | 0.924 | 1.615 | 0.936 | 0.730 | 0.962 | 0.703 | 0.963 | 0.635 | 0.965 | 0.630 | 0.965 |
| D[16] | 2.040 | 0.923 | 1.794 | 0.930 | 0.760 | 0.961 | 0.750 | 0.962 | 0.650 | 0.965 | 0.641 | 0.965 |
| D[21] | 1.418 | 0.942 | 1.325 | 0.944 | 0.646 | 0.965 | 0.539 | 0.968 | 0.534 | 0.968 | 0.513 | 0.969 |
| D[22] | 1.656 | 0.935 | 1.528 | 0.938 | 0.690 | 0.963 | 0.580 | 0.967 | 0.555 | 0.967 | 0.540 | 0.968 |
| D[23] | 1.585 | 0.937 | 1.500 | 0.939 | 0.654 | 0.964 | 0.549 | 0.968 | 0.544 | 0.968 | 0.524 | 0.968 |
| D[24] | 1.190 | 0.948 | 1.125 | 0.950 | 0.539 | 0.968 | 0.516 | 0.969 | 0.499 | 0.969 | 0.490 | 0.969 |
| D[25] | 1.303 | 0.945 | 1.244 | 0.947 | 0.549 | 0.968 | 0.518 | 0.968 | 0.509 | 0.969 | 0.501 | 0.969 |
| D[26] | 1.353 | 0.944 | 1.296 | 0.945 | 0.555 | 0.967 | 0.529 | 0.968 | 0.515 | 0.969 | 0.510 | 0.969 |
| D[31] | 1.053 | 0.953 | 0.896 | 0.957 | 0.510 | 0.969 | 0.505 | 0.969 | 0.485 | 0.969 | 0.473 | 0.970 |
| D[32] | 1.142 | 0.950 | 1.097 | 0.951 | 0.525 | 0.968 | 0.513 | 0.969 | 0.496 | 0.969 | 0.485 | 0.969 |
| D[33] | 1.101 | 0.951 | 1.007 | 0.954 | 0.513 | 0.969 | 0.505 | 0.969 | 0.492 | 0.969 | 0.479 | 0.970 |
| D[34] | 0.779 | 0.961 | 0.754 | 0.961 | 0.496 | 0.969 | 0.488 | 0.969 | 0.400 | 0.972 | 0.170 | 0.979 |
| D[35] | 0.853 | 0.958 | 0.765 | 0.961 | 0.498 | 0.969 | 0.491 | 0.969 | 0.405 | 0.972 | 0.187 | 0.978 |
| D[36] | 0.918 | 0.957 | 0.827 | 0.959 | 0.504 | 0.969 | 0.494 | 0.969 | 0.408 | 0.972 | 0.188 | 0.978 |
| D[11] | 2.066 | 0.922 | 1.819 | 0.930 | 1.276 | 0.946 | 1.151 | 0.950 | 1.046 | 0.953 | 0.945 | 0.956 |
| D[12] | 2.378 | 0.913 | 2.193 | 0.919 | 1.713 | 0.933 | 1.504 | 0.939 | 1.379 | 0.943 | 1.208 | 0.948 |
| D[13] | 2.166 | 0.919 | 1.873 | 0.928 | 1.336 | 0.944 | 1.263 | 0.946 | 1.094 | 0.951 | 0.983 | 0.955 |
| D[14] | 1.771 | 0.931 | 1.519 | 0.939 | 0.722 | 0.962 | 0.679 | 0.964 | 0.631 | 0.965 | 0.625 | 0.965 |
| D[15] | 1.838 | 0.929 | 1.589 | 0.937 | 0.724 | 0.962 | 0.688 | 0.963 | 0.635 | 0.965 | 0.630 | 0.965 |
| D[16] | 1.846 | 0.929 | 1.711 | 0.933 | 0.755 | 0.961 | 0.741 | 0.962 | 0.650 | 0.965 | 0.641 | 0.965 |
| D[21] | 1.349 | 0.944 | 1.269 | 0.946 | 0.643 | 0.965 | 0.534 | 0.968 | 0.533 | 0.968 | 0.512 | 0.969 |
| D[22] | 1.614 | 0.936 | 1.515 | 0.939 | 0.689 | 0.963 | 0.578 | 0.967 | 0.550 | 0.968 | 0.539 | 0.968 |
| D[23] | 1.583 | 0.937 | 1.499 | 0.939 | 0.650 | 0.965 | 0.547 | 0.968 | 0.543 | 0.968 | 0.520 | 0.968 |
| D[24] | 1.172 | 0.949 | 1.113 | 0.951 | 0.533 | 0.968 | 0.513 | 0.969 | 0.492 | 0.969 | 0.484 | 0.969 |
| D[25] | 1.269 | 0.946 | 1.222 | 0.948 | 0.544 | 0.968 | 0.515 | 0.969 | 0.508 | 0.969 | 0.491 | 0.969 |
| D[26] | 1.318 | 0.945 | 1.266 | 0.946 | 0.550 | 0.968 | 0.528 | 0.968 | 0.511 | 0.969 | 0.504 | 0.969 |
| D[31] | 0.918 | 0.957 | 0.896 | 0.957 | 0.509 | 0.969 | 0.495 | 0.969 | 0.478 | 0.970 | 0.472 | 0.970 |
| D[32] | 1.109 | 0.951 | 1.044 | 0.953 | 0.517 | 0.969 | 0.513 | 0.969 | 0.487 | 0.969 | 0.483 | 0.970 |
| D[33] | 1.021 | 0.954 | 0.969 | 0.955 | 0.509 | 0.969 | 0.495 | 0.969 | 0.480 | 0.970 | 0.478 | 0.970 |
| D[34] | 0.766 | 0.961 | 0.742 | 0.962 | 0.488 | 0.969 | 0.479 | 0.970 | 0.364 | 0.973 | 0.164 | 0.979 |
| D[35] | 0.831 | 0.959 | 0.753 | 0.961 | 0.491 | 0.969 | 0.489 | 0.969 | 0.368 | 0.973 | 0.179 | 0.979 |
| D[36] | 0.837 | 0.959 | 0.812 | 0.960 | 0.495 | 0.969 | 0.492 | 0.969 | 0.370 | 0.973 | 0.180 | 0.979 |
| Set-2 | ||||||||||||
| D[11] | 2.130 | 0.919 | 1.595 | 0.936 | 0.400 | 0.974 | 0.364 | 0.975 | 0.240 | 0.979 | 0.236 | 0.980 |
| D[12] | 2.225 | 0.916 | 1.942 | 0.925 | 0.413 | 0.974 | 0.375 | 0.975 | 0.245 | 0.979 | 0.240 | 0.979 |
| D[13] | 2.140 | 0.919 | 1.731 | 0.932 | 0.405 | 0.974 | 0.368 | 0.975 | 0.243 | 0.979 | 0.239 | 0.979 |
| D[14] | 1.753 | 0.931 | 1.501 | 0.939 | 0.347 | 0.976 | 0.340 | 0.976 | 0.234 | 0.980 | 0.231 | 0.980 |
| D[15] | 1.893 | 0.927 | 1.551 | 0.938 | 0.360 | 0.976 | 0.352 | 0.976 | 0.238 | 0.979 | 0.234 | 0.980 |
| D[16] | 2.085 | 0.920 | 1.593 | 0.936 | 0.370 | 0.975 | 0.361 | 0.976 | 0.239 | 0.979 | 0.236 | 0.980 |
| D[21] | 1.354 | 0.944 | 1.261 | 0.947 | 0.226 | 0.980 | 0.222 | 0.980 | 0.217 | 0.980 | 0.217 | 0.980 |
| D[22] | 1.649 | 0.934 | 1.447 | 0.941 | 0.231 | 0.980 | 0.229 | 0.980 | 0.219 | 0.980 | 0.219 | 0.980 |
| D[23] | 1.389 | 0.943 | 1.288 | 0.946 | 0.229 | 0.980 | 0.227 | 0.980 | 0.219 | 0.980 | 0.218 | 0.980 |
| D[24] | 1.247 | 0.947 | 1.168 | 0.950 | 0.219 | 0.980 | 0.216 | 0.980 | 0.213 | 0.980 | 0.212 | 0.980 |
| D[25] | 1.250 | 0.947 | 1.180 | 0.949 | 0.220 | 0.980 | 0.217 | 0.980 | 0.216 | 0.980 | 0.214 | 0.980 |
| D[26] | 1.293 | 0.946 | 1.202 | 0.949 | 0.220 | 0.980 | 0.217 | 0.980 | 0.217 | 0.980 | 0.216 | 0.980 |
| D[31] | 1.002 | 0.955 | 0.963 | 0.956 | 0.215 | 0.980 | 0.212 | 0.980 | 0.183 | 0.981 | 0.180 | 0.981 |
| D[32] | 1.233 | 0.948 | 1.155 | 0.950 | 0.218 | 0.980 | 0.215 | 0.980 | 0.186 | 0.981 | 0.183 | 0.981 |
| D[33] | 1.033 | 0.954 | 0.987 | 0.956 | 0.218 | 0.980 | 0.215 | 0.980 | 0.183 | 0.981 | 0.180 | 0.981 |
| D[34] | 0.905 | 0.958 | 0.873 | 0.959 | 0.178 | 0.981 | 0.175 | 0.981 | 0.171 | 0.982 | 0.166 | 0.982 |
| D[35] | 0.914 | 0.958 | 0.882 | 0.959 | 0.187 | 0.981 | 0.179 | 0.981 | 0.176 | 0.981 | 0.173 | 0.982 |
| D[36] | 0.984 | 0.956 | 0.944 | 0.957 | 0.188 | 0.981 | 0.180 | 0.981 | 0.180 | 0.981 | 0.173 | 0.982 |
| D[11] | 1.846 | 0.928 | 1.589 | 0.936 | 0.398 | 0.974 | 0.361 | 0.976 | 0.239 | 0.979 | 0.236 | 0.980 |
| D[12] | 2.149 | 0.918 | 1.873 | 0.927 | 0.408 | 0.974 | 0.370 | 0.975 | 0.241 | 0.979 | 0.237 | 0.979 |
| D[13] | 2.029 | 0.922 | 1.708 | 0.932 | 0.398 | 0.974 | 0.362 | 0.975 | 0.240 | 0.979 | 0.236 | 0.980 |
| D[14] | 1.680 | 0.933 | 1.462 | 0.940 | 0.347 | 0.976 | 0.340 | 0.976 | 0.232 | 0.980 | 0.229 | 0.980 |
| D[15] | 1.691 | 0.933 | 1.519 | 0.939 | 0.355 | 0.976 | 0.348 | 0.976 | 0.235 | 0.980 | 0.232 | 0.980 |
| D[16] | 1.771 | 0.930 | 1.525 | 0.938 | 0.369 | 0.975 | 0.360 | 0.976 | 0.236 | 0.979 | 0.234 | 0.980 |
| D[21] | 1.321 | 0.945 | 1.230 | 0.948 | 0.224 | 0.980 | 0.221 | 0.980 | 0.217 | 0.980 | 0.217 | 0.980 |
| D[22] | 1.646 | 0.934 | 1.444 | 0.941 | 0.229 | 0.980 | 0.228 | 0.980 | 0.219 | 0.980 | 0.219 | 0.980 |
| D[23] | 1.330 | 0.945 | 1.239 | 0.947 | 0.228 | 0.980 | 0.225 | 0.980 | 0.219 | 0.980 | 0.218 | 0.980 |
| D[24] | 1.205 | 0.949 | 1.132 | 0.951 | 0.219 | 0.980 | 0.216 | 0.980 | 0.213 | 0.980 | 0.212 | 0.980 |
| D[25] | 1.222 | 0.948 | 1.145 | 0.950 | 0.219 | 0.980 | 0.216 | 0.980 | 0.216 | 0.980 | 0.214 | 0.980 |
| D[26] | 1.293 | 0.946 | 1.202 | 0.949 | 0.220 | 0.980 | 0.217 | 0.980 | 0.217 | 0.980 | 0.216 | 0.980 |
| D[31] | 0.984 | 0.956 | 0.944 | 0.957 | 0.215 | 0.980 | 0.212 | 0.980 | 0.180 | 0.981 | 0.176 | 0.981 |
| D[32] | 1.180 | 0.949 | 1.120 | 0.951 | 0.218 | 0.980 | 0.215 | 0.980 | 0.186 | 0.981 | 0.182 | 0.981 |
| D[33] | 0.988 | 0.955 | 0.947 | 0.957 | 0.218 | 0.980 | 0.215 | 0.980 | 0.183 | 0.981 | 0.180 | 0.981 |
| D[34] | 0.885 | 0.959 | 0.855 | 0.960 | 0.178 | 0.981 | 0.175 | 0.981 | 0.170 | 0.982 | 0.158 | 0.982 |
| D[35] | 0.903 | 0.958 | 0.871 | 0.959 | 0.180 | 0.981 | 0.178 | 0.981 | 0.175 | 0.981 | 0.166 | 0.982 |
| D[36] | 0.976 | 0.956 | 0.939 | 0.957 | 0.183 | 0.981 | 0.180 | 0.981 | 0.176 | 0.981 | 0.166 | 0.982 |
References
- Sapkota, L.P.; Bam, N.; Kumar, V. A new exponential family of distributions with applications to engineering and medical data. Sci. Rep. 2025, 15, 33649. [Google Scholar] [CrossRef] [PubMed]
- Górny, J.; Cramer, E. Modularization of hybrid censoring schemes and its application to unified progressive hybrid censoring. Metrika 2018, 81, 173–210. [Google Scholar]
- Lone, S.A.; Panahi, H.; Anwar, S.; Shahab, S. Estimations and optimal censoring schemes for the unified progressive hybrid gamma-mixed Rayleigh distribution. Electron. Res. Arch. 2023, 31, 4729–4752. [Google Scholar] [CrossRef]
- Anwar, S.; Lone, S.A.; Khan, A.; Almutlak, S. Stress-strength reliability estimation for the inverted exponentiated Rayleigh distribution under unified progressive hybrid censoring with application. Electron. Res. Arch. 2023, 31, 4011–4033. [Google Scholar] [CrossRef]
- Bayoud, H.A.; Almathkour, F.B.; Raqab, M.Z. Advanced inference techniques for two-parameter Topp-Leone models under unified progressively hybrid censoring. Commun.-Stat.-Simul. Comput. 2024, 55, 1384–1407. [Google Scholar]
- Dutta, S.; Kayal, S. Estimation and prediction for Burr type III distribution based on unified progressive hybrid censoring scheme. J. Appl. Stat. 2024, 51, 1–33. [Google Scholar]
- Mohammed, H.S.; Abo-Kasem, O.E.; Elshahhat, A. Optimum Progressive Data Analysis and Bayesian Inference for Unified Progressive Hybrid INH Censoring with Applications to Diamonds and Gold. Axioms 2025, 14, 559. [Google Scholar] [CrossRef]
- Prakash, A.; Maurya, A.; Maurya, R.K. Statistical and Machine Learning Approach for Progressive Stress Accelerated Life Testing Under Unified Progressive Hybrid Censoring. Qual. Reliab. Eng. Int. 2025, 41, 2463–2480. [Google Scholar] [CrossRef]
- Henningsen, A.; Toomet, O. maxLik: A package for maximum likelihood estimation in R. Comput. Stat. 2011, 26, 443–458. [Google Scholar]
- Lawless, J.F. Statistical Models and Methods for Lifetime Data, 2nd ed.; John Wiley and Sons: Hoboken, NJ, USA, 2003. [Google Scholar]
- Greene, W.H. Econometric Analysis, 7th ed.; Pearson Prentice-Hall: Upper Saddle River, NJ, USA, 2012. [Google Scholar]
- Meeker, W.Q.; Escobar, L.A. Statistical Methods for Reliability Data; John Wiley and Sons: New York, NY, USA, 1998. [Google Scholar]
- Kundu, D. Bayesian inference and life testing plan for the Weibull distribution in presence of progressive censoring. Technometrics 2008, 50, 144–154. [Google Scholar] [CrossRef]
- Plummer, M.; Best, N.; Cowles, K.; Vines, K. CODA: Convergence diagnosis and output analysis for MCMC. R News 2006, 6, 7–11. [Google Scholar]
- Valavanidis, A.; Fiotakis, K.; Vlachogianni, T. Airborne particulate matter and human health: Toxicological assessment and importance of size and composition of particles for oxidative damage and carcinogenic mechanisms. J. Environ. Sci. Health Part C 2008, 26, 339–362. [Google Scholar] [CrossRef] [PubMed]
- Kumagai, S.; Matsunaga, I. Physiologically based pharmacokinetic model for acetone. Occup. Environ. Med. 1995, 52, 344–352. [Google Scholar] [CrossRef]
- Peter, P.O.; Oluyede, B.; Bindele, H.F.; Ndwapi, N.; Mabikwa, O. The gamma odd Burr III-G family of distributions: Model, properties and applications. Rev. Colomb. Estadística 2021, 44, 331–368. [Google Scholar]
- Cowdrey, K.W.; de Lange, J.; Malekian, R.; Wanneburg, J.; Jose, A.C. Applying queueing theory for the optimization of a banking model. J. Internet Technol. 2018, 19, 381–389. [Google Scholar]
- Ghitany, M.E.; Atieh, B.; Nadarajah, S. Lindley distribution and its application. Math. Comput. Simul. 2008, 78, 493–506. [Google Scholar] [CrossRef]
- Alsubie, A. Properties and applications of the modified Kies–Lomax distribution with estimation methods. J. Math. 2021, 2021, 1944864. [Google Scholar] [CrossRef]
- Marinho, P.R.D.; Silva, R.B.; Bourguignon, M.; Cordeiro, G.M.; Nadarajah, S. AdequacyModel: An R package for probability distributions and general purpose optimization. PLoS ONE 2019, 14, e0221487. [Google Scholar] [CrossRef]
- Bagdonavicius, V.; Nikulin, M. Accelerated Life Models: Modeling and Statistical Analysis; Chapman and Hall/CRC: Boca Raton, FL, USA, 2001. [Google Scholar]
- Peng, X.; Yan, Z. Estimation and application for a new extended Weibull distribution. Reliab. Eng. Syst. Saf. 2014, 121, 34–42. [Google Scholar] [CrossRef]
- Mahdavi, A.; Kundu, D. A new method for generating distributions with an application to exponential distribution. Commun.-Stat.-Theory Methods 2017, 46, 6543–6557. [Google Scholar] [CrossRef]
- Pinho, L.G.B.; Cordeiro, G.M.; Nobre, J.S. The Harris extended exponential distribution. Commun.-Stat.-Theory Methods 2015, 44, 3486–3502. [Google Scholar] [CrossRef]
- Alotaibi, R.; Nassar, M.; Elshahhat, A. A new extended Pham distribution for modelling cancer data. J. Radiat. Res. Appl. Sci. 2024, 17, 100961. [Google Scholar] [CrossRef]
- Mudholkar, G.S.; Srivastava, D.K. Exponentiated Weibull family for analyzing bathtub failure-rate data. IEEE Trans. Reliab. 1993, 42, 299–302. [Google Scholar] [CrossRef]
- Oguntunde, P.E.; Balogun, O.S.; Okagbue, H.I.; Bishop, S.A. The Weibull-exponential distribution: Its properties and applications. J. Appl. Sci. 2015, 15, 1305–1311. [Google Scholar] [CrossRef]
- Nadarajah, S.; Haghighi, F. An extension of the exponential distribution. Statistics 2011, 45, 543–558. [Google Scholar] [CrossRef]
- Gupta, R.D.; Kundu, D. Generalized exponential distribution: Different method of estimations. J. Stat. Comput. Simul. 2001, 69, 315–337. [Google Scholar] [CrossRef]
- Birnbaum, Z.W.; Saunders, S.C. A probabilistic interpretation of Miner’s rule. Siam J. Appl. Math. 1968, 16, 637–652. [Google Scholar] [CrossRef]
- Weibull, W. A statistical distribution function of wide applicability. J. Appl. Mech. 1951, 18, 293–297. [Google Scholar] [CrossRef]
- Johnson, N.; Kotz, S.; Balakrishnan, N. Continuous Univariate Distributions, 2nd ed.; John Wiley & Sons: New York, NY, USA, 1994. [Google Scholar]
- Ng, H.K.T.; Chan, C.S.; Balakrishnan, N. Optimal progressive censoring plans for the Weibull distribution. Technometrics 2004, 46, 470–481. [Google Scholar] [CrossRef]
- Pradhan, B.; Kundu, D. On progressively censored generalized exponential distribution. Test 2009, 18, 497–515. [Google Scholar] [CrossRef]
- Sen, A.; Kannan, N.; Kundu, D. Bayesian planning and inference of a progressively censored sample from linear hazard rate distribution. Comput. Stat. Data Anal. 2013, 62, 108–121. [Google Scholar] [CrossRef]
- Sen, T.; Bhattacharya, R.; Tripathi, Y.M.; Pradhan, B. Inference and optimum life testing plans based on Type-II progressive hybrid censored generalized exponential data. Commun.-Stat.-Simul. Comput. 2020, 49, 3254–3282. [Google Scholar] [CrossRef]
- Lin, C.T.; Chen, Y.C.; Yeh, T.C.; Ng, H.K.T. Statistical inference and optimum life-testing plans with joint progressively type-II censoring scheme. Qual. Technol. Quant. Manag. 2023, 20, 279–306. [Google Scholar] [CrossRef]
- Nassar, M.; Elshahhat, A. Estimation procedures and optimal censoring schemes for an improved adaptive progressively type-II censored Weibull distribution. J. Appl. Stat. 2024, 51, 1664–1688. [Google Scholar] [CrossRef]











| Design | Design | ||
| D[11] | D[14] | ||
| D[12] | D[15] | ||
| D[13] | D[16] | ||
| D[21] | D[24] | ||
| D[22] | D[25] | ||
| D[23] | D[26] | ||
| D[31] | D[34] | ||
| D[32] | D[35] | ||
| D[33] | D[36] |
| App. | Mean | Mode | St.D | Skew. | |||
|---|---|---|---|---|---|---|---|
| 1 | 19.01 | 3.80 | 3.80 | 9.30 | 15.95 | 28.55 | 2.65 |
| 2 | 9.88 | 7.10 | 4.68 | 8.10 | 13.03 | 7.237 | 1.47 |
| Sample | S | Data | ||||
|---|---|---|---|---|---|---|
| App.1 | ||||||
| (,) | 43(19) | 45(19) | 0 | 51.9 | 1.5, 2.1, 2.2, 2.5, 2.6, 3.8, 3.8, 4.3, 6, 7, 9.3, 10.2, 10.6, 12.3, 14.1, 17.8, 27.6, 42, 51.9 | |
| (,,) | 46(15) | 50(15) | 4 | 46 | 1.5, 1.7, 2.1, 2.2, 2.4, 2.5, 2.6, 3.8, 6, 7.5, 10.6, 12.3, 14.1, 31, 45.6 | |
| (,,) | 32(10) | 35(10) | 15 | 32 | 1.5, 2.6, 3.8, 4.2, 5.6, 7.5, 9.3, 12.9, 17.8, 31 | |
| (,) | 3(7) | 5(8) | 23 | 5 | 1.5, 1.7, 2.1, 2.2, 2.4, 2.5, 2.6, 3.8 | |
| App.2 | ||||||
| (,) | 28(40) | 30(40) | 0 | 27 | 0.8, 1.3, 1.9, 2.9, 3.2, 3.3, 4, 4.1, 4.2, 4.7, 4.9, 5, 5.5, 5.7, 6.2, 6.9, 7.1, 7.4, 7.6, 7.7, | |
| 8, 8.6, 8.8, 8.9, 9.5 10.7 11.1 11.5 12.4, 12.5, 13.3, 14.1, 15.4, 18.1, 18.4, 18.9, 19.9, 21.4, 23, 27 | ||||||
| (,,) | 25(30) | 27(30) | 10 | 25 | 0.8, 0.8, 1.3, 1.5, 1.8, 1.9, 1.9, 2.1, 2.6, 2.7, 2.9, 3.1, 3.2, 3.3, 3.5, 4.1, 4.4, 4.6, 4.9, 5.5, | |
| 6.2, 8.6, 9.5, 12.4, 12.5, 13.7, 17.3, 18.2, 20.6, 23 | ||||||
| (,,) | 8(11) | 22(20) | 50 | 21.4 | 0.8, 1.5, 2.7, 3.2, 4.2, 4.3, 4.7, 5.3, 6.1, 6.3, 7.6, 8.9, 9.7, 11, 12.4, 12.9, 13.3, 15.4, 19.9, 21.4 | |
| (,) | 2(7) | 4(15) | 85 | 4 | 0.8, 0.8, 1.3, 1.5, 1.8, 1.9, 1.9, 2.1, 2.6, 2.7, 2.9, 3.1, 3.2, 3.3, 3.5 | |
| Sample | Par. | MLE | MCMC | ACI[Norm] | BCI | ACI[Log-Norm] | HPD | ||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Est. | St.E | Est. | St.E | Low. | Upp. | IW | Low. | Upp. | IW | Low. | Upp. | IW | Low. | Upp. | IW | ||
| 0.0016 | 0.0013 | 0.0016 | 0.0002 | 0.0010 | 0.0042 | 0.0052 | 0.0011 | 0.0023 | 0.0012 | 0.0003 | 0.0080 | 0.0077 | 0.0011 | 0.0022 | 0.0012 | ||
| 325.27 | 5.9316 | 325.27 | 0.0058 | 313.64 | 336.89 | 23.252 | 325.25 | 325.29 | 0.0394 | 313.85 | 337.10 | 23.257 | 325.25 | 325.29 | 0.0392 | ||
| 164.83 | 8.6903 | 164.83 | 0.0006 | 147.80 | 181.86 | 34.065 | 164.83 | 164.83 | 0.0040 | 148.65 | 182.77 | 34.126 | 164.83 | 164.83 | 0.0040 | ||
| 3.1379 | 0.6418 | 3.1379 | 0.0058 | 1.8799 | 4.3958 | 2.5159 | 3.1182 | 3.1579 | 0.0397 | 2.1015 | 4.6854 | 2.5838 | 3.1182 | 3.1579 | 0.0397 | ||
| 0.0434 | 0.0319 | 0.0434 | 0.0006 | 0.0091 | 0.1058 | 0.1250 | 0.0414 | 0.0453 | 0.0039 | 0.0103 | 0.1832 | 0.1729 | 0.0413 | 0.0452 | 0.0039 | ||
| 0.1711 | 0.0272 | 0.1707 | 0.0049 | 0.1177 | 0.2245 | 0.1068 | 0.1543 | 0.1873 | 0.0330 | 0.1252 | 0.2337 | 0.1085 | 0.1543 | 0.1873 | 0.0330 | ||
| 0.9410 | 0.0323 | 0.9412 | 0.0176 | 0.8776 | 0.9944 | 0.1267 | 0.8674 | 0.9836 | 0.1163 | 0.8797 | 1.0065 | 0.1268 | 0.8811 | 0.9892 | 0.1081 | ||
| 0.9426 | 0.0200 | 0.9427 | 0.0039 | 0.9034 | 0.9819 | 0.0785 | 0.9288 | 0.9553 | 0.9979 | 0.9042 | 0.9827 | 0.0785 | 0.9918 | 0.9771 | 0.9342 | ||
| 0.7001 | 0.0359 | 0.0886 | 0.0187 | 0.6298 | 0.7705 | 0.1407 | 0.0353 | 0.1602 | 0.1250 | 0.6332 | 0.7741 | 0.1409 | 0.0310 | 0.1529 | 0.1219 | ||
| 0.0487 | 0.0116 | 0.9173 | 0.0013 | 0.0259 | 0.0715 | 0.0456 | 0.8650 | 0.8579 | 0.8506 | 0.0305 | 0.0777 | 0.0473 | 0.8128 | 0.7971 | 0.7891 | ||
| 0.0006 | 0.0002 | 0.0006 | 0.0000 | 0.0003 | 0.0009 | 0.0006 | 0.0005 | 0.0008 | 0.0003 | 0.0004 | 0.0010 | 0.0007 | 0.0005 | 0.0008 | 0.0003 | ||
| 52.061 | 8.3900 | 52.061 | 0.0058 | 35.617 | 68.505 | 32.888 | 52.042 | 52.081 | 0.0392 | 37.961 | 71.399 | 33.438 | 52.041 | 52.080 | 0.0391 | ||
| 266.57 | 0.0002 | 266.57 | 0.0006 | 266.57 | 266.57 | 0.0010 | 266.57 | 266.57 | 0.0039 | 266.57 | 266.57 | 0.0010 | 266.57 | 266.57 | 0.0039 | ||
| 1.3621 | 0.2793 | 1.3619 | 0.0058 | 0.8147 | 1.9095 | 1.0947 | 1.3426 | 1.3816 | 0.0390 | 0.9114 | 2.0358 | 1.1244 | 1.3426 | 1.3815 | 0.0390 | ||
| 0.0440 | 0.0002 | 0.0440 | 0.0001 | 0.0435 | 0.0444 | 0.0009 | 0.0438 | 0.0442 | 0.0004 | 0.0435 | 0.0444 | 0.0009 | 0.0438 | 0.0442 | 0.0004 | ||
| 0.0667 | 0.0184 | 0.0664 | 0.0041 | 0.0307 | 0.1027 | 0.0720 | 0.0528 | 0.0805 | 0.0277 | 0.0389 | 0.1145 | 0.0756 | 0.0526 | 0.0802 | 0.0276 | ||
| 0.8393 | 0.0744 | 0.8311 | 0.0239 | 0.6935 | 0.9852 | 0.2917 | 0.7384 | 0.8991 | 0.1607 | 0.7054 | 0.9986 | 0.2932 | 0.7490 | 0.9061 | 0.1571 | ||
| 0.9002 | 0.0263 | 0.9006 | 0.0075 | 0.8487 | 0.9517 | 0.1030 | 0.8745 | 0.9252 | 0.0507 | 0.8501 | 0.9532 | 0.1031 | 0.8738 | 0.9245 | 0.0507 | ||
| 0.1822 | 0.0546 | 0.1865 | 0.0168 | 0.0753 | 0.2891 | 0.2138 | 0.1338 | 0.2475 | 0.1136 | 0.1013 | 0.3276 | 0.2263 | 0.1322 | 0.2453 | 0.1131 | ||
| 0.0453 | 0.0086 | 0.0451 | 0.0035 | 0.0284 | 0.0622 | 0.0338 | 0.0337 | 0.0573 | 0.0236 | 0.0312 | 0.0658 | 0.0346 | 0.0339 | 0.0575 | 0.0235 | ||
| 0.0055 | 0.0098 | 0.0055 | 0.0001 | 0.0038 | 0.0248 | 0.0386 | 0.0053 | 0.0057 | 0.0004 | 0.0002 | 0.1830 | 0.1828 | 0.0053 | 0.0057 | 0.0004 | ||
| 160.16 | 8.4237 | 160.16 | 0.0058 | 143.65 | 176.67 | 33.020 | 160.14 | 160.17 | 0.0393 | 144.47 | 177.55 | 33.079 | 160.14 | 160.18 | 0.0392 | ||
| 37.847 | 15.718 | 37.847 | 0.0006 | 7.0399 | 68.655 | 61.615 | 37.845 | 37.849 | 0.0039 | 16.770 | 85.417 | 68.648 | 37.845 | 37.849 | 0.0039 | ||
| 2.7551 | 0.7683 | 2.7550 | 0.0057 | 1.2493 | 4.2610 | 3.0116 | 2.7358 | 2.7747 | 0.0390 | 1.5951 | 4.7589 | 3.1638 | 2.7356 | 2.7746 | 0.0389 | ||
| 0.0321 | 0.0508 | 0.0321 | 0.0001 | 0.0067 | 0.1316 | 0.1990 | 0.0319 | 0.0323 | 0.0004 | 0.0015 | 0.7106 | 0.7091 | 0.0319 | 0.0323 | 0.0004 | ||
| 0.1788 | 0.0410 | 0.1784 | 0.0054 | 0.0983 | 0.2592 | 0.1608 | 0.1603 | 0.1968 | 0.0366 | 0.1140 | 0.2803 | 0.1663 | 0.1597 | 0.1962 | 0.0365 | ||
| 0.9503 | 0.0399 | 0.9501 | 0.0015 | 0.8721 | 0.9985 | 0.1564 | 0.9447 | 0.9551 | 0.0104 | 0.8752 | 1.0318 | 0.1566 | 0.9448 | 0.9552 | 0.0104 | ||
| 0.9104 | 0.0364 | 0.9105 | 0.0056 | 0.8391 | 0.9817 | 0.1427 | 0.8909 | 0.9290 | 0.0381 | 0.8418 | 0.9846 | 0.1428 | 0.8915 | 0.9296 | 0.0381 | ||
| 0.0780 | 0.0450 | 0.0781 | 0.0017 | 0.0010 | 0.1662 | 0.1764 | 0.0723 | 0.0842 | 0.0119 | 0.0251 | 0.2417 | 0.2166 | 0.0723 | 0.0841 | 0.0119 | ||
| 0.0681 | 0.0196 | 0.0680 | 0.0041 | 0.0297 | 0.1066 | 0.0769 | 0.0546 | 0.0823 | 0.0277 | 0.0387 | 0.1198 | 0.0810 | 0.0542 | 0.0818 | 0.0276 | ||
| 0.0567 | 0.0617 | 0.0567 | 0.0001 | 0.0003 | 0.1778 | 0.2421 | 0.0566 | 0.0569 | 0.0004 | 0.0067 | 0.4788 | 0.4721 | 0.0566 | 0.0569 | 0.0004 | ||
| 206.90 | 11.864 | 206.90 | 0.0058 | 183.65 | 230.15 | 46.506 | 206.88 | 206.92 | 0.0391 | 184.91 | 231.51 | 46.604 | 206.88 | 206.92 | 0.0391 | ||
| 233.46 | 24.902 | 233.46 | 0.0006 | 184.65 | 282.26 | 97.614 | 233.46 | 233.46 | 0.0039 | 189.41 | 287.74 | 98.327 | 233.46 | 233.46 | 0.0039 | ||
| 6.8015 | 3.3175 | 6.8013 | 0.0057 | 0.2993 | 13.304 | 13.004 | 6.7821 | 6.8211 | 0.0391 | 2.6147 | 17.692 | 15.078 | 6.7818 | 6.8208 | 0.0390 | ||
| 1.1753 | 0.4381 | 1.1753 | 0.0001 | 0.3167 | 2.0339 | 1.7172 | 1.1751 | 1.1755 | 0.0004 | 0.5661 | 2.4402 | 1.8741 | 1.1751 | 1.1755 | 0.0004 | ||
| 1.1349 | 0.2609 | 1.1349 | 0.0058 | 0.6235 | 1.6463 | 1.0228 | 1.1153 | 1.1545 | 0.0392 | 0.7232 | 1.7810 | 1.0578 | 1.1155 | 1.1546 | 0.0391 | ||
| 0.7321 | 0.1177 | 0.7321 | 0.0003 | 0.5015 | 0.9627 | 0.4613 | 0.7309 | 0.7333 | 0.0023 | 0.5343 | 1.0032 | 0.4690 | 0.7309 | 0.7332 | 0.0023 | ||
| 0.2044 | 0.0840 | 0.2045 | 0.0032 | 0.0397 | 0.3691 | 0.3293 | 0.1938 | 0.2156 | 0.0218 | 0.0913 | 0.4575 | 0.3661 | 0.1935 | 0.2153 | 0.0217 | ||
| 0.6244 | 0.2425 | 0.6244 | 0.0005 | 0.1491 | 1.0997 | 0.9506 | 0.6227 | 0.6261 | 0.0033 | 0.2916 | 1.3368 | 1.0451 | 0.6227 | 0.6261 | 0.0033 | ||
| 1.0274 | 0.2708 | 1.0274 | 0.0071 | 0.4967 | 1.5581 | 1.0614 | 1.0034 | 1.0514 | 0.0480 | 0.6130 | 1.7222 | 1.1092 | 1.0041 | 1.0519 | 0.0479 | ||
| Criterion | Goal |
|---|---|
| Sample | ||||||
|---|---|---|---|---|---|---|
| 0.3 | 0.6 | 0.9 | ||||
| App.1 | ||||||
| 1.03 × 107 | 7.55 × 101 | 7.49 × 10−9 | 6.50 × 10−1 | 2.91 × 100 | 7.40 × 101 | |
| 3.21 × 108 | 1.38 × 10−7 | 1.06 × 10−23 | 5.03 × 10−1 | 2.06 × 100 | 1.27 × 101 | |
| 4.63 × 105 | 2.47 × 102 | 1.42 × 10−6 | 1.48 × 100 | 8.00 × 100 | 3.70 × 102 | |
| 2.53 × 103 | 6.20 × 102 | 4.79 × 10−2 | 6.78 × 10−2 | 1.31 × 10−1 | 5.73 × 10−1 | |
| App.2 | ||||||
| 4.11 × 103 | 7.05 × 101 | 3.53 × 10−3 | 3.83 × 10−1 | 1.14 × 100 | 5.33 × 100 | |
| 1.01 × 104 | 3.56 × 101 | 5.48 × 10−4 | 1.40 × 100 | 8.82 × 100 | 6.96 × 101 | |
| 1.67 × 103 | 7.16 × 101 | 2.49 × 10−2 | 6.58 × 10−1 | 2.01 × 100 | 9.71 × 100 | |
| 5.26 × 101 | 1.52 × 102 | 2.96 × 101 | 4.45 × 10−2 | 8.06 × 10−2 | 2.85 × 10−1 | |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Alotaibi, R.; Elshahhat, A. A New Exponential-Type Model Under Unified Progressive Hybrid Censoring: Computational Inference and Its Applications. Mathematics 2026, 14, 1182. https://doi.org/10.3390/math14071182
Alotaibi R, Elshahhat A. A New Exponential-Type Model Under Unified Progressive Hybrid Censoring: Computational Inference and Its Applications. Mathematics. 2026; 14(7):1182. https://doi.org/10.3390/math14071182
Chicago/Turabian StyleAlotaibi, Refah, and Ahmed Elshahhat. 2026. "A New Exponential-Type Model Under Unified Progressive Hybrid Censoring: Computational Inference and Its Applications" Mathematics 14, no. 7: 1182. https://doi.org/10.3390/math14071182
APA StyleAlotaibi, R., & Elshahhat, A. (2026). A New Exponential-Type Model Under Unified Progressive Hybrid Censoring: Computational Inference and Its Applications. Mathematics, 14(7), 1182. https://doi.org/10.3390/math14071182

