Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (644)

Search Parameters:
Keywords = Monte Carlo Markov chain

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
16 pages, 882 KiB  
Article
MatBYIB: A MATLAB-Based Toolkit for Parameter Estimation of Eccentric Gravitational Waves from EMRIs
by Genliang Li, Shujie Zhao, Huaike Guo, Jingyu Su and Zhenheng Lin
Universe 2025, 11(8), 259; https://doi.org/10.3390/universe11080259 - 6 Aug 2025
Abstract
Accurate parameter estimation is essential for gravitational wave data analysis. In extreme mass-ratio inspiral binary systems, orbital eccentricity is a critical parameter for parameter estimation. However, the current software for the parameter estimation of the gravitational wave often neglects the direct estimation of [...] Read more.
Accurate parameter estimation is essential for gravitational wave data analysis. In extreme mass-ratio inspiral binary systems, orbital eccentricity is a critical parameter for parameter estimation. However, the current software for the parameter estimation of the gravitational wave often neglects the direct estimation of orbital eccentricity. To fill this gap, we have developed the MatBYIB, a MATLAB-based software (Version 1.0) package for the parameter estimation of the gravitational wave with arbitrary eccentricity. The MatBYIB employs the Analytical Kludge waveform as a computationally efficient signal generator and computes parameter uncertainties via the Fisher Information Matrix and the Markov Chain Monte Carlo. For Bayesian inference, we implement the Metropolis–Hastings algorithm to derive posterior distributions. To guarantee convergence, the Gelman–Rubin convergence criterion (the Potential Scale Reduction Factor R^) is used to determine sampling adequacy, with MatBYIB dynamically increasing the sample size until R^<1.05 for all parameters. Our results demonstrate strong agreement between predictions based on the Fisher Information Matrix and full MCMC sampling. This program is user-friendly and allows for the estimation of the gravitational wave parameters with arbitrary eccentricity on standard personal computers. Full article
Show Figures

Figure 1

23 pages, 3124 KiB  
Article
Bee Swarm Metropolis–Hastings Sampling for Bayesian Inference in the Ginzburg–Landau Equation
by Shucan Xia and Lipu Zhang
Algorithms 2025, 18(8), 476; https://doi.org/10.3390/a18080476 - 2 Aug 2025
Viewed by 92
Abstract
To improve the sampling efficiency of Markov Chain Monte Carlo in complex parameter spaces, this paper proposes an adaptive sampling method that integrates a swarm intelligence mechanism called the BeeSwarm-MH algorithm. The method combines global exploration by scout bees with local exploitation by [...] Read more.
To improve the sampling efficiency of Markov Chain Monte Carlo in complex parameter spaces, this paper proposes an adaptive sampling method that integrates a swarm intelligence mechanism called the BeeSwarm-MH algorithm. The method combines global exploration by scout bees with local exploitation by worker bees. It employs multi-stage perturbation intensities and adaptive step-size tuning to enable efficient posterior sampling. Focusing on Bayesian inference for parameter estimation in the soliton solutions of the two-dimensional complex Ginzburg–Landau equation, we design a dedicated inference framework to systematically compare the performance of BeeSwarm-MH with the classical Metropolis–Hastings algorithm. Experimental results demonstrate that BeeSwarm-MH achieves comparable estimation accuracy while significantly reducing the required number of iterations and total computation time for convergence. Moreover, it exhibits superior global search capabilities and adaptive features, offering a practical approach for efficient Bayesian inference in complex physical models. Full article
Show Figures

Graphical abstract

26 pages, 657 KiB  
Article
Bayesian Inference for Copula-Linked Bivariate Generalized Exponential Distributions: A Comparative Approach
by Carlos A. dos Santos, Saralees Nadarajah, Fernando A. Moala, Hassan S. Bakouch and Shuhrah Alghamdi
Axioms 2025, 14(8), 574; https://doi.org/10.3390/axioms14080574 - 25 Jul 2025
Viewed by 173
Abstract
This paper addresses the limitations of existing bivariate generalized exponential (GE) distributions for modeling lifetime data, which often exhibit rigid dependence structures or non-GE marginals. To overcome these limitations, we introduce four new bivariate GE distributions based on the Farlie–Gumbel–Morgenstern, Gumbel–Barnett, Clayton, and [...] Read more.
This paper addresses the limitations of existing bivariate generalized exponential (GE) distributions for modeling lifetime data, which often exhibit rigid dependence structures or non-GE marginals. To overcome these limitations, we introduce four new bivariate GE distributions based on the Farlie–Gumbel–Morgenstern, Gumbel–Barnett, Clayton, and Frank copulas, which allow for more flexible modeling of various dependence structures. We employ a Bayesian framework with Markov Chain Monte Carlo (MCMC) methods for parameter estimation. A simulation study is conducted to evaluate the performance of the proposed models, which are then applied to a real-world dataset of electrical treeing failures. The results from the data application demonstrate that the copula-based models, particularly the one derived from the Frank copula, provide a superior fit compared to existing bivariate GE models. This work provides a flexible and robust framework for modeling dependent lifetime data. Full article
Show Figures

Figure 1

16 pages, 666 KiB  
Article
Bayesian Analysis of the Maxwell Distribution Under Progressively Type-II Random Censoring
by Rajni Goel, Mahmoud M. Abdelwahab and Mustafa M. Hasaballah
Axioms 2025, 14(8), 573; https://doi.org/10.3390/axioms14080573 - 25 Jul 2025
Viewed by 178
Abstract
Accurate modeling of product lifetimes is vital in reliability analysis and engineering to ensure quality and maintain competitiveness. This paper proposes the progressively randomly censored Maxwell distribution, which incorporates both progressive Type-II and random censoring within the Maxwell distribution framework. The model allows [...] Read more.
Accurate modeling of product lifetimes is vital in reliability analysis and engineering to ensure quality and maintain competitiveness. This paper proposes the progressively randomly censored Maxwell distribution, which incorporates both progressive Type-II and random censoring within the Maxwell distribution framework. The model allows for the planned removal of surviving units at specific stages of an experiment, accounting for both deliberate and random censoring events. It is assumed that survival and censoring times each follow a Maxwell distribution, though with distinct parameters. Both frequentist and Bayesian approaches are employed to estimate the model parameters. In the frequentist approach, maximum likelihood estimators and their corresponding confidence intervals are derived. In the Bayesian approach, Bayes estimators are obtained using an inverse gamma prior and evaluated through a Markov Chain Monte Carlo (MCMC) method under the squared error loss function (SELF). A Monte Carlo simulation study evaluates the performance of the proposed estimators. The practical relevance of the methodology is demonstrated using a real data set. Full article
Show Figures

Figure 1

28 pages, 835 KiB  
Article
Progressive First-Failure Censoring in Reliability Analysis: Inference for a New Weibull–Pareto Distribution
by Rashad M. EL-Sagheer and Mahmoud M. Ramadan
Mathematics 2025, 13(15), 2377; https://doi.org/10.3390/math13152377 - 24 Jul 2025
Viewed by 185
Abstract
This paper explores statistical techniques for estimating unknown lifetime parameters using data from a progressive first-failure censoring scheme. The failure times are modeled with a new Weibull–Pareto distribution. Maximum likelihood estimators are derived for the model parameters, as well as for the survival [...] Read more.
This paper explores statistical techniques for estimating unknown lifetime parameters using data from a progressive first-failure censoring scheme. The failure times are modeled with a new Weibull–Pareto distribution. Maximum likelihood estimators are derived for the model parameters, as well as for the survival and hazard rate functions, although these estimators do not have explicit closed-form solutions. The Newton–Raphson algorithm is employed for the numerical computation of these estimates. Confidence intervals for the parameters are approximated based on the asymptotic normality of the maximum likelihood estimators. The Fisher information matrix is calculated using the missing information principle, and the delta technique is applied to approximate confidence intervals for the survival and hazard rate functions. Bayesian estimators are developed under squared error, linear exponential, and general entropy loss functions, assuming independent gamma priors. Markov chain Monte Carlo sampling is used to obtain Bayesian point estimates and the highest posterior density credible intervals for the parameters and reliability measures. Finally, the proposed methods are demonstrated through the analysis of a real dataset. Full article
(This article belongs to the Section D1: Probability and Statistics)
Show Figures

Figure 1

35 pages, 11039 KiB  
Article
Optimum Progressive Data Analysis and Bayesian Inference for Unified Progressive Hybrid INH Censoring with Applications to Diamonds and Gold
by Heba S. Mohammed, Osama E. Abo-Kasem and Ahmed Elshahhat
Axioms 2025, 14(8), 559; https://doi.org/10.3390/axioms14080559 - 23 Jul 2025
Viewed by 170
Abstract
A novel unified progressive hybrid censoring is introduced to combine both progressive and hybrid censoring plans to allow flexible test termination either after a prespecified number of failures or at a fixed time. This work develops both frequentist and Bayesian inferential procedures for [...] Read more.
A novel unified progressive hybrid censoring is introduced to combine both progressive and hybrid censoring plans to allow flexible test termination either after a prespecified number of failures or at a fixed time. This work develops both frequentist and Bayesian inferential procedures for estimating the parameters, reliability, and hazard rates of the inverted Nadarajah–Haghighi lifespan model when a sample is produced from such a censoring plan. Maximum likelihood estimators are obtained through the Newton–Raphson iterative technique. The delta method, based on the Fisher information matrix, is utilized to build the asymptotic confidence intervals for each unknown quantity. In the Bayesian methodology, Markov chain Monte Carlo techniques with independent gamma priors are implemented to generate posterior summaries and credible intervals, addressing computational intractability through the Metropolis—Hastings algorithm. Extensive Monte Carlo simulations compare the efficiency and utility of frequentist and Bayesian estimates across multiple censoring designs, highlighting the superiority of Bayesian inference using informative prior information. Two real-world applications utilizing rare minerals from gold and diamond durability studies are examined to demonstrate the adaptability of the proposed estimators to the analysis of rare events in precious materials science. By applying four different optimality criteria to multiple competing plans, an analysis of various progressive censoring strategies that yield the best performance is conducted. The proposed censoring framework is effectively applied to real-world datasets involving diamonds and gold, demonstrating its practical utility in modeling the reliability and failure behavior of rare and high-value minerals. Full article
(This article belongs to the Special Issue Applications of Bayesian Methods in Statistical Analysis)
Show Figures

Figure 1

11 pages, 961 KiB  
Article
Viscous Cosmology in f(Q,Lm) Gravity: Insights from CC, BAO, and GRB Data
by Dheeraj Singh Rana, Sai Swagat Mishra, Aaqid Bhat and Pradyumn Kumar Sahoo
Universe 2025, 11(8), 242; https://doi.org/10.3390/universe11080242 - 23 Jul 2025
Viewed by 228
Abstract
In this article, we investigate the influence of viscosity on the evolution of the cosmos within the framework of the newly proposed f(Q,Lm) gravity. We have considered a linear functional form [...] Read more.
In this article, we investigate the influence of viscosity on the evolution of the cosmos within the framework of the newly proposed f(Q,Lm) gravity. We have considered a linear functional form f(Q,Lm)=αQ+βLm with a bulk viscous coefficient ζ=ζ0+ζ1H for our analysis and obtained exact solutions to the field equations associated with a flat FLRW metric. In addition, we utilized Cosmic Chronometers (CC), CC + BAO, CC + BAO + GRB, and GRB data samples to determine the constrained values of independent parameters in the derived exact solution. The likelihood function and the Markov Chain Monte Carlo (MCMC) sampling technique are combined to yield the posterior probability using Bayesian statistical methods. Furthermore, by comparing our results with the standard cosmological model, we found that our considered model supports the acceleration of the universe in late time. Full article
Show Figures

Figure 1

13 pages, 793 KiB  
Communication
Gamma-Ray Bursts Calibrated by Using Artificial Neural Networks from the Pantheon+ Sample
by Zhen Huang, Xin Luo, Bin Zhang, Jianchao Feng, Puxun Wu, Yu Liu and Nan Liang
Universe 2025, 11(8), 241; https://doi.org/10.3390/universe11080241 - 23 Jul 2025
Viewed by 137
Abstract
In this paper, we calibrate the luminosity relation of gamma−ray bursts (GRBs) by employing artificial neural networks (ANNs) to analyze the Pantheon+ sample of type Ia supernovae (SNe Ia) in a manner independent of cosmological assumptions. The A219 GRB dataset is used to [...] Read more.
In this paper, we calibrate the luminosity relation of gamma−ray bursts (GRBs) by employing artificial neural networks (ANNs) to analyze the Pantheon+ sample of type Ia supernovae (SNe Ia) in a manner independent of cosmological assumptions. The A219 GRB dataset is used to calibrate the Amati relation (Ep-Eiso) at low redshift with the ANN framework, facilitating the construction of the Hubble diagram at higher redshifts. Cosmological models are constrained with GRBs at high redshift and the latest observational Hubble data (OHD) via the Markov chain Monte Carlo numerical approach. For the Chevallier−Polarski−Linder (CPL) model within a flat universe, we obtain Ωm=0.3210.069+0.078h=0.6540.071+0.053w0=1.020.50+0.67, and wa=0.980.58+0.58 at the 1 −σ confidence level, which indicates a preference for dark energy with potential redshift evolution (wa0). These findings using ANNs align closely with those derived from GRBs calibrated using Gaussian processes (GPs). Full article
Show Figures

Figure 1

25 pages, 10024 KiB  
Article
Forecasting with a Bivariate Hysteretic Time Series Model Incorporating Asymmetric Volatility and Dynamic Correlations
by Hong Thi Than
Entropy 2025, 27(7), 771; https://doi.org/10.3390/e27070771 - 21 Jul 2025
Viewed by 239
Abstract
This study explores asymmetric volatility structures within multivariate hysteretic autoregressive (MHAR) models that incorporate conditional correlations, aiming to flexibly capture the dynamic behavior of global financial assets. The proposed framework integrates regime switching and time-varying delays governed by a hysteresis variable, enabling the [...] Read more.
This study explores asymmetric volatility structures within multivariate hysteretic autoregressive (MHAR) models that incorporate conditional correlations, aiming to flexibly capture the dynamic behavior of global financial assets. The proposed framework integrates regime switching and time-varying delays governed by a hysteresis variable, enabling the model to account for both asymmetric volatility and evolving correlation patterns over time. We adopt a fully Bayesian inference approach using adaptive Markov chain Monte Carlo (MCMC) techniques, allowing for the joint estimation of model parameters, Value-at-Risk (VaR), and Marginal Expected Shortfall (MES). The accuracy of VaR forecasts is assessed through two standard backtesting procedures. Our empirical analysis involves both simulated data and real-world financial datasets to evaluate the model’s effectiveness in capturing downside risk dynamics. We demonstrate the application of the proposed method on three pairs of daily log returns involving the S&P500, Bank of America (BAC), Intercontinental Exchange (ICE), and Goldman Sachs (GS), present the results obtained, and compare them against the original model framework. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

23 pages, 752 KiB  
Article
On Joint Progressively Censored Gumbel Type-II Distributions: (Non-) Bayesian Estimation with an Application to Physical Data
by Mustafa M. Hasaballah, Mahmoud E. Bakr, Oluwafemi Samson Balogun and Arwa M. Alshangiti
Axioms 2025, 14(7), 544; https://doi.org/10.3390/axioms14070544 - 20 Jul 2025
Viewed by 212
Abstract
This paper presents a comprehensive statistical analysis of the Gumbel Type-II distribution based on joint progressive Type-II censoring. It derives the maximum likelihood estimators for the distribution parameters and constructs their asymptotic confidence intervals. It investigates Bayesian estimation using non-informative and informative priors [...] Read more.
This paper presents a comprehensive statistical analysis of the Gumbel Type-II distribution based on joint progressive Type-II censoring. It derives the maximum likelihood estimators for the distribution parameters and constructs their asymptotic confidence intervals. It investigates Bayesian estimation using non-informative and informative priors under the squared error loss function and the LINEX loss function, applying Markov Chain Monte Carlo methods. A detailed simulation study evaluates the estimators’ performance in terms of average estimates, mean squared errors, and average confidence interval lengths. Results show that Bayesian estimators can outperform maximum likelihood estimators, especially with informative priors. A real data example demonstrates the practical use of the proposed methods. The analysis confirms that the Gumbel Type-II distribution with joint progressive censoring provides a flexible and effective model for lifetime data, enabling more accurate reliability assessment and risk analysis in engineering and survival studies. Full article
Show Figures

Figure 1

19 pages, 1760 KiB  
Article
A Multilevel Spatial Framework for E-Scooter Collision Risk Assessment in Urban Texas
by Nassim Sohaee, Arian Azadjoo Tabari and Rod Sardari
Safety 2025, 11(3), 67; https://doi.org/10.3390/safety11030067 - 17 Jul 2025
Viewed by 298
Abstract
As shared micromobility grows quickly in metropolitan settings, e-scooter safety issues have become more urgent. This paper uses a Bayesian hierarchical model applied to census block groups in several Texas metropolitan areas to construct a spatial risk assessment methodology for e-scooter crashes. Based [...] Read more.
As shared micromobility grows quickly in metropolitan settings, e-scooter safety issues have become more urgent. This paper uses a Bayesian hierarchical model applied to census block groups in several Texas metropolitan areas to construct a spatial risk assessment methodology for e-scooter crashes. Based on crash statistics from 2018 to 2024, we develop a severity-weighted crash risk index and combine it with variables related to land use, transportation, demographics, economics, and other factors. The model comprises a geographically structured random effect based on a Conditional Autoregressive (CAR) model, which accounts for residual spatial clustering after capture. It also includes fixed effects for covariates such as car ownership and nightlife density, as well as regional random intercepts to account for city-level heterogeneity. Markov Chain Monte Carlo is used for model fitting; evaluation reveals robust spatial calibration and predictive ability. The following key predictors are statistically significant: a higher share of working-age residents shows a positive association with crash frequency (incidence rate ratio (IRR): ≈1.55 per +10% population aged 18–64), as does a greater proportion of car-free households (IRR ≈ 1.20). In the built environment, entertainment-related employment density is strongly linked to elevated risk (IRR ≈ 1.37), and high intersection density similarly increases crash risk (IRR ≈ 1.32). In contrast, higher residential housing density has a protective effect (IRR ≈ 0.78), correlating with fewer crashes. Additionally, a sensitivity study reveals that the risk index is responsive to policy scenarios, including reducing car ownership or increasing employment density, and is sensitive to varying crash intensity weights. Results show notable collision hotspots near entertainment venues and central areas, as well as increased baseline risk in car-oriented urban environments. The results provide practical information for targeted initiatives to lower e-scooter collision risk and safety planning. Full article
(This article belongs to the Special Issue Road Traffic Risk Assessment: Control and Prevention of Collisions)
Show Figures

Figure 1

36 pages, 1465 KiB  
Article
USV-Affine Models Without Derivatives: A Bayesian Time-Series Approach
by Malefane Molibeli and Gary van Vuuren
J. Risk Financial Manag. 2025, 18(7), 395; https://doi.org/10.3390/jrfm18070395 - 17 Jul 2025
Viewed by 263
Abstract
We investigate the affine term structure models (ATSMs) with unspanned stochastic volatility (USV). Our aim is to test their ability to generate accurate cross-sectional behavior and time-series dynamics of bond yields. Comparing the restricted models and those with USV, we test whether they [...] Read more.
We investigate the affine term structure models (ATSMs) with unspanned stochastic volatility (USV). Our aim is to test their ability to generate accurate cross-sectional behavior and time-series dynamics of bond yields. Comparing the restricted models and those with USV, we test whether they produce both reasonable estimates for the short rate variance and cross-sectional fit. Essentially, a joint approach from both time series and options data for estimating risk-neutral dynamics in ATSMs should be followed. Due to the scarcity of derivative data in emerging markets, we estimate the model using only time-series of bond yields. A Bayesian estimation approach combining Markov Chain Monte Carlo (MCMC) and the Kalman filter is employed to recover the model parameters and filter out latent state variables. We further incorporate macro-economic indicators and GARCH-based volatility as external validation of the filtered latent volatility process. The A1(4)USV performs better both in and out of sample, even though the issue of a tension between time series and cross-section remains unresolved. Our findings suggest that even without derivative instruments, it is possible to identify and interpret risk-neutral dynamics and volatility risk using observable time-series data. Full article
(This article belongs to the Section Financial Markets)
Show Figures

Figure 1

30 pages, 3032 KiB  
Article
A Bayesian Additive Regression Trees Framework for Individualized Causal Effect Estimation
by Lulu He, Lixia Cao, Tonghui Wang, Zhenqi Cao and Xin Shi
Mathematics 2025, 13(13), 2195; https://doi.org/10.3390/math13132195 - 4 Jul 2025
Viewed by 400
Abstract
In causal inference research, accurate estimation of individualized treatment effects (ITEs) is at the core of effective intervention. This paper proposes a dual-structure ITE-estimation model based on Bayesian Additive Regression Trees (BART), which constructs independent BART sub-models for the treatment and control groups, [...] Read more.
In causal inference research, accurate estimation of individualized treatment effects (ITEs) is at the core of effective intervention. This paper proposes a dual-structure ITE-estimation model based on Bayesian Additive Regression Trees (BART), which constructs independent BART sub-models for the treatment and control groups, estimates ITEs using the potential outcome framework and enhances posterior stability and estimation reliability through Markov Chain Monte Carlo (MCMC) sampling. Based on psychological stress questionnaire data from graduate students, the study first integrates BART with the Shapley value method to identify employment pressure as a key driving factor and reveals substantial heterogeneity in ITEs across subgroups. Furthermore, the study constructs an ITE model using a dual-structured BART framework (BART-ITE), where employment pressure is defined as the treatment variable. Experimental results show that the model performs well in terms of credible interval width and ranking ability, demonstrating superior heterogeneity detection and individual-level sorting. External validation using both the Bootstrap method and matching-based pseudo-ITE estimation confirms the robustness of the proposed model. Compared with mainstream meta-learning methods such as S-Learner, X-Learner and Bayesian Causal Forest, the dual-structure BART-ITE model achieves a favorable balance between root mean square error and bias. In summary, it offers clear advantages in capturing ITE heterogeneity and enhancing estimation reliability and individualized decision-making. Full article
(This article belongs to the Special Issue Bayesian Learning and Its Advanced Applications)
Show Figures

Figure 1

17 pages, 572 KiB  
Article
Statistical Analysis Under a Random Censoring Scheme with Applications
by Mustafa M. Hasaballah and Mahmoud M. Abdelwahab
Symmetry 2025, 17(7), 1048; https://doi.org/10.3390/sym17071048 - 3 Jul 2025
Cited by 1 | Viewed by 261
Abstract
The Gumbel Type-II distribution is a widely recognized and frequently utilized lifetime distribution, playing a crucial role in reliability engineering. This paper focuses on the statistical inference of the Gumbel Type-II distribution under a random censoring scheme. From a frequentist perspective, point estimates [...] Read more.
The Gumbel Type-II distribution is a widely recognized and frequently utilized lifetime distribution, playing a crucial role in reliability engineering. This paper focuses on the statistical inference of the Gumbel Type-II distribution under a random censoring scheme. From a frequentist perspective, point estimates for the unknown parameters are derived using the maximum likelihood estimation method, and confidence intervals are constructed based on the Fisher information matrix. From a Bayesian perspective, Bayes estimates of the parameters are obtained using the Markov Chain Monte Carlo method, and the average lengths of credible intervals are calculated. The Bayesian inference is performed under both the squared error loss function and the general entropy loss function. Additionally, a numerical simulation is conducted to evaluate the performance of the proposed methods. To demonstrate their practical applicability, a real world example is provided, illustrating the application and development of these inference techniques. In conclusion, the Bayesian method appears to outperform other approaches, although each method offers unique advantages. Full article
Show Figures

Figure 1

30 pages, 16041 KiB  
Article
Estimation of Inverted Weibull Competing Risks Model Using Improved Adaptive Progressive Type-II Censoring Plan with Application to Radiobiology Data
by Refah Alotaibi, Mazen Nassar and Ahmed Elshahhat
Symmetry 2025, 17(7), 1044; https://doi.org/10.3390/sym17071044 - 2 Jul 2025
Viewed by 334
Abstract
This study focuses on estimating the unknown parameters and the reliability function of the inverted-Weibull distribution, using an improved adaptive progressive Type-II censoring scheme under a competing risks model. Both classical and Bayesian estimation approaches are explored to offer a thorough analysis. Under [...] Read more.
This study focuses on estimating the unknown parameters and the reliability function of the inverted-Weibull distribution, using an improved adaptive progressive Type-II censoring scheme under a competing risks model. Both classical and Bayesian estimation approaches are explored to offer a thorough analysis. Under the classical approach, maximum likelihood estimators are obtained for the unknown parameters and the reliability function. Approximate confidence intervals are also constructed to assess the uncertainty in the estimates. From a Bayesian standpoint, symmetric Bayes estimates and highest posterior density credible intervals are computed using Markov Chain Monte Carlo sampling, assuming a symmetric squared error loss function. An extensive simulation study is carried out to assess how well the proposed methods perform under different experimental conditions, showing promising accuracy. To demonstrate the practical use of these methods, a real dataset is analyzed, consisting of the survival times of male mice aged 35 to 42 days after being exposed to 300 roentgens of X-ray radiation. The analysis demonstrated that the inverted Weibull distribution is well-suited for modeling the given dataset. Furthermore, the Bayesian estimation method, considering both point estimates and interval estimates, was found to be more effective than the classical approach in estimating the model parameters as well as the reliability function. Full article
Show Figures

Figure 1

Back to TopTop