Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (87)

Search Parameters:
Keywords = random censoring

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
18 pages, 366 KiB  
Article
Nonparametric Transformation Models for Double-Censored Data with Crossed Survival Curves: A Bayesian Approach
by Ping Xu, Ruichen Ni, Shouzheng Chen, Zhihua Ma and Chong Zhong
Mathematics 2025, 13(15), 2461; https://doi.org/10.3390/math13152461 - 30 Jul 2025
Viewed by 133
Abstract
Double-censored data are frequently encountered in pharmacological and epidemiological studies, where the failure time can only be observed within a certain range and is otherwise either left- or right-censored. In this paper, we present a Bayesian approach for analyzing double-censored survival data with [...] Read more.
Double-censored data are frequently encountered in pharmacological and epidemiological studies, where the failure time can only be observed within a certain range and is otherwise either left- or right-censored. In this paper, we present a Bayesian approach for analyzing double-censored survival data with crossed survival curves. We introduce a novel pseudo-quantile I-splines prior to model monotone transformations under both random and fixed censoring schemes. Additionally, we incorporate categorical heteroscedasticity using the dependent Dirichlet process (DDP), enabling the estimation of crossed survival curves. Comprehensive simulations further validate the robustness and accuracy of the method, particularly under the fixed censoring scheme, where traditional approaches may NOT be applicable. In the randomized AIDS clinical trial, by incorporating the categorical heteroscedasticity, we obtain a new finding that the effect of baseline log RNA levels is significant. The proposed framework provides a flexible and reliable tool for survival analysis, offering an alternative to parametric and semiparametric models. Full article
Show Figures

Figure 1

13 pages, 600 KiB  
Article
Frequentist and Bayesian Estimation Under Progressive Type-II Random Censoring for a Two-Parameter Exponential Distribution
by Rajni Goel, Mahmoud M. Abdelwahab and Tejaswar Kamble
Symmetry 2025, 17(8), 1205; https://doi.org/10.3390/sym17081205 - 29 Jul 2025
Viewed by 160
Abstract
In medical research, random censoring often occurs due to unforeseen subject withdrawals, whereas progressive censoring is intentionally applied to minimize time and resource requirements during experimentation. This work focuses on estimating the parameters of a two-parameter exponential distribution under a progressive Type-II random [...] Read more.
In medical research, random censoring often occurs due to unforeseen subject withdrawals, whereas progressive censoring is intentionally applied to minimize time and resource requirements during experimentation. This work focuses on estimating the parameters of a two-parameter exponential distribution under a progressive Type-II random censoring scheme, which integrates both censoring strategies. The use of symmetric properties in failure and censoring time models, arising from a shared location parameter, facilitates a balanced and robust inferential framework. This symmetry ensures interpretational clarity and enhances the tractability of both frequentist and Bayesian methods. Maximum likelihood estimators (MLEs) are obtained, along with asymptotic confidence intervals. A Bayesian approach is also introduced, utilizing inverse gamma priors, and Gibbs sampling is implemented to derive Bayesian estimates. The effectiveness of the proposed methodologies was assessed through extensive Monte Carlo simulations and demonstrated using an actual dataset. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

16 pages, 666 KiB  
Article
Bayesian Analysis of the Maxwell Distribution Under Progressively Type-II Random Censoring
by Rajni Goel, Mahmoud M. Abdelwahab and Mustafa M. Hasaballah
Axioms 2025, 14(8), 573; https://doi.org/10.3390/axioms14080573 - 25 Jul 2025
Viewed by 163
Abstract
Accurate modeling of product lifetimes is vital in reliability analysis and engineering to ensure quality and maintain competitiveness. This paper proposes the progressively randomly censored Maxwell distribution, which incorporates both progressive Type-II and random censoring within the Maxwell distribution framework. The model allows [...] Read more.
Accurate modeling of product lifetimes is vital in reliability analysis and engineering to ensure quality and maintain competitiveness. This paper proposes the progressively randomly censored Maxwell distribution, which incorporates both progressive Type-II and random censoring within the Maxwell distribution framework. The model allows for the planned removal of surviving units at specific stages of an experiment, accounting for both deliberate and random censoring events. It is assumed that survival and censoring times each follow a Maxwell distribution, though with distinct parameters. Both frequentist and Bayesian approaches are employed to estimate the model parameters. In the frequentist approach, maximum likelihood estimators and their corresponding confidence intervals are derived. In the Bayesian approach, Bayes estimators are obtained using an inverse gamma prior and evaluated through a Markov Chain Monte Carlo (MCMC) method under the squared error loss function (SELF). A Monte Carlo simulation study evaluates the performance of the proposed estimators. The practical relevance of the methodology is demonstrated using a real data set. Full article
Show Figures

Figure 1

20 pages, 437 KiB  
Article
A Copula-Driven CNN-LSTM Framework for Estimating Heterogeneous Treatment Effects in Multivariate Outcomes
by Jong-Min Kim
Mathematics 2025, 13(15), 2384; https://doi.org/10.3390/math13152384 - 24 Jul 2025
Viewed by 400
Abstract
Estimating heterogeneous treatment effects (HTEs) across multiple correlated outcomes poses significant challenges due to complex dependency structures and diverse data types. In this study, we propose a novel deep learning framework integrating empirical copula transformations with a CNN-LSTM (Convolutional Neural Networks and Long [...] Read more.
Estimating heterogeneous treatment effects (HTEs) across multiple correlated outcomes poses significant challenges due to complex dependency structures and diverse data types. In this study, we propose a novel deep learning framework integrating empirical copula transformations with a CNN-LSTM (Convolutional Neural Networks and Long Short-Term Memory networks) architecture to capture nonlinear dependencies and temporal dynamics in multivariate treatment effect estimation. The empirical copula transformation, a rank-based nonparametric approach, preprocesses input covariates to better represent the underlying joint distributions before modeling. We compare this method with a baseline CNN-LSTM model lacking copula preprocessing and a nonparametric tree-based approach, the Causal Forest, grounded in generalized random forests for HTE estimation. Our framework accommodates continuous, count, and censored survival outcomes simultaneously through a multitask learning setup with customized loss functions, including Cox partial likelihood for survival data. We evaluate model performance under varying treatment perturbation rates via extensive simulation studies, demonstrating that the Empirical Copula CNN-LSTM achieves superior accuracy and robustness in average treatment effect (ATE) and conditional average treatment effect (CATE) estimation. These results highlight the potential of copula-based deep learning models for causal inference in complex multivariate settings, offering valuable insights for personalized treatment strategies. Full article
(This article belongs to the Special Issue Current Developments in Theoretical and Applied Statistics)
Show Figures

Figure 1

17 pages, 572 KiB  
Article
Statistical Analysis Under a Random Censoring Scheme with Applications
by Mustafa M. Hasaballah and Mahmoud M. Abdelwahab
Symmetry 2025, 17(7), 1048; https://doi.org/10.3390/sym17071048 - 3 Jul 2025
Cited by 1 | Viewed by 254
Abstract
The Gumbel Type-II distribution is a widely recognized and frequently utilized lifetime distribution, playing a crucial role in reliability engineering. This paper focuses on the statistical inference of the Gumbel Type-II distribution under a random censoring scheme. From a frequentist perspective, point estimates [...] Read more.
The Gumbel Type-II distribution is a widely recognized and frequently utilized lifetime distribution, playing a crucial role in reliability engineering. This paper focuses on the statistical inference of the Gumbel Type-II distribution under a random censoring scheme. From a frequentist perspective, point estimates for the unknown parameters are derived using the maximum likelihood estimation method, and confidence intervals are constructed based on the Fisher information matrix. From a Bayesian perspective, Bayes estimates of the parameters are obtained using the Markov Chain Monte Carlo method, and the average lengths of credible intervals are calculated. The Bayesian inference is performed under both the squared error loss function and the general entropy loss function. Additionally, a numerical simulation is conducted to evaluate the performance of the proposed methods. To demonstrate their practical applicability, a real world example is provided, illustrating the application and development of these inference techniques. In conclusion, the Bayesian method appears to outperform other approaches, although each method offers unique advantages. Full article
Show Figures

Figure 1

14 pages, 698 KiB  
Article
Inferring the Timing of Antiretroviral Therapy by Zero-Inflated Random Change Point Models Using Longitudinal Data Subject to Left-Censoring
by Hongbin Zhang, McKaylee Robertson, Sarah L. Braunstein, David B. Hanna, Uriel R. Felsen, Levi Waldron and Denis Nash
Algorithms 2025, 18(6), 346; https://doi.org/10.3390/a18060346 - 5 Jun 2025
Viewed by 644
Abstract
We propose a new random change point model that utilizes routinely recorded individual-level HIV viral load data to estimate the timing of antiretroviral therapy (ART) initiation in people living with HIV. The change point distribution is assumed to follow a zero-inflated exponential distribution [...] Read more.
We propose a new random change point model that utilizes routinely recorded individual-level HIV viral load data to estimate the timing of antiretroviral therapy (ART) initiation in people living with HIV. The change point distribution is assumed to follow a zero-inflated exponential distribution for the longitudinal data, which is also subject to left-censoring, and the underlying data-generating mechanism is a nonlinear mixed-effects model. We extend the Stochastic EM (StEM) algorithm by combining a Gibbs sampler with a Metropolis–Hastings sampling. We apply the method to real HIV data to infer the timing of ART initiation since diagnosis. Additionally, we conduct simulation studies to assess the performance of our proposed method. Full article
Show Figures

Figure 1

26 pages, 12878 KiB  
Article
Reliability Estimation for the Inverse Chen Distribution Under Adaptive Progressive Censoring with Binomial Removals: A Framework for Asymmetric Data
by Refah Alotaibi, Mazen Nassar and Ahmed Elshahhat
Symmetry 2025, 17(6), 812; https://doi.org/10.3390/sym17060812 - 23 May 2025
Viewed by 368
Abstract
Traditional reliability methods using fixed removal plans often overlook withdrawal randomness, leading to biased estimates for asymmetric data. This study advances classical and Bayesian frameworks for the inverse Chen distribution, which is suited for modeling asymmetric data under adaptive progressively Type-II censoring with [...] Read more.
Traditional reliability methods using fixed removal plans often overlook withdrawal randomness, leading to biased estimates for asymmetric data. This study advances classical and Bayesian frameworks for the inverse Chen distribution, which is suited for modeling asymmetric data under adaptive progressively Type-II censoring with binomial removals. Here, removals post-failure follow a dynamic binomial process, enhancing a more realistic approach for reliability studies. Maximum likelihood estimates are computed numerically, with confidence intervals derived asymptotically. Bayesian approaches employ gamma priors, symmetric squared error loss, and posterior sampling for estimates and credible intervals. A simulation study validates the methods, while two asymmetric real-world applications demonstrate practicality: (1) analyzing diamond sizes from South-West Africa, capturing skewed geological distributions, and (2) modeling failure times of airborne communication transceivers, vital for aviation safety. The flexibility of the inverse Chen in handling asymmetric data addresses the limitations of symmetric assumptions, offering precise reliability tools for complex scenarios. This integration of adaptive censoring and asymmetric distributions advances reliability analysis, providing robust solutions where traditional approaches falter. Full article
Show Figures

Figure 1

30 pages, 515 KiB  
Article
Parameter Estimation of the Lomax Lifetime Distribution Based on Middle-Censored Data: Methodology, Applications, and Comparative Analysis
by Peiyao Ren, Wenhao Gui and Shan Liang
Axioms 2025, 14(5), 330; https://doi.org/10.3390/axioms14050330 - 26 Apr 2025
Viewed by 599
Abstract
The Lomax distribution has important applications in survival analysis, reliability engineering, insurance, finance, and other fields. Middle-censoring is an important censoring scheme, and data with middle-censoring will produce censoring in random intervals. This paper studies the parameter estimation of the Lomax distribution based [...] Read more.
The Lomax distribution has important applications in survival analysis, reliability engineering, insurance, finance, and other fields. Middle-censoring is an important censoring scheme, and data with middle-censoring will produce censoring in random intervals. This paper studies the parameter estimation of the Lomax distribution based on middle-censored data. The expectation–maximization algorithm is employed to compute the maximum likelihood estimates of the two unknown parameters of the Lomax distribution. After processing the data using the midpoint approach estimation, the parameter estimates are obtained by two computational methods: the Newton–Raphson iteration method and the fixed-point method. Moreover, the calculation methods for the asymptotic confidence intervals of the two parameters are provided, with the confidence interval coverage rate serving as one of the criteria for evaluating the estimation performance. In the Bayesian estimation aspect, the shape parameter is estimated using a Gamma prior distribution, and the Gibbs sampling method is employed for the solution. Finally, both simulation data and real data are used to compare the accuracy of the various estimation methods. Full article
(This article belongs to the Special Issue Computational Statistics and Its Applications)
Show Figures

Figure 1

24 pages, 1017 KiB  
Article
Parametric Estimation and Analysis of Lifetime Models with Competing Risks Under Middle-Censored Data
by Shan Liang and Wenhao Gui
Appl. Sci. 2025, 15(8), 4288; https://doi.org/10.3390/app15084288 - 13 Apr 2025
Cited by 1 | Viewed by 374
Abstract
Middle-censoring is a general censoring mechanism. In middle-censoring, the exact lifetimes are observed only for a portion of the units and for others, we can only know the random interval within which the failure occurs. In this study, we focus on statistical inference [...] Read more.
Middle-censoring is a general censoring mechanism. In middle-censoring, the exact lifetimes are observed only for a portion of the units and for others, we can only know the random interval within which the failure occurs. In this study, we focus on statistical inference for middle-censored data with competing risks. The latent failure times are assumed to be independent and follow Burr-XII distributions with distinct parameters. To begin with, we derive the maximum likelihood estimators for the unknown parameters, proving their existence and uniqueness. Additionally, asymptotic confidence intervals are constructed using the observed Fisher information matrix. Furthermore, Bayesian estimates under squared loss function and the corresponding highest posterior density intervals are obtained through the Gibbs sampling method. A simulation study is carried out to assess the performance of all proposed estimators. Lastly, an analysis for a practical dataset is provided to demonstrate the inferential processes developed. Full article
Show Figures

Figure 1

30 pages, 5133 KiB  
Article
Adipocytokine Protein Expression from Visceral Fat Differs Significantly Based on Diet, Sex, and Age in C3H/HeJ Mice Fed Long-Term, High-Fat Diets, ± Ammonium-Hydroxide-Supplemented Dietary Protein
by Caleb Boren, Benjamin Barr, Noshin Mubtasim and Lauren Gollahon
Curr. Issues Mol. Biol. 2025, 47(4), 218; https://doi.org/10.3390/cimb47040218 - 23 Mar 2025
Viewed by 700
Abstract
(1) Background: Overconsumption of processed meats, fats, and carbohydrates drives the obesity epidemic in the USA. Associated with this epidemic are increases in metabolic diseases, such as type 2 diabetes, cardiovascular disease, and cancer. In this study, protein levels of adipocytokines isolated from [...] Read more.
(1) Background: Overconsumption of processed meats, fats, and carbohydrates drives the obesity epidemic in the USA. Associated with this epidemic are increases in metabolic diseases, such as type 2 diabetes, cardiovascular disease, and cancer. In this study, protein levels of adipocytokines isolated from visceral fat in mice fed high-fat diets with proteins modified through ammonium supplementation were analyzed to determine changes that occur as a result of dietary protein source and its modification based on age or sex. (2) Methods: Male and female C3H/HeJ mice were randomized into six customized diets—Group 1: CCN = Control Chow (CC) + Ammonium Hydroxide Enhancement (AHE); Group 2: CC = Control Chow; Group 3: HFBN = High Fat (HF) AHE Dietary Beef; Group 4: HFB = HF Beef; Group 5: HFCN = HF AHE Dietary Casein; Group 6: HFC = HF Dietary Casein. Mice were censored at six-month intervals, and visceral fat was collected for analysis. This study highlights sex- and age-related changes in cellular adipocytokine protein expression from 12 to 18 months. (3) Results: When compared to dietary casein, dietary-beef-fed mice showed increased expression of adiponectin, leptin, and MCP-1. In dietary casein protein diets, high fat content was correlated with the expression of pro-inflammatory adipocytokines leptin, MCP-1, resistin, VEGF-A, and TIMP-1. Sex-related differences were observed in adiponectin, leptin, and MCP-1 expression levels. AHE of dietary protein decreased the expression of adiponectin, leptin, MCP-1, and TIMP-1. Age-related changes in expression were observed in leptin, MCP-1, and VEGF-A. (4) Conclusions: Our results indicate that the source of dietary protein plays a critical role in determining adipocytokine expression in WAT. Furthermore, this study shows that in addition to dietary protein type (beef or casein), AHE and fat content also impact the relative expression of both pro-inflammatory and anti-inflammatory adipocytokines based on sex over time, with leptin and MCP-1 identified as the most frequently affected. Full article
Show Figures

Figure 1

28 pages, 1067 KiB  
Article
Inference Based on Progressive-Stress Accelerated Life-Testing for Extended Distribution via the Marshall-Olkin Family Under Progressive Type-II Censoring with Optimality Techniques
by Ehab M. Almetwally, Osama M. Khaled and Haroon M. Barakat
Axioms 2025, 14(4), 244; https://doi.org/10.3390/axioms14040244 - 23 Mar 2025
Viewed by 451
Abstract
This paper explores a progressive-stress accelerated life test under progressive type-II censoring with binomial random removal. It assumes a cumulative exposure model in which the lifetimes of test units follow a Marshall–Olkin length-biased exponential distribution. The study derives maximum likelihood and Bayes estimates [...] Read more.
This paper explores a progressive-stress accelerated life test under progressive type-II censoring with binomial random removal. It assumes a cumulative exposure model in which the lifetimes of test units follow a Marshall–Olkin length-biased exponential distribution. The study derives maximum likelihood and Bayes estimates of the model parameters and constructs Bayes estimates of the unknown parameters under various loss functions. In addition, this study provides approximate, credible, and bootstrapping confidence intervals for the estimators. Moreover, it evaluates three optimal test methods to determine the most effective censoring approach based on various optimality criteria. A real-life dataset is analyzed to demonstrate the proposed procedures and simulation studies used to compare two different designs of the progressive-stress test. Full article
(This article belongs to the Special Issue Stochastic Modeling and Optimization Techniques)
Show Figures

Figure 1

17 pages, 1088 KiB  
Article
Bayesian Estimation of the Stress–Strength Parameter for Bivariate Normal Distribution Under an Updated Type-II Hybrid Censoring
by Yu-Jau Lin, Yuhlong Lio and Tzong-Ru Tsai
Mathematics 2025, 13(5), 792; https://doi.org/10.3390/math13050792 - 27 Feb 2025
Viewed by 482
Abstract
To save time and cost for a parameter inference, the type-II hybrid censoring scheme has been broadly applied to collect one-component samples. In the current study, one of the essential parameters for comparing two distributions, that is, the stress–strength probability [...] Read more.
To save time and cost for a parameter inference, the type-II hybrid censoring scheme has been broadly applied to collect one-component samples. In the current study, one of the essential parameters for comparing two distributions, that is, the stress–strength probability δ=Pr(X<Y), is investigated under a new proposed type-II hybrid censoring scheme that generates the type-II hybrid censored two-component sample from the bivariate normal distribution. The difficult issues occurred from extending the one-component type-II hybrid censored sample to a two-component type-II hybrid censored sample are keeping useful information from both components and the establishment of the corresponding likelihood function. To conquer these two drawbacks, the proposed type-II hybrid censoring scheme is addressed as follows. The observed values of the first component, X, of data pairs (X,Y) are recorded up to a random time τ=max{Xr:n,T}, where Xr:n is the rth ordered statistic among n items with r<n as two pre-specified positive integers and T is a pre-determined experimental time. The observed value from the other component variable Y is recorded only if it is the counterpart of X and also observed before time τ; otherwise, it is denoted as occurred or not at τ. Under the new proposed scheme, the likelihood function of the new bivariate censored data is derived to include the factors of double improper integrals to cover all possible cases without the loss of data information where any component is unobserved. A Monte Carlo Markov chain (MCMC) method is applied to find the Bayesian estimate of the bivariate distribution model parameters and the stress–strength probability, δ. An extensive simulation study is conducted to demonstrate the performance of the developed methods. Finally, the proposed methodologies are applied to a type-II hybrid censored sample generated from a bivariate normal distribution. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation: 3rd Edition)
Show Figures

Figure 1

20 pages, 2869 KiB  
Article
Censoring Sensitivity Analysis for Benchmarking Survival Machine Learning Methods
by János Báskay, Tamás Mezei, Péter Banczerowski, Anna Horváth, Tamás Joó and Péter Pollner
Sci 2025, 7(1), 18; https://doi.org/10.3390/sci7010018 - 13 Feb 2025
Cited by 1 | Viewed by 1070
Abstract
(1) Background: Survival analysis models in clinical research must effectively handle censored data, where complete survival times are unknown for some subjects. While established methodologies exist for validating standard machine learning models, current benchmarking approaches rarely assess model robustness under varying censoring conditions. [...] Read more.
(1) Background: Survival analysis models in clinical research must effectively handle censored data, where complete survival times are unknown for some subjects. While established methodologies exist for validating standard machine learning models, current benchmarking approaches rarely assess model robustness under varying censoring conditions. This limitation creates uncertainty about model reliability in real-world applications where censoring patterns may differ from training data. We address this gap by introducing a systematic benchmarking methodology focused on censoring sensitivity. (2) Methods: We developed a benchmarking framework that assesses survival models through controlled modification of censoring conditions. Five models were evaluated: Cox proportional hazards, survival tree, random survival forest, gradient-boosted survival analysis, and mixture density networks. The framework systematically reduced observation periods and increased censoring rates while measuring performance through multiple metrics following Bayesian hyperparameter optimization. (3) Results: Model performance showed greater sensitivity to increased censoring rates than to reduced observation periods. Non-linear models, especially mixture density networks, exhibited higher vulnerability to data quality degradation. Statistical comparisons became increasingly challenging with higher censoring rates due to widened confidence intervals. (4) Conclusions: Our methodology provides a new standard for evaluating survival analysis models, revealing the critical impact of censoring on model performance. These findings offer practical guidance for model selection and development in clinical applications, emphasizing the importance of robust censoring handling strategies. Full article
(This article belongs to the Section Computer Sciences, Mathematics and AI)
Show Figures

Figure 1

24 pages, 534 KiB  
Article
Inference for Two-Parameter Birnbaum–Saunders Distribution Based on Type-II Censored Data with Application to the Fatigue Life of Aluminum Coupon Cuts
by Omar M. Bdair
Mathematics 2025, 13(4), 590; https://doi.org/10.3390/math13040590 - 11 Feb 2025
Cited by 1 | Viewed by 652
Abstract
This study addresses the problem of parameter estimation and prediction for type-II censored data from the two-parameter Birnbaum–Saunders (BS) distribution. The BS distribution is commonly used in reliability analysis, particularly in modeling fatigue life. Accurate estimation and prediction are crucial in many fields [...] Read more.
This study addresses the problem of parameter estimation and prediction for type-II censored data from the two-parameter Birnbaum–Saunders (BS) distribution. The BS distribution is commonly used in reliability analysis, particularly in modeling fatigue life. Accurate estimation and prediction are crucial in many fields where censored data frequently appear, such as material science, medical studies and industrial applications. This paper presents both frequentist and Bayesian approaches to estimate the shape and scale parameters of the BS distribution, along with the prediction of unobserved failure times. Random data are generated from the BS distribution under type-II censoring, where a pre-specified number of failures (m) is observed. The generated data are used to calculate the Maximum Likelihood Estimation (MLE) and Bayesian inference and evaluate their performances. The Bayesian method employs Markov Chain Monte Carlo (MCMC) sampling for point predictions and credible intervals. We apply the methods to both datasets generated under type-II censoring and real-world data on the fatigue life of 6061-T6 aluminum coupons. Although the results show that the two methods yield similar parameter estimates, the Bayesian approach offers more flexible and reliable prediction intervals. Extensive R codes are used to explain the practical application of these methods. Our findings confirm the advantages of Bayesian inference in handling censored data, especially when prior information is available for estimation. This work not only supports the theoretical understanding of the BS distribution under type-II censoring but also provides practical tools for analyzing real data in reliability and survival studies. Future research will discuss extensions of these methods to the multi-sample progressive censoring model with larger datasets and the integration of degradation models commonly encountered in industrial applications. Full article
Show Figures

Figure 1

17 pages, 316 KiB  
Article
On the Distribution of the Random Sum and Linear Combination of Independent Exponentiated Exponential Random Variables
by Abd El-Raheem M. Abd El-Raheem and Mona Hosny
Symmetry 2025, 17(2), 200; https://doi.org/10.3390/sym17020200 - 27 Jan 2025
Viewed by 672
Abstract
The exponentiated exponential distribution has received great attention from many statisticians due to its popularity, many applications, and the fact that it is an efficient alternative to many famous distributions such as Weibull and gamma distributions. Many statisticians have studied the mathematical properties [...] Read more.
The exponentiated exponential distribution has received great attention from many statisticians due to its popularity, many applications, and the fact that it is an efficient alternative to many famous distributions such as Weibull and gamma distributions. Many statisticians have studied the mathematical properties of this distribution and estimated its parameters under different censoring schemes. However, it seems that the distribution of the random sum, the distribution of the linear combination, and the value of the reliability index, R=P(X2<X1), in the case of unequal scale parameters, were not known for this distribution. Therefore, in this article, we present the saddlepoint approximation to the distribution of the random sum, the distribution of linear combination, and the value of the reliability index R=P(X2<X1) for exponentiated exponential variates. These saddlepoint approximations are computationally appealing, and numerical studies confirm their accuracy. In addition to the accuracy provided by the saddlepoint approximation method, it saves time compared to the simulation method, which requires a lot of time. Therefore, the saddlepoint approximation method provides an outstanding balance between precision and computational efficiency. Full article
Show Figures

Figure 1

Back to TopTop