Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

Search Results (216)

Search Parameters:
Keywords = HASTE

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
41 pages, 1517 KB  
Article
The Half-Logistic Generalized Power Lindley Distribution: Theory and Applications
by Ayşe Metin Karakaş and Fatma Bulut
Symmetry 2025, 17(11), 1936; https://doi.org/10.3390/sym17111936 - 12 Nov 2025
Abstract
In this paper, the half-logistic generalized power Lindley distribution, a new two-parameter lifetime model for positive and heavy-tailed data, is proposed and studied. Several mathematical properties are derived, including closed-form expressions for the density, distribution, survival, hazard, and the Lambert W quantile function, [...] Read more.
In this paper, the half-logistic generalized power Lindley distribution, a new two-parameter lifetime model for positive and heavy-tailed data, is proposed and studied. Several mathematical properties are derived, including closed-form expressions for the density, distribution, survival, hazard, and the Lambert W quantile function, as well as series expansions for moments, skewness, kurtosis, and Rényi entropy. Parameter estimation is performed using maximum likelihood and Bayesian methods, where Bayesian estimation is implemented via the Metropolis–Hastings algorithm. A Monte Carlo simulation study is conducted to evaluate the estimators’ performance, showing decreasing bias and mean squared error with larger samples. Finally, three real-world datasets are analyzed to demonstrate that the proposed distribution provides superior fit compared to Lindley-type competitors and the Weibull distribution, based on likelihood values, information criteria, and empirical diagnostics. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

27 pages, 1721 KB  
Article
Handling Multi-Source Uncertainty in Accelerated Degradation Through a Wiener-Based Robust Modeling Scheme
by Hua Tu, Xiuli Wang and Yang Li
Sensors 2025, 25(21), 6654; https://doi.org/10.3390/s25216654 - 31 Oct 2025
Viewed by 444
Abstract
Uncertainty from heterogeneous degradation paths, limited experimental samples, and exogenous perturbations often complicates accelerated lifetime modeling and prediction. To confront these intertwined challenges, a Wiener process-based robust framework is developed. The proposed approach incorporates random-effect structures to capture unit-to-unit variability, adopts interval-based inference [...] Read more.
Uncertainty from heterogeneous degradation paths, limited experimental samples, and exogenous perturbations often complicates accelerated lifetime modeling and prediction. To confront these intertwined challenges, a Wiener process-based robust framework is developed. The proposed approach incorporates random-effect structures to capture unit-to-unit variability, adopts interval-based inference to reflect sampling limitations, and employs a hybrid estimator, combining Huber-type loss with a Metropolis–Hastings algorithm, to suppress the influence of external disturbances. In addition, a quantitative stress–parameter linkage is established under the accelerated factor principle, supporting consistent transfer from accelerated testing to normal conditions. Validation on contact stress relaxation data of connectors confirms that this methodology achieves more stable parameter inference and improves the reliability of lifetime predictions. Full article
Show Figures

Figure 1

9 pages, 2943 KB  
Article
Improve Intermetal Dielectric Process for HTRB Stability in Power GaN High Electron Mobility Transistor (HEMT) by unbiased-Highly Accelerated Stress Testing (uHAST)
by Yu-Ting Chuang, Niall Tumilty and Tian-Li Wu
Micromachines 2025, 16(11), 1233; https://doi.org/10.3390/mi16111233 - 30 Oct 2025
Viewed by 370
Abstract
This study investigates a severe high-temperature reverse bias (HTRB) failure observed in GaN HEMTs, with devices failing in under 24 h. We conducted an in-depth analysis of the electrical and physical failure mechanisms, revealing that unbiased-highly accelerated stress testing (uHAST) can effectively induce [...] Read more.
This study investigates a severe high-temperature reverse bias (HTRB) failure observed in GaN HEMTs, with devices failing in under 24 h. We conducted an in-depth analysis of the electrical and physical failure mechanisms, revealing that unbiased-highly accelerated stress testing (uHAST) can effectively induce dielectric delamination. The electrical and physical characteristics of devices post-delamination demonstrated a strong correlation between delamination at the nitride–polyimide interface and an increase in off-state drain leakage current (IDSS). Our findings led to the removal of a suspected process step involving the use of the reactive chemical, N-methyl-2-pyrrolidone (NMP), before and after polyimide deposition. This critical process change yielded a significant improvement in reliability; while the initial failure rate was 25% at 24 h, three lots of 260 parts subsequently survived 1000 h of HTRB stress with no failure. In conclusion, uHAST is a valuable reliability testing tool for assessing package and film adhesion, leveraging high pressure and moisture to quickly identify and troubleshoot delamination-related reliability issues. Full article
Show Figures

Figure 1

30 pages, 1387 KB  
Article
Asymptotic Analysis of the Bias–Variance Trade-Off in Subsampling Metropolis–Hastings
by Shuang Liu
Mathematics 2025, 13(21), 3395; https://doi.org/10.3390/math13213395 - 24 Oct 2025
Viewed by 236
Abstract
Markov chain Monte Carlo (MCMC) methods are fundamental to Bayesian inference but are often computationally prohibitive for large datasets, as the full likelihood must be evaluated at each iteration. Subsampling-based approximate Metropolis–Hastings (MH) algorithms offer a popular alternative, trading a manageable bias for [...] Read more.
Markov chain Monte Carlo (MCMC) methods are fundamental to Bayesian inference but are often computationally prohibitive for large datasets, as the full likelihood must be evaluated at each iteration. Subsampling-based approximate Metropolis–Hastings (MH) algorithms offer a popular alternative, trading a manageable bias for a significant reduction in per-iteration cost. While this bias–variance trade-off is empirically understood, a formal theoretical framework for its optimization has been lacking. Our work establishes such a framework by bounding the mean squared error (MSE) as a function of the subsample size (m), the data size (n), and the number of epochs (E). This analysis reveals two optimal asymptotic scaling laws: the optimal subsample size is m=O(E1/2), leading to a minimal MSE that scales as MSE=O(E1/2). Furthermore, leveraging the large-sample asymptotic properties of the posterior, we show that when augmented with a control variate, the approximate MH algorithm can be asymptotically more efficient than the standard MH method under ideal conditions. Experimentally, we first validate the two optimal asymptotic scaling laws. We then use Bayesian logistic regression and Softmax classification models to highlight a key difference in convergence behavior: the exact algorithm starts with a high MSE that gradually decreases as the number of epochs increases. In contrast, the approximate algorithm with a practical control variate maintains a consistently low MSE that is largely insensitive to the number of epochs. Full article
Show Figures

Figure 1

41 pages, 762 KB  
Article
MCMC Methods: From Theory to Distributed Hamiltonian Monte Carlo over PySpark
by Christos Karras, Leonidas Theodorakopoulos, Aristeidis Karras, George A. Krimpas, Charalampos-Panagiotis Bakalis and Alexandra Theodoropoulou
Algorithms 2025, 18(10), 661; https://doi.org/10.3390/a18100661 - 17 Oct 2025
Viewed by 615
Abstract
The Hamiltonian Monte Carlo (HMC) method is effective for Bayesian inference but suffers from synchronization overhead in distributed settings. We propose two variants: a distributed HMC (DHMC) baseline with synchronized, globally exact gradient evaluations and a communication-avoiding leapfrog HMC (CALF-HMC) method that interleaves [...] Read more.
The Hamiltonian Monte Carlo (HMC) method is effective for Bayesian inference but suffers from synchronization overhead in distributed settings. We propose two variants: a distributed HMC (DHMC) baseline with synchronized, globally exact gradient evaluations and a communication-avoiding leapfrog HMC (CALF-HMC) method that interleaves local surrogate micro-steps with a single–global Metropolis–Hastings correction per trajectory. Implemented on Apache Spark/PySpark and evaluated on a large synthetic logistic regression (N=107, d=100, workers J{4,8,16,32}), DHMC attained an average acceptance of 0.986, mean ESS of 1200, and wall-clock of 64.1 s per evaluation run, yielding 18.7 ESS/s; CALF-HMC achieved an acceptance of 0.942, mean ESS of 5.1, and 14.8 s, i.e., ≈0.34 ESS/s under the tested surrogate configuration. While DHMC delivered higher ESS/s due to robust mixing under conservative integration, CALF-HMC reduced the per-trajectory runtime and exhibited more favorable scaling as inter-worker latency increased. The study contributes (i) a systems-oriented communication cost model for distributed HMC, (ii) an exact, communication-avoiding leapfrog variant, and (iii) practical guidance for ESS/s-optimized tuning on clusters. Full article
(This article belongs to the Special Issue Numerical Optimization and Algorithms: 4th Edition)
Show Figures

Figure 1

26 pages, 11786 KB  
Article
Quantification of Multi-Source Road Emissions in an Urban Environment Using Inverse Methods
by Panagiotis Gkirmpas, George Tsegas, Giannis Ioannidis, Paul Tremper, Till Riedel, Eleftherios Chourdakis, Christos Vlachokostas and Nicolas Moussiopoulos
Atmosphere 2025, 16(10), 1184; https://doi.org/10.3390/atmos16101184 - 14 Oct 2025
Viewed by 277
Abstract
The spatial quantification of multiple sources within the urban environment is crucial for understanding urban air quality and implementing measures to mitigate air pollution levels. At the same time, emissions from road traffic contribute significantly to these concentrations. However, uncertainties arise when assessing [...] Read more.
The spatial quantification of multiple sources within the urban environment is crucial for understanding urban air quality and implementing measures to mitigate air pollution levels. At the same time, emissions from road traffic contribute significantly to these concentrations. However, uncertainties arise when assessing the contribution of multiple sources affecting a single receptor. This study aims to evaluate an inverse dispersion modelling methodology that combines Computational Fluid Dynamics (CFD) simulations with the Metropolis–Hastings Markov Chain Monte Carlo (MCMC) algorithm to quantify multiple traffic emissions at the street scale. This approach relies solely on observational data and prior information on each source’s emission rate range and is tested within the Augsburg city centre. To address the absence of extensive measurement data of a real pollutant correlated with traffic emissions, a synthetic observational dataset of a theoretical pollutant, treated as a passive scalar, was generated from the forward dispersion model, with added Gaussian noise. Furthermore, a sensitivity analysis also explores the influence of sensor configuration and prior information on the accuracy of the emission estimates. The results indicate that, when the potential emission rate range is narrow, high-quality predictions can be achieved (ratio between true and estimated release rates, Δq2) even with networks using data from only 10 sensors. In contrast, expanding the allowable emission range leads to reduced accuracy (2Δq6), particularly in networks with fewer than 50 sensors. Further research is recommended to assess the methodology’s performance using real-world measurements. Full article
(This article belongs to the Section Air Quality)
Show Figures

Figure 1

24 pages, 518 KB  
Article
Bayesian Inference on Stress–Strength Reliability with Geometric Distributions
by Mohammed K. Shakhatreh
Symmetry 2025, 17(10), 1723; https://doi.org/10.3390/sym17101723 - 13 Oct 2025
Viewed by 308
Abstract
This paper investigates the estimation of the stress–strength reliability parameter ρ=P(XY), where stress (X) and strength (Y) are independently modeled by geometric distributions. Objective Bayesian approaches are employed by developing Jeffreys, [...] Read more.
This paper investigates the estimation of the stress–strength reliability parameter ρ=P(XY), where stress (X) and strength (Y) are independently modeled by geometric distributions. Objective Bayesian approaches are employed by developing Jeffreys, reference, and probability-matching priors for ρ, and their effects on the resulting Bayes estimates are examined. Posterior inference is carried out using the random-walk Metropolis–Hastings algorithm. The performance of the proposed Bayesian estimators is assessed through extensive Monte Carlo simulations based on average estimates, root mean squared errors, and frequentist coverage probabilities of the highest posterior density credible intervals. Furthermore, the applicability of the methodology is demonstrated using two real data sets. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

31 pages, 521 KB  
Article
Bayesian Analysis of Nonlinear Quantile Structural Equation Model with Possible Non-Ignorable Missingness
by Lu Zhang and Mulati Tuerde
Mathematics 2025, 13(19), 3094; https://doi.org/10.3390/math13193094 - 26 Sep 2025
Viewed by 322
Abstract
This paper develops a nonlinear quantile structural equation model via the Bayesian approach, aiming to more accurately analyze the relationships between latent variables, with special attention paid to the issue of non-ignorable missing data in the model. The model not only incorporates quantile [...] Read more.
This paper develops a nonlinear quantile structural equation model via the Bayesian approach, aiming to more accurately analyze the relationships between latent variables, with special attention paid to the issue of non-ignorable missing data in the model. The model not only incorporates quantile regression to examine the relationships between latent variables at different quantile levels but also features a specially designed mechanism for handling missing data. The non-ignorable missing mechanism is specified through a logistic regression model, and a combined method of Gibbs sampling and Metropolis–Hastings sampling is adopted for missing value imputation, while simultaneously estimating unknown parameters, latent variables, and parameters in the missing data model. To verify the effectiveness of the proposed method, simulation studies are conducted under conditions of different sample sizes and missing rates. The results of these simulation studies indicate that the developed method performs excellently in handling complex data structures and missing data. Furthermore, this paper demonstrates the practical application value of the nonlinear quantile structural equation model through a case study on the growth of listed companies, providing researchers in related fields with a new analytical tool. Full article
(This article belongs to the Special Issue Research on Dynamical Systems and Differential Equations, 2nd Edition)
Show Figures

Figure 1

14 pages, 1009 KB  
Article
A Bayesian ARMA Probability Density Estimator
by Jeffrey D. Hart
Entropy 2025, 27(10), 1001; https://doi.org/10.3390/e27101001 - 26 Sep 2025
Viewed by 415
Abstract
A Bayesian approach for constructing ARMA probability density estimators is proposed. Such estimators are ratios of trigonometric polynomials and have a number of advantages over Fourier series estimators, including parsimony and greater efficiency under common conditions. The Bayesian approach is carried out via [...] Read more.
A Bayesian approach for constructing ARMA probability density estimators is proposed. Such estimators are ratios of trigonometric polynomials and have a number of advantages over Fourier series estimators, including parsimony and greater efficiency under common conditions. The Bayesian approach is carried out via MCMC, the output of which can be used to obtain probability intervals for unknown parameters and the underlying density. Finite sample efficiency and methods for choosing the estimator’s smoothing parameter are considered in a simulation study, and the ideas are illustrated with data on a wine attribute. Full article
(This article belongs to the Section Signal and Data Analysis)
Show Figures

Figure 1

10 pages, 790 KB  
Proceeding Paper
A Comparison of MCMC Algorithms for an Inverse Squeeze Flow Problem
by Aricia Rinkens, Rodrigo L. S. Silva, Clemens V. Verhoosel, Nick O. Jaensson and Erik Quaeghebeur
Phys. Sci. Forum 2025, 12(1), 4; https://doi.org/10.3390/psf2025012004 - 22 Sep 2025
Viewed by 350
Abstract
Using Bayesian inference to calibrate constitutive model parameters has recently seen a rise in interest. The Markov chain Monte Carlo (MCMC) algorithm is one of the most commonly used methods to sample from the posterior. However, the choice of which MCMC algorithm to [...] Read more.
Using Bayesian inference to calibrate constitutive model parameters has recently seen a rise in interest. The Markov chain Monte Carlo (MCMC) algorithm is one of the most commonly used methods to sample from the posterior. However, the choice of which MCMC algorithm to apply is typically pragmatic and based on considerations such as software availability and experience. We compare three commonly used MCMC algorithms: Metropolis-Hastings (MH), Affine Invariant Stretch Move (AISM) and No-U-Turn sampler (NUTS). For the comparison, we use the Kullback-Leibler (KL) divergence as a convergence criterion, which measures the statistical distance between the sampled and the ‘true’ posterior. We apply the Bayesian framework to a Newtonian squeeze flow problem, for which there exists an analytical model. Furthermore, we have collected experimental data using a tailored setup. The ground truth for the posterior is obtained by evaluating it on a uniform reference grid. We conclude that, for the same number of samples, the NUTS results in the lowest KL divergence, followed by the AISM sampler and last the MH sampler. Full article
Show Figures

Figure 1

31 pages, 12350 KB  
Article
Statistical Evaluation of Beta-Binomial Probability Law for Removal in Progressive First-Failure Censoring and Its Applications to Three Cancer Cases
by Ahmed Elshahhat, Osama E. Abo-Kasem and Heba S. Mohammed
Mathematics 2025, 13(18), 3028; https://doi.org/10.3390/math13183028 - 19 Sep 2025
Viewed by 389
Abstract
Progressive first-failure censoring is a flexible and cost-efficient strategy that captures real-world testing scenarios where only the first failure is observed at each stage while randomly removing remaining units, making it ideal for biomedical and reliability studies. By applying the α-power transformation [...] Read more.
Progressive first-failure censoring is a flexible and cost-efficient strategy that captures real-world testing scenarios where only the first failure is observed at each stage while randomly removing remaining units, making it ideal for biomedical and reliability studies. By applying the α-power transformation to the exponential baseline, the proposed model introduces an additional flexibility parameter that enriches the family of lifetime distributions, enabling it to better capture varying failure rates and diverse hazard rate behaviors commonly observed in biomedical data, thus extending the classical exponential model. This study develops a novel computational framework for analyzing an α-powered exponential model under beta-binomial random removals within the proposed censoring test. To address the inherent complexity of the likelihood function arising from simultaneous random removals and progressive censoring, we derive closed-form expressions for the likelihood, survival, and hazard functions and propose efficient estimation strategies based on both maximum likelihood and Bayesian inference. For the Bayesian approach, gamma and beta priors are adopted, and a tailored Metropolis–Hastings algorithm is implemented to approximate posterior distributions under symmetric and asymmetric loss functions. To evaluate the empirical performance of the proposed estimators, extensive Monte Carlo simulations are conducted, examining bias, mean squared error, and credible interval coverage under varying censoring levels and removal probabilities. Furthermore, the practical utility of the model is illustrated through three oncological datasets, including multiple myeloma, lung cancer, and breast cancer patients, demonstrating superior goodness of fit and predictive reliability compared to traditional models. The results show that the proposed lifespan model, under the beta-binomial probability law and within the examined censoring mechanism, offers a flexible and computationally tractable framework for reliability and biomedical survival analysis, providing new insights into censored data structures with random withdrawals. Full article
(This article belongs to the Special Issue New Advance in Applied Probability and Statistical Inference)
Show Figures

Figure 1

29 pages, 19296 KB  
Article
Inference for the Chris–Jerry Lifetime Distribution Under Improved Adaptive Progressive Type-II Censoring for Physics and Engineering Data Modelling
by Heba S. Mohammed, Osama E. Abo-Kasem and Ahmed Elshahhat
Axioms 2025, 14(9), 702; https://doi.org/10.3390/axioms14090702 - 17 Sep 2025
Viewed by 310
Abstract
This paper presents a comprehensive reliability analysis framework for the Chris–Jerry (CJ) lifetime distribution under an improved adaptive progressive Type-II censoring plan. The CJ model, recently introduced to capture skewed lifetime behaviors, is studied under a modified censoring structure designed to provide greater [...] Read more.
This paper presents a comprehensive reliability analysis framework for the Chris–Jerry (CJ) lifetime distribution under an improved adaptive progressive Type-II censoring plan. The CJ model, recently introduced to capture skewed lifetime behaviors, is studied under a modified censoring structure designed to provide greater flexibility in terminating life-testing experiments. We derive maximum likelihood estimators for the CJ parameters and key reliability measures, including the reliability and hazard rate functions, and construct approximate confidence intervals using the observed Fisher information matrix and the delta method. To address the intractability of the likelihood function, Bayesian estimators are obtained under independent gamma priors and a squared-error loss function. Because the posterior distributions are not available in closed form, we apply the Metropolis–Hastings algorithm to generate Bayesian estimates and two types of credible intervals. A comprehensive simulation study evaluates the performance of the proposed estimation techniques under various censoring scenarios. The framework is further validated through two real-world datasets: one involving rainfall measurements and another concerning mechanical failure times. In both cases, the CJ model combined with the proposed censoring strategy demonstrates superior fit and reliability inference compared to competing models. These findings highlight the value of the CJ distribution, together with advanced censoring methods, for modeling lifetime data in physics and engineering applications. Full article
Show Figures

Figure 1

34 pages, 31211 KB  
Article
Statistical Evaluation of Alpha-Powering Exponential Generalized Progressive Hybrid Censoring and Its Modeling for Medical and Engineering Sciences with Optimization Plans
by Heba S. Mohammed, Osama E. Abo-Kasem and Ahmed Elshahhat
Symmetry 2025, 17(9), 1473; https://doi.org/10.3390/sym17091473 - 6 Sep 2025
Viewed by 573
Abstract
This study explores advanced methods for analyzing the two-parameter alpha-power exponential (APE) distribution using data from a novel generalized progressive hybrid censoring scheme. The APE model is inherently asymmetric, exhibiting positive skewness across all valid parameter values due to its right-skewed exponential base, [...] Read more.
This study explores advanced methods for analyzing the two-parameter alpha-power exponential (APE) distribution using data from a novel generalized progressive hybrid censoring scheme. The APE model is inherently asymmetric, exhibiting positive skewness across all valid parameter values due to its right-skewed exponential base, with the alpha-power transformation amplifying or dampening this skewness depending on the power parameter. The proposed censoring design offers new insights into modeling lifetime data that exhibit non-monotonic hazard behaviors. It enhances testing efficiency by simultaneously imposing fixed-time constraints and ensuring a minimum number of failures, thereby improving inference quality over traditional censoring methods. We derive maximum likelihood and Bayesian estimates for the APE distribution parameters and key reliability measures, such as the reliability and hazard rate functions. Bayesian analysis is performed using independent gamma priors under a symmetric squared error loss, implemented via the Metropolis–Hastings algorithm. Interval estimation is addressed using two normality-based asymptotic confidence intervals and two credible intervals obtained through a simulated Markov Chain Monte Carlo procedure. Monte Carlo simulations across various censoring scenarios demonstrate the stable and superior precision of the proposed methods. Optimal censoring patterns are identified based on the observed Fisher information and its inverse. Two real-world case studies—breast cancer remission times and global oil reserve data—illustrate the practical utility of the APE model within the proposed censoring framework. These applications underscore the model’s capability to effectively analyze diverse reliability phenomena, bridging theoretical innovation with empirical relevance in lifetime data analysis. Full article
(This article belongs to the Special Issue Unlocking the Power of Probability and Statistics for Symmetry)
Show Figures

Figure 1

54 pages, 22294 KB  
Article
Research on Risk Evolution Probability of Urban Lifeline Natech Events Based on MdC-MCMC
by Shifeng Li and Yu Shang
Sustainability 2025, 17(17), 7664; https://doi.org/10.3390/su17177664 - 25 Aug 2025
Viewed by 974
Abstract
Urban lifeline Natech events are coupled systems composed of multiple risks and entities with complex dynamic transmission chains. Predicting risk evolution probabilities is the core task for achieving the safety management of urban lifeline Natech events. First, the risk evolution mechanism is analyzed, [...] Read more.
Urban lifeline Natech events are coupled systems composed of multiple risks and entities with complex dynamic transmission chains. Predicting risk evolution probabilities is the core task for achieving the safety management of urban lifeline Natech events. First, the risk evolution mechanism is analyzed, where urban lifeline Natech events exhibit spatial evolution characteristics, which involves dissecting the parallel and synergistic effects of risk evolution in spatial dimensions. Next, based on fitting marginal probability distribution functions for natural hazard and urban lifeline risk evolution, a Multi-dimensional Copula (MdC) function for the joint probability distribution of urban lifeline Natech event risk evolution is constructed. Building upon the MdC function, a Markov Chain Monte Carlo (MCMC) model for predicting risk evolution probabilities of urban lifeline Natech events is developed using the Metropolis–Hastings (M-H) algorithm and Gibbs sampling. Finally, taking the 2021 Zhengzhou ‘7·20’ catastrophic rainstorm as a case study, joint probability distribution functions for risk evolution under Rainfall-Wind speed scenarios are fitted for traffic, electric, communication, water supply, and drainage systems (including different risk transmission chains). Numerical simulations of joint probability distributions for risk evolution are conducted, and visualizations of joint probability predictions for risk evolution are generated. Full article
Show Figures

Figure 1

33 pages, 6324 KB  
Article
The Inverted Hjorth Distribution and Its Applications in Environmental and Pharmaceutical Sciences
by Ahmed Elshahhat, Osama E. Abo-Kasem and Heba S. Mohammed
Symmetry 2025, 17(8), 1327; https://doi.org/10.3390/sym17081327 - 14 Aug 2025
Cited by 1 | Viewed by 490
Abstract
This study introduces an inverted version of the three-parameter Hjorth lifespan model, characterized by one scale parameter and two shape parameters, referred to as the inverted Hjorth (IH) distribution. This asymmetric distribution can fit various positively skewed datasets more accurately than several existing [...] Read more.
This study introduces an inverted version of the three-parameter Hjorth lifespan model, characterized by one scale parameter and two shape parameters, referred to as the inverted Hjorth (IH) distribution. This asymmetric distribution can fit various positively skewed datasets more accurately than several existing models in the literature, as it can accommodate data exhibiting an inverted (upside-down) bathtub-shaped hazard rate. We derive key properties of the model, including quantiles, moments, reliability measures, stress–strength reliability, and order statistics. Point estimation of the IH model parameters is performed using maximum likelihood and Bayesian approaches. Moreover, for interval estimation, two types of asymptotic confidence intervals and two types of Bayesian credible intervals are obtained using the same estimation methodologies. As an extension to a complete sampling plan, Type-II censoring is employed to examine the impact of data incompleteness on IH parameter estimation. Monte Carlo simulation results indicate that Bayesian point and credible estimates outperform those obtained via classical estimation methods across several precision metrics, including mean squared error, average absolute bias, average interval length, and coverage probability. To further assess its performance, two real datasets are analyzed: one from the environmental domain (minimum monthly water flows of the Piracicaba River) and another from the pharmacological domain (plasma indomethacin concentrations). The superiority and flexibility of the inverted Hjorth model are evaluated and compared with several competing models. The results confirm that the IH distribution provides a better fit than several existing lifetime models—such as the inverted Gompertz, inverted log-logistic, inverted Lomax, and inverted Nadarajah–Haghighi distributions—making it a valuable tool for reliability and survival data analysis. Full article
Show Figures

Figure 1

Back to TopTop