Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (284)

Search Parameters:
Keywords = Markov Chain Monte Carlo simulations

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
26 pages, 657 KiB  
Article
Bayesian Inference for Copula-Linked Bivariate Generalized Exponential Distributions: A Comparative Approach
by Carlos A. dos Santos, Saralees Nadarajah, Fernando A. Moala, Hassan S. Bakouch and Shuhrah Alghamdi
Axioms 2025, 14(8), 574; https://doi.org/10.3390/axioms14080574 - 25 Jul 2025
Viewed by 162
Abstract
This paper addresses the limitations of existing bivariate generalized exponential (GE) distributions for modeling lifetime data, which often exhibit rigid dependence structures or non-GE marginals. To overcome these limitations, we introduce four new bivariate GE distributions based on the Farlie–Gumbel–Morgenstern, Gumbel–Barnett, Clayton, and [...] Read more.
This paper addresses the limitations of existing bivariate generalized exponential (GE) distributions for modeling lifetime data, which often exhibit rigid dependence structures or non-GE marginals. To overcome these limitations, we introduce four new bivariate GE distributions based on the Farlie–Gumbel–Morgenstern, Gumbel–Barnett, Clayton, and Frank copulas, which allow for more flexible modeling of various dependence structures. We employ a Bayesian framework with Markov Chain Monte Carlo (MCMC) methods for parameter estimation. A simulation study is conducted to evaluate the performance of the proposed models, which are then applied to a real-world dataset of electrical treeing failures. The results from the data application demonstrate that the copula-based models, particularly the one derived from the Frank copula, provide a superior fit compared to existing bivariate GE models. This work provides a flexible and robust framework for modeling dependent lifetime data. Full article
Show Figures

Figure 1

16 pages, 666 KiB  
Article
Bayesian Analysis of the Maxwell Distribution Under Progressively Type-II Random Censoring
by Rajni Goel, Mahmoud M. Abdelwahab and Mustafa M. Hasaballah
Axioms 2025, 14(8), 573; https://doi.org/10.3390/axioms14080573 - 25 Jul 2025
Viewed by 166
Abstract
Accurate modeling of product lifetimes is vital in reliability analysis and engineering to ensure quality and maintain competitiveness. This paper proposes the progressively randomly censored Maxwell distribution, which incorporates both progressive Type-II and random censoring within the Maxwell distribution framework. The model allows [...] Read more.
Accurate modeling of product lifetimes is vital in reliability analysis and engineering to ensure quality and maintain competitiveness. This paper proposes the progressively randomly censored Maxwell distribution, which incorporates both progressive Type-II and random censoring within the Maxwell distribution framework. The model allows for the planned removal of surviving units at specific stages of an experiment, accounting for both deliberate and random censoring events. It is assumed that survival and censoring times each follow a Maxwell distribution, though with distinct parameters. Both frequentist and Bayesian approaches are employed to estimate the model parameters. In the frequentist approach, maximum likelihood estimators and their corresponding confidence intervals are derived. In the Bayesian approach, Bayes estimators are obtained using an inverse gamma prior and evaluated through a Markov Chain Monte Carlo (MCMC) method under the squared error loss function (SELF). A Monte Carlo simulation study evaluates the performance of the proposed estimators. The practical relevance of the methodology is demonstrated using a real data set. Full article
Show Figures

Figure 1

35 pages, 11039 KiB  
Article
Optimum Progressive Data Analysis and Bayesian Inference for Unified Progressive Hybrid INH Censoring with Applications to Diamonds and Gold
by Heba S. Mohammed, Osama E. Abo-Kasem and Ahmed Elshahhat
Axioms 2025, 14(8), 559; https://doi.org/10.3390/axioms14080559 - 23 Jul 2025
Viewed by 165
Abstract
A novel unified progressive hybrid censoring is introduced to combine both progressive and hybrid censoring plans to allow flexible test termination either after a prespecified number of failures or at a fixed time. This work develops both frequentist and Bayesian inferential procedures for [...] Read more.
A novel unified progressive hybrid censoring is introduced to combine both progressive and hybrid censoring plans to allow flexible test termination either after a prespecified number of failures or at a fixed time. This work develops both frequentist and Bayesian inferential procedures for estimating the parameters, reliability, and hazard rates of the inverted Nadarajah–Haghighi lifespan model when a sample is produced from such a censoring plan. Maximum likelihood estimators are obtained through the Newton–Raphson iterative technique. The delta method, based on the Fisher information matrix, is utilized to build the asymptotic confidence intervals for each unknown quantity. In the Bayesian methodology, Markov chain Monte Carlo techniques with independent gamma priors are implemented to generate posterior summaries and credible intervals, addressing computational intractability through the Metropolis—Hastings algorithm. Extensive Monte Carlo simulations compare the efficiency and utility of frequentist and Bayesian estimates across multiple censoring designs, highlighting the superiority of Bayesian inference using informative prior information. Two real-world applications utilizing rare minerals from gold and diamond durability studies are examined to demonstrate the adaptability of the proposed estimators to the analysis of rare events in precious materials science. By applying four different optimality criteria to multiple competing plans, an analysis of various progressive censoring strategies that yield the best performance is conducted. The proposed censoring framework is effectively applied to real-world datasets involving diamonds and gold, demonstrating its practical utility in modeling the reliability and failure behavior of rare and high-value minerals. Full article
(This article belongs to the Special Issue Applications of Bayesian Methods in Statistical Analysis)
Show Figures

Figure 1

25 pages, 10024 KiB  
Article
Forecasting with a Bivariate Hysteretic Time Series Model Incorporating Asymmetric Volatility and Dynamic Correlations
by Hong Thi Than
Entropy 2025, 27(7), 771; https://doi.org/10.3390/e27070771 - 21 Jul 2025
Viewed by 226
Abstract
This study explores asymmetric volatility structures within multivariate hysteretic autoregressive (MHAR) models that incorporate conditional correlations, aiming to flexibly capture the dynamic behavior of global financial assets. The proposed framework integrates regime switching and time-varying delays governed by a hysteresis variable, enabling the [...] Read more.
This study explores asymmetric volatility structures within multivariate hysteretic autoregressive (MHAR) models that incorporate conditional correlations, aiming to flexibly capture the dynamic behavior of global financial assets. The proposed framework integrates regime switching and time-varying delays governed by a hysteresis variable, enabling the model to account for both asymmetric volatility and evolving correlation patterns over time. We adopt a fully Bayesian inference approach using adaptive Markov chain Monte Carlo (MCMC) techniques, allowing for the joint estimation of model parameters, Value-at-Risk (VaR), and Marginal Expected Shortfall (MES). The accuracy of VaR forecasts is assessed through two standard backtesting procedures. Our empirical analysis involves both simulated data and real-world financial datasets to evaluate the model’s effectiveness in capturing downside risk dynamics. We demonstrate the application of the proposed method on three pairs of daily log returns involving the S&P500, Bank of America (BAC), Intercontinental Exchange (ICE), and Goldman Sachs (GS), present the results obtained, and compare them against the original model framework. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

23 pages, 752 KiB  
Article
On Joint Progressively Censored Gumbel Type-II Distributions: (Non-) Bayesian Estimation with an Application to Physical Data
by Mustafa M. Hasaballah, Mahmoud E. Bakr, Oluwafemi Samson Balogun and Arwa M. Alshangiti
Axioms 2025, 14(7), 544; https://doi.org/10.3390/axioms14070544 - 20 Jul 2025
Viewed by 204
Abstract
This paper presents a comprehensive statistical analysis of the Gumbel Type-II distribution based on joint progressive Type-II censoring. It derives the maximum likelihood estimators for the distribution parameters and constructs their asymptotic confidence intervals. It investigates Bayesian estimation using non-informative and informative priors [...] Read more.
This paper presents a comprehensive statistical analysis of the Gumbel Type-II distribution based on joint progressive Type-II censoring. It derives the maximum likelihood estimators for the distribution parameters and constructs their asymptotic confidence intervals. It investigates Bayesian estimation using non-informative and informative priors under the squared error loss function and the LINEX loss function, applying Markov Chain Monte Carlo methods. A detailed simulation study evaluates the estimators’ performance in terms of average estimates, mean squared errors, and average confidence interval lengths. Results show that Bayesian estimators can outperform maximum likelihood estimators, especially with informative priors. A real data example demonstrates the practical use of the proposed methods. The analysis confirms that the Gumbel Type-II distribution with joint progressive censoring provides a flexible and effective model for lifetime data, enabling more accurate reliability assessment and risk analysis in engineering and survival studies. Full article
Show Figures

Figure 1

17 pages, 572 KiB  
Article
Statistical Analysis Under a Random Censoring Scheme with Applications
by Mustafa M. Hasaballah and Mahmoud M. Abdelwahab
Symmetry 2025, 17(7), 1048; https://doi.org/10.3390/sym17071048 - 3 Jul 2025
Cited by 1 | Viewed by 257
Abstract
The Gumbel Type-II distribution is a widely recognized and frequently utilized lifetime distribution, playing a crucial role in reliability engineering. This paper focuses on the statistical inference of the Gumbel Type-II distribution under a random censoring scheme. From a frequentist perspective, point estimates [...] Read more.
The Gumbel Type-II distribution is a widely recognized and frequently utilized lifetime distribution, playing a crucial role in reliability engineering. This paper focuses on the statistical inference of the Gumbel Type-II distribution under a random censoring scheme. From a frequentist perspective, point estimates for the unknown parameters are derived using the maximum likelihood estimation method, and confidence intervals are constructed based on the Fisher information matrix. From a Bayesian perspective, Bayes estimates of the parameters are obtained using the Markov Chain Monte Carlo method, and the average lengths of credible intervals are calculated. The Bayesian inference is performed under both the squared error loss function and the general entropy loss function. Additionally, a numerical simulation is conducted to evaluate the performance of the proposed methods. To demonstrate their practical applicability, a real world example is provided, illustrating the application and development of these inference techniques. In conclusion, the Bayesian method appears to outperform other approaches, although each method offers unique advantages. Full article
Show Figures

Figure 1

30 pages, 16041 KiB  
Article
Estimation of Inverted Weibull Competing Risks Model Using Improved Adaptive Progressive Type-II Censoring Plan with Application to Radiobiology Data
by Refah Alotaibi, Mazen Nassar and Ahmed Elshahhat
Symmetry 2025, 17(7), 1044; https://doi.org/10.3390/sym17071044 - 2 Jul 2025
Viewed by 333
Abstract
This study focuses on estimating the unknown parameters and the reliability function of the inverted-Weibull distribution, using an improved adaptive progressive Type-II censoring scheme under a competing risks model. Both classical and Bayesian estimation approaches are explored to offer a thorough analysis. Under [...] Read more.
This study focuses on estimating the unknown parameters and the reliability function of the inverted-Weibull distribution, using an improved adaptive progressive Type-II censoring scheme under a competing risks model. Both classical and Bayesian estimation approaches are explored to offer a thorough analysis. Under the classical approach, maximum likelihood estimators are obtained for the unknown parameters and the reliability function. Approximate confidence intervals are also constructed to assess the uncertainty in the estimates. From a Bayesian standpoint, symmetric Bayes estimates and highest posterior density credible intervals are computed using Markov Chain Monte Carlo sampling, assuming a symmetric squared error loss function. An extensive simulation study is carried out to assess how well the proposed methods perform under different experimental conditions, showing promising accuracy. To demonstrate the practical use of these methods, a real dataset is analyzed, consisting of the survival times of male mice aged 35 to 42 days after being exposed to 300 roentgens of X-ray radiation. The analysis demonstrated that the inverted Weibull distribution is well-suited for modeling the given dataset. Furthermore, the Bayesian estimation method, considering both point estimates and interval estimates, was found to be more effective than the classical approach in estimating the model parameters as well as the reliability function. Full article
Show Figures

Figure 1

24 pages, 2253 KiB  
Article
Modeling Spatial Data with Heteroscedasticity Using PLVCSAR Model: A Bayesian Quantile Regression Approach
by Rongshang Chen and Zhiyong Chen
Entropy 2025, 27(7), 715; https://doi.org/10.3390/e27070715 - 1 Jul 2025
Viewed by 317
Abstract
Spatial data not only enables smart cities to visualize, analyze, and interpret data related to location and space, but also helps departments make more informed decisions. We apply a Bayesian quantile regression (BQR) of the partially linear varying coefficient spatial autoregressive (PLVCSAR) model [...] Read more.
Spatial data not only enables smart cities to visualize, analyze, and interpret data related to location and space, but also helps departments make more informed decisions. We apply a Bayesian quantile regression (BQR) of the partially linear varying coefficient spatial autoregressive (PLVCSAR) model for spatial data to improve the prediction of performance. It can be used to capture the response of covariates to linear and nonlinear effects at different quantile points. Through an approximation of the nonparametric functions with free-knot splines, we develop a Bayesian sampling approach that can be applied by the Markov chain Monte Carlo (MCMC) approach and design an efficient Metropolis–Hastings within the Gibbs sampling algorithm to explore the joint posterior distributions. Computational efficiency is achieved through a modified reversible-jump MCMC algorithm incorporating adaptive movement steps to accelerate chain convergence. The simulation results demonstrate that our estimator exhibits robustness to alternative spatial weight matrices and outperforms both quantile regression (QR) and instrumental variable quantile regression (IVQR) in a finite sample at different quantiles. The effectiveness of the proposed model and estimation method is demonstrated by the use of real data from the Boston median house price. Full article
(This article belongs to the Special Issue Bayesian Hierarchical Models with Applications)
Show Figures

Figure 1

22 pages, 327 KiB  
Article
Bayesian Analysis of the Doubly Truncated Zubair-Weibull Distribution: Parameter Estimation, Reliability, Hazard Rate and Prediction
by Zakiah I. Kalantan, Mai A. Hegazy, Abeer A. EL-Helbawy, Hebatalla H. Mohammad, Doaa S. A. Soliman, Gannat R. AL-Dayian and Mervat K. Abd Elaal
Axioms 2025, 14(7), 502; https://doi.org/10.3390/axioms14070502 - 26 Jun 2025
Viewed by 241
Abstract
This paper discusses the Bayesian estimation for the unknown parameters, reliability and hazard rate functions of the doubly truncated Zubair-Weibull distribution. Informative priors (gamma distribution) for the parameters are used to obtain the posterior distributions. Under the squared-error and linear–exponential loss functions, the [...] Read more.
This paper discusses the Bayesian estimation for the unknown parameters, reliability and hazard rate functions of the doubly truncated Zubair-Weibull distribution. Informative priors (gamma distribution) for the parameters are used to obtain the posterior distributions. Under the squared-error and linear–exponential loss functions, the Bayes estimators are derived. Credible intervals for the parameters, reliability and hazard rate functions are obtained. Bayesian prediction (point and interval) for the future observation is considered under the two-sample prediction scheme. A simulation study is performed using the Markov Chain Monte Carlo algorithm of simulation for different sample sizes to assess the performance of the estimators. Two real datasets are applied to show the flexibility and applicability of the distribution. Full article
29 pages, 6050 KiB  
Article
Multidimensional Comprehensive Evaluation Method for Sonar Detection Efficiency Based on Dynamic Spatiotemporal Interactions
by Shizhe Wang, Weiyi Chen, Zongji Li, Xu Chen and Yanbing Su
J. Mar. Sci. Eng. 2025, 13(7), 1206; https://doi.org/10.3390/jmse13071206 - 21 Jun 2025
Viewed by 239
Abstract
The detection efficiency evaluation of sonars is crucial for optimizing task planning and resource scheduling. The existing static evaluation methods based on single indicators face significant challenges. First, static modeling has difficulty coping with complex scenes where the relative situation changes in real [...] Read more.
The detection efficiency evaluation of sonars is crucial for optimizing task planning and resource scheduling. The existing static evaluation methods based on single indicators face significant challenges. First, static modeling has difficulty coping with complex scenes where the relative situation changes in real time in the task process. Second, a single evaluation dimension cannot characterize the data distribution characteristics of efficiency indicators. In this paper, we propose a multidimensional detection efficiency evaluation method for sonar search paths based on dynamic spatiotemporal interactions. We develop a dynamic multidimensional evaluation framework. It consists of three parts, namely, spatiotemporal discrete modeling, situational dynamic deduction, and probability-based statistical analysis. This framework can achieve dynamic quantitative expression of the sonar detection efficiency. Specifically, by accurately characterizing the spatiotemporal interaction process between the sonars and targets, we overcome the bottleneck in entire-path detection efficiency evaluation. We introduce a Markov chain model to guide the Monte Carlo sampling; it helps to specify the uncertain situations by constructing a high-fidelity target motion trajectory database. To simulate the actual sensor working state, we add observation error to the sensor, which significantly improves the authenticity of the target’s trajectories. For each discrete time point, the minimum mean square error is used to estimate the sonar detection probability and cumulative detection probability. Based on the above models, we construct the multidimensional sonar detection efficiency evaluation indicator system by implementing a confidence analysis, effective detection rate calculation, and a data volatility quantification analysis. We conducted relevant simulation studies by setting the source level parameter of the target base on the sonar equation. In the simulation, we took two actual sonar search paths as examples and conducted an efficiency evaluation based on multidimensional evaluation indicators, and compared the evaluation results corresponding to the two paths. The simulation results show that in the passive and active working modes of sonar, for the detection probability, the box length of path 2 is reduced by 0∼0.2 and 0∼0.5, respectively, compared to path 1 during the time period from T = 11 to T = 15. For the cumulative detection probability, during the time period from T = 15 to T = 20, the box length of path 2 decreased by 0∼0.1 and 0∼0.2, respectively, compared to path 1, and the variance decreased by 0∼0.02 and 0∼0.03, respectively, compared to path 1. The numerical simulation results show that the data distribution corresponding to path 2 is more concentrated and stable, and its search ability is better than path 1, which reflects the advantages of the proposed multidimensional evaluation method. Full article
(This article belongs to the Section Ocean Engineering)
Show Figures

Figure 1

34 pages, 18712 KiB  
Article
Statistical Computation of Hjorth Competing Risks Using Binomial Removals in Adaptive Progressive Type II Censoring
by Refah Alotaibi, Mazen Nassar and Ahmed Elshahhat
Mathematics 2025, 13(12), 2010; https://doi.org/10.3390/math13122010 - 18 Jun 2025
Viewed by 245
Abstract
In complex reliability applications, it is common for the failure of an individual or an item to be attributed to multiple causes known as competing risks. This paper explores the estimation of the Hjorth competing risks model based on an adaptive progressive Type [...] Read more.
In complex reliability applications, it is common for the failure of an individual or an item to be attributed to multiple causes known as competing risks. This paper explores the estimation of the Hjorth competing risks model based on an adaptive progressive Type II censoring scheme via a binomial removal mechanism. For parameter and reliability metric estimation, both frequentist and Bayesian methodologies are developed. Maximum likelihood estimates for the Hjorth parameters are computed numerically due to their intricate form, while the binomial removal parameter is derived explicitly. Confidence intervals are constructed using asymptotic approximations. Within the Bayesian paradigm, gamma priors are assigned to the Hjorth parameters and a beta prior for the binomial parameter, facilitating posterior analysis. Markov Chain Monte Carlo techniques yield Bayesian estimates and credible intervals for parameters and reliability measures. The performance of the proposed methods is compared using Monte Carlo simulations. Finally, to illustrate the practical applicability of the proposed methodology, two real-world competing risk data sets are analyzed: one representing the breaking strength of jute fibers and the other representing the failure modes of electrical appliances. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation: 3rd Edition)
Show Figures

Figure 1

28 pages, 11942 KiB  
Article
Reliability Analysis of Improved Type-II Adaptive Progressively Inverse XLindley Censored Data
by Refah Alotaibi, Mazen Nassar and Ahmed Elshahhat
Axioms 2025, 14(6), 437; https://doi.org/10.3390/axioms14060437 - 2 Jun 2025
Viewed by 357
Abstract
This study offers a newly improved Type-II adaptive progressive censoring with data sampled from an inverse XLindley (IXL) distribution for more efficient and adaptive reliability assessments. Through this sampling mechanism, we evaluate the parameters of the IXL distribution, as well as its reliability [...] Read more.
This study offers a newly improved Type-II adaptive progressive censoring with data sampled from an inverse XLindley (IXL) distribution for more efficient and adaptive reliability assessments. Through this sampling mechanism, we evaluate the parameters of the IXL distribution, as well as its reliability and hazard rate features. In the context of reliability, to handle flexible and time-constrained testing frameworks in high-reliability environments, we formulate maximum likelihood estimators versus Bayesian estimates derived via Markov chain Monte Carlo techniques under gamma priors, which effectively capture prior knowledge. Two patterns of asymptotic interval estimates are constructed through the normal approximation of the classical estimates and of the log-transformed classical estimates. On the other hand, from the Markovian chains, two patterns of credible interval estimates are also constructed. A robust simulation study is carried out to compare the classical and Bayesian point estimation methods, along with the four interval estimation methods. This study’s practical usefulness is demonstrated by its analysis of a real-world dataset. The results reveal that both conventional and Bayesian inferential methods function accurately, with the Bayesian outcomes surpassing those of the conventional method. Full article
(This article belongs to the Special Issue Computational Statistics and Its Applications, 2nd Edition)
Show Figures

Figure 1

27 pages, 993 KiB  
Article
Statistical Inference of Inverse Weibull Distribution Under Joint Progressive Censoring Scheme
by Jinchen Xiang, Yuanqi Wang and Wenhao Gui
Symmetry 2025, 17(6), 829; https://doi.org/10.3390/sym17060829 - 26 May 2025
Viewed by 355
Abstract
In recent years, there has been an increasing interest in the application of progressive censoring as a means to reduce both cost and experiment duration. In the absence of explanatory variables, the present study employs a statistical inference approach for the inverse Weibull [...] Read more.
In recent years, there has been an increasing interest in the application of progressive censoring as a means to reduce both cost and experiment duration. In the absence of explanatory variables, the present study employs a statistical inference approach for the inverse Weibull distribution, using a progressive type II censoring strategy with two independent samples. The article expounds on the maximum likelihood estimation method, utilizing the Fisher information matrix to derive approximate confidence intervals. Moreover, interval estimations are computed by the bootstrap method. We explore the application of Bayesian methods for estimating model parameters under both the squared error and LINEX loss functions. The Bayesian estimates and corresponding credible intervals are calculated via Markov chain Monte Carlo (MCMC). Finally, comprehensive simulation studies and real data analysis are carried out to validate the precision of the proposed estimation methods. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

27 pages, 466 KiB  
Article
An Analysis of Vectorised Automatic Differentiation for Statistical Applications
by Chun Fung Kwok, Dan Zhu and Liana Jacobi
Stats 2025, 8(2), 40; https://doi.org/10.3390/stats8020040 - 19 May 2025
Viewed by 378
Abstract
Automatic differentiation (AD) is a general method for computing exact derivatives in complex sensitivity analyses and optimisation tasks, particularly when closed-form solutions are unavailable and traditional analytical or numerical methods fall short. This paper introduces a vectorised formulation of AD grounded in matrix [...] Read more.
Automatic differentiation (AD) is a general method for computing exact derivatives in complex sensitivity analyses and optimisation tasks, particularly when closed-form solutions are unavailable and traditional analytical or numerical methods fall short. This paper introduces a vectorised formulation of AD grounded in matrix calculus. It aligns naturally with the matrix-oriented style prevalent in statistics, supports convenient implementations, and takes advantage of sparse matrix representation and other high-level optimisation techniques that are not available in the scalar counterpart. Our formulation is well-suited to high-dimensional statistical applications, where finite differences (FD) scale poorly due to the need to repeat computations for each input dimension, resulting in significant overhead, and is advantageous in simulation-intensive settings—such as Markov Chain Monte Carlo (MCMC)-based inference—where FD requires repeated sampling and multiple function evaluations, while AD can compute exact derivatives in a single pass, substantially reducing computational cost. Numerical studies are presented to demonstrate the efficacy and speed of the proposed AD method compared with FD schemes. Full article
(This article belongs to the Section Computational Statistics)
20 pages, 595 KiB  
Article
Learning Gaussian Bayesian Network from Censored Data Subject to Limit of Detection by the Structural EM Algorithm
by Ping-Feng Xu, Shanyi Lin, Qian-Zhen Zheng and Man-Lai Tang
Mathematics 2025, 13(9), 1482; https://doi.org/10.3390/math13091482 - 30 Apr 2025
Viewed by 332
Abstract
A Bayesian network offers powerful knowledge representations for independence, conditional independence and causal relationships among variables in a given domain. Despite its wide application, the detection limits of modern measurement technologies make the use of the Bayesian networks theoretically unfounded, even when the [...] Read more.
A Bayesian network offers powerful knowledge representations for independence, conditional independence and causal relationships among variables in a given domain. Despite its wide application, the detection limits of modern measurement technologies make the use of the Bayesian networks theoretically unfounded, even when the assumption of a multivariate Gaussian distribution is satisfied. In this paper, we introduce the censored Gaussian Bayesian network (GBN), an extension of GBNs designed to handle left- and right-censored data caused by instrumental detection limits. We further propose the censored Structural Expectation-Maximization (cSEM) algorithm, an iterative score-and-search framework that integrates Monte Carlo sampling in the E-step for efficient expectation computation and employs the iterative Markov chain Monte Carlo (MCMC) algorithm in the M-step to refine the network structure and parameters. This approach addresses the non-decomposability challenge of censored-data likelihoods. Through simulation studies, we illustrate the superior performance of the cSEM algorithm compared to the existing competitors in terms of network recovery when censored data exist. Finally, the proposed cSEM algorithm is applied to single-cell data with censoring to uncover the relationships among variables. The implementation of the cSEM algorithm is available on GitHub. Full article
Show Figures

Figure 1

Back to TopTop