Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (139)

Search Parameters:
Keywords = monte carlo markov chain technique

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
28 pages, 835 KiB  
Article
Progressive First-Failure Censoring in Reliability Analysis: Inference for a New Weibull–Pareto Distribution
by Rashad M. EL-Sagheer and Mahmoud M. Ramadan
Mathematics 2025, 13(15), 2377; https://doi.org/10.3390/math13152377 - 24 Jul 2025
Viewed by 142
Abstract
This paper explores statistical techniques for estimating unknown lifetime parameters using data from a progressive first-failure censoring scheme. The failure times are modeled with a new Weibull–Pareto distribution. Maximum likelihood estimators are derived for the model parameters, as well as for the survival [...] Read more.
This paper explores statistical techniques for estimating unknown lifetime parameters using data from a progressive first-failure censoring scheme. The failure times are modeled with a new Weibull–Pareto distribution. Maximum likelihood estimators are derived for the model parameters, as well as for the survival and hazard rate functions, although these estimators do not have explicit closed-form solutions. The Newton–Raphson algorithm is employed for the numerical computation of these estimates. Confidence intervals for the parameters are approximated based on the asymptotic normality of the maximum likelihood estimators. The Fisher information matrix is calculated using the missing information principle, and the delta technique is applied to approximate confidence intervals for the survival and hazard rate functions. Bayesian estimators are developed under squared error, linear exponential, and general entropy loss functions, assuming independent gamma priors. Markov chain Monte Carlo sampling is used to obtain Bayesian point estimates and the highest posterior density credible intervals for the parameters and reliability measures. Finally, the proposed methods are demonstrated through the analysis of a real dataset. Full article
(This article belongs to the Section D1: Probability and Statistics)
Show Figures

Figure 1

35 pages, 11039 KiB  
Article
Optimum Progressive Data Analysis and Bayesian Inference for Unified Progressive Hybrid INH Censoring with Applications to Diamonds and Gold
by Heba S. Mohammed, Osama E. Abo-Kasem and Ahmed Elshahhat
Axioms 2025, 14(8), 559; https://doi.org/10.3390/axioms14080559 - 23 Jul 2025
Viewed by 151
Abstract
A novel unified progressive hybrid censoring is introduced to combine both progressive and hybrid censoring plans to allow flexible test termination either after a prespecified number of failures or at a fixed time. This work develops both frequentist and Bayesian inferential procedures for [...] Read more.
A novel unified progressive hybrid censoring is introduced to combine both progressive and hybrid censoring plans to allow flexible test termination either after a prespecified number of failures or at a fixed time. This work develops both frequentist and Bayesian inferential procedures for estimating the parameters, reliability, and hazard rates of the inverted Nadarajah–Haghighi lifespan model when a sample is produced from such a censoring plan. Maximum likelihood estimators are obtained through the Newton–Raphson iterative technique. The delta method, based on the Fisher information matrix, is utilized to build the asymptotic confidence intervals for each unknown quantity. In the Bayesian methodology, Markov chain Monte Carlo techniques with independent gamma priors are implemented to generate posterior summaries and credible intervals, addressing computational intractability through the Metropolis—Hastings algorithm. Extensive Monte Carlo simulations compare the efficiency and utility of frequentist and Bayesian estimates across multiple censoring designs, highlighting the superiority of Bayesian inference using informative prior information. Two real-world applications utilizing rare minerals from gold and diamond durability studies are examined to demonstrate the adaptability of the proposed estimators to the analysis of rare events in precious materials science. By applying four different optimality criteria to multiple competing plans, an analysis of various progressive censoring strategies that yield the best performance is conducted. The proposed censoring framework is effectively applied to real-world datasets involving diamonds and gold, demonstrating its practical utility in modeling the reliability and failure behavior of rare and high-value minerals. Full article
(This article belongs to the Special Issue Applications of Bayesian Methods in Statistical Analysis)
Show Figures

Figure 1

11 pages, 961 KiB  
Article
Viscous Cosmology in f(Q,Lm) Gravity: Insights from CC, BAO, and GRB Data
by Dheeraj Singh Rana, Sai Swagat Mishra, Aaqid Bhat and Pradyumn Kumar Sahoo
Universe 2025, 11(8), 242; https://doi.org/10.3390/universe11080242 - 23 Jul 2025
Viewed by 197
Abstract
In this article, we investigate the influence of viscosity on the evolution of the cosmos within the framework of the newly proposed f(Q,Lm) gravity. We have considered a linear functional form [...] Read more.
In this article, we investigate the influence of viscosity on the evolution of the cosmos within the framework of the newly proposed f(Q,Lm) gravity. We have considered a linear functional form f(Q,Lm)=αQ+βLm with a bulk viscous coefficient ζ=ζ0+ζ1H for our analysis and obtained exact solutions to the field equations associated with a flat FLRW metric. In addition, we utilized Cosmic Chronometers (CC), CC + BAO, CC + BAO + GRB, and GRB data samples to determine the constrained values of independent parameters in the derived exact solution. The likelihood function and the Markov Chain Monte Carlo (MCMC) sampling technique are combined to yield the posterior probability using Bayesian statistical methods. Furthermore, by comparing our results with the standard cosmological model, we found that our considered model supports the acceleration of the universe in late time. Full article
Show Figures

Figure 1

25 pages, 10024 KiB  
Article
Forecasting with a Bivariate Hysteretic Time Series Model Incorporating Asymmetric Volatility and Dynamic Correlations
by Hong Thi Than
Entropy 2025, 27(7), 771; https://doi.org/10.3390/e27070771 - 21 Jul 2025
Viewed by 220
Abstract
This study explores asymmetric volatility structures within multivariate hysteretic autoregressive (MHAR) models that incorporate conditional correlations, aiming to flexibly capture the dynamic behavior of global financial assets. The proposed framework integrates regime switching and time-varying delays governed by a hysteresis variable, enabling the [...] Read more.
This study explores asymmetric volatility structures within multivariate hysteretic autoregressive (MHAR) models that incorporate conditional correlations, aiming to flexibly capture the dynamic behavior of global financial assets. The proposed framework integrates regime switching and time-varying delays governed by a hysteresis variable, enabling the model to account for both asymmetric volatility and evolving correlation patterns over time. We adopt a fully Bayesian inference approach using adaptive Markov chain Monte Carlo (MCMC) techniques, allowing for the joint estimation of model parameters, Value-at-Risk (VaR), and Marginal Expected Shortfall (MES). The accuracy of VaR forecasts is assessed through two standard backtesting procedures. Our empirical analysis involves both simulated data and real-world financial datasets to evaluate the model’s effectiveness in capturing downside risk dynamics. We demonstrate the application of the proposed method on three pairs of daily log returns involving the S&P500, Bank of America (BAC), Intercontinental Exchange (ICE), and Goldman Sachs (GS), present the results obtained, and compare them against the original model framework. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

17 pages, 572 KiB  
Article
Statistical Analysis Under a Random Censoring Scheme with Applications
by Mustafa M. Hasaballah and Mahmoud M. Abdelwahab
Symmetry 2025, 17(7), 1048; https://doi.org/10.3390/sym17071048 - 3 Jul 2025
Cited by 1 | Viewed by 248
Abstract
The Gumbel Type-II distribution is a widely recognized and frequently utilized lifetime distribution, playing a crucial role in reliability engineering. This paper focuses on the statistical inference of the Gumbel Type-II distribution under a random censoring scheme. From a frequentist perspective, point estimates [...] Read more.
The Gumbel Type-II distribution is a widely recognized and frequently utilized lifetime distribution, playing a crucial role in reliability engineering. This paper focuses on the statistical inference of the Gumbel Type-II distribution under a random censoring scheme. From a frequentist perspective, point estimates for the unknown parameters are derived using the maximum likelihood estimation method, and confidence intervals are constructed based on the Fisher information matrix. From a Bayesian perspective, Bayes estimates of the parameters are obtained using the Markov Chain Monte Carlo method, and the average lengths of credible intervals are calculated. The Bayesian inference is performed under both the squared error loss function and the general entropy loss function. Additionally, a numerical simulation is conducted to evaluate the performance of the proposed methods. To demonstrate their practical applicability, a real world example is provided, illustrating the application and development of these inference techniques. In conclusion, the Bayesian method appears to outperform other approaches, although each method offers unique advantages. Full article
Show Figures

Figure 1

21 pages, 2212 KiB  
Article
The Whale Optimization Algorithm and Markov Chain Monte Carlo-Based Approach for Optimizing Teacher Professional Development in Creative Learning Design with Technology
by Kalliopi Rigopouli, Dimitrios Kotsifakos and Yannis Psaromiligkos
Algorithms 2025, 18(7), 407; https://doi.org/10.3390/a18070407 - 2 Jul 2025
Viewed by 330
Abstract
In this article, we present a hybrid optimization methodology using the whale optimization algorithm and Markov Chain Monte Carlo sampling technique in a teachers’ training development program regarding creativity in technology-enhanced learning design. Finding the best possible training for creativity in learning design [...] Read more.
In this article, we present a hybrid optimization methodology using the whale optimization algorithm and Markov Chain Monte Carlo sampling technique in a teachers’ training development program regarding creativity in technology-enhanced learning design. Finding the best possible training for creativity in learning design with technology is a complex task, as many dynamic and multi-model variables need to be taken into consideration. When designing the best possible training, the whale optimization algorithm helped us in determining the right methods, resources, content, and assessment. A further Markov Chain Monte Carlo-based approach helped us in deciding with accuracy that these were the correct parameters of our training. In this article, we show that metaheuristic algorithms like the whale optimization algorithm, validated by a Markov chain technique like Markov Chain Monte Carlo, can help not only in areas like machine learning but also in fields without structured data, like creativity in technology-enhanced learning design. The best possible training for a teacher’s professional development in creative learning design is collaborative, hands-on, and utilizes creativity definitions for the product along with technology integration learning design models. Full article
(This article belongs to the Special Issue Artificial Intelligence Algorithms and Generative AI in Education)
Show Figures

Figure 1

34 pages, 18712 KiB  
Article
Statistical Computation of Hjorth Competing Risks Using Binomial Removals in Adaptive Progressive Type II Censoring
by Refah Alotaibi, Mazen Nassar and Ahmed Elshahhat
Mathematics 2025, 13(12), 2010; https://doi.org/10.3390/math13122010 - 18 Jun 2025
Viewed by 236
Abstract
In complex reliability applications, it is common for the failure of an individual or an item to be attributed to multiple causes known as competing risks. This paper explores the estimation of the Hjorth competing risks model based on an adaptive progressive Type [...] Read more.
In complex reliability applications, it is common for the failure of an individual or an item to be attributed to multiple causes known as competing risks. This paper explores the estimation of the Hjorth competing risks model based on an adaptive progressive Type II censoring scheme via a binomial removal mechanism. For parameter and reliability metric estimation, both frequentist and Bayesian methodologies are developed. Maximum likelihood estimates for the Hjorth parameters are computed numerically due to their intricate form, while the binomial removal parameter is derived explicitly. Confidence intervals are constructed using asymptotic approximations. Within the Bayesian paradigm, gamma priors are assigned to the Hjorth parameters and a beta prior for the binomial parameter, facilitating posterior analysis. Markov Chain Monte Carlo techniques yield Bayesian estimates and credible intervals for parameters and reliability measures. The performance of the proposed methods is compared using Monte Carlo simulations. Finally, to illustrate the practical applicability of the proposed methodology, two real-world competing risk data sets are analyzed: one representing the breaking strength of jute fibers and the other representing the failure modes of electrical appliances. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation: 3rd Edition)
Show Figures

Figure 1

28 pages, 11942 KiB  
Article
Reliability Analysis of Improved Type-II Adaptive Progressively Inverse XLindley Censored Data
by Refah Alotaibi, Mazen Nassar and Ahmed Elshahhat
Axioms 2025, 14(6), 437; https://doi.org/10.3390/axioms14060437 - 2 Jun 2025
Viewed by 347
Abstract
This study offers a newly improved Type-II adaptive progressive censoring with data sampled from an inverse XLindley (IXL) distribution for more efficient and adaptive reliability assessments. Through this sampling mechanism, we evaluate the parameters of the IXL distribution, as well as its reliability [...] Read more.
This study offers a newly improved Type-II adaptive progressive censoring with data sampled from an inverse XLindley (IXL) distribution for more efficient and adaptive reliability assessments. Through this sampling mechanism, we evaluate the parameters of the IXL distribution, as well as its reliability and hazard rate features. In the context of reliability, to handle flexible and time-constrained testing frameworks in high-reliability environments, we formulate maximum likelihood estimators versus Bayesian estimates derived via Markov chain Monte Carlo techniques under gamma priors, which effectively capture prior knowledge. Two patterns of asymptotic interval estimates are constructed through the normal approximation of the classical estimates and of the log-transformed classical estimates. On the other hand, from the Markovian chains, two patterns of credible interval estimates are also constructed. A robust simulation study is carried out to compare the classical and Bayesian point estimation methods, along with the four interval estimation methods. This study’s practical usefulness is demonstrated by its analysis of a real-world dataset. The results reveal that both conventional and Bayesian inferential methods function accurately, with the Bayesian outcomes surpassing those of the conventional method. Full article
(This article belongs to the Special Issue Computational Statistics and Its Applications, 2nd Edition)
Show Figures

Figure 1

27 pages, 466 KiB  
Article
An Analysis of Vectorised Automatic Differentiation for Statistical Applications
by Chun Fung Kwok, Dan Zhu and Liana Jacobi
Stats 2025, 8(2), 40; https://doi.org/10.3390/stats8020040 - 19 May 2025
Viewed by 364
Abstract
Automatic differentiation (AD) is a general method for computing exact derivatives in complex sensitivity analyses and optimisation tasks, particularly when closed-form solutions are unavailable and traditional analytical or numerical methods fall short. This paper introduces a vectorised formulation of AD grounded in matrix [...] Read more.
Automatic differentiation (AD) is a general method for computing exact derivatives in complex sensitivity analyses and optimisation tasks, particularly when closed-form solutions are unavailable and traditional analytical or numerical methods fall short. This paper introduces a vectorised formulation of AD grounded in matrix calculus. It aligns naturally with the matrix-oriented style prevalent in statistics, supports convenient implementations, and takes advantage of sparse matrix representation and other high-level optimisation techniques that are not available in the scalar counterpart. Our formulation is well-suited to high-dimensional statistical applications, where finite differences (FD) scale poorly due to the need to repeat computations for each input dimension, resulting in significant overhead, and is advantageous in simulation-intensive settings—such as Markov Chain Monte Carlo (MCMC)-based inference—where FD requires repeated sampling and multiple function evaluations, while AD can compute exact derivatives in a single pass, substantially reducing computational cost. Numerical studies are presented to demonstrate the efficacy and speed of the proposed AD method compared with FD schemes. Full article
(This article belongs to the Section Computational Statistics)
22 pages, 4740 KiB  
Article
Determining the Scale Length and Height of the Milky Way’s Thick Disc Using RR Lyrae
by Roman Tkachenko, Katherine Vieira, Artem Lutsenko, Vladimir Korchagin and Giovanni Carraro
Universe 2025, 11(4), 132; https://doi.org/10.3390/universe11040132 - 17 Apr 2025
Cited by 1 | Viewed by 689
Abstract
Using the RR Lyrae surveys Gaia DR3 Specific Objects Study, PanSTARRS1 and ASAS-SN-II, we determine the Milky Way’s thick disc scale length and scale height as well as the radial scale length of the galaxy’s inner halo. We use a Bayesian approach to [...] Read more.
Using the RR Lyrae surveys Gaia DR3 Specific Objects Study, PanSTARRS1 and ASAS-SN-II, we determine the Milky Way’s thick disc scale length and scale height as well as the radial scale length of the galaxy’s inner halo. We use a Bayesian approach to estimate these values using two independent techniques: Markov chain Monte Carlo sampling, and importance nested sampling. We consider two vertical density profiles for the thick disc. In the exponential model, the scale length of the thick disc is hR=2.140.17+0.19 kpc, and its scale height is hz=0.640.06+0.06 kpc. In the squared hyperbolic secant profile sech2, those values are correspondingly hR=2.100.17+0.19 kpc and hz=1.020.08+0.09 kpc. The density distribution of the inner halo can be described as a power law function with the exponent n=2.350.05+0.05 and flattening q=0.570.02+0.02. We also estimate the halo to disc concentration ratio as γ=0.190.02+0.02 for the exponential disc and γ=0.320.03+0.03 for the sech2 disc. Full article
(This article belongs to the Special Issue Universe: Feature Papers 2024—"Galaxies and Clusters")
Show Figures

Figure 1

14 pages, 2366 KiB  
Article
Rice Growth Estimation and Yield Prediction by Combining the DSSAT Model and Remote Sensing Data Using the Monte Carlo Markov Chain Technique
by Yingbo Chen, Siyu Wang, Zhankui Xue, Jijie Hu, Shaojie Chen and Zunfu Lv
Plants 2025, 14(8), 1206; https://doi.org/10.3390/plants14081206 - 14 Apr 2025
Cited by 1 | Viewed by 698
Abstract
The integration of crop models and remote sensing data has become a useful method for monitoring crop growth status and crop yield based on data assimilation. The objective of this study was to use leaf area index (LAI) values and plant nitrogen accumulation [...] Read more.
The integration of crop models and remote sensing data has become a useful method for monitoring crop growth status and crop yield based on data assimilation. The objective of this study was to use leaf area index (LAI) values and plant nitrogen accumulation (PNA) values generated from spectral indices to calibrate the Decision Support System for Agrotechnology Transfer (DSSAT) model using the Monte Carlo Markov Chain (MCMC) technique. The initial management parameters, including sowing date, sowing rate, and nitrogen rate, are recalibrated based on the relationship between the remote sensing state variables and the simulated state variables. This integrated technique was tested on independent datasets acquired from three rice field tests at the experimental site in Deqing, China. The results showed that the data assimilation method achieved the most accurate LAI (R2 = 0.939 and RMSE = 0.74) and PNA (R2 = 0.926 and RMSE = 7.3 kg/ha) estimations compared with the spectral index method. Average differences (RE, %) between the inverted initialized parameters and the original input parameters for sowing date, seeding rate, and nitrogen amount were 1.33%, 4.75%, and 8.16%, respectively. The estimated yield was in good agreement with the measured yield (R2 = 0.79 and RMSE = 661 kg/ha). The average root mean square deviation (RMSD) for the simulated values of yield was 745 kg/ha. Yield uncertainty from data assimilation between crop models and remote sensing was quantified. This study found that data assimilation of crop models and remote sensing data using the MCMC technique could improve the estimation of rice leaf area index (LAI), plant nitrogen accumulation (PNA), and yield. Data assimilation using the MCMC technique improves the prediction of LAI, PNA, and yield by solving the saturation effect of the normalized difference vegetation index (NDVI). This method proposed in this study can provide precise decision-making support for field management and anticipate regional yield fluctuations in advance. Full article
(This article belongs to the Special Issue Crop Nutrition Diagnosis and Regulation)
Show Figures

Figure 1

31 pages, 1126 KiB  
Review
A Comprehensive Review and Application of Bayesian Methods in Hydrological Modelling: Past, Present, and Future Directions
by Khaled Haddad
Water 2025, 17(7), 1095; https://doi.org/10.3390/w17071095 - 6 Apr 2025
Cited by 1 | Viewed by 1765
Abstract
Bayesian methods have revolutionised hydrological modelling by providing a framework for managing uncertainty, improving model calibration, and enabling more accurate predictions. This paper reviews the evolution of Bayesian methods in hydrology, from their initial applications in flood-frequency analysis to their current use in [...] Read more.
Bayesian methods have revolutionised hydrological modelling by providing a framework for managing uncertainty, improving model calibration, and enabling more accurate predictions. This paper reviews the evolution of Bayesian methods in hydrology, from their initial applications in flood-frequency analysis to their current use in streamflow forecasting, flood risk assessment, and climate-change adaptation. It discusses the development of key Bayesian techniques, such as Markov Chain Monte Carlo (MCMC) methods, hierarchical models, and approximate Bayesian computation (ABC), and their integration with remote sensing and big data analytics. The paper also presents simulated examples demonstrating the application of Bayesian methods to flood, drought, and rainfall data, showcasing the potential of these methods to inform water-resource management, flood risk mitigation, and drought prediction. The future of Bayesian hydrology lies in expanding the use of machine learning, improving computational efficiency, and integrating large-scale datasets from remote sensing. This review serves as a resource for hydrologists seeking to understand the evolution and future potential of Bayesian methods in addressing complex hydrological challenges. Full article
(This article belongs to the Section Hydrology)
Show Figures

Figure 1

32 pages, 1098 KiB  
Article
Estimation and Bayesian Prediction for New Version of Xgamma Distribution Under Progressive Type-II Censoring
by Ahmed R. El-Saeed, Molay Kumar Ruidas and Ahlam H. Tolba
Symmetry 2025, 17(3), 457; https://doi.org/10.3390/sym17030457 - 18 Mar 2025
Cited by 1 | Viewed by 320
Abstract
This article introduces a new continuous lifetime distribution within the Gamma family, called the induced Xgamma distribution, and explores its various statistical properties. The proposed distribution’s estimation and prediction are investigated using Bayesian and non-Bayesian approaches under progressively Type-II censored data. The maximum [...] Read more.
This article introduces a new continuous lifetime distribution within the Gamma family, called the induced Xgamma distribution, and explores its various statistical properties. The proposed distribution’s estimation and prediction are investigated using Bayesian and non-Bayesian approaches under progressively Type-II censored data. The maximum likelihood and maximum product spacing methods are applied for the non-Bayesian approach, and some of their performances are evaluated. In the Bayesian framework, the numerical approximation technique utilizing the Metropolis–Hastings algorithm within the Markov chain Monte Carlo is employed under different loss functions, including the squared error loss, general entropy, and LINEX loss. Interval estimation methods, such as asymptotic confidence intervals, log-normal asymptotic confidence intervals, and highest posterior density intervals, are also developed. A comprehensive numerical study using Monte Carlo simulations is conducted to evaluate the performance of the proposed point and interval estimation methods through progressive Type-II censored data. Furthermore, the applicability and effectiveness of the proposed distribution are demonstrated through three real-world datasets from the fields of medicine and engineering. Full article
(This article belongs to the Special Issue Bayesian Statistical Methods for Forecasting)
Show Figures

Figure 1

24 pages, 2642 KiB  
Article
Mixed Student’s T-Distribution Regression Soft Measurement Model and Its Application Based on VI and MCMC
by Qirui Li, Cuixian Li, Zhiping Peng, Delong Cui and Jieguang He
Processes 2025, 13(3), 861; https://doi.org/10.3390/pr13030861 - 14 Mar 2025
Viewed by 599
Abstract
The conventional diagnostic techniques for ethylene cracker furnace tube coking rely on manual expertise, offline analysis and on-site inspection. However, these methods have inherent limitations, including prolonged inspection times, low accuracy and poor real-time performance. This makes it challenging to meet the requirements [...] Read more.
The conventional diagnostic techniques for ethylene cracker furnace tube coking rely on manual expertise, offline analysis and on-site inspection. However, these methods have inherent limitations, including prolonged inspection times, low accuracy and poor real-time performance. This makes it challenging to meet the requirements of chemical production. The necessity for high efficiency, high reliability and high safety, coupled with the inherent complexity of the production process, results in data that is characterized by multimodal, nonlinear, non-Gaussian and strong noise. This renders the traditional data processing and analysis methods ineffective. In order to address these issues, this paper puts forth a novel soft measurement approach, namely the ‘Mixed Student’s t-distribution regression soft measurement model based on Variational Inference (VI) and Markov Chain Monte Carlo (MCMC)’. The initial variational distribution is selected during the initialization step of VI. Subsequently, VI is employed to iteratively refine the distribution in order to more closely approximate the true posterior distribution. Subsequently, the outcomes of VI are employed to initiate the MCMC, which facilitates the placement of the iterative starting point of the MCMC in a region that more closely approximates the true posterior distribution. This approach allows the convergence process of MCMC to be accelerated, thereby enabling a more rapid approach to the true posterior distribution. The model integrates the efficiency of VI with the accuracy of the MCMC, thereby enhancing the precision of the posterior distribution approximation while preserving computational efficiency. The experimental results demonstrate that the model exhibits enhanced accuracy and robustness in the diagnosis of ethylene cracker tube coking compared to the conventional Partial Least Squares Regression (PLSR), Gaussian Process Regression (GPR), Gaussian Mixture Regression (GMR), Bayesian Student’s T-Distribution Mixture Regression (STMR) and Semi-supervised Bayesian T-Distribution Mixture Regression (SsSMM). This method provides a scientific basis for optimizing and maintaining the ethylene cracker, enhancing its production efficiency and reliability, and effectively addressing the multimodal, non-Gaussian distribution and uncertainty of the coking data of the ethylene cracker furnace tube. Full article
(This article belongs to the Section Chemical Processes and Systems)
Show Figures

Figure 1

18 pages, 1181 KiB  
Article
Modeling and Estimation of Traffic Intensity in M/M/1 Queueing System with Balking: Classical and Bayesian Approaches
by Bhaskar Kushvaha, Dhruba Das, Asmita Tamuli, Dibyajyoti Bora, Mrinal Deka and Amit Choudhury
AppliedMath 2025, 5(1), 19; https://doi.org/10.3390/appliedmath5010019 - 21 Feb 2025
Viewed by 702
Abstract
This article focuses on both classical and Bayesian inference of traffic intensity in a single-server Markovian queueing model considering balking. To reflect real-world situations, the article introduces the concept of balking, where customers opt not to join the queue due to the perceived [...] Read more.
This article focuses on both classical and Bayesian inference of traffic intensity in a single-server Markovian queueing model considering balking. To reflect real-world situations, the article introduces the concept of balking, where customers opt not to join the queue due to the perceived waiting time. The essence of this article involves a comprehensive analysis of different loss functions, namely, the squared error loss function (SELF) and the precautionary loss function (PLF), on the accuracy of the Bayesian estimation. To evaluate the performance of the Bayesian method with various priors such as inverted beta, gamma, and Jeffreys distributions, an assessment is performed using the Markov Chain Monte Carlo (MCMC) simulation technique. The efficacy of the Bayesian estimators is assessed by comparing the mean squared errors (MSEs). Full article
Show Figures

Figure 1

Back to TopTop