Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (311)

Search Parameters:
Keywords = Markov Chain Monte Carlo simulation

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
39 pages, 23728 KB  
Article
Parametric Inference of the Power Weibull Survival Model Using a Generalized Censoring Plan: Three Applications to Symmetry and Asymmetry Scenarios
by Refah Alotaibi and Ahmed Elshahhat
Symmetry 2025, 17(12), 2142; https://doi.org/10.3390/sym17122142 - 12 Dec 2025
Viewed by 169
Abstract
Generalized censoring, combined with a power-based distribution, improves inferential efficiency by capturing more detailed failure-time information in complex testing scenarios. Conventional censoring schemes may discard substantial failure-time information, leading to inefficiencies in parameter estimation and reliability prediction. To address this limitation, we develop [...] Read more.
Generalized censoring, combined with a power-based distribution, improves inferential efficiency by capturing more detailed failure-time information in complex testing scenarios. Conventional censoring schemes may discard substantial failure-time information, leading to inefficiencies in parameter estimation and reliability prediction. To address this limitation, we develop a comprehensive inferential framework for the alpha-power Weibull (APW) distribution under a generalized progressive hybrid Type-II censoring scheme, a flexible design that unifies classical, hybrid, and progressive censoring while guaranteeing test completion within preassigned limits. Both maximum likelihood and Bayesian estimation procedures are derived for the model parameters, reliability function, and hazard rate. Associated uncertainty quantification is provided through asymptotic confidence intervals (normal and log-normal approximations) and Bayesian credible intervals obtained via Markov chain Monte Carlo (MCMC) methods with independent gamma priors. In addition, we propose optimal censoring designs based on trace, determinant, and quantile-variance criteria to maximize inferential efficiency at the design stage. Extensive Monte Carlo simulations, assessed using four precision measures, demonstrate that the Bayesian MCMC estimators consistently outperform their frequentist counterparts in terms of bias, mean squared error, robustness, and interval coverage across a wide range of censoring levels and prior settings. Finally, the proposed methodology is validated using real-life datasets from engineering (electronic devices), clinical (organ transplant), and physical (rare metals) studies, demonstrating the APW model’s superior goodness-of-fit, reliability prediction, and inferential stability. Overall, this study demonstrates that combining generalized censoring with the APW distribution substantially enhances inferential efficiency and predictive performance, offering a robust and versatile tool for complex life-testing experiments across multiple scientific domains. Full article
Show Figures

Figure 1

23 pages, 4509 KB  
Article
Data Assimilation for a Simple Hydrological Partitioning Model Using Machine Learning
by Changhwi Jeon, Chaelim Lee, Suhyung Jang and Sangdan Kim
Water 2025, 17(22), 3204; https://doi.org/10.3390/w17223204 - 9 Nov 2025
Viewed by 1002
Abstract
Predicting streamflow is a core element of efficient water resource management. Traditional hydrological models are constructed based on historical observational data, leading to cumulative prediction errors over time. To address this issue, this study proposes an Artificial Intelligence Filter (AIF) that integrates machine [...] Read more.
Predicting streamflow is a core element of efficient water resource management. Traditional hydrological models are constructed based on historical observational data, leading to cumulative prediction errors over time. To address this issue, this study proposes an Artificial Intelligence Filter (AIF) that integrates machine learning (ML) techniques into a data assimilation framework. The AIF learns the relationship between simulated streamflow and state variables (soil moisture, aquifer water level) and updates the state based on observed streamflow. This study applied the Simple Hydrologic Partitioning Model (SHPM) to four dam basins in southeastern Korea (Andong, Hapcheon, Miryang, Namgang). Model parameters were estimated using the Markov Chain Monte Carlo (MCMC) method, and results were compared with Open Loop (OL) simulations. After applying AIF, R2 and NSE increased by an average of approximately 0.02–0.04, representing a 2–5% improvement, achieving enhanced performance in most basins. KGE decreased slightly in some basins but improved by an average of about 2%. These results demonstrate that AIF not only enhances the accuracy of hydrological models but also contributes to securing the reliability of water resource forecasts through data assimilation and supports efficient management decision-making. Full article
Show Figures

Figure 1

13 pages, 450 KB  
Article
South Africa’s Two-Pot Retirement Savings Model Under Labor Market Uncertainty
by Tichaona Chikore and Farai Nyabadza
Economies 2025, 13(11), 318; https://doi.org/10.3390/economies13110318 - 7 Nov 2025
Viewed by 536
Abstract
This study addresses the critical challenge of designing retirement savings systems that effectively balance liquidity needs and long-term accumulation in contexts characterized by high unemployment and labor market instability, with a focus on South Africa. Traditional pension schemes often assume uninterrupted careers and [...] Read more.
This study addresses the critical challenge of designing retirement savings systems that effectively balance liquidity needs and long-term accumulation in contexts characterized by high unemployment and labor market instability, with a focus on South Africa. Traditional pension schemes often assume uninterrupted careers and stable incomes, assumptions frequently violated in low- and middle-income countries, leading to inadequate retirement security and consumption volatility during working life. Motivated by this gap, we develop a stochastic two-pot retirement savings model that explicitly integrates labor market uncertainty using a Markov chain-based Monte Carlo simulation. The model allocates annual contributions between an accessible savings pot and a locked retirement pot, with individuals optimizing consumption and withdrawal decisions to maximize expected lifetime utility under Constant Relative Risk Aversion (CRRA) preferences. Our findings, derived from calibration to South African labor data, reveal that high unemployment and career uncertainty significantly increase the welfare-maximizing preference for liquidity. This result challenges conventional policies prescribing fixed contribution allocations, such as the one-third/two-thirds split in the new two-pot system, and underscores the importance of flexible retirement savings designs. We conclude that tailoring pension design to labor market realities can enhance both retirement security and welfare in volatile economies. Full article
(This article belongs to the Section Labour and Education)
Show Figures

Figure 1

18 pages, 7443 KB  
Article
Generating Accurate Activity Patterns for Cattle Farm Management Using MCMC Simulation of Multiple-Sensor Data System
by Yukie Hashimoto, Thi Thi Zin, Pyke Tin, Ikuo Kobayashi and Hiromitsu Hama
Sensors 2025, 25(21), 6781; https://doi.org/10.3390/s25216781 - 5 Nov 2025
Viewed by 619
Abstract
This paper presents a novel Markov Chain Monte Carlo (MCMC) simulation model for analyzing multi-sensor data to enhance cattle farm management. As Precision Livestock Farming (PLF) systems become more widespread, leveraging data from technologies like 3D acceleration, pneumatic, and proximity sensors is crucial [...] Read more.
This paper presents a novel Markov Chain Monte Carlo (MCMC) simulation model for analyzing multi-sensor data to enhance cattle farm management. As Precision Livestock Farming (PLF) systems become more widespread, leveraging data from technologies like 3D acceleration, pneumatic, and proximity sensors is crucial for deriving actionable insights into animal behavior. Our research addresses this need by demonstrating how MCMC can be used to accurately model and predict complex cattle activity patterns. We investigate the direct impact of these insights on optimizing key farm management areas, including feed allocation, early disease detection, and labor scheduling. Using a combination of controlled monthly experiments and the analysis of uncontrolled, real-world data, we validate our proposed approach. The results confirm that our MCMC simulation effectively processes diverse sensor inputs to generate reliable and detailed behavioral patterns. We find that this data-driven methodology provides significant advantages for developing informed management strategies, leading to improvements in the overall efficiency, productivity, and profitability of cattle operations. This work underscores the potential of using advanced statistical models like MCMC to transform multi-sensor data into tangible improvements for modern agriculture. Full article
(This article belongs to the Special Issue Sensors and Data-Driven Precision Agriculture—Second Edition)
Show Figures

Figure 1

28 pages, 30126 KB  
Article
Reliability Inference for ZLindley Models Under Improved Adaptive Progressive Censoring: Applications to Leukemia Trials and Flood Risks
by Refah Alotaibi and Ahmed Elshahhat
Mathematics 2025, 13(21), 3499; https://doi.org/10.3390/math13213499 - 1 Nov 2025
Viewed by 249
Abstract
Modern healthcare and engineering both rely on robust reliability models, where handling censored data effectively translates into longer-lasting devices, improved therapies, and safer environments for society. To address this, we develop a novel inferential framework for the ZLindley (ZL) distribution under the improved [...] Read more.
Modern healthcare and engineering both rely on robust reliability models, where handling censored data effectively translates into longer-lasting devices, improved therapies, and safer environments for society. To address this, we develop a novel inferential framework for the ZLindley (ZL) distribution under the improved adaptive progressive Type-II censoring strategy. The proposed approach unifies the flexibility of the ZL model—capable of representing monotonically increasing hazards—with the efficiency of an adaptive censoring strategy that guarantees experiment termination within pre-specified limits. Both classical and Bayesian methodologies are investigated: Maximum likelihood and log-transformed likelihood estimators are derived alongside their asymptotic confidence intervals, while Bayesian estimation is conducted via gamma priors and Markov chain Monte Carlo methods, yielding Bayes point estimates, credible intervals, and highest posterior density regions. Extensive Monte Carlo simulations are employed to evaluate estimator performance in terms of bias, efficiency, coverage probability, and interval length across diverse censoring designs. Results demonstrate the superiority of Bayesian inference, particularly under informative priors, and highlight the robustness of HPD intervals over traditional asymptotic approaches. To emphasize practical utility, the methodology is applied to real-world reliability datasets from clinical trials on leukemia patients and hydrological measurements from River Styx floods, demonstrating the model’s ability to capture heterogeneity, over-dispersion, and increasing risk profiles. The empirical investigations reveal that the ZLindley distribution consistently provides a better fit than well-known competitors—including Lindley, Weibull, and Gamma models—when applied to real-world case studies from clinical leukemia trials and hydrological systems, highlighting its unmatched flexibility, robustness, and predictive utility for practical reliability modeling. Full article
Show Figures

Figure 1

18 pages, 907 KB  
Article
Bayesian Estimation of Multicomponent Stress–Strength Model Using Progressively Censored Data from the Inverse Rayleigh Distribution
by Asuman Yılmaz
Entropy 2025, 27(11), 1095; https://doi.org/10.3390/e27111095 - 23 Oct 2025
Viewed by 368
Abstract
This paper presents a comprehensive study on the estimation of multicomponent stress–strength reliability under progressively censored data, assuming the inverse Rayleigh distribution. Both maximum likelihood estimation and Bayesian estimation methods are considered. The loss function and prior distribution play crucial roles in Bayesian [...] Read more.
This paper presents a comprehensive study on the estimation of multicomponent stress–strength reliability under progressively censored data, assuming the inverse Rayleigh distribution. Both maximum likelihood estimation and Bayesian estimation methods are considered. The loss function and prior distribution play crucial roles in Bayesian inference. Therefore, Bayes estimators of the unknown model parameters are obtained under symmetric (squared error loss function) and asymmetric (linear exponential and general entropy) loss functions using gamma priors. Lindley and MCMC approximation methods are used for Bayesian calculations. Additionally, asymptotic confidence intervals based on maximum likelihood estimators and Bayesian credible intervals constructed via Markov Chain Monte Carlo methods are presented. An extensive Monte Carlo simulation study compares the efficiencies of classical and Bayesian estimators, revealing that Bayesian estimators outperform classical ones. Finally, a real-life data example is provided to illustrate the practical applicability of the proposed methods. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

16 pages, 325 KB  
Article
Electricity Demand Forecasting and Risk Assessment for Campus Energy Management
by Yon-Hon Tsai and Ming-Tang Tsai
Energies 2025, 18(20), 5521; https://doi.org/10.3390/en18205521 - 20 Oct 2025
Viewed by 513
Abstract
This paper employs the Grey–Markov Model (GMM) to predict users’ electricity demand and introduces the Enhanced Monte Carlo (EMC) method to assess the reliability of the prediction results. The GMM integrates the advantages of the Grey Model (GM) and the Markov Chain to [...] Read more.
This paper employs the Grey–Markov Model (GMM) to predict users’ electricity demand and introduces the Enhanced Monte Carlo (EMC) method to assess the reliability of the prediction results. The GMM integrates the advantages of the Grey Model (GM) and the Markov Chain to enhance prediction accuracy, while the EMC combines the Monte Carlo simulation with a dual-variable approach to conduct a comprehensive risk assessment. This framework helps decision-makers better understand electricity demand patterns and effectively manage associated risks. A university campus in southern Taiwan is selected as the case study. Historical data of monthly maximum electricity demand, including peak, semi-peak, Saturday semi-peak, and off-peak periods, were collected and organized into a database using Excel. The GMM was applied to predict the monthly maximum electricity demand for the target year, and its prediction results were compared with those obtained from the GM and Grey Differential Equation (GDE) models. The results show that the average Mean Absolute Percentage Error (MAPE) values for the GM, GDE, and GMM are 10.96341%, 9.333164%, and 6.56026%, respectively. Among the three models, the GMM exhibits the lowest average MAPE, indicating superior prediction performance. The proposed GMM demonstrates robust predictive capability and significant practical value, offering a more effective forecasting tool than the GM and GDE models. Furthermore, the EMC method is utilized to evaluate the reliability of the risk assessment. The findings of this study provide decision-makers with a reliable reference for electricity demand forecasting and risk management, thereby supporting more effective contract capacity planning. Full article
Show Figures

Figure 1

17 pages, 341 KB  
Article
Inferences for the GKME Distribution Under Progressive Type-I Interval Censoring with Random Removals and Its Application to Survival Data
by Ela Verma, Mahmoud M. Abdelwahab, Sanjay Kumar Singh and Mustafa M. Hasaballah
Axioms 2025, 14(10), 769; https://doi.org/10.3390/axioms14100769 - 17 Oct 2025
Viewed by 321
Abstract
The analysis of lifetime data under censoring schemes plays a vital role in reliability studies and survival analysis, where complete information is often difficult to obtain. This work focuses on the estimation of the parameters of the recently proposed generalized Kavya–Manoharan exponential (GKME) [...] Read more.
The analysis of lifetime data under censoring schemes plays a vital role in reliability studies and survival analysis, where complete information is often difficult to obtain. This work focuses on the estimation of the parameters of the recently proposed generalized Kavya–Manoharan exponential (GKME) distribution under progressive Type-I interval censoring, a censoring scheme that frequently arises in medical and industrial life-testing experiments. Estimation procedures are developed under both classical and Bayesian paradigms, providing a comprehensive framework for inference. In the Bayesian setting, parameter estimation is carried out using Markov Chain Monte Carlo (MCMC) techniques under two distinct loss functions: the squared error loss function (SELF) and the general entropy loss function (GELF). For interval estimation, asymptotic confidence intervals as well as highest posterior density (HPD) credible intervals are constructed. The performance of the proposed estimators is systematically evaluated through a Monte Carlo simulation study in terms of mean squared error (MSE) and the average lengths of the interval estimates. The practical usefulness of the developed methodology is further demonstrated through the analysis of a real dataset on survival times of guinea pigs exposed to virulent tubercle bacilli. The findings indicate that the proposed methods provide flexible and efficient tools for analyzing progressively interval-censored lifetime data. Full article
Show Figures

Figure 1

21 pages, 2630 KB  
Article
Hierarchical Markov Chain Monte Carlo Framework for Spatiotemporal EV Charging Load Forecasting
by Xuehan Zheng, Yalun Zhu, Ming Wang, Bo Lv and Yisheng Lv
Appl. Sci. 2025, 15(20), 11094; https://doi.org/10.3390/app152011094 - 16 Oct 2025
Viewed by 433
Abstract
With the advancement of battery technology and the promotion of the “dual carbon” policy, electric vehicles (EVs) have been widely used in industrial, commercial, and civil fields, and the charging infrastructure of highway service areas across the country has also shown a rapid [...] Read more.
With the advancement of battery technology and the promotion of the “dual carbon” policy, electric vehicles (EVs) have been widely used in industrial, commercial, and civil fields, and the charging infrastructure of highway service areas across the country has also shown a rapid development trend. However, the charging load of electric vehicles in highway scenarios exhibits strong randomness and uncertainty. It is affected by multiple factors such as traffic flow, state of charge (SOC), and user charging behavior, and it is difficult to accurately model it through traditional mathematical models. This paper proposes a hierarchical Markov chain Monte Carlo (HMMC) simulation method to construct a charging load prediction model with spatiotemporal coupling characteristics. The model hierarchically models features such as traffic flow, SOC, and charging behavior through a hierarchical structure to reduce interference between dimensions; by constructing a Markov chain that converges to the target distribution and an inter-layer transfer mechanism, the load change process is deduced layer by layer, thereby achieving a more accurate charging load prediction. Comparative experiments with mainstream methods such as ARIMA, BP neural networks, random forests, and LSTM show that the HMMC model has higher prediction accuracy in highway scenarios, significantly reduces prediction errors, and improves model stability and interpretability. Full article
Show Figures

Figure 1

26 pages, 11786 KB  
Article
Quantification of Multi-Source Road Emissions in an Urban Environment Using Inverse Methods
by Panagiotis Gkirmpas, George Tsegas, Giannis Ioannidis, Paul Tremper, Till Riedel, Eleftherios Chourdakis, Christos Vlachokostas and Nicolas Moussiopoulos
Atmosphere 2025, 16(10), 1184; https://doi.org/10.3390/atmos16101184 - 14 Oct 2025
Viewed by 463
Abstract
The spatial quantification of multiple sources within the urban environment is crucial for understanding urban air quality and implementing measures to mitigate air pollution levels. At the same time, emissions from road traffic contribute significantly to these concentrations. However, uncertainties arise when assessing [...] Read more.
The spatial quantification of multiple sources within the urban environment is crucial for understanding urban air quality and implementing measures to mitigate air pollution levels. At the same time, emissions from road traffic contribute significantly to these concentrations. However, uncertainties arise when assessing the contribution of multiple sources affecting a single receptor. This study aims to evaluate an inverse dispersion modelling methodology that combines Computational Fluid Dynamics (CFD) simulations with the Metropolis–Hastings Markov Chain Monte Carlo (MCMC) algorithm to quantify multiple traffic emissions at the street scale. This approach relies solely on observational data and prior information on each source’s emission rate range and is tested within the Augsburg city centre. To address the absence of extensive measurement data of a real pollutant correlated with traffic emissions, a synthetic observational dataset of a theoretical pollutant, treated as a passive scalar, was generated from the forward dispersion model, with added Gaussian noise. Furthermore, a sensitivity analysis also explores the influence of sensor configuration and prior information on the accuracy of the emission estimates. The results indicate that, when the potential emission rate range is narrow, high-quality predictions can be achieved (ratio between true and estimated release rates, Δq2) even with networks using data from only 10 sensors. In contrast, expanding the allowable emission range leads to reduced accuracy (2Δq6), particularly in networks with fewer than 50 sensors. Further research is recommended to assess the methodology’s performance using real-world measurements. Full article
(This article belongs to the Section Air Quality)
Show Figures

Figure 1

17 pages, 6312 KB  
Article
Black–Litterman Portfolio Optimization with Dynamic CAPM via ABC-MCMC
by Sebastián Flández, Rolando Rubilar-Torrealba, Karime Chahuán-Jiménez, Hanns de la Fuente-Mella and Claudio Elórtegui-Gómez
Mathematics 2025, 13(20), 3265; https://doi.org/10.3390/math13203265 - 12 Oct 2025
Viewed by 1858
Abstract
The present research proposes a methodology for portfolio construction that integrates the Black–Litterman model with expected returns generated through simulations under dynamic Capital Asset Pricing Model (CAPM) with conditional betas, estimated via Approximate Bayesian Computation Markov Chain Monte Carlo (ABC-MCMC). Bayesian estimation enables [...] Read more.
The present research proposes a methodology for portfolio construction that integrates the Black–Litterman model with expected returns generated through simulations under dynamic Capital Asset Pricing Model (CAPM) with conditional betas, estimated via Approximate Bayesian Computation Markov Chain Monte Carlo (ABC-MCMC). Bayesian estimation enables the incorporation of volatility regimes and the adjustment of each asset’s sensitivity to the market, thereby delivering expected returns that more accurately reflect the structural state of the assets compared to historical methods. This strategy is applied to the United States stock market, and the results suggest that the Black–Litterman portfolio performs competitively against portfolios optimised using the classic Markowitz model, even maintaining the same fixed weights throughout the month. Specifically, it has been demonstrated to outperform the minimum variance portfolio with regard to cumulative return and attains a Sharpe ratio that approaches the Markowitz maximum Sharpe portfolio, although it does so with a distinct and more concentrated asset allocation. It has been observed that, while the maximum return portfolio attains the highest absolute profit, it does so at the expense of significantly higher volatility. Full article
Show Figures

Figure 1

22 pages, 2007 KB  
Article
A Joint Diagnosis Model Using Response Time and Accuracy for Online Learning Assessment
by Xia Li, Yuxia Chen, Huali Yang and Jing Geng
Electronics 2025, 14(19), 3873; https://doi.org/10.3390/electronics14193873 - 29 Sep 2025
Viewed by 520
Abstract
Cognitive diagnosis models (CDMs) assess the proficiency of examinees in specific skills. Online education has increased the amount of data that is available on the response behaviour of examinees. Traditional CDMs determine the state of skills by modelling information on item response results [...] Read more.
Cognitive diagnosis models (CDMs) assess the proficiency of examinees in specific skills. Online education has increased the amount of data that is available on the response behaviour of examinees. Traditional CDMs determine the state of skills by modelling information on item response results and ignoring vital response time information. In this study, a CDM, named RT-CDM, which models the condition dependence between response time and response accuracy on the speed-accuracy exchange criterion, is proposed. The model’s continuous latent trait function and response time function, used for more precise cognitive analyses, makes it a tractable, interpretable skill diagnosis model. The Markov chain Monte Carlo algorithm is used to estimate the parameters of the RT-CDM. We evaluate RT-CDM through controlled simulations and three real datasets—PISA 2015 computer-based mathematics, EdNet-KT1, and MATH—against multiple baselines, including classical CDMs (e.g., DINA/IRT), RT-extended IRT and joint models (e.g., 4P-IRT, JRT-DINA), and neural CDMs (e.g., NCD, ICD, MFNCD). Across datasets, RT-CDM consistently achieves superior predictive performance, demonstrates stable parameter recovery in simulations, and delivers stronger diagnostic interpretability by leveraging RT alongside RA. Full article
Show Figures

Figure 1

22 pages, 2815 KB  
Article
Optimization of Pavement Maintenance Planning in Cambodia Using a Probabilistic Model and Genetic Algorithm
by Nut Sovanneth, Felix Obunguta, Kotaro Sasai and Kiyoyuki Kaito
Infrastructures 2025, 10(10), 261; https://doi.org/10.3390/infrastructures10100261 - 29 Sep 2025
Viewed by 1046
Abstract
Optimizing pavement maintenance and rehabilitation (M&R) strategies is essential, especially in developing countries with limited budgets. This study presents an integrated framework combining a deterioration prediction model and a genetic algorithm (GA)-based optimization model to plan cost-effective M&R strategies for flexible pavements, including [...] Read more.
Optimizing pavement maintenance and rehabilitation (M&R) strategies is essential, especially in developing countries with limited budgets. This study presents an integrated framework combining a deterioration prediction model and a genetic algorithm (GA)-based optimization model to plan cost-effective M&R strategies for flexible pavements, including asphalt concrete (AC) and double bituminous surface treatment (DBST). The GA schedules multi-year interventions by accounting for varied deterioration rates and budget constraints to maximize pavement performance. The optimization process involves generating a population of candidate solutions representing a set of selected road sections for maintenance, followed by fitness evaluation and solution evolution. A mixed Markov hazard (MMH) model is used to model uncertainty in pavement deterioration, simulating condition transitions influenced by pavement bearing capacity, traffic load, and environmental factors. The MMH model employs an exponential hazard function and Bayesian inference via Markov Chain Monte Carlo (MCMC) to estimate deterioration rates and life expectancies. A case study on Cambodia’s road network evaluates six budget scenarios (USD 12–27 million) over a 10-year period, identifying the USD 18 million budget as the most effective. The framework enables road agencies to access maintenance strategies under various financial and performance conditions, supporting data-driven, sustainable infrastructure management and optimal fund allocation. Full article
Show Figures

Figure 1

18 pages, 3750 KB  
Article
Optimal Guidance Mechanism for EV Charging Behavior and Its Impact Assessment on Distribution Network Hosting Capacity
by Xin Yang, Fan Zhou, Ran Xu, Yalin Zhong, Jingjing Yu and Hejun Yang
Processes 2025, 13(10), 3107; https://doi.org/10.3390/pr13103107 - 28 Sep 2025
Cited by 1 | Viewed by 557
Abstract
With the rapid growth in the penetration of Electric Vehicles (EVs), their large-scale uncoordinated charging behavior presents significant challenges to the hosting capacity of traditional distribution networks (DNs). The novelty of this paper lies in its methodology, which integrates a Markov Chain Monte [...] Read more.
With the rapid growth in the penetration of Electric Vehicles (EVs), their large-scale uncoordinated charging behavior presents significant challenges to the hosting capacity of traditional distribution networks (DNs). The novelty of this paper lies in its methodology, which integrates a Markov Chain Monte Carlo (MCMC) method for realistic load profiling with a bi-level optimization framework for Time-of-Use (TOU) pricing, whose effectiveness is then rigorously evaluated through an Optimal Power Flow (OPF)-based assessment of the grid’s hosting capacity. First, to compensate for the limitations of historical data, the MCMC method is employed to simulate the uncoordinated charging process of a large-scale EV fleet. Second, the bi-level optimization model is constructed to formulate a globally optimal TOU tariff that maximizes charging cost savings for EV users. At the same time, its lower-level simulates the optimal economic response of the EV user population. Finally, the change in the minimum daily hosting capacity is calculated based on the OPF. Case study simulations for IEEE 33-bus and IEEE 69-bus systems demonstrate that the proposed model effectively shifts charging loads to off-peak hours, achieving stable user cost savings of 20.95%. More importantly, the findings reveal substantial security benefits from this economic strategy, validated across diverse network topologies. In the 33-bus system, the minimum daily capacity enhancement ranged from 174.63% for the most vulnerable node to 2.44% for the strongest node. In the 69-bus system, vulnerable nodes still achieved a significant 78.62% improvement. This finding highlights the limitations of purely economic assessments and underscores the necessity of the proposed integrated framework for achieving precise, location-dependent security planning. Full article
(This article belongs to the Section Energy Systems)
Show Figures

Figure 1

26 pages, 5202 KB  
Article
Time-Varying Bivariate Modeling for Predicting Hydrometeorological Trends in Jakarta Using Rainfall and Air Temperature Data
by Suci Nur Setyawati, Sri Nurdiati, I Wayan Mangku, Ionel Haidu and Mohamad Khoirun Najib
Hydrology 2025, 12(10), 252; https://doi.org/10.3390/hydrology12100252 - 26 Sep 2025
Viewed by 1093
Abstract
Changes in rainfall patterns and irregular air temperature have become essential issues in analyzing hydrometeorological trends in Jakarta. This study aims to select the best copula of the stationary and non-stationary copula models and visualize and explore the relationship between rainfall and air [...] Read more.
Changes in rainfall patterns and irregular air temperature have become essential issues in analyzing hydrometeorological trends in Jakarta. This study aims to select the best copula of the stationary and non-stationary copula models and visualize and explore the relationship between rainfall and air temperature to predict hydrometeorological trends. The methods used include combining univariate Lognormal and Generalized Extreme Value (GEV) distributions with Clayton, Gumbel, and Frank copulas, as well as parameter estimation using the fminsearch algorithm, Markov Chain Monte Carlo (MCMC) simulation, and a combination of both. The results show that the best model is the non-stationary Clayton copula estimated using MCMC simulation, which has the lowest Akaike Information Criterion (AIC) value. This model effectively captures extreme dependence in the lower tail of the distribution, indicating a potential increase in extreme low events such as cold droughts. Visualization of the best model through contour plots shows a shifting center of the distribution over time. This study contributes to developing dynamic hydrometeorological models for adaptation planning of changing hydrometeorological trends in Indonesia. Full article
(This article belongs to the Special Issue Trends and Variations in Hydroclimatic Variables: 2nd Edition)
Show Figures

Figure 1

Back to TopTop