Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (12)

Search Parameters:
Keywords = Quasi-Monte Carlo (QMC)

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 2678 KiB  
Article
Federated Semi-Supervised Learning with Uniform Random and Lattice-Based Client Sampling
by Mei Zhang and Feng Yang
Entropy 2025, 27(8), 804; https://doi.org/10.3390/e27080804 - 28 Jul 2025
Viewed by 225
Abstract
Federated semi-supervised learning (Fed-SSL) has emerged as a powerful framework that leverages both labeled and unlabeled data distributed across clients. To reduce communication overhead, real-world deployments often adopt partial client participation, where only a subset of clients is selected in each round. However, [...] Read more.
Federated semi-supervised learning (Fed-SSL) has emerged as a powerful framework that leverages both labeled and unlabeled data distributed across clients. To reduce communication overhead, real-world deployments often adopt partial client participation, where only a subset of clients is selected in each round. However, under non-i.i.d. data distributions, the choice of client sampling strategy becomes critical, as it significantly affects training stability and final model performance. To address this challenge, we propose a novel federated averaging semi-supervised learning algorithm, called FedAvg-SSL, that considers two sampling approaches, uniform random sampling (standard Monte Carlo) and a structured lattice-based sampling, inspired by quasi-Monte Carlo (QMC) techniques, which ensures more balanced client participation through structured deterministic selection. On the client side, each selected participant alternates between updating the global model and refining the pseudo-label model using local data. We provide a rigorous convergence analysis, showing that FedAvg-SSL achieves a sublinear convergence rate with linear speedup. Extensive experiments not only validate our theoretical findings but also demonstrate the advantages of lattice-based sampling in federated learning, offering insights into the interplay among algorithm performance, client participation rates, local update steps, and sampling strategies. Full article
(This article belongs to the Special Issue Number Theoretic Methods in Statistics: Theory and Applications)
Show Figures

Figure 1

25 pages, 362 KiB  
Article
Cutting-Edge Stochastic Approach: Efficient Monte Carlo Algorithms with Applications to Sensitivity Analysis
by Ivan Dimov and Rayna Georgieva
Algorithms 2025, 18(5), 252; https://doi.org/10.3390/a18050252 - 27 Apr 2025
Viewed by 555
Abstract
Many important practical problems connected to energy efficiency in buildings, ecology, metallurgy, the development of wireless communication systems, the optimization of radar technology, quantum computing, pharmacology, and seismology are described by large-scale mathematical models that are typically represented by systems of partial differential [...] Read more.
Many important practical problems connected to energy efficiency in buildings, ecology, metallurgy, the development of wireless communication systems, the optimization of radar technology, quantum computing, pharmacology, and seismology are described by large-scale mathematical models that are typically represented by systems of partial differential equations. Such systems often involve numerous input parameters. It is crucial to understand how susceptible the solutions are to uncontrolled variations or uncertainties within these input parameters. This knowledge helps in identifying critical factors that significantly influence the model’s outcomes and can guide efforts to improve the accuracy and reliability of predictions. Sensitivity analysis (SA) is a method used efficiently to assess the sensitivity of the output results from large-scale mathematical models to uncertainties in their input data. By performing SA, we can better manage risks associated with uncertain inputs and make more informed decisions based on the model’s outputs. In recent years, researchers have developed advanced algorithms based on the analysis of variance (ANOVA) technique for computing numerical sensitivity indicators. These methods have also incorporated computationally efficient Monte Carlo integration techniques. This paper presents a comprehensive theoretical and experimental investigation of Monte Carlo algorithms based on “symmetrized shaking” of Sobol’s quasi-random sequences. The theoretical proof demonstrates that these algorithms exhibit an optimal rate of convergence for functions with continuous and bounded first derivatives and for functions with continuous and bounded second derivatives, respectively, both in terms of probability and mean square error. For the purposes of numerical study, these approaches were successfully applied to a particular problem. A specialized software tool for the global sensitivity analysis of an air pollution mathematical model was developed. Sensitivity analyses were conducted regarding some important air pollutant levels, calculated using a large-scale mathematical model describing the long-distance transport of air pollutants—the Unified Danish Eulerian Model (UNI-DEM). The sensitivity of the model was explored focusing on two distinct categories of key input parameters: chemical reaction rates and input emissions. To validate the theoretical findings and study the applicability of the algorithms across diverse problem classes, extensive numerical experiments were conducted to calculate the main sensitivity indicators—Sobol’ global sensitivity indices. Various numerical integration algorithms were employed to meet this goal—Monte Carlo, quasi-Monte Carlo (QMC), scrambled quasi-Monte Carlo methods based on Sobol’s sequences, and a sensitivity analysis approach implemented in the SIMLAB software for sensitivity analysis. During the study, an essential task arose that is small in value sensitivity measures. It required numerical integration approaches with higher accuracy to ensure reliable predictions based on a specific mathematical model, defining a vital role for small sensitivity measures. Both the analysis and numerical results highlight the advantages of one of the proposed approaches in terms of accuracy and efficiency, particularly for relatively small sensitivity indices. Full article
(This article belongs to the Section Algorithms for Multidisciplinary Applications)
24 pages, 1545 KiB  
Article
The Representative Points of Generalized Alpha Skew-t Distribution and Applications
by Yong-Feng Zhou, Yu-Xuan Lin, Kai-Tai Fang and Hong Yin
Entropy 2024, 26(11), 889; https://doi.org/10.3390/e26110889 - 22 Oct 2024
Cited by 1 | Viewed by 983
Abstract
Assuming the underlying statistical distribution of data is critical in information theory, as it impacts the accuracy and efficiency of communication and the definition of entropy. The real-world data are widely assumed to follow the normal distribution. To better comprehend the skewness of [...] Read more.
Assuming the underlying statistical distribution of data is critical in information theory, as it impacts the accuracy and efficiency of communication and the definition of entropy. The real-world data are widely assumed to follow the normal distribution. To better comprehend the skewness of the data, many models more flexible than the normal distribution have been proposed, such as the generalized alpha skew-t (GAST) distribution. This paper studies some properties of the GAST distribution, including the calculation of the moments, and the relationship between the number of peaks and the GAST parameters with some proofs. For complex probability distributions, representative points (RPs) are useful due to the convenience of manipulation, computation and analysis. The relative entropy of two probability distributions could have been a good criterion for the purpose of generating RPs of a specific distribution but is not popularly used due to computational complexity. Hence, this paper only provides three ways to obtain RPs of the GAST distribution, Monte Carlo (MC), quasi-Monte Carlo (QMC), and mean square error (MSE). The three types of RPs are utilized in estimating moments and densities of the GAST distribution with known and unknown parameters. The MSE representative points perform the best among all case studies. For unknown parameter cases, a revised maximum likelihood estimation (MLE) method of parameter estimation is compared with the plain MLE method. It indicates that the revised MLE method is suitable for the GAST distribution having a unimodal or unobvious bimodal pattern. This paper includes two real-data applications in which the GAST model appears adaptable to various types of data. Full article
(This article belongs to the Special Issue Number Theoretic Methods in Statistics: Theory and Applications)
Show Figures

Figure 1

23 pages, 7922 KiB  
Article
Groundwater LNAPL Contamination Source Identification Based on Stacking Ensemble Surrogate Model
by Yukun Bai, Wenxi Lu, Zibo Wang and Yaning Xu
Water 2024, 16(16), 2274; https://doi.org/10.3390/w16162274 - 12 Aug 2024
Cited by 3 | Viewed by 1578
Abstract
Groundwater LNAPL (Light Non-Aqueous Phase Liquid) contamination source identification (GLCSI) is essential for effective remediation and risk assessment. Addressing the GLCSI problem often involves numerous repetitive forward simulations, which are computationally expensive and time-consuming. Establishing a surrogate model for the simulation model is [...] Read more.
Groundwater LNAPL (Light Non-Aqueous Phase Liquid) contamination source identification (GLCSI) is essential for effective remediation and risk assessment. Addressing the GLCSI problem often involves numerous repetitive forward simulations, which are computationally expensive and time-consuming. Establishing a surrogate model for the simulation model is an effective way to overcome this challenge. However, how to obtain high-quality samples for training the surrogate model and which method should be used to develop the surrogate model with higher accuracy remain important questions to explore. To this end, this paper innovatively adopted the quasi-Monte Carlo (QMC) method to sample from the prior space of unknown variables. Then, this paper established a variety of individual machine learning surrogate models, respectively, and screened three with higher training accuracy among them as the base-learning models (BLMs). The Stacking ensemble framework was utilized to integrate the three BLMs to establish the ensemble surrogate model for the groundwater LNAPL multiphase flow numerical simulation model. Finally, a hypothetical case of groundwater LNAPL contamination was designed. After evaluating the accuracy of the Stacking ensemble surrogate model, the differential evolution Markov chain (DE-MC) algorithm was applied to jointly identify information on groundwater LNAPL contamination source and key hydrogeological parameters. The results of this study demonstrated the following: (1) Employing the QMC method to sample from the prior space resulted in more uniformly distributed and representative samples, which improved the quality of the training data. (2) The developed Stacking ensemble surrogate model had a higher accuracy than any individual surrogate model, with an average R2 of 0.995, and reduced the computational burden by 99.56% compared to the inversion process based on the simulation model. (3) The application of the DE-MC algorithm effectively solved the GLCSI problem, and the mean relative error of the identification results of unknown variables was less than 5%. Full article
Show Figures

Figure 1

16 pages, 2925 KiB  
Article
Research on Seismic Connectivity Reliability Analysis of Water Distribution System Based on CUDA
by Li Long, Huaping Yang, Yan Zhou and Yong Yang
Water 2023, 15(11), 2087; https://doi.org/10.3390/w15112087 - 31 May 2023
Cited by 1 | Viewed by 1564
Abstract
To improve the seismic connectivity reliability (SCR) analysis efficiency of water distribution systems (WDS) based on Monte Carlo (MC) simulation, the quasi-Monte Carlo (QMC) method sampled by a low-discrepancy sequence is applied. Furthermore, a parallel algorithm combined with the breadth-first search algorithm for [...] Read more.
To improve the seismic connectivity reliability (SCR) analysis efficiency of water distribution systems (WDS) based on Monte Carlo (MC) simulation, the quasi-Monte Carlo (QMC) method sampled by a low-discrepancy sequence is applied. Furthermore, a parallel algorithm combined with the breadth-first search algorithm for SCR analysis of WDS based on the QMC method and Compute Unified Device Architecture (CUDA) platform was proposed. A city WDS was taken as a computational example, the accuracy and efficiency of the traditional MC algorithm and parallel algorithm were compared, and the influence of the Sobol sequence and pseudo-random number sequence was analysed. The analysis results show that when 1,000,000 simulations are performed, the maximum error of the calculation results of the two methods is 0.2%, and the parallel method can obtain a six-fold speedup ratio compared with the serial method, indicating that the proposed parallel method is correct, meets the accuracy requirements, and helps to improve the SCR analysis efficiency. When the number of simulations is the same, the simulation results based on the Sobol sequence are more accurate than those based on the pseudo-random number sequence. The proposed parallel method also achieves a good acceleration effect in the SCR analysis of large-scale WDS. Full article
(This article belongs to the Section Urban Water Management)
Show Figures

Figure 1

18 pages, 1454 KiB  
Article
Analysing Quantiles in Models of Forward Term Rates
by Thomas A. McWalter, Erik Schlögl and Jacques van Appel
Risks 2023, 11(2), 29; https://doi.org/10.3390/risks11020029 - 28 Jan 2023
Viewed by 1660
Abstract
The class of forward-LIBOR market models can, under certain volatility structures, produce unrealistically high long-dated forward rates, particularly for maturities and tenors beyond the liquid market calibration instruments. This paper presents a diagnostic tool for analysing the quantiles of distributions for forward term [...] Read more.
The class of forward-LIBOR market models can, under certain volatility structures, produce unrealistically high long-dated forward rates, particularly for maturities and tenors beyond the liquid market calibration instruments. This paper presents a diagnostic tool for analysing the quantiles of distributions for forward term rates in a displaced lognormal forward-LIBOR model (DLFM). In particular, we provide a quantile approximation that can be used to assess whether the modelled term rates remain within realistic bounds with a high probability. Applying this diagnostic tool (verified using Quasi-Monte Carlo (QMC) simulations), we show that realised forward term rates for long time horizons may be kept within realistic limits by appropriately damping the tail of the DLFM volatility function. Full article
Show Figures

Figure 1

20 pages, 409 KiB  
Article
Convergence of Uniformity Criteria and the Application in Numerical Integration
by Yang Huang and Yongdao Zhou
Mathematics 2022, 10(19), 3717; https://doi.org/10.3390/math10193717 - 10 Oct 2022
Cited by 3 | Viewed by 1913
Abstract
Quasi-Monte Carlo (QMC) methods have been successfully used for the estimation of numerical integrations arising in many applications. In most QMC methods, low-discrepancy sequences have been used, such as digital nets and lattice rules. In this paper, we derive the convergence rates of [...] Read more.
Quasi-Monte Carlo (QMC) methods have been successfully used for the estimation of numerical integrations arising in many applications. In most QMC methods, low-discrepancy sequences have been used, such as digital nets and lattice rules. In this paper, we derive the convergence rates of order of some improved discrepancies, such as centered L2-discrepancy, wrap-around L2-discrepancy, and mixture discrepancy, and propose a randomized QMC method based on a uniform design constructed by the mixture discrepancy and Baker’s transformation. Moreover, the numerical results show that the proposed method has better approximation than the Monte Carlo method and many other QMC methods, especially when the number of dimensions is less than 10. Full article
(This article belongs to the Special Issue Distribution Theory and Application)
Show Figures

Figure 1

23 pages, 5821 KiB  
Article
Probabilistic Storm Surge Estimation for Landfalling Hurricanes: Advancements in Computational Efficiency Using Quasi-Monte Carlo Techniques
by Aikaterini P. Kyprioti, Ehsan Adeli, Alexandros A. Taflanidis, Joannes J. Westerink and Hendrik L. Tolman
J. Mar. Sci. Eng. 2021, 9(12), 1322; https://doi.org/10.3390/jmse9121322 - 23 Nov 2021
Cited by 10 | Viewed by 2739
Abstract
During landfalling tropical storms, predictions of the expected storm surge are critical for guiding evacuation and emergency response/preparedness decisions, both at regional and national levels. Forecast errors related to storm track, intensity, and size impact these predictions and, thus, should be explicitly accounted [...] Read more.
During landfalling tropical storms, predictions of the expected storm surge are critical for guiding evacuation and emergency response/preparedness decisions, both at regional and national levels. Forecast errors related to storm track, intensity, and size impact these predictions and, thus, should be explicitly accounted for. The Probabilistic tropical storm Surge (P-Surge) model is the established approach from the National Weather Service (NWS) to achieve this objective. Historical forecast errors are utilized to specify probability distribution functions for different storm features, quantifying, ultimately, the uncertainty in the National Hurricane Center advisories. Surge statistics are estimated by using the predictions across a storm ensemble generated by sampling features from the aforementioned probability distribution functions. P-Surge relies, currently, on a full factorial sampling scheme to create this storm ensemble, combining representative values for each of the storm features. This work investigates an alternative formulation that can be viewed as a seamless extension to the current NHC framework, adopting a quasi-Monte Carlo (QMC) sampling implementation with ultimate goal to reduce the computational burden and provide surge predictions with the same degree of statistical reliability, while using a smaller number of sample storms. The definition of forecast errors adopted here directly follows published NWS practices, while different uncertainty levels are considered in the examined case studies, in order to offer a comprehensive validation. This validation, considering different historical storms, clearly demonstrates the advantages QMC can offer. Full article
(This article belongs to the Section Coastal Engineering)
Show Figures

Figure 1

15 pages, 9964 KiB  
Article
Identification of Efficient Sampling Techniques for Probabilistic Voltage Stability Analysis of Renewable-Rich Power Systems
by Mohammed Alzubaidi, Kazi N. Hasan, Lasantha Meegahapola and Mir Toufikur Rahman
Energies 2021, 14(8), 2328; https://doi.org/10.3390/en14082328 - 20 Apr 2021
Cited by 24 | Viewed by 3319
Abstract
This paper presents a comparative analysis of six sampling techniques to identify an efficient and accurate sampling technique to be applied to probabilistic voltage stability assessment in large-scale power systems. In this study, six different sampling techniques are investigated and compared to each [...] Read more.
This paper presents a comparative analysis of six sampling techniques to identify an efficient and accurate sampling technique to be applied to probabilistic voltage stability assessment in large-scale power systems. In this study, six different sampling techniques are investigated and compared to each other in terms of their accuracy and efficiency, including Monte Carlo (MC), three versions of Quasi-Monte Carlo (QMC), i.e., Sobol, Halton, and Latin Hypercube, Markov Chain MC (MCMC), and importance sampling (IS) technique, to evaluate their suitability for application with probabilistic voltage stability analysis in large-scale uncertain power systems. The coefficient of determination (R2) and root mean square error (RMSE) are calculated to measure the accuracy and the efficiency of the sampling techniques compared to each other. All the six sampling techniques provide more than 99% accuracy by producing a large number of wind speed random samples (8760 samples). In terms of efficiency, on the other hand, the three versions of QMC are the most efficient sampling techniques, providing more than 96% accuracy with only a small number of generated samples (150 samples) compared to other techniques. Full article
(This article belongs to the Special Issue Voltage Stability Analysis in Power Systems)
Show Figures

Figure 1

30 pages, 1297 KiB  
Article
p-Refined Multilevel Quasi-Monte Carlo for Galerkin Finite Element Methods with Applications in Civil Engineering
by Philippe Blondeel, Pieterjan Robbe, Cédric Van hoorickx, Stijn François, Geert Lombaert and Stefan Vandewalle
Algorithms 2020, 13(5), 110; https://doi.org/10.3390/a13050110 - 28 Apr 2020
Cited by 7 | Viewed by 4319
Abstract
Civil engineering applications are often characterized by a large uncertainty on the material parameters. Discretization of the underlying equations is typically done by means of the Galerkin Finite Element method. The uncertain material parameter can be expressed as a random field represented by, [...] Read more.
Civil engineering applications are often characterized by a large uncertainty on the material parameters. Discretization of the underlying equations is typically done by means of the Galerkin Finite Element method. The uncertain material parameter can be expressed as a random field represented by, for example, a Karhunen–Loève expansion. Computation of the stochastic responses, i.e., the expected value and variance of a chosen quantity of interest, remains very costly, even when state-of-the-art Multilevel Monte Carlo (MLMC) is used. A significant cost reduction can be achieved by using a recently developed multilevel method: p-refined Multilevel Quasi-Monte Carlo (p-MLQMC). This method is based on the idea of variance reduction by employing a hierarchical discretization of the problem based on a p-refinement scheme. It is combined with a rank-1 Quasi-Monte Carlo (QMC) lattice rule, which yields faster convergence compared to the use of random Monte Carlo points. In this work, we developed algorithms for the p-MLQMC method for two dimensional problems. The p-MLQMC method is first benchmarked on an academic beam problem. Finally, we use our algorithm for the assessment of the stability of slopes, a problem that arises in geotechnical engineering, and typically suffers from large parameter uncertainty. For both considered problems, we observe a very significant reduction in the amount of computational work with respect to MLMC. Full article
Show Figures

Figure 1

15 pages, 4425 KiB  
Article
Distinct Recrystallization Pathways in a Cold-Rolled Al-2%Mg Alloy Evidenced by In-Situ Neutron Diffraction
by Grigoreta M. Stoica, Luc L. Dessieux, Alexandru D. Stoica, Sven C. Vogel, Govindarajan Muralidharan, Balasubramaniam Radhakrishnan, Sarma B. Gorti, Ke An, Dong Ma and Xun-Li Wang
Quantum Beam Sci. 2018, 2(3), 17; https://doi.org/10.3390/qubs2030017 - 18 Sep 2018
Cited by 4 | Viewed by 5272
Abstract
The time-of-flight neutron diffraction data collected in-situ on Oak Ridge National Laboratory’s (ORNL, Oak Ridge, TN, USA) VULCAN and Los Alamos National Laboratory’s (LANL, Los Alamos, NM, USA) High-Pressure-Preferred-Orientation (HIPPO) diffractometers have been analyzed complementarily to show the texture evolution during annealing of [...] Read more.
The time-of-flight neutron diffraction data collected in-situ on Oak Ridge National Laboratory’s (ORNL, Oak Ridge, TN, USA) VULCAN and Los Alamos National Laboratory’s (LANL, Los Alamos, NM, USA) High-Pressure-Preferred-Orientation (HIPPO) diffractometers have been analyzed complementarily to show the texture evolution during annealing of a cold-rolled Al-2%Mg alloy. The texture analysis aimed to identify the components present in the initial rolling (or deformation) texture and in the thermally-activated recrystallization texture, respectively. Using a quasi-Monte-Carlo (QMC) approach, a new method has been developed to simulate the weighted texture components, and to obtain inverse pole figures for both rolling and normal directions. As such, distinct recrystallization pathways during annealing in isochronal conditions, can be revealed in terms of the evolution of the texture components and their respective volume fractions. Moreover, the recrystallization kinetics associated with the cube and random texture components are analyzed quantitatively using a similar approach developed for differential scanning calorimetry (DSC). Full article
(This article belongs to the Special Issue Strain, Stress and Texture Analysis with Quantum Beams)
Show Figures

Figure 1

19 pages, 741 KiB  
Article
Risk-Based Probabilistic Voltage Stability Assessment in Uncertain Power System
by Weisi Deng, Buhan Zhang, Hongfa Ding and Hang Li
Energies 2017, 10(2), 180; https://doi.org/10.3390/en10020180 - 5 Feb 2017
Cited by 18 | Viewed by 5342
Abstract
The risk-based assessment is a new approach to the voltage stability assessment in power systems. Under several uncertainties, the security risk of static voltage stability with the consideration of wind power can be evaluated. In this paper, we first build a probabilistic forecast [...] Read more.
The risk-based assessment is a new approach to the voltage stability assessment in power systems. Under several uncertainties, the security risk of static voltage stability with the consideration of wind power can be evaluated. In this paper, we first build a probabilistic forecast model for wind power generation based on real historical data. Furthermore, we propose a new probability voltage stability approach based on Conditional Value-at-Risk (CVaR) and Quasi-Monte Carlo (QMC) simulation. The QMC simulation is used to speed up Monte Carlo (MC) simulation by improving the sampling technique. Our CVaR-based model reveals critical characteristics of static voltage stability. The distribution of the local voltage stability margin, which considers the security risk at a forecast operating time interval, is estimated to evaluate the probability voltage stability. Tested on the modified IEEE New England 39-bus system and the IEEE 118-bus system, results from the proposal are compared against the result of the conventional proposal. The effectiveness and advantages of the proposed method are demonstrated by the test results. Full article
Show Figures

Figure 1

Back to TopTop