Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (144)

Search Parameters:
Keywords = bayesian information update

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
29 pages, 12360 KB  
Article
Vision-Guided Dynamic Risk Assessment for Long-Span PC Continuous Rigid-Frame Bridge Construction Through DEMATEL–ISM–DBN Modelling
by Linlin Zhao, Qingfei Gao, Yidian Dong, Yajun Hou, Liangbo Sun and Wei Wang
Buildings 2025, 15(24), 4543; https://doi.org/10.3390/buildings15244543 - 16 Dec 2025
Abstract
In response to the challenges posed by the complex evolution of risks and the static nature of traditional assessment methods during the construction of long-span prestressed concrete (PC) continuous rigid-frame bridges, this study proposes a risk assessment framework that integrates visual perception with [...] Read more.
In response to the challenges posed by the complex evolution of risks and the static nature of traditional assessment methods during the construction of long-span prestressed concrete (PC) continuous rigid-frame bridges, this study proposes a risk assessment framework that integrates visual perception with dynamic probabilistic reasoning. By combining an improved YOLOv8 model with the Decision-making Trial and Evaluation Laboratory–InterpretiveStructure Modeling (DEMATEL–ISM) algorithm, the framework achieves intelligent identification of risk elements and causal structure modelling. On this basis, a dynamic Bayesian network (DBN) is constructed, incorporating a sliding window and forgetting factor mechanism to enable adaptive updating of conditional probability tables. Using the Tongshun River Bridge as a case study, at the identification layer, we refine onsite targets into 14 risk elements (F1–F14). For visualization, these are aggregated into four categories—“Bridge, Person, Machine, Environment”—to enhance readability. In the methodology layer, leveraging causal a priori information provided by DEMATEL–ISM, risk elements are mapped to scenario probabilities, enabling scenario-level risk assessment and grading. This establishes a traceable closed-loop process from “elements” to “scenarios.” The results demonstrate that the proposed approach effectively identifies key risk chains within the “human–machine–environment–bridge” system, revealing phase-specific peaks in human-related risks and cumulative increases in structural and environmental risks. The particle filter and Monte Carlo prediction outputs generate short-term risk evolution curves with confidence intervals, facilitating the quantitative classification of risk levels. Overall, this vision-guided dynamic risk assessment method significantly enhances the real-time responsiveness, interpretability, and foresight of bridge construction safety management and provides a promising pathway for proactive risk control in complex engineering environments. Full article
(This article belongs to the Special Issue Big Data and Machine/Deep Learning in Construction)
Show Figures

Figure 1

22 pages, 594 KB  
Article
A Symmetric Bayesian Framework for Swarm Information Interaction and Collective Behavior Prediction
by Hui Shen, Peng Yu, Yonghui Yang, Chenyang Li and Xue-Bo Chen
Symmetry 2025, 17(12), 2091; https://doi.org/10.3390/sym17122091 - 5 Dec 2025
Viewed by 178
Abstract
This paper studies the information interaction process in Bayesian theorem-based swarm systems. Through theoretical analysis, model construction, and simulation experiments, it explores how Bayesian decision-making utilizes information cascades to update its state step by step in group information interaction. The system operates within [...] Read more.
This paper studies the information interaction process in Bayesian theorem-based swarm systems. Through theoretical analysis, model construction, and simulation experiments, it explores how Bayesian decision-making utilizes information cascades to update its state step by step in group information interaction. The system operates within a theoretical framework where an underlying symmetry governs the dynamic combination of prior knowledge, neighbor information, and target guidance, leading to spontaneous aggregation behavior similar to biological swarms. A key embodiment of this symmetry is the action–reaction force parity between agents, which ensures local stability.The simulation results show that groups with different prior information exhibit a multi-stage convergence characteristic, which reveals that within each iteration step, the agent adheres to the rules for information-symmetric communication and interaction. This dynamic behavior is a true reflection of natural biological populations and provides theoretical support for practical applications such as traffic management and robot collaboration. Full article
(This article belongs to the Section Computer)
Show Figures

Figure 1

36 pages, 5969 KB  
Article
Policy Credibility and Carbon Border Adjustments: A Dynamic Signaling Analysis
by Haoling Zhan, Shanqi Zhou and Tian Luo
Sustainability 2025, 17(23), 10843; https://doi.org/10.3390/su172310843 - 3 Dec 2025
Viewed by 293
Abstract
This study examines how information frictions in climate policy credibility shape carbon border adjustment mechanisms when trading partners cannot fully verify each other’s commitment to green industrial policies. A dynamic signaling framework models exporters’ policy commitment capacity as private information, incorporating Bayesian belief [...] Read more.
This study examines how information frictions in climate policy credibility shape carbon border adjustment mechanisms when trading partners cannot fully verify each other’s commitment to green industrial policies. A dynamic signaling framework models exporters’ policy commitment capacity as private information, incorporating Bayesian belief updating subject to signal noise and reputation decay. The analysis derives a Perfect Bayesian Equilibrium characterizing optimal CBAM tariff responses conditional on importers’ evolving credibility assessments. The calibrated model achieves strong empirical validation (R2 = 0.884, explaining 88% of tariff variance; rank correlation ρ=0.950), with Monte Carlo simulations demonstrating robust internal consistency (RMSE = 2.56 percentage points). Results identify a critical belief threshold (pt<0.3) triggering hyperelastic tariff responses: below this credibility level, small perception declines generate disproportionately steep border tax increases (elasticity ≥ 2), trapping countries in high-tariff equilibria despite genuine commitment. Information frictions impose aggregate welfare losses equivalent to 30% of potential coordination gains, decomposed into four sources: information opacity (40%), cognitive biases in belief formation (27%), policy distortions induced by credibility concerns (20%), and reputation maintenance costs (13%). These quantitative patterns, while illustrative within the baseline calibration, motivate testable implications regarding elasticity asymmetries and credibility persistence. The framework identifies targeted policy interventions—third-party verification of commitment durability, rule-based tariff adjustment protocols, and institutional commitment devices—systematically prioritized by marginal welfare impact to guide beliefs away from credibility traps while maintaining environmental rigor. Full article
(This article belongs to the Section Economic and Business Aspects of Sustainability)
Show Figures

Figure 1

26 pages, 1468 KB  
Article
Integrated Bayesian Networks and Linear Programming for Decision Optimization
by Assel Abdildayeva, Assem Shayakhmetova and Galymzhan Baurzhanuly Nurtugan
Mathematics 2025, 13(23), 3749; https://doi.org/10.3390/math13233749 - 22 Nov 2025
Viewed by 537
Abstract
This paper develops a general BN → LP framework for decision optimization under complex, structured uncertainty. A Bayesian network encodes causal dependencies among drivers and yields posterior joint probabilities; a linear program then reads expected coefficients directly from BN marginals to optimize the [...] Read more.
This paper develops a general BN → LP framework for decision optimization under complex, structured uncertainty. A Bayesian network encodes causal dependencies among drivers and yields posterior joint probabilities; a linear program then reads expected coefficients directly from BN marginals to optimize the objective under operational constraints with explicit risk control via chance constraints or small ambiguity sets centered at the BN posterior. This mapping avoids explicit scenario enumeration and separates feasibility from credibility, so extreme but implausible cases are down-weighted rather than dictating decisions. A farm-planning case with interacting factors (weather → disease → yield; demand ↔ price; input costs) demonstrates practical feasibility. Under matched risk control, the BN → LP approach maintains the target violation rate while avoiding the over-conservatism of flat robust optimization and the optimism of independence-based stochastic programming; it also circumvents the inner minimax machinery typical of distributionally robust optimization. Tractability is governed by BN inference over the decision-relevant ancestor subgraph; empirical scaling shows that Markov-blanket pruning, mutual-information screening of weak parents, and structured/low-rank CPDs yield orders-of-magnitude savings with negligible impact on the objective. A standardized, data-and-expert construction (Dirichlet smoothing) and a systematic sensitivity analysis identifies high-leverage parameters, while a receding-horizon DBN → LP extension supports online updates. The method brings the largest benefits when uncertainty is high-dimensional and coupled, and it converges to classical allocations when drivers are few and essentially independent. Full article
(This article belongs to the Special Issue Decision Making and Optimization Under Uncertainty)
Show Figures

Figure 1

26 pages, 1227 KB  
Article
Fractional-Order Black-Winged Kite Algorithm for Moving Target Search by UAV
by Li Lv, Lei Fu, Wenjing Xiao, Zhe Zhang, Tomas Wu and Jun Du
Fractal Fract. 2025, 9(11), 726; https://doi.org/10.3390/fractalfract9110726 - 10 Nov 2025
Viewed by 465
Abstract
The nonlocality (capable of associating target dynamics across multiple time moments) and memory properties (able to retain historical trajectories) of fractional calculus serve as the core theoretical approach to resolving the “dynamic information association deficiency” in UAV mobile target search. This paper proposes [...] Read more.
The nonlocality (capable of associating target dynamics across multiple time moments) and memory properties (able to retain historical trajectories) of fractional calculus serve as the core theoretical approach to resolving the “dynamic information association deficiency” in UAV mobile target search. This paper proposes the Fractional-order Black-winged Kite Algorithm (FOBKA), which transforms the search problem into an adaptability function optimization model aimed at “maximizing target capture probability” based on Bayesian theory. Addressing the limitations of the standard Black-winged Kite Algorithm (BKA), the study incorporates fractional calculus theory for enhancement: A fractional-order operator is embedded in the migration behavior phase, leveraging the memory advantage of fractional-orders to precisely capture the temporal span, spatial position, and velocity evolution of targets, thereby enhancing global detection capability and convergence accuracy. Simultaneously, population individuals are initialized using motion-encoding, and the attack behavior phase combines alternating updates with a Lévy flight mechanism to balance local exploration and global search performance. To validate FOBKA’s superiority, comparative experiments were conducted against eight newly proposed meta-heuristic algorithms across six distinct test scenarios. Experimental data demonstrate that FOBKA significantly outperforms the comparison algorithms in convergence accuracy, operational robustness, and target capture probability. Full article
Show Figures

Figure 1

45 pages, 750 KB  
Article
The Price Equation Reveals a Universal Force–Metric–Bias Law of Algorithmic Learning and Natural Selection
by Steven A. Frank
Entropy 2025, 27(11), 1129; https://doi.org/10.3390/e27111129 - 31 Oct 2025
Viewed by 536
Abstract
Diverse learning algorithms, optimization methods, and natural selection share a common mathematical structure despite their apparent differences. Here, I show that a simple notational partitioning of change by the Price equation reveals a universal force–metric–bias (FMB) law: [...] Read more.
Diverse learning algorithms, optimization methods, and natural selection share a common mathematical structure despite their apparent differences. Here, I show that a simple notational partitioning of change by the Price equation reveals a universal force–metric–bias (FMB) law: Δθ=Mf+b+ξ. The force f drives improvement in parameters, Δθ, in proportion to the slope of performance with respect to the parameters. The metric M rescales movement by inverse curvature. The bias b adds momentum or changes in the frame of reference. The noise ξ enables exploration. This framework unifies natural selection, Bayesian updating, Newton’s method, stochastic gradient descent, stochastic Langevin dynamics, Adam optimization, and most other algorithms as special cases of the same underlying process. The Price equation also reveals why Fisher information, Kullback–Leibler divergence, and d’Alembert’s principle arise naturally in learning dynamics. By exposing this common structure, the FMB law provides a principled foundation for understanding, comparing, and designing learning algorithms across disciplines. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

21 pages, 8490 KB  
Article
BDGS-SLAM: A Probabilistic 3D Gaussian Splatting Framework for Robust SLAM in Dynamic Environments
by Tianyu Yang, Shuangfeng Wei, Jingxuan Nan, Mingyang Li and Mingrui Li
Sensors 2025, 25(21), 6641; https://doi.org/10.3390/s25216641 - 30 Oct 2025
Viewed by 1771
Abstract
Simultaneous Localization and Mapping (SLAM) utilizes sensor data to concurrently construct environmental maps and estimate its own position, finding wide application in scenarios like robotic navigation and augmented reality. SLAM systems based on 3D Gaussian Splatting (3DGS) have garnered significant attention due to [...] Read more.
Simultaneous Localization and Mapping (SLAM) utilizes sensor data to concurrently construct environmental maps and estimate its own position, finding wide application in scenarios like robotic navigation and augmented reality. SLAM systems based on 3D Gaussian Splatting (3DGS) have garnered significant attention due to their real-time, high-fidelity rendering capabilities. However, in real-world environments containing dynamic objects, existing 3DGS-SLAM methods often suffer from mapping errors and tracking drift due to dynamic interference. To address this challenge, this paper proposes BDGS-SLAM—a Bayesian Dynamic Gaussian Splatting SLAM framework specifically designed for dynamic environments. During the tracking phase, the system integrates semantic detection results from YOLOv5 to build a dynamic prior probability model based on Bayesian filtering, enabling accurate identification of dynamic Gaussians. In the mapping phase, a multi-view probabilistic update mechanism is employed, which aggregates historical observation information from co-visible keyframes. By introducing an exponential decay factor to dynamically adjust weights, this mechanism effectively restores static Gaussians that were mistakenly culled. Furthermore, an adaptive dynamic Gaussian optimization strategy is proposed. This strategy applies penalizing constraints to suppress the negative impact of dynamic Gaussians on rendering while avoiding the erroneous removal of static Gaussians and ensuring the integrity of critical scene information. Experimental results demonstrate that, compared to baseline methods, BDGS-SLAM achieves comparable tracking accuracy while generating fewer artifacts in rendered results and realizing higher-fidelity scene reconstruction. Full article
(This article belongs to the Special Issue Indoor Localization Technologies and Applications)
Show Figures

Figure 1

12 pages, 2253 KB  
Article
Enhancing Migraine Trigger Surprisal Predictions: A Bayesian Approach to Establishing Prospective Expectations
by Dana P. Turner, Emily Caplis, Twinkle Patel and Timothy T. Houle
Entropy 2025, 27(11), 1102; https://doi.org/10.3390/e27111102 - 25 Oct 2025
Viewed by 700
Abstract
Prior work has demonstrated that higher surprisal, a measure quantifying the unexpectedness of a trigger exposure, predicts headache onset over 12 to 24 h. However, these analyses relied on retrospective expectations of trigger exposure formed after extended data collection. To operationalize surprisal prospectively, [...] Read more.
Prior work has demonstrated that higher surprisal, a measure quantifying the unexpectedness of a trigger exposure, predicts headache onset over 12 to 24 h. However, these analyses relied on retrospective expectations of trigger exposure formed after extended data collection. To operationalize surprisal prospectively, Bayesian methods could update expectations dynamically over time. The objective of this study was to extend the application of surprisal theory for predicting migraine attack risk by developing methods to estimate trigger variable likelihood in real time, under conditions of limited personal observation. In a prospective daily diary study of individuals with migraine (N = 104), data were collected over 28 days, including stress, sleep, and exercise exposures. Bayesian models were applied to estimate daily expectations for each variable under uninformative and empirical priors derived from the sample. Stress was modeled using a hurdle-Gamma distribution, sleep using discrete outcomes from a Normal distribution, and exercise using a Bernoulli distribution. Surprisal was calculated based on the predictive distribution at each time point and compared to static empirical surprisal values obtained after full data collection. Dynamic Bayesian surprisal values systematically differed from retrospective empirical estimates, particularly early in the observation period. Divergence was larger and more variable under uninformative priors but attenuated over time. Empirically informed priors produced more stable, lower-bias surprisal trajectories. Substantial individual variability was observed across exposure types, especially for exercise behavior. Prospective surprisal modeling is feasible but highly sensitive to prior specification, especially in sparse data contexts (e.g., a binary exposure). Incorporating empirical or individually informed priors may improve early model calibration, though individual learning remains essential. These methods offer a foundation for real-time headache forecasting and dynamic modeling of brain–environment interactions. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

30 pages, 852 KB  
Article
Bayesian Model Updating of Structural Parameters Using Temperature Variation Data: Simulation
by Ujjwal Adhikari and Young Hoon Kim
Machines 2025, 13(10), 899; https://doi.org/10.3390/machines13100899 - 1 Oct 2025
Viewed by 991
Abstract
Finite element (FE) models are widely used in structural health monitoring to represent real structures and assess their condition, but discrepancies often arise between numerical and actual structural behavior due to simplifying assumptions, uncertain parameters, and environmental influences. Temperature variation, in particular, significantly [...] Read more.
Finite element (FE) models are widely used in structural health monitoring to represent real structures and assess their condition, but discrepancies often arise between numerical and actual structural behavior due to simplifying assumptions, uncertain parameters, and environmental influences. Temperature variation, in particular, significantly affects structural stiffness and modal properties, yet it is often treated as noise in traditional model updating methods. This study treats temperature changes as valuable information for model updating and structural damage quantification. The Bayesian model updating approach (BMUA) is a probabilistic approach that updates uncertain model parameters by combining prior knowledge with measured data to estimate their posterior probability distributions. However, traditional BMUA methods assume mass is known and only update stiffness. A novel BMUA framework is proposed that incorporates thermal buckling and temperature-dependent stiffness estimation and introduces an algorithm to eliminate the coupling effect between mass and stiffness by using temperature-induced stiffness changes. This enables the simultaneous updating of both parameters. The framework is validated through numerical simulations on a three-story aluminum shear frame under uniform and non-uniform temperature distributions. Under healthy and uniform temperature conditions, stiffness parameters were estimated with high accuracy, with errors below 0.5% and within uncertainty bounds, while mass parameters exhibited errors up to 13.8% that exceeded their extremely low standard deviations, indicating potential model bias. Under non-uniform temperature distributions, accuracy declined, particularly for localized damage cases, with significant deviations in both parameters. Full article
Show Figures

Figure 1

12 pages, 1642 KB  
Article
A Bayesian Approach for Designing Experiments Based on Information Criteria to Reduce Epistemic Uncertainty of Fuel Fracture During Loss-of-Coolant Accidents
by Shusuke Hamaguchi, Takafumi Narukawa and Takashi Takata
J. Nucl. Eng. 2025, 6(3), 35; https://doi.org/10.3390/jne6030035 - 1 Sep 2025
Viewed by 952
Abstract
In probabilistic risk assessment (PRA), the fracture limit of fuel cladding tubes under loss-of-coolant accident conditions plays a critical role in determining the core damage, highlighting the need for accurate modeling of cladding tube fracture behavior. However, for high-burnup cladding tubes, it is [...] Read more.
In probabilistic risk assessment (PRA), the fracture limit of fuel cladding tubes under loss-of-coolant accident conditions plays a critical role in determining the core damage, highlighting the need for accurate modeling of cladding tube fracture behavior. However, for high-burnup cladding tubes, it is often infeasible to conduct extensive experiments due to limited material availability, high costs, and technical constraints. These limitations make it difficult to acquire sufficient data, leading to substantial epistemic uncertainty in fracture modeling. To enhance the realism of PRA results under such constraints, it is essential to develop methods that can effectively reduce epistemic uncertainty using limited experimental data. In this study, we propose a Bayesian approach for designing experimental conditions based on a widely applicable information criterion (WAIC) in order to effectively reduce the uncertainty in the prediction of fuel cladding tube fracture with limited data. We conduct numerical experiments to evaluate the effectiveness of the proposed method in comparison with conventional approaches based on empirical loss and functional variance. Two cases are considered: one where the true and predictive models share the same mathematical structure (Case 1) and one where they differ (Case 2). In Case 1, the empirical loss-based design performs best when the number of added data points is fewer than approximately 10. In Case 2, the WAIC-based design consistently achieves the lowest Bayes generalization loss, demonstrating superior robustness in situations where the true model is unknown. These results indicate that the proposed method enables more informative experimental designs on average and contributes to the effective reduction in epistemic uncertainty in practical applications. Full article
(This article belongs to the Special Issue Probabilistic Safety Assessment and Management of Nuclear Facilities)
Show Figures

Figure 1

22 pages, 894 KB  
Article
Adaptive Knowledge Assessment via Symmetric Hierarchical Bayesian Neural Networks with Graph Symmetry-Aware Concept Dependencies
by Wenyang Cao, Nhu Tam Mai and Wenhe Liu
Symmetry 2025, 17(8), 1332; https://doi.org/10.3390/sym17081332 - 15 Aug 2025
Cited by 12 | Viewed by 1169
Abstract
Traditional educational assessment systems suffer from inefficient question selection strategies that fail to optimally probe student knowledge while requiring extensive testing time. We present a novel hierarchical probabilistic neural framework that integrates Bayesian inference with symmetric deep neural architectures to enable adaptive, efficient [...] Read more.
Traditional educational assessment systems suffer from inefficient question selection strategies that fail to optimally probe student knowledge while requiring extensive testing time. We present a novel hierarchical probabilistic neural framework that integrates Bayesian inference with symmetric deep neural architectures to enable adaptive, efficient knowledge assessment. Our method models student knowledge as latent representations within a graph-structured concept dependency network, where probabilistic mastery states, updated through variational inference, are encoded by symmetric graph properties and symmetric concept representations that preserve structural equivalences across similar knowledge configurations. The system employs a symmetric dual-network architecture: a concept embedding network that learns scale-invariant hierarchical knowledge representations from assessment data and a question selection network that optimizes symmetric information gain through deep reinforcement learning with symmetric reward structures. We introduce a novel uncertainty-aware objective function that leverages symmetric uncertainty measures to balance exploration of uncertain knowledge regions with exploitation of informative question patterns. The hierarchical structure captures both fine-grained concept mastery and broader domain understanding through multi-scale graph convolutions that preserve local graph symmetries and global structural invariances. Our symmetric information-theoretic method ensures balanced assessment strategies that maintain diagnostic equivalence across isomorphic concept subgraphs. Experimental validation on large-scale educational datasets demonstrates that our method achieves 76.3% diagnostic accuracy while reducing the question count by 35.1% compared to traditional assessments. The learned concept embeddings reveal interpretable knowledge structures with symmetric dependency patterns that align with pedagogical theory. Our work generalizes across domains and student populations through symmetric transfer learning mechanisms, providing a principled framework for intelligent tutoring systems and adaptive testing platforms. The integration of probabilistic reasoning with symmetric neural pattern recognition offers a robust solution to the fundamental trade-off between assessment efficiency and diagnostic precision in educational technology. Full article
(This article belongs to the Special Issue Advances in Graph Theory Ⅱ)
Show Figures

Figure 1

16 pages, 1720 KB  
Article
The Maghreb as a Hotspot of Diversity for the Freshwater Crab Genus Potamon (Decapoda, Potamidae)
by Nesrine Rouabhi, Djaouida Bouchelouche, Luca Vecchioni, Youness Mabrouki, Fouzi Abdelkhaleq Taybi, Federico Marrone and Francesco Paolo Faraone
Diversity 2025, 17(8), 562; https://doi.org/10.3390/d17080562 - 10 Aug 2025
Cited by 1 | Viewed by 1065
Abstract
The Maghreb region of North Africa, located at the intersection of the Palaearctic and Afrotropical zones, is a biodiversity hotspot for terrestrial and freshwater taxa, including the freshwater crab of genus Potamon Savigny, 1816. Recent molecular studies have suggested the presence of two [...] Read more.
The Maghreb region of North Africa, located at the intersection of the Palaearctic and Afrotropical zones, is a biodiversity hotspot for terrestrial and freshwater taxa, including the freshwater crab of genus Potamon Savigny, 1816. Recent molecular studies have suggested the presence of two distinct Potamon species in the region: Potamon algeriense Bott, 1967, and an as-yet undescribed taxon, Potamon sp. However, comprehensive data on their distribution, genetic structure, and conservation status are still lacking. In the present study, we integrate new field collections from Algeria and Morocco (2021–2023) with molecular analyses of mitochondrial (COI, ND1) and nuclear (28S rDNA) markers to assess species boundaries and genetic diversity within Potamon across the Maghreb. Phylogenetic reconstructions based on Maximum Likelihood and Bayesian Inference consistently support the presence of two well-differentiated Potamon lineages in the region, corresponding to P. algeriense in western and central Maghreb, and Potamon sp. in eastern Algeria and Tunisia. While Potamon sp. exhibits low intra-specific genetic variation, P. algeriense displays a deeply structured mitochondrial lineage composition, forming four geographically coherent subclades, each corresponding to distinct hydrological regions. In light of this, it would be advisable to revise the IUCN assessment to include both species and updated information on their distribution. Full article
Show Figures

Figure 1

34 pages, 1156 KB  
Systematic Review
Mathematical Modelling and Optimization Methods in Geomechanically Informed Blast Design: A Systematic Literature Review
by Fabian Leon, Luis Rojas, Alvaro Peña, Paola Moraga, Pedro Robles, Blanca Gana and Jose García
Mathematics 2025, 13(15), 2456; https://doi.org/10.3390/math13152456 - 30 Jul 2025
Cited by 1 | Viewed by 1400
Abstract
Background: Rock–blast design is a canonical inverse problem that joins elastodynamic partial differential equations (PDEs), fracture mechanics, and stochastic heterogeneity. Objective: Guided by the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) protocol, a systematic review of mathematical methods for geomechanically informed [...] Read more.
Background: Rock–blast design is a canonical inverse problem that joins elastodynamic partial differential equations (PDEs), fracture mechanics, and stochastic heterogeneity. Objective: Guided by the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) protocol, a systematic review of mathematical methods for geomechanically informed blast modelling and optimisation is provided. Methods: A Scopus–Web of Science search (2000–2025) retrieved 2415 records; semantic filtering and expert screening reduced the corpus to 97 studies. Topic modelling with Bidirectional Encoder Representations from Transformers Topic (BERTOPIC) and bibliometrics organised them into (i) finite-element and finite–discrete element simulations, including arbitrary Lagrangian–Eulerian (ALE) formulations; (ii) geomechanics-enhanced empirical laws; and (iii) machine-learning surrogates and multi-objective optimisers. Results: High-fidelity simulations delimit blast-induced damage with ≤0.2 m mean absolute error; extensions of the Kuznetsov–Ram equation cut median-size mean absolute percentage error (MAPE) from 27% to 15%; Gaussian-process and ensemble learners reach a coefficient of determination (R2>0.95) while providing closed-form uncertainty; Pareto optimisers lower peak particle velocity (PPV) by up to 48% without productivity loss. Synthesis: Four themes emerge—surrogate-assisted PDE-constrained optimisation, probabilistic domain adaptation, Bayesian model fusion for digital-twin updating, and entropy-based energy metrics. Conclusions: Persisting challenges in scalable uncertainty quantification, coupled discrete–continuous fracture solvers, and rigorous fusion of physics-informed and data-driven models position blast design as a fertile test bed for advances in applied mathematics, numerical analysis, and machine-learning theory. Full article
Show Figures

Figure 1

13 pages, 559 KB  
Article
Dynamic Modeling and Online Updating of Full-Power Converter Wind Turbines Based on Physics-Informed Neural Networks and Bayesian Neural Networks
by Yunyang Xu, Bo Zhou, Xinwei Sun, Yuting Tian and Xiaofeng Jiang
Electronics 2025, 14(15), 2985; https://doi.org/10.3390/electronics14152985 - 26 Jul 2025
Viewed by 772
Abstract
This paper presents a dynamic model for full-power converter permanent magnet synchronous wind turbines based on Physics-Informed Neural Networks (PINNs). The model integrates the physical dynamics of the wind turbine directly into the loss function, enabling high-accuracy equivalent modeling with limited data and [...] Read more.
This paper presents a dynamic model for full-power converter permanent magnet synchronous wind turbines based on Physics-Informed Neural Networks (PINNs). The model integrates the physical dynamics of the wind turbine directly into the loss function, enabling high-accuracy equivalent modeling with limited data and overcoming the typical “black-box” constraints and large data requirements of traditional data-driven approaches. To enhance the model’s real-time adaptability, we introduce an online update mechanism leveraging Bayesian Neural Networks (BNNs) combined with a clustering-guided strategy. This mechanism estimates uncertainty in the neural network weights in real-time, accurately identifies error sources, and performs local fine-tuning on clustered data. This improves the model’s ability to track real-time errors and addresses the challenge of parameter-specific adjustments. Finally, the data-driven model is integrated into the CloudPSS platform, and its multi-scenario modeling accuracy is validated across various typical cases, demonstrating the robustness of the proposed approach. Full article
Show Figures

Figure 1

26 pages, 7319 KB  
Article
Methodology for Implementing Monitoring Data into Probabilistic Analysis of Existing Embankment Dams
by Ljubo Divac, Veljko Pujević, Dejan Divac and Miloš Marjanović
Appl. Sci. 2025, 15(12), 6786; https://doi.org/10.3390/app15126786 - 17 Jun 2025
Viewed by 687
Abstract
Monitoring data provide valuable information on embankment dam behavior but are typically not integrated into a classical probabilistic safety assessment. This paper introduces a Bayesian-inspired methodology to directly integrate actual dam monitoring records into a Monte Carlo probabilistic safety assessment using a finite [...] Read more.
Monitoring data provide valuable information on embankment dam behavior but are typically not integrated into a classical probabilistic safety assessment. This paper introduces a Bayesian-inspired methodology to directly integrate actual dam monitoring records into a Monte Carlo probabilistic safety assessment using a finite element framework, without recalibrating the original input parameters ‘distributions. After the baseline (unweighted) set of simulations is generated, the method assigns a weight coefficient to each simulation outcome based on the likelihood of matching monitoring data, effectively updating the baseline probabilistic analysis results. Therefore, such “weighted” analysis produces an updated probability distribution of the dam’s factor of safety (FS) that reflects both prior uncertainty of model parameters and actual monitoring data. To illustrate the approach, a case study of a rockfill triaxial test specimen is analyzed: a baseline probabilistic analysis yields a mean FS ~1.7, whereas the weighted analysis incorporating monitoring data reduces the mean FS to ~1.5 and narrows the variability. The weighted analysis suggests less favorable conditions than the baseline projections. This methodology offers a transparent, computationally tractable route for embedding monitoring evidence into reliability calculations, producing more reflective safety estimates of actual dam behavior. Full article
Show Figures

Figure 1

Back to TopTop