Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (126)

Search Parameters:
Keywords = probabilistic events processing

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
27 pages, 4075 KB  
Article
Outlier Detection in Functional Data Using Adjusted Outlyingness
by Zhenghui Feng, Xiaodan Hong, Yingxing Li, Xiaofei Song and Ketao Zhang
Entropy 2026, 28(2), 233; https://doi.org/10.3390/e28020233 - 16 Feb 2026
Abstract
In signal processing and information analysis, the detection and identification of anomalies present in signals constitute a critical research focus. Accurately discerning these deviations using probabilistic, statistical, and information-theoretic methods is essential for ensuring data integrity and supporting reliable downstream analysis. Outlier detection [...] Read more.
In signal processing and information analysis, the detection and identification of anomalies present in signals constitute a critical research focus. Accurately discerning these deviations using probabilistic, statistical, and information-theoretic methods is essential for ensuring data integrity and supporting reliable downstream analysis. Outlier detection in functional data aims to identify curves or trajectories that deviate significantly from the dominant pattern—a process vital for data cleaning and the discovery of anomalous events. This task is challenging due to the intrinsic infinite dimensionality of functional data, where outliers often appear as subtle shape deformations that are difficult to detect. Moving beyond conventional approaches that discretize curves into multivariate vectors, we introduce a novel framework that projects functional data into a low-dimensional space of meaningful features. This is achieved via a tailored weighting scheme designed to preserve essential curve variations. We then incorporate the Mahalanobis distance to detect directional outlyingness under non-Gaussian assumptions through a robustified bootstrap resampling method with data-driven threshold determination. Simulation studies validated its superior performance, demonstrating higher true positive and lower false positive rates across diverse anomaly types, including magnitude, shape-isolated, shape-persistent, and mixed outliers. The practical utility of our approach was further confirmed through applications in environmental monitoring using seawater spectral data, character trajectory analysis, and population data underscoring its cross-domain versatility. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

22 pages, 6191 KB  
Article
Estimations of Production Capacity Based on Simulation Models: A Case Study of Furniture Manufacturing Systems
by Damian Kolny and Robert Drobina
Appl. Sci. 2026, 16(4), 1683; https://doi.org/10.3390/app16041683 - 7 Feb 2026
Viewed by 158
Abstract
This article presents the concept of building a discrete event simulation model of a production system in terms of statistical and probabilistic models, which is based on a fragment of a broader production process in the furniture industry. The purpose of the study [...] Read more.
This article presents the concept of building a discrete event simulation model of a production system in terms of statistical and probabilistic models, which is based on a fragment of a broader production process in the furniture industry. The purpose of the study was to evaluate the efficiency of a single-shift production process during the start-up phase and to determine the impact of implementing two- and three-shift systems. The discrete event simulation model was developed using actual production data collected during a single-shift operation. Scenarios were then designed to identify and quantify the necessary process adjustments required for the successful implementation of two- and three-shift systems. The authors demonstrated that simulation modeling of production processes based on probabilistic distributions provides information that is essential for effective capacity planning. The proposed percentile grids enabled clear visualization and precise assessment of production resource utilization in various shift configurations, facilitating decision-making regarding capacity expansion based on previously assumed data. Full article
Show Figures

Figure 1

43 pages, 5548 KB  
Article
A Novel Probabilistic Model for Streamflow Analysis and Its Role in Risk Management and Environmental Sustainability
by Tassaddaq Hussain, Enrique Villamor, Mohammad Shakil, Mohammad Ahsanullah and Bhuiyan Mohammad Golam Kibria
Axioms 2026, 15(2), 113; https://doi.org/10.3390/axioms15020113 - 4 Feb 2026
Viewed by 240
Abstract
Probabilistic streamflow models play a pivotal role in quantifying hydrological uncertainty and form the backbone of modern risk management strategies for flood and drought forecasting, water allocation planning, and the design of resilient infrastructure. Unlike deterministic approaches that yield single-point estimates, these models [...] Read more.
Probabilistic streamflow models play a pivotal role in quantifying hydrological uncertainty and form the backbone of modern risk management strategies for flood and drought forecasting, water allocation planning, and the design of resilient infrastructure. Unlike deterministic approaches that yield single-point estimates, these models provide a spectrum of possible outcomes, enabling a more realistic assessment of extreme events and supporting informed, sustainable water resource decisions. By explicitly accounting for natural variability and uncertainty, probabilistic models promote transparent, robust, and equitable risk evaluations, helping decision-makers balance economic costs, societal benefits, and environmental protection for long-term sustainability. In this study, we introduce the bounded half-logistic distribution (BHLD), a novel heavy-tailed probability model constructed using the T–Y method for distribution generation, where T denotes a transformer distribution and Y represents a baseline generator. Although the BHLD is conceptually related to the Pareto and log-logistic families, it offers several distinctive advantages for streamflow modeling, including a flexible hazard rate that can be unimodal or monotonically decreasing, a finite lower bound, and closed-form expressions for key risk measures such as Value at Risk (VaR) and Tail Value at Risk (TVaR). The proposed distribution is defined on a lower-bounded domain, allowing it to realistically capture physical constraints inherent in flood processes, while a log-logistic-based tail structure provides the flexibility needed to model extreme hydrological events. Moreover, the BHLD is analytically characterized through a governing differential equation and further examined via its characteristic function and the maximum entropy principle, ensuring stable and efficient parameter estimation. It integrates a half-logistic generator with a log-logistic baseline, yielding a power-law tail decay governed by the parameter β, which is particularly effective for representing extreme flows. Fundamental properties, including the hazard rate function, moments, and entropy measures, are derived in closed form, and model parameters are estimated using the maximum likelihood method. Applied to four real streamflow data sets, the BHLD demonstrates superior performance over nine competing distributions in goodness-of-fit analyses, with notable improvements in tail representation. The model facilitates accurate computation of hydrological risk metrics such as VaR, TVaR, and tail variance, uncovering pronounced temporal variations in flood risk and establishing the BHLD as a powerful and reliable tool for streamflow modeling under changing environmental conditions. Full article
(This article belongs to the Special Issue Probability Theory and Stochastic Processes: Theory and Applications)
Show Figures

Figure 1

35 pages, 7867 KB  
Article
Inter-Comparison of Deep Learning Models for Flood Forecasting in Ethiopia’s Upper Awash Basin
by Girma Moges Mengistu, Addisu G. Semie, Gulilat T. Diro, Natei Ermias Benti, Emiola O. Gbobaniyi and Yonas Mersha
Water 2026, 18(3), 397; https://doi.org/10.3390/w18030397 - 3 Feb 2026
Viewed by 931
Abstract
Flood events driven by climate variability and change pose significant risks for socio-economic activities in the Awash Basin, necessitating advanced forecasting tools. This study benchmarks five deep learning (DL) architectures, Convolutional Neural Network (CNN), Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), Bidirectional [...] Read more.
Flood events driven by climate variability and change pose significant risks for socio-economic activities in the Awash Basin, necessitating advanced forecasting tools. This study benchmarks five deep learning (DL) architectures, Convolutional Neural Network (CNN), Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), Bidirectional LSTM (BiLSTM), and a Hybrid CNN–LSTM, for daily discharge forecasting for the Hombole catchment in the Upper Awash Basin (UAB) using 40 years of hydrometeorological observations (1981–2020). Rainfall, lagged discharge, and seasonal indicators were used as predictors. Model performance was evaluated against two baseline approaches, a conceptual HBV rainfall–runoff model as well as a climatology, using standard and hydrological metrics. Of the two baselines (climatology and HBV), the climatology showed limited skill with large bias and negative NSE, whereas the HBV model achieved moderate skill (NSE = 0.64 and KGE = 0.82). In contrast, all DL models substantially improved predictive performance, achieving test NSE values above 0.83 and low overall bias. Among them, the Hybrid CNN–LSTM provided the most balanced performance, combining local temporal feature extraction with long-term memory and yielding stable efficiency (NSE ≈ 0.84, KGE ≈ 0.90, and PBIAS ≈ −2%) across flow regimes. The LSTM and GRU models performed comparably, offering strong temporal learning and robust daily predictions, while BiLSTM improved flood timing through bidirectional sequence modeling. The CNN captured short-term variability effectively but showed weaker representation of extreme peaks. Analysis of peak-flow metrics revealed systematic underestimation of extreme discharge magnitudes across all models. However, a post-processing flow-regime classification based on discharge quantiles demonstrated high extreme-event detection skill, with deep learning models exceeding 89% accuracy in identifying extreme-flow occurrences on the test set. These findings indicate that, while magnitude errors remain for rare floods, DL models reliably discriminate flood regimes relevant for early warning. Overall, the results show that deep learning models provide clear improvements over climatology and conceptual baselines for daily streamflow forecasting in the UAB, while highlighting remaining challenges in peak-flow magnitude prediction. The study indicates promising results for the integration of deep learning methods into flood early-warning workflows; however, these results could be further improved by adopting a probabilistic forecasting framework that accounts for model uncertainty. Full article
(This article belongs to the Section Hydrology)
Show Figures

Figure 1

18 pages, 387 KB  
Article
Transfer of Quantum Information and Genesis of Superfluid Vacuum in the Pre-Inflationary Universe
by Konstantin G. Zloshchastiev
Universe 2026, 12(2), 33; https://doi.org/10.3390/universe12020033 - 26 Jan 2026
Viewed by 238
Abstract
We conjecture that during the time period preceding the inflationary epoch, the background matter was initially a condensate formed from a many-body system of indistinguishable particles whose states were in a quantum superposition. This resulted in the occurrence of a statistical ensemble of [...] Read more.
We conjecture that during the time period preceding the inflationary epoch, the background matter was initially a condensate formed from a many-body system of indistinguishable particles whose states were in a quantum superposition. This resulted in the occurrence of a statistical ensemble of spacetimes, thus causing the probabilistic uncertainty in the spacetime geometry of the pre-inflationary multiverse. Then, at a certain moment in time, a measurement event occurred, which broke the linear superposition and reduced the primordial geometrical multiverse to a single state. This process can be described as a quantum Shannon information transfer, which induces logarithmic nonlinearity in the evolution equations of the background system. The latter, therefore, transformed into a logarithmic quantum liquid of a superfluid type and formed the physical vacuum. This measurement also generated the primary mass energy necessary for the Universe’s further evolution into the inflationary epoch, followed by the contemporary “dark energy” era. Full article
(This article belongs to the Section Cosmology)
Show Figures

Figure 1

36 pages, 5408 KB  
Article
A Risk-Informed Framework for Public Safety Around Dams
by Tareq Salloum and Ernest Forman
CivilEng 2026, 7(1), 5; https://doi.org/10.3390/civileng7010005 - 10 Jan 2026
Viewed by 476
Abstract
This paper presents a quantitative framework for assessing and managing public-safety risks around dams. The framework integrates a hazard–event–objective–control structure with the Analytic Hierarchy Process (AHP) to transform qualitative judgments into quantitative risk measures. Likelihoods, consequences, and overall risk are expressed on a [...] Read more.
This paper presents a quantitative framework for assessing and managing public-safety risks around dams. The framework integrates a hazard–event–objective–control structure with the Analytic Hierarchy Process (AHP) to transform qualitative judgments into quantitative risk measures. Likelihoods, consequences, and overall risk are expressed on a ratio scale, allowing results to be aggregated, compared, and communicated in monetary terms. Probabilistic simulation accounts for uncertainty and generates outputs such as Value-at-Risk (VaR), loss-exceedance curves, and societal F–N charts, providing a clear picture of both expected and extreme outcomes. Optimization identifies control portfolios that achieve the greatest risk reduction for available budgets. A hypothetical dam case study demonstrates the framework’s application and highlights its ability to identify high-value safety investments. The framework offers dam owners and regulators a transparent, data-driven basis for prioritizing public-safety improvements and supports both facility-level (micro) and program-level (macro) decision-making consistent with international risk-tolerability and ALARP principles. Full article
Show Figures

Figure 1

18 pages, 16226 KB  
Article
Liquefaction Hazard Assessment and Mapping Across the Korean Peninsula Using Amplified Liquefaction Potential Index
by Woo-Hyun Baek and Jae-Soon Choi
Appl. Sci. 2026, 16(2), 612; https://doi.org/10.3390/app16020612 - 7 Jan 2026
Viewed by 238
Abstract
Liquefaction is a critical mechanism amplifying earthquake-induced damage, necessitating systematic hazard assessment through spatially distributed mapping. This study presents a nationwide liquefaction hazard assessment framework for South Korea, integrating site classification, liquefaction potential index (LPI) computation, and probabilistic damage evaluation. Sites across the [...] Read more.
Liquefaction is a critical mechanism amplifying earthquake-induced damage, necessitating systematic hazard assessment through spatially distributed mapping. This study presents a nationwide liquefaction hazard assessment framework for South Korea, integrating site classification, liquefaction potential index (LPI) computation, and probabilistic damage evaluation. Sites across the Korean Peninsula were stratified into five geotechnical categories (S1–S5) based on soil characteristics. LPI values were computed incorporating site-specific amplification coefficients for nine bedrock acceleration levels corresponding to seismic recurrence intervals of 500, 1000, 2400, and 4800 years per Korean seismic design specifications. Subsurface characterization utilized standard penetration test (SPT) data from 121,821 boreholes, with an R-based analytical program enabling statistical processing and spatial visualization. Damage probability assessment employed Iwasaki’s LPI severity classification across site categories. Results indicate that at 0.10 g peak ground acceleration (500-year event), four regions exhibit severe liquefaction susceptibility. This geographic footprint expands to seven regions at 0.14 g (1000-year event) and eight regions at 0.18 g. For the 2400-year design basis earthquake (0.22 g), all eight identified high-risk zones reach critical thresholds simultaneously. Site-specific analysis reveals stark contrasts in vulnerability: S2 sites demonstrate 99% very low to low damage probability, whereas S3, S4, and S5 sites face 33%, 51%, and 99% severe damage risk, respectively. This study establishes a scalable, evidence-based framework enabling efficient large-scale liquefaction hazard assessment for governmental risk management applications. Full article
(This article belongs to the Special Issue Soil Dynamics and Earthquake Engineering)
Show Figures

Figure 1

17 pages, 1399 KB  
Article
Quality Performance Criterion Model for Distributed Automated Control Systems Based on Markov Processes for Smart Grid
by Waldemar Wojcik, Ainur Ormanbekova, Muratkali Jamanbayev, Maria Yukhymchuk and Vladyslav Lesko
Appl. Sci. 2025, 15(24), 12917; https://doi.org/10.3390/app152412917 - 8 Dec 2025
Viewed by 281
Abstract
This paper addresses the problem of decision-making support for the modernization of distributed automated control systems (ACS) in power engineering by proposing an integral quality criterion that combines similarity-driven Markov process modeling with geometric programming. The methodology transforms the transition rate matrix of [...] Read more.
This paper addresses the problem of decision-making support for the modernization of distributed automated control systems (ACS) in power engineering by proposing an integral quality criterion that combines similarity-driven Markov process modeling with geometric programming. The methodology transforms the transition rate matrix of a continuous-time Markov chain (CTMC) into a matrix polynomial, enabling the derivation of normalized similarity indices and the development of a criterion-based model to quantify relative variations in system quality without requiring global optimization. The proposed approach yields a generalized criterion model that facilitates the ranking of modernization alternatives and the evaluation of the sensitivity of optimal decisions to parameter variations. The practical implementation is demonstrated through updated state transition graphs, quality functions, and UML-based architectures of diagnostic-ready evaluation modules. The scientific contribution of this work lies in the integration of similarity-based Markov modeling with the mathematical framework of geometric programming into a unified criterion model for the quantitative assessment of functional readiness under multistate conditions and probabilistic failures. The methodology enables the comparison of modernization scenarios using a unified integral indicator, assessment of sensitivity to structural and parametric changes, and seamless integration of quality evaluation into SCADA/Smart Grid environments as part of real-time diagnostics. The accuracy of the assessment depends on the adequacy of transition rate identification and the validity of the Markovian assumption. Future extensions include the real-time estimation of transition rates from event streams, generalization to semi-Markov processes, and multicriteria optimization considering cost, risk, and readiness. Full article
Show Figures

Figure 1

18 pages, 5442 KB  
Article
Tail-Aware Forecasting of Precipitation Extremes Using STL-GEV and LSTM Neural Networks
by Haoyu Niu, Samantha Murray, Fouad Jaber, Bardia Heidari and Nick Duffield
Hydrology 2025, 12(11), 284; https://doi.org/10.3390/hydrology12110284 - 30 Oct 2025
Cited by 2 | Viewed by 1515
Abstract
Accurate prediction of extreme precipitation events remains a critical challenge in hydrological forecasting due to their rare occurrence and complex statistical behavior. These extreme events are becoming more frequent and intense under the influence of climate change. Their unpredictability not only hampers water [...] Read more.
Accurate prediction of extreme precipitation events remains a critical challenge in hydrological forecasting due to their rare occurrence and complex statistical behavior. These extreme events are becoming more frequent and intense under the influence of climate change. Their unpredictability not only hampers water resource management and disaster preparedness but also leads to disproportionate impacts on vulnerable communities and critical infrastructure. Therefore, in this article, we introduce a hybrid modeling framework that combines Generalized Extreme Value (GEV) distribution fitting with deep learning models to forecast monthly maximum precipitation extremes. Long Short-term Memory models (LSTMs) are proposed to predict the cumulative distribution (CDF) values of the GEV-fitted remainder series. This approach transforms the forecasting problem into a bounded probabilistic learning task, improving model stability and interpretability. Crucially, a tail-weighted loss function is designed to emphasize rare but high-impact events in the training process, addressing the inherent class imbalance in extreme precipitation predictions. Results demonstrate strong predictive performance in both the CDF and residual domains, with the proposed model accurately identifying anomalously high precipitation months. This hybrid GEV–deep learning approach offers a promising solution for early warning systems and long-term climate resilience planning in hydrologically sensitive regions. Full article
Show Figures

Figure 1

17 pages, 297 KB  
Article
Psychosocial Representations of Gender-Based Violence Among University Students from Northwestern Italy
by Ilaria Coppola, Marta Tironi, Elisa Berlin, Laura Scudieri, Fabiola Bizzi, Chiara Rollero and Nadia Rania
Behav. Sci. 2025, 15(10), 1373; https://doi.org/10.3390/bs15101373 - 8 Oct 2025
Cited by 1 | Viewed by 1800
Abstract
The aim of the study was to explore the psychosocial perceptions that young adults have regarding gender-based violence, including those based on their personal experiences, and to highlight perceptions related to social media and how its use might be connected to gender-based violence. [...] Read more.
The aim of the study was to explore the psychosocial perceptions that young adults have regarding gender-based violence, including those based on their personal experiences, and to highlight perceptions related to social media and how its use might be connected to gender-based violence. The participants were 40 university students from Northwestern Italy with an average age of 21.8 years (range: 19–25); 50% were women. Sampling was non-probabilistic and followed a purposive convenience strategy. Semi-structured interviews were conducted online and audio-recorded, and data were analyzed using the reflective thematic approach. The results revealed that young adults are very aware, at a theoretical level, of “offline” physical, psychological, and verbal gender-based violence and its effects, while they do not give much consideration to online violence, despite often being victims of it, as revealed by their accounts, for example, through unsolicited explicit images or persistent harassment on social media. Therefore, the results of this research highlight the need to develop primary prevention programs focused on increasing awareness and providing young people with more tools to identify when they have been victims of violence, both online and offline, and to process the emotional experiences associated with such events. Full article
(This article belongs to the Special Issue Psychological Research on Sexual and Social Relationships)
30 pages, 1809 KB  
Article
Safety of LNG-Fuelled Cruise Ships in Comparative Risk Assessment
by Elvis Čapalija, Peter Vidmar and Marko Perkovič
J. Mar. Sci. Eng. 2025, 13(10), 1896; https://doi.org/10.3390/jmse13101896 - 2 Oct 2025
Viewed by 2114
Abstract
Although liquefied natural gas (LNG) is already widely used as a marine fuel, its use on large cruise ships is a relatively new development. By the end of 2024, twenty-four LNG-fuelled cruise ships were in operation, each carrying several thousand passengers and making [...] Read more.
Although liquefied natural gas (LNG) is already widely used as a marine fuel, its use on large cruise ships is a relatively new development. By the end of 2024, twenty-four LNG-fuelled cruise ships were in operation, each carrying several thousand passengers and making frequent port calls. These operational characteristics increase the potential risks compared to conventional cargo ships and require a rigorous safety assessment. In this study, the safety of LNG-fuelled cruise ships is assessed using the Formal Safety Assessment (FSA) framework prescribed by the International Maritime Organization (IMO). The assessment includes a hazard identification (HAZID), a risk analysis, an evaluation of risk control options, a cost–benefit analysis and recommendations for decision-making. Given the limited operational data on LNG-fuelled cruise ships, event trees are developed on the basis of LNG tanker incidents, adjusted to reflect passenger-related risks and cruise-specific operating conditions. A statistical overview of marine casualties involving cruise ships and LNG carriers of more than 20,000 GT over the last 35 years provides a further basis for the analysis. To ensure compliance, the study also analyses class requirements and regulatory frameworks, including risk assessments for ship design, bunker operations and emergency preparedness. These assessments, which are carried out at component, ship and process level, remain essential for safety validation and regulatory approval. The results provide a comprehensive framework for assessing LNG safety in the cruise sector by combining existing safety data, regulatory standards and probabilistic risk modelling. Recent work also confirms that event tree modelling identifies critical accident escalation pathways, particularly in scenarios involving passenger evacuation and port operations, which are under-researched in current practice. The results contribute to the wider debate on alternative fuels and support evidence-based decision-making by ship operators, regulators and industry stakeholders. Full article
(This article belongs to the Special Issue Maritime Security and Risk Assessments—2nd Edition)
Show Figures

Figure 1

36 pages, 4030 KB  
Article
Impact of High Penetration of Sustainable Local Energy Communities on Distribution Network Protection and Reliability
by Samuel Borroy Vicente, Luis Carlos Parada, María Teresa Villén Martínez, Aníbal Antonio Prada Hurtado, Andrés Llombart Estopiñán and Luis Hernandez-Callejo
Appl. Sci. 2025, 15(19), 10401; https://doi.org/10.3390/app151910401 - 25 Sep 2025
Viewed by 1006
Abstract
The growing integration of renewable-based distributed energy resources within local energy communities is significantly reshaping the operational dynamics of medium voltage distribution networks, particularly affecting their reliability and protection schemes. This work investigates the technical impacts of the high penetration of distributed generation [...] Read more.
The growing integration of renewable-based distributed energy resources within local energy communities is significantly reshaping the operational dynamics of medium voltage distribution networks, particularly affecting their reliability and protection schemes. This work investigates the technical impacts of the high penetration of distributed generation within sustainable local energy communities on the effectiveness of fault detection, location, isolation, and service restoration processes, from the point of view of Distribution System Operators. From a supply continuity perspective, the methodology of the present work comprises a comprehensive, quantitative, system-level assessment based on probabilistic, scenario-based simulations of fault events on a CIGRE benchmark distribution network. The models incorporate component fault rates and repair times derived from EPRI databases and compute standard IEEE indices over a one-year horizon, considering manual, hybrid, and fully automated operation scenarios. The results highlight the significant potential of automation to enhance supply continuity. However, the qualitative assessment carried out through laboratory-based Hardware-in-the-Loop tests reveals critical vulnerabilities in fault-detection devices, particularly when inverter-based distributed generation units contribute to fault currents. Consequently, quantitative evaluations based on a sensitivity analysis incorporating these findings, varying the reliability of fault-detection systems, indicate that the reliability improvements expected from increased automation levels are significantly deteriorated if protection malfunctions occur due to fault current contributions from distributed generation. These results underscore the need for the evolution of protection technologies in medium voltage networks to ensure reliability under future scenarios characterised by high shares of distributed energy resources and local energy communities. Full article
(This article belongs to the Section Energy Science and Technology)
Show Figures

Figure 1

16 pages, 2069 KB  
Article
“Can I Use My Leg Too?” Dancing with Uncertainty: Exploring Probabilistic Thinking Through Embodied Learning in a Jerusalem Art High School Classroom
by Dafna Efron and Alik Palatnik
Educ. Sci. 2025, 15(9), 1248; https://doi.org/10.3390/educsci15091248 - 18 Sep 2025
Viewed by 763
Abstract
Despite increased interest in embodied learning, the role of sensorimotor activity in shaping students’ probabilistic reasoning remains underexplored. This design-based study examines how high school students develop key probabilistic concepts, including sample space, certainty, and event probability, through whole-body movement activities situated in [...] Read more.
Despite increased interest in embodied learning, the role of sensorimotor activity in shaping students’ probabilistic reasoning remains underexplored. This design-based study examines how high school students develop key probabilistic concepts, including sample space, certainty, and event probability, through whole-body movement activities situated in an authentic classroom setting. Grounded in embodied cognition theory, we introduce a two-axis interpretive framework. One axis spans sensorimotor exploration and formal reasoning, drawing from established continuums in the literature. The second axis, derived inductively from our analysis, contrasts engagement with distraction, foregrounding the affective and attentional dimensions of embodied participation. Students engaged in structured yet open-ended movement sequences that elicited intuitive insights. This approach, epitomized by one student’s spontaneous question, “Can I use my leg too?”, captures the agentive and improvisational character of the embodied learning environment. Through five analyzed classroom episodes, we trace how students shifted between bodily exploration and formalization, often through nonlinear trajectories shaped by play, uncertainty, and emotionally driven reflection. While moments of insight emerged organically, they were also fragile, as they were affected by ambiguity and the difficulty in translating physical actions into mathematical language. Our findings underscore the pedagogical potential of embodied design for probabilistic learning while also highlighting the need for responsive teaching that balances structure with improvisation and supports affective integration throughout the learning process. Full article
Show Figures

Figure 1

26 pages, 2889 KB  
Article
Advanced Implementation of the Asymmetric Distribution Expectation-Maximum Algorithm in Fault-Tolerant Control for Turbofan Acceleration
by Xinhai Zhang, Jia Geng, Kang Wang, Ming Li and Zhiping Song
Aerospace 2025, 12(9), 829; https://doi.org/10.3390/aerospace12090829 - 16 Sep 2025
Viewed by 679
Abstract
For the safety and performance of turbofan engines, the fault-tolerant control of acceleration schedules is becoming increasingly necessary. However, traditional probabilistic approaches struggle to satisfy the single-side surge boundary limits and control asymmetry. Moreover, the baseline fault-tolerance requirement of the acceleration schedule cannot [...] Read more.
For the safety and performance of turbofan engines, the fault-tolerant control of acceleration schedules is becoming increasingly necessary. However, traditional probabilistic approaches struggle to satisfy the single-side surge boundary limits and control asymmetry. Moreover, the baseline fault-tolerance requirement of the acceleration schedule cannot depend on whether fault detection exists, and model-dependent data approaches inherently limit their generalizability. To address all these challenges, this paper proposes a probabilistic viewpoint of non-frequency and non-Bayesian schools, and the asymmetric distribution expectation-maximum algorithm (ADEMA) based on this viewpoint, along with their detailed theoretical derivations. The surge boundary enhances safety requirements for the acceleration control; therefore, simulations and verifications consider the disturbance combinations involving a single significant fault alongside normal deviations from other factors, including minor faults. In the event of such disturbances, ADEMA can effectively prevent the acceleration process from approaching the surge boundary, both at sea level and within the flight envelope. It demonstrates the smallest median estimation error (0.27% at sea level and 0.96% within the flight envelope) compared to other methods, such as the Bayesian weighted average method. Although its maintenance of performance is not exceptionally strong, its independence from model-data makes it a valuable reference. Full article
(This article belongs to the Section Aeronautics)
Show Figures

Figure 1

35 pages, 638 KB  
Article
On the Relativity of Quantumness as Implied by Relativity of Arithmetic and Probability
by Marek Czachor
Entropy 2025, 27(9), 922; https://doi.org/10.3390/e27090922 - 2 Sep 2025
Cited by 2 | Viewed by 1089
Abstract
A hierarchical structure of isomorphic arithmetics is defined by a bijection gR:RR. It entails a hierarchy of probabilistic models, with probabilities pk=gk(p), where g is the restriction of [...] Read more.
A hierarchical structure of isomorphic arithmetics is defined by a bijection gR:RR. It entails a hierarchy of probabilistic models, with probabilities pk=gk(p), where g is the restriction of gR to the interval [0,1], gk is the kth iterate of g, and k is an arbitrary integer (positive, negative, or zero; g0(x)=x). The relation between p and gk(p), k>0, is analogous to the one between probability and neural activation function. For k1, gk(p) is essentially white noise (all processes are equally probable). The choice of k=0 is physically as arbitrary as the choice of origin of a line in space, hence what we regard as experimental binary probabilities, pexp, can be given by any k, pexp=gk(p). Quantum binary probabilities are defined by g(p)=sin2π2p. With this concrete form of g, one finds that any two neighboring levels of the hierarchy are related to each other in a quantum–subquantum relation. In this sense, any model in the hierarchy is probabilistically quantum in appropriate arithmetic and calculus. And the other way around: any model is subquantum in appropriate arithmetic and calculus. Probabilities involving more than two events are constructed by means of trees of binary conditional probabilities. We discuss from this perspective singlet-state probabilities and Bell inequalities. We find that singlet state probabilities involve simultaneously three levels of the hierarchy: quantum, hidden, and macroscopic. As a by-product of the analysis, we discover a new (arithmetic) interpretation of the Fubini–Study geodesic distance. Full article
(This article belongs to the Special Issue Quantum Measurement)
Show Figures

Figure 1

Back to TopTop