Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (4,183)

Search Parameters:
Keywords = probabilistic models

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
25 pages, 22453 KB  
Article
A Safe and Efficient Navigation Framework for Ground Vehicles on Uneven Terrain Considering Kinematic Constraints and Terrain Traversability
by Jingyao Gai, Zhiyang Guo, Huimin Su, Wang Qing, Kangye Wei, Zhiqiang Cai and Mingzhang Pan
Sensors 2026, 26(5), 1481; https://doi.org/10.3390/s26051481 - 26 Feb 2026
Abstract
Ground vehicles navigating uneven terrain must simultaneously guarantee motion safety and efficiency. Safety requires that the planned waypoints lie in highly traversable terrain, while ensuring vehicle reachability to these waypoints, which must be kinematically feasible. Efficiency demands fewer detours and smoother paths that [...] Read more.
Ground vehicles navigating uneven terrain must simultaneously guarantee motion safety and efficiency. Safety requires that the planned waypoints lie in highly traversable terrain, while ensuring vehicle reachability to these waypoints, which must be kinematically feasible. Efficiency demands fewer detours and smoother paths that avoid excessive vehicle acceleration and steering. However, existing path planning research for uneven terrain fails to comprehensively integrate vehicle kinematic constraints, terrain factors, path smoothness, rollover risk, and total path length. To address this problem, this paper proposes a novel navigation framework. It first integrates terrain slope, flatness, elevation variation, and sparsity to generate a 2D global terrain traversability cost map. Subsequently, a three-phase path planning algorithm integrates A*, guided Rapidly-exploring Random Tree (RRT), and our proposed Kinematic and Terrain-Aware Probabilistic Roadmap (KT-PRM) local re-planning algorithm, which jointly considers multiple factors including ground vehicle kinematic constraints, terrain factors, path smoothness, rollover risk, and path length. This three-phase combination delivers safe, smooth, and short global paths over uneven terrain within a relatively short planning time. Finally, Nonlinear Model Predictive Control (NMPC) is employed for path tracking in the framework. Experiments were conducted in both simulated and real-world uneven terrain environments. The results demonstrated that the three-phase path planning algorithm integrated with our proposed KT-PRM algorithm achieves comprehensive performance in generating safer, smoother, and shorter paths. Our proposed navigation framework achieves safer and more efficient navigation compared with existing navigation frameworks. Full article
(This article belongs to the Section Vehicular Sensing)
27 pages, 1015 KB  
Article
Stature Estimation in Forensic Anthropology: Addressing the Current Status, Challenges and Future Prospects
by Tooba Siddiqui, Peter Zioupos and Nicholas Marquez-Grant
Forensic Sci. 2026, 6(1), 23; https://doi.org/10.3390/forensicsci6010023 - 26 Feb 2026
Abstract
Background: Stature estimation is a key parameter in biological profiling of human skeletal remains and thus in aiding in the identification process of the deceased. Various methods, including anatomical approaches, multifactorial regressions and organic correlation approaches, have been developed to estimate stature. Challenges [...] Read more.
Background: Stature estimation is a key parameter in biological profiling of human skeletal remains and thus in aiding in the identification process of the deceased. Various methods, including anatomical approaches, multifactorial regressions and organic correlation approaches, have been developed to estimate stature. Challenges arise from incomplete or altered remains, outdated reference samples, and the lack of population specific studies. This paper critically examines genetic influences and population-specific factors, the present status quo, recent developments and the challenges in stature estimation in forensic anthropology. Methods: The study appraises the current status, challenges, and future directions of stature estimation in forensic anthropology and bioarchaeology. The open-source literature is systematically identified, and relevant osteological and imaging-based studies are screened and prioritised when they report original empirical data or clearly defined methodological developments, enabling consistent extraction of sample and population descriptors, modelling strategies, and limitations. Included studies are then synthesised comparatively and summarised in a study table to support direct cross-study evaluation. Results: Stature estimation accuracy is shown to depend on population, ancestry, sex, time period, and environment, with cross-population or cross-temporal equation transfer identified as a key source of bias. Limitations include fragmented/altered remains and non-representative reference samples, while newer imaging and probabilistic approaches improve performance when supported by suitable data and explicit error quantification. Conclusions: In the present article, stature estimation is positioned as a context-dependent, evidence-weighted process rather than a fixed calculation. Full article
Show Figures

Figure 1

36 pages, 2825 KB  
Article
Life as Counterfactual Geometry: An Adversarial Theory of Biological Function
by Călin Gheorghe Buzea, Florin Nedeff, Diana Mirilă, Valentin Nedeff, Maricel Agop, Lăcrămioara Ochiuz, Lucian Dobreci and Decebal Vasincu
Entropy 2026, 28(3), 255; https://doi.org/10.3390/e28030255 - 26 Feb 2026
Abstract
Living systems exhibit anticipation, adaptability, and resilience that cannot be fully explained by stimulus–response models, static homeostasis, or convergence-based optimization. This work addresses this gap by proposing a theoretical framework in which a central aspect of biological function is understood through the geometry [...] Read more.
Living systems exhibit anticipation, adaptability, and resilience that cannot be fully explained by stimulus–response models, static homeostasis, or convergence-based optimization. This work addresses this gap by proposing a theoretical framework in which a central aspect of biological function is understood through the geometry and stability of distributions over unrealized but accessible future trajectories. We formalize these distributions as a counterfactual manifold, defined as a probabilistically supported subset of path space induced by a system’s effective internal dynamics. Using tools from information geometry and dynamical systems theory, we analyze adaptive systems that modify the laws governing their own future trajectories and construct explicit dual-channel adversarial dynamics that couple processes expanding future possibilities with antagonistic processes enforcing feasibility constraints. We show that adaptive systems of this kind are generically unstable, tending toward either collapse of accessible futures or unbounded sensitivity to perturbation. Constructive adversarial dynamics are sufficient to stabilize counterfactual geometry without requiring convergence to a fixed point. A minimal adversarial model reveals three generic regimes: collapse, runaway sensitivity, and bounded non-convergent regulation. The framework yields operational, falsifiable predictions through measurable proxies based on response diversity, perturbation sensitivity, recovery geometry, and boundary residence, allowing these regimes to be discriminated using finite observations without reconstructing underlying state-space dynamics. Interpreting disease as instability of counterfactual geometry provides a unifying language for understanding rigidity, volatility, and context dependence across biological domains. Rather than replacing mechanistic models, the proposed framework offers a higher-level geometric and dynamical perspective in which such models can be embedded and compared, shifting attention from component-level dysfunction to the stability of biological futures and establishing a principled foundation for analyzing disease, intervention, and adaptability across scales. Full article
Show Figures

Figure 1

16 pages, 2311 KB  
Article
The Novel Models for Identifying the Vertical Structure of Urban Vegetation from UAV LiDAR Data
by Hang Yang, Rongxin Deng, Xinmeng Jing, Zhen Dong, Xiaoyu Yang, Jingyi Li and Zhiwen Mei
Remote Sens. 2026, 18(5), 692; https://doi.org/10.3390/rs18050692 - 26 Feb 2026
Abstract
Accurate quantification of vegetation vertical structure is crucial for analyzing the ecological functions of urban green spaces. However, constrained by the complexity of vegetation structure and spatial heterogeneity, current approaches for extracting vegetation vertical structure by airborne LiDAR have limitations in terms of [...] Read more.
Accurate quantification of vegetation vertical structure is crucial for analyzing the ecological functions of urban green spaces. However, constrained by the complexity of vegetation structure and spatial heterogeneity, current approaches for extracting vegetation vertical structure by airborne LiDAR have limitations in terms of layer boundary identification stability, threshold dependency, and ecological plausibility. This study developed two integrated UAV LiDAR-based stratification frameworks for identifying urban riparian vegetation vertical structure by combining established statistical modeling and signal processing techniques: (1) a Gaussian Mixture Model with Bayesian Information Criterion (GMM-BIC)-based probabilistic stratification framework; (2) a Savitzky–Golay filtering and Pruned Exact Linear Time (SG-PELT)-based change-point detection framework. Furthermore, the ecological height constraint was incorporated into the model to achieve biological adjustments. Two models were applied in the study area and compared using reference data. The results showed that the GMM-BIC method achieved an overall classification accuracy of 91.06%, with a macro-averaged F1-score of 87.77%, while the SG-PELT method attained an overall accuracy of 84.57%, with a macro-averaged F1-score of 79.20%. These results demonstrate that both models can effectively identify the vertical structure of urban vegetation. In particular, the two models exhibited distinct characteristics across different scenarios. The GMM-BIC model showed superior stratification accuracy in regions where vegetation height distribution displayed pronounced multi-peak characteristics and distinct differences among height segments. In comparison, the SG-PELT model demonstrated greater sensitivity in areas with significant height variation and clearly defined abrupt transitions between layers. These models could provide new methodologies for monitoring vegetation vertical structure and offer data support for biodiversity monitoring and ecological function assessment within urban ecosystems. Full article
Show Figures

Figure 1

21 pages, 3430 KB  
Article
Comparative Evaluation of Brine Leakage Models in Legacy Wells: Analytical, Transient, and Mechanistic Approaches for CO2 Storage Integrity
by Ahmed Alsubaih, Bruno Fernande, Mojdeh Delshad and Kamy Sepehrnoori
Energies 2026, 19(5), 1154; https://doi.org/10.3390/en19051154 - 26 Feb 2026
Abstract
Geologic carbon storage (GCS) is expanding rapidly as a cornerstone decarbonization option, but its climate value depends on maintaining long-term containment of CO2 and displaced formation brine. Legacy wells—many drilled and abandoned before modern barrier standards—remain one of the most credible and [...] Read more.
Geologic carbon storage (GCS) is expanding rapidly as a cornerstone decarbonization option, but its climate value depends on maintaining long-term containment of CO2 and displaced formation brine. Legacy wells—many drilled and abandoned before modern barrier standards—remain one of the most credible and controllable pathways for unintended upward migration. To support transparent, fit-for-purpose risk screening, this study benchmarks three leakage-modeling philosophies across a common six-layer scenario: (i) a reservoir-scale analytical solution for layered aquifers, (ii) a semi-analytical pressure-transient model that captures rock–fluid compressibility and breakthrough time, and (iii) a new mechanistic wellbore-scale model that explicitly represents dominant annular failure pathways (micro-annuli, cement fractures, casing breaches, and cement–formation interface flow) with pathway-specific hydraulic losses. Results show that model choice and physics assumptions drive order-of-magnitude differences in predicted brine rates: after 1000 days, the analytical model predicts ~1.7 bbls/day, the pressure-transient model exceeds 8 bbls/day, whereas the mechanistic model yields damage-dependent outcomes (~0.2–0.4 bbls/day for moderate–severe cement damage and up to ~3.5 bbls/day for open-channel conditions). These findings demonstrate that neglecting wellbore hydraulic resistance can systematically overstate leakage risk, while mechanistic pathway representation enables more realistic, condition-dependent screening. Future work will focus on model calibration to field/monitoring data, probabilistic parameterization of defect geometries, and extension to multiphase/reactive leakage to support operational decision-making and regulatory assurance. Full article
(This article belongs to the Section A: Sustainable Energy)
Show Figures

Figure 1

20 pages, 820 KB  
Article
A Risk-Based Universal Calibration Interval Model Using Monte Carlo Simulation
by Dmytro Malakhov, Tatiana Kelemenová and Michal Kelemen
Appl. Sci. 2026, 16(5), 2230; https://doi.org/10.3390/app16052230 - 26 Feb 2026
Abstract
Sustainable manufacturing requires modern intelligent approaches to monitoring products of the manufacturing process. An integral part of intelligent manufacturing is the measurement of geometric parameters of products, which allows diagnosing the state of the manufacturing process, optimizing it and predicting its further development. [...] Read more.
Sustainable manufacturing requires modern intelligent approaches to monitoring products of the manufacturing process. An integral part of intelligent manufacturing is the measurement of geometric parameters of products, which allows diagnosing the state of the manufacturing process, optimizing it and predicting its further development. For these reasons, it is necessary to monitor the condition of measuring instruments, as decision-making is based on the data provided by them. Calibration intervals of measuring instruments are commonly defined using fixed time-based rules that are not explicitly linked to measurement uncertainty growth or conformity risk. This practice may lead to either unnecessary recalibration or an increased probability of using out-of-tolerance instruments. In this study, a Monte Carlo-based methodology for determining recalibration intervals is proposed, in which recalibration decisions are derived from the probabilistic evolution of measurement error over time. Measurement uncertainty is modeled as a time-dependent stochastic process combining calibration uncertainty, drift behavior, and repeatability. Monte Carlo simulation is used to propagate uncertainty and to estimate both the expanded uncertainty and the probability that the measurement error exceeds the maximum permissible error (MPE). The recalibration interval is defined as the earliest time at which this probability exceeds a predefined acceptable risk threshold. A numerical experiment using realistic synthetic data representative of a typical dimensional measuring instrument demonstrates that probability-based and uncertainty-based criteria may lead to substantially different recalibration intervals. The results confirm that risk-informed recalibration intervals provide a more transparent and metrologically justified alternative to fixed schedules while remaining fully compatible with ISO/IEC 17025 and GUM principles. The proposed approach is instrument-agnostic and readily applicable in calibration laboratories and industrial measurement systems. Full article
(This article belongs to the Special Issue Advanced Digital Design and Intelligent Manufacturing, 2nd Edition)
Show Figures

Figure 1

26 pages, 1831 KB  
Article
Joint Action of Wind and Temperature for a Long-Span Cable-Stayed Bridge in Plateau Canyon Regions Using SHM Data and Copula-Based Probabilistic Modeling
by Jiang Liu, Zefan Liu, Zhiyuan Ma, Yagang Tong, Chendi Wang, Licheng Zhu and Yongjian Liu
Buildings 2026, 16(5), 916; https://doi.org/10.3390/buildings16050916 - 25 Feb 2026
Abstract
Current bridge design codes specify combination coefficients for wind–temperature joint actions, yet few studies have addressed these for bridges in plateau canyon regions. This study investigates the joint actions and combination coefficients for Haihuang Bridge, which is in a plateau canyon region surrounded [...] Read more.
Current bridge design codes specify combination coefficients for wind–temperature joint actions, yet few studies have addressed these for bridges in plateau canyon regions. This study investigates the joint actions and combination coefficients for Haihuang Bridge, which is in a plateau canyon region surrounded by mountains. Using long-term structural health monitoring data, trivariate normal copulas and Con-KRP were applied to estimate joint probabilities of wind speed and air temperature in different directions. The combination coefficients range from 0.68 to 0.92 for temperature actions and 0.56 to 0.75 for wind actions, obtained based on the principle that bivariate Con-KRP equals univariate Con-KRP. Significant differences in the joint actions are found in different directions. Furthermore, the combination coefficients in the plateau canyon region are much larger than those in the subtropical coastal plain region, indicating a need for further study on the regional difference. Full article
(This article belongs to the Special Issue Advances in Steel-Concrete Composite Structure—2nd Edition)
44 pages, 5435 KB  
Article
Techno-Economic Assessment of Integrated CO2 Liquefaction and Waste Energy Recovery Using Low-GWP Zeotropic Mixtures for Maritime Applications
by Luis Alfonso Díaz-Secades, Aitor Nicolás Fernández Álvarez, Raquel Martínez Martínez, Pablo A. Rico Lázaro, Jonas W. Ringsberg and C. Guedes Soares
J. Mar. Sci. Eng. 2026, 14(5), 420; https://doi.org/10.3390/jmse14050420 - 25 Feb 2026
Abstract
The increasing regulatory pressure on the maritime sector to decarbonize, driven in part by market-based mechanisms at the European level, is accelerating the development of onboard carbon management and energy-efficiency solutions. In this context, this study evaluates an integrated architecture that combines a [...] Read more.
The increasing regulatory pressure on the maritime sector to decarbonize, driven in part by market-based mechanisms at the European level, is accelerating the development of onboard carbon management and energy-efficiency solutions. In this context, this study evaluates an integrated architecture that combines a CO2 liquefaction system with organic Rankine cycles. The system captures 66% of the total CO2 emitted by ship engines and is capable of recovering up to 2600.8 kW of energy from onboard hot and cold sources. To identify the most suitable working fluids, an extensive screening of 208 low-GWP zeotropic mixtures is conducted, assessing their thermophysical behavior and energy recovery performance. A detailed thermo-economic assessment is undertaken, including the calculation of CO2-equivalent savings, GHG abatement cost, and payback periods. To account for fuel price variability, probabilistic modelling based on Monte Carlo sampling is applied to estimate the distribution of discounted payback outcomes. The results demonstrate that Novec 649-based zeotropic mixtures combined with the proposed architecture reduce fuel consumption and enhance onboard CO2 management while remaining safe and economically viable across a wide range of operating scenarios. Full article
23 pages, 10908 KB  
Article
MSF: Multi-Level Spatiotemporal Filtering for Event Denoising via Motion Estimation
by Jiuhe Wang, Kun Yu, Xinghua Xu and Nanliang Shan
Sensors 2026, 26(5), 1437; https://doi.org/10.3390/s26051437 - 25 Feb 2026
Abstract
Event cameras provide microsecond-level temporal resolution, low latency, and high dynamic range, enabling robust perception under fast motion and challenging lighting conditions. Nevertheless, event streams are susceptible to background activity, thermal noise, and hot pixels. Their sparse and irregular patterns can corrupt event [...] Read more.
Event cameras provide microsecond-level temporal resolution, low latency, and high dynamic range, enabling robust perception under fast motion and challenging lighting conditions. Nevertheless, event streams are susceptible to background activity, thermal noise, and hot pixels. Their sparse and irregular patterns can corrupt event structures and degrade downstream tasks. We propose MSF, a multi-level spatiotemporal filtering framework that couples motion-compensated aggregation with neighborhood-level verification. In each temporal window, MSF estimates a constant 2D optical flow by maximizing a robust, density-normalized contrast objective on the image of warped events (IWE). We further incorporate polarity–gradient decorrelation to suppress mixed-polarity noise and an explicit peak-suppression regularizer to avoid hot-pixel-induced degeneracy. The motion parameters are optimized via coarse grid initialization followed by gradient-ascent refinement. Based on the estimated motion, MSF performs hierarchical event selection: central events are extracted from high-confidence aggregated regions, local events are recovered through joint spatial–temporal–directional–polarity consistency, and weak border events are identified using a density-normalized probabilistic support model that rewards support from reliable structures while penalizing self-clustering. Experiments on four public benchmarks (DVSNOISE20, DVSMOTION20, DVSCLEAN, and E-MLB) show that MSF consistently improves the Event Structural Ratio (ESR) and outperforms representative baselines across diverse motion regimes and severe low-light noise. Full article
(This article belongs to the Special Issue Event-Driven Vision Sensor Architectures and Application Scenarios)
Show Figures

Figure 1

14 pages, 900 KB  
Article
Alignment-Free Machine Learning Serotype Classification of the Dengue Virus
by Vladimir Gajdov, Isidora Prosic, Mihaela Kavran, Filip Bosilkov, Tamas Petrovic, Jelena Konstantinov and Gospava Lazic
Viruses 2026, 18(3), 280; https://doi.org/10.3390/v18030280 - 25 Feb 2026
Abstract
Dengue virus (DENV) serotyping is essential for epidemiological surveillance, clinical risk assessment, and vaccine evaluation, as the four dengue serotypes differ in pathogenicity, immune interactions, and population dynamics. Existing subtyping methods largely rely on sequence alignment and phylogenetic inference, which can be computationally [...] Read more.
Dengue virus (DENV) serotyping is essential for epidemiological surveillance, clinical risk assessment, and vaccine evaluation, as the four dengue serotypes differ in pathogenicity, immune interactions, and population dynamics. Existing subtyping methods largely rely on sequence alignment and phylogenetic inference, which can be computationally intensive and unreliable for short, fragmented, or error-prone sequences commonly generated in diagnostic and surveillance settings. There is a need for fast, alignment-free serotyping approaches that maintain high accuracy across heterogeneous sequence lengths while remaining scalable, transparent, and suitable for real-world diagnostic inputs. We demonstrate that compact 3-mer composition features are sufficient for highly accurate dengue virus serotyping when coupled with a lineage-aware Random Forest classification framework. Using 64 normalized 3-mer frequency features per sequence with ambiguity masking and enforcing strict cluster-aware validation at both 99% and 95% nucleotide identity thresholds, our approach achieved near-perfect accuracy and macro-F1 scores on held-out internal test sets. To further ensure independence, external validation datasets were filtered to remove exact sequence matches and any sequences sharing ≥99% or ≥95% nucleotide identity with internal data. On these strictly independent external datasets, the model maintained 100% accuracy and macro-F1 performance, confirming robust generalization beyond database redundancy. Robustness analyses showed stable performance under contiguous sequence truncation down to 300 bp and in the presence of ambiguous nucleotides, indicating resilience to realistic diagnostic inputs. These results demonstrate that a lightweight, alignment-free, machine learning approach can rival alignment-dependent methods while maintaining strict lineage-aware evaluation controls. The proposed framework combines high predictive accuracy, probabilistic reliability, computational efficiency, and reproducible validation design, making it well suited for large-scale genomic surveillance, rapid pre-screening, and diagnostic decision-support applications. Full article
Show Figures

Figure 1

18 pages, 2183 KB  
Article
Annual Load Scenario Generation Using a Hybrid STL and Improved DDPM Approach
by Heran Kang, Hongyang Liu, Jianfei Liu, Ruichen Hao, Xiang Wang, Wenbo Hu, Jie Chen, Wei Yue, Haibo Li and Zongxiang Lu
Inventions 2026, 11(2), 21; https://doi.org/10.3390/inventions11020021 - 24 Feb 2026
Abstract
To address the limitations of existing annual load scenario generation methods, including insufficient ability to represent long-term trends, excessive randomness in generated scenarios, and inadequate consideration of special holiday conditions, in this paper, an annual load curve generation method is proposed that integrates [...] Read more.
To address the limitations of existing annual load scenario generation methods, including insufficient ability to represent long-term trends, excessive randomness in generated scenarios, and inadequate consideration of special holiday conditions, in this paper, an annual load curve generation method is proposed that integrates Seasonal–Trend decomposition using Loess (STL) with an improved denoising diffusion probabilistic model (DDPM). In the proposed method, the STL algorithm is first applied to decompose the annual load curve into a trend component and a daily seasonal component. The trend component is used as a baseline to ensure that the generated load curves remain consistent with the actual long-term trend characteristics. On this basis, an improved diffusion-based denoising model is employed to achieve controllable generation of different types of daily load scenarios. Finally, the generated daily load scenarios are aggregated with the trend component on an hourly basis to construct annual load scenario curves that simultaneously preserve realistic trend behavior and stochastic fluctuations. A case study based on a city in China is used to evaluate the proposed method. The results demonstrate that both the generated daily load scenarios and annual load scenarios outperform existing benchmark methods across multiple quantitative evaluation metrics, thereby validating the effectiveness of the proposed load scenario generation approach. Full article
Show Figures

Figure 1

24 pages, 2751 KB  
Article
Regression Analysis Under Interval-Valued Targets as an Imprecise Classification Problem
by Lev Utkin, Stanislav Kogan, Andrei Konstantinov and Vladimir Muliukha
Algorithms 2026, 19(3), 166; https://doi.org/10.3390/a19030166 - 24 Feb 2026
Abstract
Regression analysis with interval-valued outcomes presents a fundamental challenge in modeling data where uncertainty is inherent rather than incidental. Such data, arising naturally in fields ranging from meteorology to finance, require methods that preserve information about both central tendency and dispersion. We introduce [...] Read more.
Regression analysis with interval-valued outcomes presents a fundamental challenge in modeling data where uncertainty is inherent rather than incidental. Such data, arising naturally in fields ranging from meteorology to finance, require methods that preserve information about both central tendency and dispersion. We introduce a novel class of attention-based regression models that reformulates interval-valued regression as a multiclass classification task. The key idea behind the model is in partitioning the outcome domain into basic intervals derived from training data intersections and representing each interval-valued observation as a set of feasible discrete probability distributions over these intervals. This imprecise probabilistic representation allows us to train a classification-style model by minimizing the expected log-likelihood over all consistent distributions. We propose two training algorithms: a Monte Carlo sampling approach and a more efficient joint optimization method that simultaneously updates both the constrained probability distributions and model parameters. The model incorporates a kernel-based aggregation mechanism using trainable dot-product attention, where attention weights are computed from input features but applied to the probability distributions over basic intervals. Numerical experiments with real datasets illustrate the approach. By introducing the class of attention-based models for interval-valued regression, this work offers a novel perspective on applying machine learning to uncertain data. Codes implementing the proposed models are publicly available. Full article
(This article belongs to the Section Evolutionary Algorithms and Machine Learning)
22 pages, 1981 KB  
Article
Air Traffic Noise Prediction Method Based on Machine Learning Driven by Quick Access Recorder
by Zhixing Tang, Yijie Fan, Xuanting Chen, Xinyan Shi, Zhaolun Niu, Yuming Zhong, Meng Jia and Xiaowei Tang
Aerospace 2026, 13(3), 208; https://doi.org/10.3390/aerospace13030208 - 24 Feb 2026
Viewed by 23
Abstract
Accurate prediction of air traffic noise is critical for advancing environmentally sustainable operations in high density terminal areas. Conventional noise prediction models often exhibit significant limitations due to discrepancies between actual and nominal flight trajectories. To overcome this challenge, this study introduces a [...] Read more.
Accurate prediction of air traffic noise is critical for advancing environmentally sustainable operations in high density terminal areas. Conventional noise prediction models often exhibit significant limitations due to discrepancies between actual and nominal flight trajectories. To overcome this challenge, this study introduces a probabilistic framework that integrates real air-traffic-flow data to generate realistic flight trajectory distributions. The proposed methodology extracts key operational features—including trajectory distribution probabilities, and essential trajectory operation features—within a machine learning architecture. Furthermore, we develop a dedicated air traffic noise prediction model for clustered flight paths that explicitly incorporates traffic flow patterns, enabling high-fidelity simulation of noise propagation under actual air traffic operation. The framework is validated using a QAR (Quick Access Recorder) dataset from the terminal area of Changsha Huanghua International Airport. Experimental results demonstrate the model’s high predictive accuracy for both air traffic noise distribution and its influence, coupled with computational efficiency and practical applicability. The findings indicate that the proposed approach successfully addresses the challenge of predicting air traffic noise from divergent, real-world flight trajectories, offering a robust method for supporting noise-abatement strategies and sustainable aviation-planning initiatives. Full article
(This article belongs to the Special Issue AI, Machine Learning and Automation for Air Traffic Control (ATC))
19 pages, 999 KB  
Article
Comparing Sexual and Gender Minority and Cisgender Heterosexual Missourians’ Breast and Colorectal Cancer Screening Prevalence: The 2022 Missouri County-Level Study
by Jane A. McElroy and Kevin D. Everett
Cancers 2026, 18(5), 729; https://doi.org/10.3390/cancers18050729 - 24 Feb 2026
Viewed by 43
Abstract
Background: Cancer screening disparities remain understudied, particularly among underrepresented groups at the county level. This study compared the use of preventive breast and colorectal cancer screening services between sexual and gender minority (SGM) adults and cisgender heterosexual adults in Missouri. Methods: The 2022 [...] Read more.
Background: Cancer screening disparities remain understudied, particularly among underrepresented groups at the county level. This study compared the use of preventive breast and colorectal cancer screening services between sexual and gender minority (SGM) adults and cisgender heterosexual adults in Missouri. Methods: The 2022 Missouri County-Level Study, a probabilistic survey of health-related behaviors in each county, was used to estimate breast and colorectal cancer (BC and CRC) screening prevalence. Screening prevalence was calculated using weighted samples, and regression models were used to adjust for demographic composition and age eligibility for both cancer sites. Results: Compared to cisgender heterosexual adults (n = 48,257), SGM adults (n = 2801) were significantly younger and more likely to reside in urban areas and be employed. Statewide, county-representative prevalence of breast cancer screening in the last 2 years was 75.6%, and colorectal cancer screening (i.e., colonoscopy in the last 10 years or sigmoidoscopy in the last 5 years) was 63.1%. In age-adjusted models for BC screening for participants (ages 40–74), age had a curvilinear association, increasing at younger ages but declining in later years. For CR screening (ages 45–75), age showed a strong, stable, positive effect. SGM adults had similar odds of breast cancer screening; however, for CRC, SGM adults had higher odds of ever being screened but similar odds to cisgender heterosexual adults of up-to-date screening. Differences largely reflect eligibility windows and initiation versus maintenance dynamics. Conclusions: In this large sample of Missouri county residents, breast cancer and colorectal cancer screening rates were comparable between SGM adults and cisgender heterosexual adults for up-to-date screening probability. Improving statewide cancer prevention will require addressing the broader structural and regional barriers that suppress screening uptake across Missouri communities. Impact: These findings demonstrate the importance of using age-appropriate, guideline-aligned analyses to accurately assess cancer screening equity and avoid overstating disparities among SGM populations. By identifying where differences do not exist, this work helps focus resources on the structural and regional barriers that continue to limit cancer prevention for all Missourians. Full article
(This article belongs to the Special Issue Disparities in Cancer Prevention, Screening, Diagnosis and Management)
Show Figures

Figure 1

33 pages, 820 KB  
Article
The Kerper–Bowron Method: A Foundational Change for Service Contract Claim Estimation and Accounting
by John Kerper and Lee Bowron
Risks 2026, 14(3), 44; https://doi.org/10.3390/risks14030044 - 24 Feb 2026
Viewed by 44
Abstract
The Kerper–Bowron Method (KB Method) is a patent-pending approach that revolutionizes service contract loss estimation and accounting by introducing a precise, contract-level approach to forecasting expected losses and cancellations. Building on a prior 2007 paper, this update presents the Earned Contract formula, aligning [...] Read more.
The Kerper–Bowron Method (KB Method) is a patent-pending approach that revolutionizes service contract loss estimation and accounting by introducing a precise, contract-level approach to forecasting expected losses and cancellations. Building on a prior 2007 paper, this update presents the Earned Contract formula, aligning with Solvency II and modern accounting standards. By leveraging a probabilistic exposure base and Generalized Linear Models, the KB Method enhances accuracy in claims and cancel liabilities as well as other liability and asset estimates across global service contract markets. This methodology offers superior precision, automation, and compliance, redefining actuarial and financial practices for vehicle and other service contracts. Full article
(This article belongs to the Special Issue Advances in Risk Models and Actuarial Science)
Show Figures

Figure 1

Back to TopTop