Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (136)

Search Parameters:
Keywords = bayesian score test

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
19 pages, 660 KB  
Article
Molecular Autopsy by Exome Sequencing Identifies in Fraternal Twins a CARD11 p.Ser995Leu Variant Within GUK Domain
by Juan Fernández-Cadena, Edwin W. Naylor, Heidi Reinhard and Arindam Bhattacharjee
Int. J. Transl. Med. 2026, 6(1), 5; https://doi.org/10.3390/ijtm6010005 - 28 Jan 2026
Viewed by 226
Abstract
Background: We describe the post-mortem analysis of a CARD11 variant allele, p.Ser995Leu, identified in fraternal twins who died in early infancy with no identifiable cause of death. CARD11 variants through varied inheritance models can alter immune function through loss- or gain-of-function mechanisms, involving [...] Read more.
Background: We describe the post-mortem analysis of a CARD11 variant allele, p.Ser995Leu, identified in fraternal twins who died in early infancy with no identifiable cause of death. CARD11 variants through varied inheritance models can alter immune function through loss- or gain-of-function mechanisms, involving distinct protein domains; yet the significance of GUK domain variants remains poorly characterized. Twin autopsies showed non-specific findings, such as pulmonary macrophage accumulation and splenic white pulp expansion, but without infection or structural abnormalities. Methods: Whole-exome sequencing, performed as part of molecular autopsies, identified the shared CARD11 p.Ser995Leu variant, previously classified as a variant of uncertain significance (VUS). We assessed evolutionary conservation across CARD family proteins and species and predicted functional impact using in silico tools, which estimate the likelihood that a variant is deleterious. AlphaFold-based structural modeling emphasized qualitative biophysical assessment. Using epidemiological data, population allele frequency, and Bayesian ACMG variant classification, we assessed competing hypotheses under an autosomal dominant model. Results: The p.Ser995Leu substitution affects a conserved, surface-exposed β-sheet within the GUK domain. While CADD scores exceeded 20, other predictive algorithms offered only partial support of pathogenicity. Structural modeling suggested a potential GUK domain destabilization. Integrating genetic, pathologic, immunologic, and probabilistic modeling, we propose a biologically plausible model in which the variant, like other GUK variants, may alter NF-κB or other signaling pathways and is likely pathogenic. Conclusions: While the CARD11 p.Ser995Leu variant’s contribution to disease is uncertain without functional validation or parental testing, and phenotypic findings are non-specific, the presence of an ultra-rare GUK domain variant in both twins, combined with in silico and statistical modeling, supports its interpretation as likely pathogenic or high risk. The results highlight the challenges of data-limited post-mortem variant interpretation. Full article
Show Figures

Figure 1

16 pages, 1095 KB  
Article
Effects of a Modular Sleep System on Subjective Sleep Quality and Physiological Stability in Elite Athletes
by Robert Percy Marshall, Fabian Hennes, Niklas Hennecke, Thomas Stöggl, René Schwesig, Helge Riepenhof and Jan-Niklas Droste
Appl. Sci. 2026, 16(3), 1194; https://doi.org/10.3390/app16031194 - 23 Jan 2026
Viewed by 547
Abstract
Background: Sleep is a key determinant of recovery and performance in elite athletes, yet its optimization extends beyond sleep duration alone and encompasses multiple subjective and physiological dimensions. Environmental factors, including the sleep surface, represent modifiable components of sleep that may influence perceived [...] Read more.
Background: Sleep is a key determinant of recovery and performance in elite athletes, yet its optimization extends beyond sleep duration alone and encompasses multiple subjective and physiological dimensions. Environmental factors, including the sleep surface, represent modifiable components of sleep that may influence perceived sleep quality. This study aimed to examine whether an individually adjustable modular sleep system improves subjective sleep quality in elite athletes and whether alterations in objective sleep metrics, circadian timing, or nocturnal autonomic physiology accompany such changes. Methods: Forty-three elite athletes participated in this pre–post-intervention study (without a control group). Subjective sleep quality was assessed using the Pittsburgh Sleep Quality Index (PSQI), while objective sleep and physiological parameters were recorded using a wearable device (Oura Ring, 3rd generation). Outcomes were averaged across three consecutive nights at baseline (T0) and post-intervention (T1). Baseline values were derived from the final three nights of a standardized pre-intervention monitoring period (minimum 7 nights), and post-intervention values from the final three nights following a standardized intervention exposure period (minimum 14 nights). Statistical analyses included paired frequentist tests and complementary Bayesian paired-sample analyses. Results: Subjective sleep quality improved significantly following the intervention, with a mean reduction in PSQI score of 0.67 points (p < 0.001). In contrast, no meaningful changes were observed in total sleep time (p = 0.28), REM duration (p = 0.26), circadian timing (p = 0.47), or nocturnal minimum heart rate (p = 0.42), as supported by the absence of physiological changes in these parameters. Conclusions: It seems that an individually adjustable sleep system can be able to improve perceived sleep quality in elite athletes without disrupting sleep architecture, circadian regulation, or nocturnal autonomic function. In athletes whose sleep duration and physiological sleep metrics are already near optimal, such micro-environmental interventions may offer a feasible, low-risk means of enhancing recovery by targeting subjective sleep quality. This dimension dissociates from objective sleep measures. Optimizing the sleep surface may therefore represent a practical adjunct to existing recovery strategies in high-performance sport. Full article
Show Figures

Figure 1

27 pages, 1619 KB  
Article
Uncertainty-Aware Multimodal Fusion and Bayesian Decision-Making for DSS
by Vesna Antoska Knights, Marija Prchkovska, Luka Krašnjak and Jasenka Gajdoš Kljusurić
AppliedMath 2026, 6(1), 16; https://doi.org/10.3390/appliedmath6010016 - 20 Jan 2026
Viewed by 333
Abstract
Uncertainty-aware decision-making increasingly relies on multimodal sensing pipelines that must fuse correlated measurements, propagate uncertainty, and trigger reliable control actions. This study develops a unified mathematical framework for multimodal data fusion and Bayesian decision-making under uncertainty. The approach integrates adaptive Covariance Intersection (aCI) [...] Read more.
Uncertainty-aware decision-making increasingly relies on multimodal sensing pipelines that must fuse correlated measurements, propagate uncertainty, and trigger reliable control actions. This study develops a unified mathematical framework for multimodal data fusion and Bayesian decision-making under uncertainty. The approach integrates adaptive Covariance Intersection (aCI) for correlation-robust sensor fusion, a Gaussian state–space backbone with Kalman filtering, heteroskedastic Bayesian regression with full posterior sampling via an affine-invariant MCMC sampler, and a Bayesian likelihood-ratio test (LRT) coupled to a risk-sensitive proportional–derivative (PD) control law. Theoretical guarantees are provided by bounding the state covariance under stability conditions, establishing convexity of the aCI weight optimization on the simplex, and deriving a Bayes-risk-optimal decision threshold for the LRT under symmetric Gaussian likelihoods. A proof-of-concept agro-environmental decision-support application is considered, where heterogeneous data streams (IoT soil sensors, meteorological stations, and drone-derived vegetation indices) are fused to generate early-warning alarms for crop stress and to adapt irrigation and fertilization inputs. The proposed pipeline reduces predictive variance and sharpens posterior credible intervals (up to 34% narrower 95% intervals and 44% lower NLL/Brier score under heteroskedastic modeling), while a Bayesian uncertainty-aware controller achieves 14.2% lower water usage and 35.5% fewer false stress alarms compared to a rule-based strategy. The framework is mathematically grounded yet domain-independent, providing a probabilistic pipeline that propagates uncertainty from raw multimodal data to operational control actions, and can be transferred beyond agriculture to robotics, signal processing, and environmental monitoring applications. Full article
(This article belongs to the Section Probabilistic & Statistical Mathematics)
Show Figures

Figure 1

32 pages, 999 KB  
Article
A Robust Hybrid Metaheuristic Framework for Training Support Vector Machines
by Khalid Nejjar, Khalid Jebari and Siham Rekiek
Algorithms 2026, 19(1), 70; https://doi.org/10.3390/a19010070 - 13 Jan 2026
Viewed by 160
Abstract
Support Vector Machines (SVMs) are widely used in critical decision-making applications, such as precision agriculture, due to their strong theoretical foundations and their ability to construct an optimal separating hyperplane in high-dimensional spaces. However, the effectiveness of SVMs is highly dependent on the [...] Read more.
Support Vector Machines (SVMs) are widely used in critical decision-making applications, such as precision agriculture, due to their strong theoretical foundations and their ability to construct an optimal separating hyperplane in high-dimensional spaces. However, the effectiveness of SVMs is highly dependent on the efficiency of the optimization algorithm used to solve their underlying dual problem, which is often complex and constrained. Classical solvers, such as Sequential Minimal Optimization (SMO) and Stochastic Gradient Descent (SGD), present inherent limitations: SMO ensures numerical stability but lacks scalability and is sensitive to heuristics, while SGD scales well but suffers from unstable convergence and limited suitability for nonlinear kernels. To address these challenges, this study proposes a novel hybrid optimization framework based on Open Competency Optimization and Particle Swarm Optimization (OCO–PSO) to enhance the training of SVMs. The proposed approach combines the global exploration capability of PSO with the adaptive competency-based learning mechanism of OCO, enabling efficient exploration of the solution space, avoidance of local minima, and strict enforcement of dual constraints on the Lagrange multipliers. Across multiple datasets spanning medical (diabetes), agricultural yield, signal processing (sonar and ionosphere), and imbalanced synthetic data, the proposed OCO-PSO–SVM consistently outperforms classical SVM solvers (SMO and SGD) as well as widely used classifiers, including decision trees and random forests, in terms of accuracy, macro-F1-score, Matthews correlation coefficient (MCC), and ROC-AUC. On the Ionosphere dataset, OCO-PSO achieves an accuracy of 95.71%, an F1-score of 0.954, and an MCC of 0.908, matching the accuracy of random forest while offering superior interpretability through its kernel-based structure. In addition, the proposed method yields a sparser model with only 66 support vectors compared to 71 for standard SVC (a reduction of approximately 7%), while strictly satisfying the dual constraints with a near-zero violation of 1.3×103. Notably, the optimal hyperparameters identified by OCO-PSO (C=2, γ0.062) differ substantially from those obtained via Bayesian optimization for SVC (C=10, γ0.012), indicating that the proposed approach explores alternative yet equally effective regions of the hypothesis space. The statistical significance and robustness of these improvements are confirmed through extensive validation using 1000 bootstrap replications, paired Student’s t-tests, Wilcoxon signed-rank tests, and Holm–Bonferroni correction. These results demonstrate that the proposed metaheuristic hybrid optimization framework constitutes a reliable, interpretable, and scalable alternative for training SVMs in complex and high-dimensional classification tasks. Full article
Show Figures

Figure 1

24 pages, 1788 KB  
Article
Uncertainty-Aware Machine Learning for NBA Forecasting in Digital Betting Markets
by Matteo Montrucchio, Enrico Barbierato and Alice Gatti
Information 2026, 17(1), 56; https://doi.org/10.3390/info17010056 - 8 Jan 2026
Viewed by 706
Abstract
This study introduces a fully uncertainty-aware forecasting framework for NBA games that integrates team-level performance metrics, rolling-form indicators, and spatial shot-chart embeddings. The predictive backbone is a recurrent neural network equipped with Monte Carlo dropout, yielding calibrated sequential probabilities. The model is evaluated [...] Read more.
This study introduces a fully uncertainty-aware forecasting framework for NBA games that integrates team-level performance metrics, rolling-form indicators, and spatial shot-chart embeddings. The predictive backbone is a recurrent neural network equipped with Monte Carlo dropout, yielding calibrated sequential probabilities. The model is evaluated against strong baselines including logistic regression, XGBoost, convolutional models, a GRU sequence model, and both market-only and non-market-only benchmarks. All experiments rely on strict chronological partitioning (train ≤ 2022, validation 2023, test 2024), ablation tests designed to eliminate any circularity with bookmaker odds, and cross-season robustness checks spanning 2012–2024. Predictive performance is assessed through accuracy, Brier score, log-loss, AUC, and calibration metrics (ECE/MCE), complemented by SHAP-based interpretability to verify that only pre-game information influences predictions. To quantify economic value, calibrated probabilities are fed into a frictionless betting simulator using fractional-Kelly staking, an expected-value threshold, and bootstrap-based uncertainty estimation. Empirically, the uncertainty-aware model delivers systematically better calibration than non-Bayesian baselines and benefits materially from the combination of shot-chart embeddings and recent-form features. Economic value emerges primarily in less-efficient segments of the market: The fused predictor outperforms both market-only and non-market-only variants on moneylines, while spreads and totals show limited exploitable edge, consistent with higher pricing efficiency. Sensitivity studies across Kelly multipliers, EV thresholds, odds caps, and sequence lengths confirm that the findings are robust to modelling and decision-layer perturbations. The paper contributes a reproducible, decision-focused framework linking uncertainty-aware prediction to economic outcomes, clarifying when predictive lift can be monetized in NBA markets, and outlining methodological pathways for improving robustness, calibration, and execution realism in sports forecasting. Full article
Show Figures

Graphical abstract

38 pages, 7841 KB  
Article
Bayesian-Optimized Explainable AI for CKD Risk Stratification: A Dual-Validated Framework
by Jianbo Huang, Bitie Lan, Zhicheng Liao, Donghui Zhao and Mengdi Hou
Symmetry 2026, 18(1), 81; https://doi.org/10.3390/sym18010081 - 3 Jan 2026
Viewed by 583
Abstract
Chronic kidney disease (CKD) impacts more than 850 million people globally, yet existing machine learning methodologies for risk stratification encounter substantial challenges: computationally intensive hyperparameter tuning, model opacity that conflicts with clinical interpretability standards, and class imbalance leading to systematic prediction bias. We [...] Read more.
Chronic kidney disease (CKD) impacts more than 850 million people globally, yet existing machine learning methodologies for risk stratification encounter substantial challenges: computationally intensive hyperparameter tuning, model opacity that conflicts with clinical interpretability standards, and class imbalance leading to systematic prediction bias. We constructed an integrated architecture that combines XGBoost with Optuna-driven Bayesian optimization, evaluated against 19 competing hyperparameter tuning approaches and tested on CKD patients using dual-paradigm statistical validation. The architecture delivered 93.43% accuracy, 93.13% F1-score, and 97.59% ROC-AUC—representing gains of 6.22 percentage points beyond conventional XGBoost and 7.0–26.8 percentage points compared to 20 baseline algorithms. Tree-structured Parzen Estimator optimization necessitated merely 50 trials compared to 540 for grid search and 1069 for FLAML, whereas Boruta feature selection accomplished 54.2% dimensionality reduction with no performance compromise. Over 30 independent replications, the model exhibited remarkable stability (cross-validation standard deviation: 0.0121, generalization gap: −1.13%) alongside convergent evidence between frequentist and Bayesian paradigms (all p < 0.001, mean CI-credible interval divergence < 0.001, effect sizes d = 0.665–5.433). Four separate explainability techniques (SHAP, LIME, accumulated local effects, Eli5) consistently identified CKD stage and albumin-creatinine ratio as principal predictors, aligning with KDIGO clinical guidelines. Clinical utility evaluation demonstrated 98.4% positive case detection at 50% screening threshold alongside near-optimal calibration (mean absolute error: 0.138), while structural equation modeling revealed hyperuricemia (β = −3.19, p < 0.01) as the most potent modifiable risk factor. This dual-validated architecture demonstrates that streamlined hyperparameter optimization combined with convergent multi-method interpretability enables precise CKD risk stratification with clinical guideline alignment, supporting evidence-informed screening protocols. Full article
Show Figures

Figure 1

34 pages, 5399 KB  
Article
Improving Individual and Regional Rainfall–Runoff Modeling in North American Watersheds Through Feature Selection and Hyperparameter Optimization
by Bahareh Ghanati and Joan Serra-Sagristà
Mathematics 2025, 13(23), 3828; https://doi.org/10.3390/math13233828 - 29 Nov 2025
Cited by 1 | Viewed by 540
Abstract
Precise rainfall-runoff modeling (RRM) is vital for disaster management, resource conservation, and mitigation. Recent deep learning-based methods, such as long short-term memory (LSTM) networks, often struggle with major challenges, including temporal sensitivity, feature selection, generalizability, and hyperparameter tuning. The objective of this study [...] Read more.
Precise rainfall-runoff modeling (RRM) is vital for disaster management, resource conservation, and mitigation. Recent deep learning-based methods, such as long short-term memory (LSTM) networks, often struggle with major challenges, including temporal sensitivity, feature selection, generalizability, and hyperparameter tuning. The objective of this study is to develop an accurate and generalizable rainfall–runoff modeling framework that addresses the four aforementioned challenges. We propose a novel RRM framework that integrates transductive LSTM (TLSTM) to capture fine-grained temporal changes, off-policy proximal policy optimization (PPO) combined with Shapley Additive exPlanations (SHAP)-based reward functions for feature selection, an enhanced generative adversarial network (GAN) for online data augmentation, and Bayesian optimization hyperband (BOHB) for efficient hyperparameter tuning. TLSTM uses transductive learning, where samples near the test point are given extra weight, to capture fine-grained temporal shifts. Off-policy PPO contributes to this process by selecting features sensitive to temporal patterns in RRM. Our improved GAN conducts online data augmentation by excluding some gradients, increasing diversity and relevance in synthetic data. Finally, BOHB accelerates hyperparameter tuning by merging Bayesian optimization with the scaling efficiency of Hyperband. We evaluate our model using the Comprehensive Attributes and Meteorology for Large-Sample Studies (CAMELS) dataset under individual and regional scenarios. It achieves Nash–Sutcliffe efficiency (NSE) scores of 0.588 and 0.873, surpassing the baseline scores of 0.548 and 0.830, respectively. The generalizability of our approach was assessed on the hydro-climatic datasets for North America (HYSETS), also yielding improved performance. These improvements indicate more accurate capture of flow dynamics and peak events, supporting a robust and interpretable framework for RRM. Full article
Show Figures

Figure 1

17 pages, 10990 KB  
Article
Study of Intelligent Identification of Radionuclides Using a CNN–Meta Deep Hybrid Model
by Xiangting Meng, Ziyi Wang, Yu Sun, Zhihao Dong, Xiaoliang Liu, Huaiqiang Zhang and Xiaodong Wang
Appl. Sci. 2025, 15(22), 12285; https://doi.org/10.3390/app152212285 - 19 Nov 2025
Viewed by 587
Abstract
The rapid and accurate identification of radionuclides and the quantitative analysis of their activities have long been key research areas in the field of nuclear spectrum data processing. Traditional nuclear spectrum analysis methods heavily rely on manual feature extraction, making them highly susceptible [...] Read more.
The rapid and accurate identification of radionuclides and the quantitative analysis of their activities have long been key research areas in the field of nuclear spectrum data processing. Traditional nuclear spectrum analysis methods heavily rely on manual feature extraction, making them highly susceptible to interference from factors such as energy resolution, calibration drift, and spectral peak overlap when dealing with complex mixed-radionuclide spectra, ultimately leading to degraded identification performance and accuracy. Based on multi-nuclide energy spectral data acquired via Geant4 simulation, this study compares the performance of partial least squares regression (PLSR), random forest (RF), a convolutional neural network (CNN), and a hybrid CNN–Meta model for radionuclide identification and quantitative activity analysis under conditions of raw energy spectra, Z-score normalization, and min-max normalization. To maximize the potential of each model, principal component selection, Bayesian hyperparameter optimization, iteration tuning, and meta-learning optimization were employed. Model performance was comprehensively evaluated using the coefficient of determination (R2), root mean square error (RMSE), mean relative error (MRE), and computational time. The results demonstrate that deep learning models can effectively capture nonlinear relationships within complex energy spectra, enabling accurate radionuclide identification and activity quantification. Specifically, the CNN achieved a globally optimal test RMSE of 0.00566 and an R2 of 0.999 with raw energy spectra. CNN–Meta exhibited superior adaptability and generalization under min-max normalization, reducing test error by 70.8% compared to RF, while requiring only 49% of the total computation time of the CNN model. RF was relatively insensitive to preprocessing but yielded higher absolute errors, whereas PLSR was limited by its linear nature and failed to capture the nonlinear characteristics of complex energy spectra. In conclusion, the CNN–Meta hybrid model demonstrates superior performance in both accuracy and efficiency, providing a reliable and effective approach for the rapid identification of radionuclides and quantitative analysis of activity in complex energy spectra. Full article
Show Figures

Figure 1

21 pages, 2788 KB  
Article
Gaussian Process-Based Multi-Fidelity Bayesian Optimization for Optimal Calibration Point Selection
by Hua Zhuo, Jungang Ma, Mei Yang, Yikun Zhao, Lifang Yao, Yan Xu and Kun Yang
Sensors 2025, 25(22), 7030; https://doi.org/10.3390/s25227030 - 18 Nov 2025
Viewed by 1128
Abstract
Temperature and humidity calibration chambers, which provide controlled environments for instrument testing and validation, are widely applied in the aerospace and biomedicine fields. However, traditional fixed calibration points fail to adapt to complex operational requirements and exhibit problems including a limited coverage range [...] Read more.
Temperature and humidity calibration chambers, which provide controlled environments for instrument testing and validation, are widely applied in the aerospace and biomedicine fields. However, traditional fixed calibration points fail to adapt to complex operational requirements and exhibit problems including a limited coverage range and low efficiency. To address these challenges, this study develops a Gaussian Process-based Multi-Fidelity Bayesian Optimization (GP-MFBO) framework for optimal selection of temperature and humidity calibration points. The framework integrates the following three key components: (1) a three-layer progressive multi-fidelity modeling system comprising physical analytical models, computational fluid dynamics (CFD) numerical simulations, and experimental verification; (2) a systematic uncertainty quantification system covering model uncertainty, parameter uncertainty, and observation uncertainty; and (3) an adaptive acquisition function that balances uncertainty penalty mechanisms and multi-fidelity information gain evaluation. The experimental results demonstrate that the proposed GP-MFBO method achieves optimal calibration point combinations with a temperature uniformity score of 0.149 and humidity uniformity score of 2.38, approaching theoretical optimal solutions within 4.5% and 3.6%, respectively. Compared to standard Gaussian process, Co-Kriging, two-stage optimization, polynomial regression, and traditional single-fidelity methods, GP-MFBO achieves uniformity score improvements of up to 81.7% and 76.3% for temperature and humidity, respectively. The prediction confidence interval coverage reaches 94.2%, outperforming all comparative methods. This research provides a rigorous theoretical foundation and technical solution for the scientific design and reliable operation of large-space temperature and humidity calibration systems. Full article
(This article belongs to the Special Issue Intelligent Sensor Calibration: Techniques, Devices and Methodologies)
Show Figures

Figure 1

14 pages, 406 KB  
Article
Cognitive Flexibility Predicts Live-Fire Rifle Marksmanship in Airborne Cadets: A Pilot Study
by Dariusz Jamro, John A. Dewey, Grzegorz Żurek, Rui Lucena and Maciej Lachowicz
Brain Sci. 2025, 15(11), 1150; https://doi.org/10.3390/brainsci15111150 - 27 Oct 2025
Viewed by 730
Abstract
Background: Executive functions may underpin performance in live-fire tasks, whereas evidence for global physical fitness is mixed. We quantified the associations between cognitive flexibility (CF), inhibitory control (IC), overall physical fitness, and rifle marksmanship in cadets, and derived a parsimonious predictive model. Methods: [...] Read more.
Background: Executive functions may underpin performance in live-fire tasks, whereas evidence for global physical fitness is mixed. We quantified the associations between cognitive flexibility (CF), inhibitory control (IC), overall physical fitness, and rifle marksmanship in cadets, and derived a parsimonious predictive model. Methods: Twenty second-year male airborne cadets (mean age 21.7 ± 2.2 years) completed a live-fire Basic Rifle Marksmanship (BRM) qualification (40 targets at 50–300 m); the Color Trails Test (CTT-1 and CTT-2; interference index) to index CF and processing speed; a stop-signal–style task (CogniFit) to assess IC indexed by NO-GO accuracy and GO-trial response time; and the Army Combat Fitness Test (ACFT). Associations were examined with Spearman correlations. Multiple linear regression with backward elimination and Bayesian model comparison evaluated predictive models. Results: Faster CTT-2 performance was associated with higher BRM scores (ρ = −0.48, p = 0.032), with a similar association for CTT-1 (ρ = −0.46, p = 0.042). The best-fitting regression model included CTT-2 time and IC–accuracy (adjusted R2 = 0.345; RMSE = 7.03), with CTT-2 time the only significant predictor of BRM (b = −0.330, p = 0.006). Bayesian model comparison independently favored a parsimonious CTT-2–only model (P(M|data) = 0.222; BFM = 5.41; BF10 = 1.00; R2 = 0.352). ACFT scores were not significantly associated with BRM. Conclusions: CF and processing speed are key correlates of live–fire rifle marksmanship in cadets, suggesting value in integrating executive–function elements into marksmanship training. Replication in larger cohorts is warranted. Full article
(This article belongs to the Section Cognitive, Social and Affective Neuroscience)
Show Figures

Figure 1

17 pages, 2230 KB  
Article
The Reassuring Absence of Acute Stress Effects on IQ Test Performance
by Osman Akan, Mustafa Yildirim and Oliver T. Wolf
J. Intell. 2025, 13(10), 131; https://doi.org/10.3390/jintelligence13100131 - 19 Oct 2025
Viewed by 2877
Abstract
Acute stress impairs executive functions, and these higher-order cognitive processes are often positively associated with intelligence. Even though intelligence is generally stable over time, performance in an intelligence test can be influenced by a variety of factors, including psychological processes like motivation or [...] Read more.
Acute stress impairs executive functions, and these higher-order cognitive processes are often positively associated with intelligence. Even though intelligence is generally stable over time, performance in an intelligence test can be influenced by a variety of factors, including psychological processes like motivation or attention. For instance, test anxiety has been shown to correlate with individual differences in intelligence test performance, and theoretical accounts exist for causality in both directions. However, the potential impact of acute stress before or during an intelligence test remains elusive. Here, in a research context, we investigated the effects of test anxiety and acute stress as well as their interaction on performance in the short version of the Intelligence Structure Test 2000 in its German version (I-S-T 2000 R). Forty male participants completed two sessions scheduled 28 days apart, with the order counterbalanced across participants. In both sessions, participants underwent either the socially evaluated cold-pressor test (SECPT) or a non-stressful control procedure, followed by administration of I-S-T 2000 R (parallelized versions on both days). The SECPT is a widely used laboratory paradigm that elicits a stress response through the combination of psychosocial and physical components. Trait test anxiety scores were obtained via the German Test Anxiety Inventory (TAI-G). Stress induction was successful as indicated by physiological and subjective markers, including salivary cortisol concentrations. We applied linear mixed models to investigate the effects of acute stress (elicited by our stress manipulation) and test anxiety on the intelligence quotient (IQ). The analysis revealed that neither factor had a significant effect, nor was there a significant interaction between them. Consistent with these findings, Bayesian analyses provided evidence supporting the absence of these effects. Notably, IQ scores increased significantly from the first to the second testing day. These results suggest that neither test anxiety nor stress is significantly impacting intelligence test performance. However, improvements due to repeated testing call for caution, both in scientific and clinical settings. Full article
(This article belongs to the Section Contributions to the Measurement of Intelligence)
Show Figures

Figure 1

27 pages, 6859 KB  
Article
An Explainable Machine Learning Framework for the Hierarchical Management of Hot Pepper Damping-Off in Intensive Seedling Production
by Zhaoyuan Wang, Kaige Liu, Longwei Liang, Changhong Li, Tao Ji, Jing Xu, Huiying Liu and Ming Diao
Horticulturae 2025, 11(10), 1258; https://doi.org/10.3390/horticulturae11101258 - 17 Oct 2025
Viewed by 1061
Abstract
Facility agriculture cultivation is the main production form of the vegetable industry in the world. As an important vegetable crop, hot peppers are easily threatened by many diseases in a facility microclimate environment. Traditional disease detection methods are time-consuming and allow the disease [...] Read more.
Facility agriculture cultivation is the main production form of the vegetable industry in the world. As an important vegetable crop, hot peppers are easily threatened by many diseases in a facility microclimate environment. Traditional disease detection methods are time-consuming and allow the disease to proliferate, so timely detection and inhibition of disease development have become the focus of global agricultural practice. This article proposed a generalizable and explainable machine learning model for hot pepper damping-off in intensive seedling production under the condition of ensuring the high accuracy of the model. Through Kalman filter smoothing, SMOTE-ENN unbalanced sample processing, feature selection and other data preprocessing methods, 19 baseline models were developed for prediction in this article. After statistical testing of the results, Bayesian Optimization algorithm was used to perform hyperparameter tuning for the best five models with performance, and the Extreme Random Trees model (ET) most suitable for this research scenario was determined. The F1-score of this model is 0.9734, and the AUC value is 0.9969 for predicting the severity of hot pepper damping-off, and the explainable analysis is carried out by SHAP (SHapley Additive exPlanations). According to the results, the hierarchical management strategies under different severities are interpreted. Combined with the front-end visualization interface deployed by the model, it is helpful for farmers to know the development trend of the disease in advance and accurately regulate the environmental factors of seedling raising, and this is of great significance for disease prevention and control and to reduce the impact of diseases on hot pepper growth and development. Full article
(This article belongs to the Special Issue New Trends in Smart Horticulture)
Show Figures

Figure 1

29 pages, 1325 KB  
Article
Digital Stratigraphy—A Pattern Analysis Framework Integrating Computer Forensics, Criminology, and Forensic Archaeology for Crime Scene Investigation
by Romil Rawat, Hitesh Rawat, Mandakini Ingle, Anjali Rawat, Anand Rajavat and Ashish Dibouliya
Forensic Sci. 2025, 5(4), 48; https://doi.org/10.3390/forensicsci5040048 - 17 Oct 2025
Viewed by 1558
Abstract
Background/Objectives—Traditional forensic investigations often analyze digital, physical, and criminological evidence separately, leading to fragmented timelines and reduced accuracy in reconstructing complex events. To address these gaps, this study proposes the Digital Stratigraphy Framework (DSF), inspired by archaeological stratigraphy, to integrate heterogeneous evidence [...] Read more.
Background/Objectives—Traditional forensic investigations often analyze digital, physical, and criminological evidence separately, leading to fragmented timelines and reduced accuracy in reconstructing complex events. To address these gaps, this study proposes the Digital Stratigraphy Framework (DSF), inspired by archaeological stratigraphy, to integrate heterogeneous evidence into structured, temporally ordered layers. DSF aims to reduce asynchronous inconsistencies, minimize false associations, and enhance interpretability across digital, behavioral, geospatial, and excavation evidence. Methods—DSF employs Hierarchical Pattern Mining (HPM) to detect recurring behavioral patterns and Forensic Sequence Alignment (FSA) to synchronize evidence layers temporally and contextually. The framework was tested on the CSI-DS2025 dataset containing 25,000 multimodal, stratified records, including digital logs, geospatial data, criminological reports, and excavation notes. Evaluation used 10-fold cross-validation, Bayesian hyperparameter tuning, and structured train-validation-test splits. Metrics included accuracy, precision, recall, F1-score, and Stratigraphic Reconstruction Consistency (SRC), alongside ablation and runtime assessments. Results—DSF achieved 92.6% accuracy, 93.1% precision, 90.5% recall, 91.3% F1-score, and an SRC of 0.89, outperforming baseline models. False associations were reduced by 18%, confirming effective cross-layer alignment and computational efficiency. Conclusions—By applying stratigraphic principles to forensic analytics, DSF enables accurate, interpretable, and legally robust evidence reconstruction. The framework establishes a scalable foundation for real-time investigative applications and multi-modal evidence integration, offering significant improvements over traditional fragmented approaches. Full article
(This article belongs to the Special Issue Feature Papers in Forensic Sciences)
Show Figures

Figure 1

22 pages, 315 KB  
Article
Associations Between Psychological Coping Skills and Player Behaviors During Transition Moments in Male Youth Football
by Francisco Pires, Maria Inês Vigário, Sandra S. Ferreira and António Vicente
Sports 2025, 13(10), 363; https://doi.org/10.3390/sports13100363 - 13 Oct 2025
Viewed by 1539
Abstract
Sport performance results from the interaction of tactical, technical, physiological and psychological factors, but psychological aspects are often minimized or analyzed in a decontextualized manner. This exploratory pilot study aimed to contribute to the development of a diagnostic framework that links individual behaviors [...] Read more.
Sport performance results from the interaction of tactical, technical, physiological and psychological factors, but psychological aspects are often minimized or analyzed in a decontextualized manner. This exploratory pilot study aimed to contribute to the development of a diagnostic framework that links individual behaviors during football attack–defense transition moments (ADT) with psychological attributes. Twenty male U14 players were assessed across five official matches regarding their ADT performance indicators. The Athletic Coping Skills Inventory (ACSI-28) and the Resilience Scale (RS) were applied during the competition. Statistical analyses included correlation tests and Bayesian analysis. Players showed a significant tendency to sustain ball recovery behaviors after possession loss (p = 0.004). Psychological resilience and athletic coping skills varied substantially between individuals without positional differences, as well as RS scores were significantly below the high-resilience threshold (147; p = 0.013). A moderate positive correlation emerged between RS Factor 1 and the ACSI-28 subscale “Coping with Adversity” (r = 0.574, p = 0.008). Posterior distributions provide exploratory signals suggesting possible positive associations for two psychological constructs considering ADT individual behaviors: “Concentration” in relation to the maintenance of recovery actions (Mode = 0.439; 95% CI [0.030, 0.721]) and “Goal Setting” in relation to the rapid initiation of recovery actions (Mode = 0.465; 95% CI [0.059, 0.734]). Nevertheless, Bayes Factors favored the null model overall, indicating that these signals are weak and require replication. By contrast, most psychological constructs, including resilience, showed no reliable evidence of correlation with recovery-related actions. The findings highlight the need to further research the integration of psychological assessment into football performance diagnostics, while also indicating that psychological factors alone are insufficient to fully explain youth players’ individual ADT behaviors. Full article
Show Figures

Graphical abstract

23 pages, 3359 KB  
Article
Capsule Neural Networks with Bayesian Optimization for Pediatric Pneumonia Detection from Chest X-Ray Images
by Szymon Salamon and Wojciech Książek
J. Clin. Med. 2025, 14(20), 7212; https://doi.org/10.3390/jcm14207212 - 13 Oct 2025
Cited by 1 | Viewed by 926
Abstract
Background: Pneumonia in children poses a serious threat to life and health, making early detection critically important. In this regard, artificial intelligence methods can provide valuable support. Methods: Capsule networks and Bayesian optimization are modern techniques that were employed to build effective models [...] Read more.
Background: Pneumonia in children poses a serious threat to life and health, making early detection critically important. In this regard, artificial intelligence methods can provide valuable support. Methods: Capsule networks and Bayesian optimization are modern techniques that were employed to build effective models for predicting pneumonia from chest X-ray images. The medical images underwent essential preprocessing, were divided into training, validation, and testing sets, and were subsequently used to develop the models. Results: The designed capsule neural network model with Bayesian optimization achieved the following final results: an accuracy of 95.1%, sensitivity of 98.9%, specificity of 85.4%, precision (PPV) of 94.8%, negative predictive value (NPV) of 96.2%, F1-score of 96.8%, and a Matthews correlation coefficient (MCC) of 0.877. In addition, the model was complemented with an explainability analysis using Grad-CAM, which demonstrated that its predictions rely predominantly on clinically relevant pulmonary regions. Conclusions: The proposed model demonstrates high accuracy and shows promise for potential use in clinical practice. It may also be applied to other tasks in medical image analysis. Full article
(This article belongs to the Special Issue Artificial Intelligence and Deep Learning in Medical Imaging)
Show Figures

Figure 1

Back to TopTop