Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (6,265)

Search Parameters:
Keywords = evaluation of prediction tools

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
30 pages, 2728 KB  
Article
Supervisory Monitoring and Control Using Chemical Process Simulators and SCADA Systems
by Rebecca Bastos Boschoski and Lizandro de Sousa Santos
Methane 2026, 5(1), 8; https://doi.org/10.3390/methane5010008 - 5 Feb 2026
Abstract
A digital twin (DT) is an automation strategy that integrates a physical plant with an adaptive, real-time simulation environment, with bidirectional communication between them. In process engineering, DTs promise real-time monitoring, prediction of future conditions, predictive maintenance, process optimization, and control. Dashboards for [...] Read more.
A digital twin (DT) is an automation strategy that integrates a physical plant with an adaptive, real-time simulation environment, with bidirectional communication between them. In process engineering, DTs promise real-time monitoring, prediction of future conditions, predictive maintenance, process optimization, and control. Dashboards for process monitoring are becoming increasingly relevant for tracking key metrics and supervising industrial units in real time. Supervisory Control and Data Acquisition (SCADA) systems are widely used for process automation, with ScadaBR, an open-source, freely licensed platform. This work presents the development of a computational tool that integrates the Aspen HYSYS/Python with the ScadaBR system for real-time monitoring and supervision of dynamic models. The virtual plant, which replicates the system’s physical behavior, was connected to the SCADA platform via the Modbus protocol, enabling bidirectional data exchange between the simulated model and the supervisory interface. The system supports operational analysis and control strategy validation. Two case studies were analyzed: (i) a simplified catalytic hydrocracking process, implemented in the Python environment, and (ii) a heat exchanger networks process, simulated using the HYSYS simulator. In the second case, the process was dynamically simulated, with real-time monitoring of a simple dynamic indicator that correlates the feed methane concentration with heat transfer fluids. The results demonstrate the feasibility and applicability of the proposed approach for educational purposes, operator training, and process engineering validation, fostering a more realistic and interactive simulation environment. Furthermore, the results show that the tool is promising for dynamic monitoring of environmental and energy indices, demonstrating that methane consumption relative to process feed can be evaluated and controlled over time. Full article
21 pages, 1480 KB  
Article
Early Detection of Chronic Kidney Disease in Men Using Lifestyle and Demographic Indicators: A Machine Learning Approach for Primary Healthcare Settings
by Mc Neil Valencia, Jun Kim, Zeeshan Abbas and Seung Won Lee
Healthcare 2026, 14(3), 405; https://doi.org/10.3390/healthcare14030405 - 5 Feb 2026
Abstract
Background/Objective: Chronic kidney disease (CKD) is a major global health concern associated with significant morbidity, mortality, and healthcare burden. This study aimed to develop an explainable machine learning framework that integrates lifestyle, sociodemographic, and biochemical factors for early CKD risk prediction among middle-aged [...] Read more.
Background/Objective: Chronic kidney disease (CKD) is a major global health concern associated with significant morbidity, mortality, and healthcare burden. This study aimed to develop an explainable machine learning framework that integrates lifestyle, sociodemographic, and biochemical factors for early CKD risk prediction among middle-aged men using public health survey data. Methods: Data from 968 male participants were preprocessed by removing missing values, deriving eGFR and ACR, and labeling CKD status. Five machine learning algorithms, (i.e., Random Forest, AdaBoost, Naïve Bayes, SVM, and XGBoost) were trained and evaluated using accuracy, precision, recall, and F1-score. Model interpretability was assessed using SHAP, LIME, Boruta, and Pearson’s correlation analyses. Results: AdaBoost yielded the best performance (accuracy = 0.7258, F1 = 0.6457, recall = 0.6923), with robust generalization confirmed by the precision–recall curve (AP = 0.715). SHAP and LIME revealed that serum creatinine, blood urea nitrogen, urinary creatinine, and age were major predictors, whereas lifestyle and metabolic indicators such as BMI, sodium and sugar intake, and sleep duration emerged as secondary factors for CKD. Conclusions: This study demonstrates the effectiveness of an explainable machine learning model that integrates lifestyle, sociodemographic and biochemical data for early CKD prediction among middle-aged men. The AdaBoost-based framework shows strong potential for implementation as a clinical decision-support tool within EHR systems and may contribute to personalized and preventive interventions. It emphasizes the growing importance of modifiable behaviors in kidney disease development and supports future work involving multiple cohorts and temporal model expansion to improve risk stratification for individuals at risk of kidney disease. Full article
(This article belongs to the Special Issue Digital Health and AI for Chronic Disease Control and Management)
Show Figures

Figure 1

12 pages, 241 KB  
Article
Assessment of the Effectiveness of Pharmaceutical Advice in Selected Digestive Disorders: Perspectives of Patients and Pharmacists as Part of a Pilot “Minor Ailments” Service
by Piotr Merks, Urszula Religioni, Régis Vaillancourt, Dariusz Świetlik, Katarzyna Plagens-Rotman, Ewelina Drelich, Mariola Borowska, Piotr Bromber, Justyna Kaźmierczak, Eliza Blicharska, Paweł Piatkiewicz, Aneta Królak-Ulińska, Radosław Sierpiński, Sebastian Sikorski and Zbigniew Doniec
Diseases 2026, 14(2), 59; https://doi.org/10.3390/diseases14020059 - 5 Feb 2026
Abstract
Introduction: Minor digestive ailments are a common reason for individuals to visit pharmacies, and can be efficiently managed through structured pharmaceutical advice. This study aimed to evaluate the effectiveness of advice provided by pharmacists in community pharmacies from the perspectives of both patients [...] Read more.
Introduction: Minor digestive ailments are a common reason for individuals to visit pharmacies, and can be efficiently managed through structured pharmaceutical advice. This study aimed to evaluate the effectiveness of advice provided by pharmacists in community pharmacies from the perspectives of both patients and pharmacists. The primary focus of the study was not on assessing the effectiveness of a specific medication, but rather on the pharmaceutical advice provided. Materials and Methods: This prospective multicenter observational study was conducted between January and March 2025 in community pharmacies across Poland among adult patients with dyspepsia without alarm symptoms and included two visits: an initial visit and a follow-up phone call after 7–14 days. Symptom severity across seven domains was assessed using a GSRS-based tool, and data on adherence, treatment regimen, patient satisfaction, and acceptable costs of the two-visit service were collected. Statistical analyses (p < 0.05) using both parametric and non-parametric tests were performed on data from 100 participants who completed the study, with cost data serving as a proxy for willingness to pay. Results: Most patients (92.7%) reported symptom improvement, with a median time to relief of 3 days and good treatment adherence. The greatest benefits were observed for abdominal pain and flatulence, and higher baseline symptom severity was consistently associated with greater improvement. Service acceptability was high, and patients’ reported willingness to pay suggests perceived value and potential economic feasibility of the service. Conclusions: Structured pharmaceutical advice for digestive ailments (including triage, education, management plans, and monitoring of effects) led to rapid and clinically significant improvements in most patients. This approach demonstrates high adherence rates and positive acceptability. The stability of effects across different demographic groups, along with a predictable pattern of changes in various domains, supports the expansion of this service and customization of educational messages. Full article
21 pages, 7173 KB  
Article
Influence of Fines Content and Particle Shape on Limiting Void Ratios of Sand Mixtures: DEM and AI Approaches
by Weichao Tang, Xiaoli Zhu, Zhehao Zhu, Huaqiao Zhong and Xiufeng Zhang
Buildings 2026, 16(3), 661; https://doi.org/10.3390/buildings16030661 - 5 Feb 2026
Abstract
The mechanical behavior of sand–fines mixtures is governed by their limiting void ratios, which are sensitive to fines content and particle morphology. Conventional empirical correlations often fail to generalize to a wide range of soils, limiting their applicability in engineering design. This study [...] Read more.
The mechanical behavior of sand–fines mixtures is governed by their limiting void ratios, which are sensitive to fines content and particle morphology. Conventional empirical correlations often fail to generalize to a wide range of soils, limiting their applicability in engineering design. This study develops an integrated approach combining laboratory calibration, discrete element method (DEM) simulations incorporating realistic particle morphologies and machine learning to predict maximum and minimum void ratios. Glass beads were first tested to validate DEM contact parameters, after which sand particles obtained through 3D scanning were employed to capture morphological effects. Correlation and partial least squares analyses confirmed fines content as the dominant factor, while particle shape also contributed to packing behavior. A fully connected neural network (FCNN) was trained to establish predictive relationships, demonstrating closer agreement with DEM simulations than traditional empirical formulations. The proposed approach provides a reliable and generalizable tool for evaluating packing characteristics and offers new insights into the role of particle morphology in the mechanical response of sand–fines mixtures. Full article
(This article belongs to the Section Building Materials, and Repair & Renovation)
Show Figures

Figure 1

27 pages, 10049 KB  
Review
Cardiovascular CT in Bicuspid Aortic Valve Disease: A State-of-the-Art Narrative Review of Advances, Clinical Integration, and Future Directions
by Muhammad Ali Jawed, Cagri Ayhan, Robert Byrne, Sandeep Singh Hothi, Sherif Sultan, Mark Spence and Osama Soliman
J. Clin. Med. 2026, 15(3), 1268; https://doi.org/10.3390/jcm15031268 - 5 Feb 2026
Abstract
Bicuspid Aortic Valve (BAV) disease is recognized as the most common congenital heart condition and is frequently associated with complex valvular and aortic disorders. Cardiovascular computed tomography (CT) has become essential for diagnosing BAV, planning procedures, and evaluating patients after treatment. This is [...] Read more.
Bicuspid Aortic Valve (BAV) disease is recognized as the most common congenital heart condition and is frequently associated with complex valvular and aortic disorders. Cardiovascular computed tomography (CT) has become essential for diagnosing BAV, planning procedures, and evaluating patients after treatment. This is largely due to CT’s high spatial resolution and its ability to perform volume imaging effectively. This review provides an up-to-date overview of the increasing role of cardiovascular CT in the management of bicuspid aortic valve (BAV). It covers various aspects, including BAV morphology, optimal sizing for transcatheter aortic valve replacement (TAVR), and post-procedural monitoring. We highlight significant innovations, such as supra-annular sizing techniques and artificial intelligence (AI)-guided analysis, that position CT at the nexus of anatomy, function, and targeted treatment. Additionally, we address controversies concerning inconsistencies in sizing algorithms, recent classification challenges, and radiation exposure. Future development areas include AI predictive tools, radiomic phenotyping, and CT-guided precision medicine. This synthesis aims to provide clinicians and researchers with a high-level guide to the clinical integration of cardiovascular CT and its future in the BAV population. This review provides the most current, comprehensive synthesis on the pivotal role of cardiovascular CT in BAV management, offering a roadmap for integrating advanced imaging into clinical practice and guiding future research priorities. Full article
(This article belongs to the Special Issue Advances in Cardiovascular Computed Tomography (CT))
Show Figures

Figure 1

19 pages, 4153 KB  
Review
Imaging and Artificial Intelligence in Forensic Reconstruction and PMI/PMSI Estimation of Human Remains in Terrestrial and Aquatic Contexts
by Alessia Leggio, Ricardo Ortega-Ruiz and Giulia Iacobellis
Forensic Sci. 2026, 6(1), 13; https://doi.org/10.3390/forensicsci6010013 - 5 Feb 2026
Abstract
The application of advanced imaging techniques, particularly computed tomography (CT), photogrammetric scanning, and three-dimensional reconstructions of body surfaces and skeletal remains, is becoming a crucial component of Forensic Anthropology. These tools enable a non-invasive and highly standardized analysis of both intact cadavers and [...] Read more.
The application of advanced imaging techniques, particularly computed tomography (CT), photogrammetric scanning, and three-dimensional reconstructions of body surfaces and skeletal remains, is becoming a crucial component of Forensic Anthropology. These tools enable a non-invasive and highly standardized analysis of both intact cadavers and human remains recovered from terrestrial or aquatic environments, providing reliable support in identification processes, traumatological reconstruction, and the assessment of taphonomic processes. In the context of estimating the Post-Mortem Interval (PMI) and the Post-Mortem Submersion Interval (PMSI), digital imaging allows for the objective and reproducible documentation of morphological changes associated with decomposition, saponification, skeletonization, and taphonomic patterns specific to the recovery environment. Specifically, CT enables the precise assessment of gas accumulation, transformations in residual soft tissues, and structural bone modifications, while photogrammetry and 3D reconstructions facilitate the longitudinal monitoring of transformative processes in both terrestrial and underwater contexts. These observations enhance the reliability of PMI/PMSI estimates through integrated models that combine morphometric, taphonomic, and environmental data. Beyond PMI/PMSI estimation, imaging techniques play a central role in anthropological bioprofiling, facilitating the estimation of age, sex, and stature, the analysis of dental characteristics, and the evaluation of antemortem or perimortem trauma, including damage caused by terrestrial or fauna. Three-dimensional documentation also provides a permanent, shareable archive suitable for comparative analyses, ensuring transparency and reproducibility in investigations. Although not a complete substitute for traditional autopsy or anthropological examination, imaging serves as an essential complement, particularly in cases where the integrity of remains must be preserved or where environmental conditions hinder the direct handling of osteological material. Future directions include the development of AI-based predictive models for PMI/PMSI estimation using automated analysis of post-mortem changes, greater standardization of imaging protocols for aquatic remains, and the use of digital sensors and multimodal techniques to characterize microstructural alterations not detectable by the naked eye. The integration of high-resolution imaging and advanced analytical algorithms promises to further enhance the reconstructive accuracy and interpretative capacity of Forensic Anthropology. Full article
Show Figures

Graphical abstract

24 pages, 662 KB  
Article
Quality-by-Design Compounding of Semisolids Using an Electronic Mortar and Pestle Device for Compounding Pharmacies: Uniformity, Stability, and Cleaning
by Hudson Polonini, Carolina Schettino Kegele, Savvas Koulouridas and Marcone Augusto Leal de Oliveira
Pharmaceutics 2026, 18(2), 205; https://doi.org/10.3390/pharmaceutics18020205 - 4 Feb 2026
Abstract
Background/Objectives: Manual preparation of semisolid formulations (creams, ointments, gels) is prone to variability in mixing energy and time, which may compromise uniform API distribution. This study aimed to evaluate an Electronic Mortar and Pestle (EMP; Unguator™) as a standardized compounding tool, with [...] Read more.
Background/Objectives: Manual preparation of semisolid formulations (creams, ointments, gels) is prone to variability in mixing energy and time, which may compromise uniform API distribution. This study aimed to evaluate an Electronic Mortar and Pestle (EMP; Unguator™) as a standardized compounding tool, with objectives to: (i) validate stability-indicating UHPLC methods; (ii) assess content uniformity across jar strata; (iii) quantify the impact of mixing time and rotation speed via design of experiments (DOE); and (iv) verify cleaning effectiveness and cross-contamination risk. Methods: Five representative formulations were compounded: urea 40%, clobetasol 0.05%, diclofenac 2.5% in hyaluronic acid 3% gel, urea 10% + salicylic acid 1%, and hydroquinone 5%. UHPLC methods were validated per ICH Q2(R2) and stress-tested under acid, base, oxidative, thermal, and UV conditions. Homogeneity was assessed by stratified sampling (top/middle/bottom). A 32 factorial DOE (time: 2/6/10 min; speed: 600/1500/2400 rpm) modeled effects on % label claim and RSD. Cleaning validation employed hydroquinone as a tracer, with swab sampling pre-/post-use and post-sanitization analyzed by HPLC. Results: All UHPLC methods met specificity, linearity, precision, accuracy, and sensitivity criteria and were stability-indicating (Rs ≥ 1.5). Formulations achieved 90–110% label claim with strata CV ≤ 5%. DOE revealed speed as the dominant factor for clobetasol, urea, and diclofenac, while time was more influential for salicylic acid; gels exhibited curvature, indicating diminishing returns at high rpm. Model-predicted optima were implementable on the Unguator™ with minor rounding of rpm/time. Cleaning validation confirmed post-sanitization residues below LOQ and <10 ppm acceptance. Conclusions: The Unguator™ provides a practical, parameter-controlled route for compounding pharmacies to standardize semisolid preparations, achieving reproducible layer-to-layer content uniformity within predefined criteria under the evaluated conditions through programmable set-points and validated cycles. DOE-derived rpm–time relationships define an operational design space within the studied ranges and support selection of implementable device settings and set-points. Importantly, the DOE-derived “optima” in this study are optimized for assay-based content uniformity (mean % label claim and strata variability). Cleaning validation supports a closed, low-cross-contamination workflow, facilitating consistent routines for both routine and complex formulations. Overall, the work implements selected QbD elements (QTPP—Quality Target Product Profile; CQA—Critical Quality Attribute definition; CPP—Critical Process Parameter identification; operational design space; and a proposed control strategy) and should be viewed as a step toward broader lifecycle QbD implementation in compounding. Full article
17 pages, 1778 KB  
Article
Differentiating Borderline from Malignant Ovarian-Adnexal Tumours: A Multimodal Predictive Approach Joining Clinical, Analytic, and MRI Parameters
by Lledó Cabedo, Carmen Sebastià, Meritxell Munmany, Adela Saco, Eduardo Gallardo, Olatz Sáenz de Argandoña, Gonzalo Peón, Josep Lluís Carrasco and Carlos Nicolau
Cancers 2026, 18(3), 516; https://doi.org/10.3390/cancers18030516 - 4 Feb 2026
Abstract
Objectives: To improve the differentiation of borderline ovarian-adnexal tumours (BOTs) from malignant ovarian-adnexal masses, most of which fall into the indeterminate O-RADS MRI 4 category, by developing a multimodal predictive model that integrates clinical, analytic, and MRI parameters. Methods: This retrospective, single-centre study [...] Read more.
Objectives: To improve the differentiation of borderline ovarian-adnexal tumours (BOTs) from malignant ovarian-adnexal masses, most of which fall into the indeterminate O-RADS MRI 4 category, by developing a multimodal predictive model that integrates clinical, analytic, and MRI parameters. Methods: This retrospective, single-centre study included 248 women who underwent standardised MRI for ovarian-adnexal mass characterisation between 2019 and 2024. Of these, 201 had true ovarian-adnexal masses (114 benign, 22 borderline, and 65 malignant), confirmed by histopathology or stability after ≥12-month follow-up. Forty-one clinical, laboratory, and imaging variables were initially assessed, and after a bivariate evaluation, 18 final predictors with clinical relevance were selected for model construction with thresholds learned from the data. A classification and regression tree (CART) model (“Full Model”) was applied as a second-stage tool after O-RADS MRI scoring, using 10-fold cross-validation to prevent overfitting. A pruned “Simplified Model” was also derived to enhance interpretability. Results: O-RADS MRI performed well at the extremes (scores 2–3 and 5) but showed limited discrimination between BOTs and malignancies within category 4 (PPV for borderline = 0.50). The decision-tree models significantly improved diagnostic performance, increasing overall accuracy from 0.856 with O-RADS MRI alone to 0.905 (Simplified Model) and 0.955 (Full Model). The PPV for BOTs within the intermediate O-RADS MRI 4 category increased from 0.49 with O-RADS MRI alone to 0.77 and 0.90 with the simplified and full models, respectively, while maintaining high accuracy for benign and malignant lesions. Conclusions: In this retrospective single-centre cohort, the addition of an interpretable rule-based predictive model as a second-line tool within O-RADS MRI category 4 was associated with improved discrimination between borderline and invasive malignant ovarian-adnexal tumours. These findings suggest that multimodal integration of clinical, laboratory, and MRI features may help refine risk stratification in indeterminate cases; however, external validation in prospective multicentre cohorts is required before clinical implementation. Full article
(This article belongs to the Special Issue Gynecological Cancer: Prevention, Diagnosis, Prognosis and Treatment)
29 pages, 4499 KB  
Article
Surrogate-Assisted Many-Objective Optimization of Injection Molding: Effects of Objective Selection and Sampling Density
by T. Marques, J. B. Melo, A. J. Pontes and A. Gaspar-Cunha
Appl. Sci. 2026, 16(3), 1578; https://doi.org/10.3390/app16031578 - 4 Feb 2026
Abstract
In injection molding, advanced numerical modeling tools, such as Moldex3D, can significantly improve product development by optimizing part functionality, structural integrity, and material efficiency. However, the complex and nonlinear interdependencies between the several decision variables and objectives, considering the various operational phases, constitute [...] Read more.
In injection molding, advanced numerical modeling tools, such as Moldex3D, can significantly improve product development by optimizing part functionality, structural integrity, and material efficiency. However, the complex and nonlinear interdependencies between the several decision variables and objectives, considering the various operational phases, constitute a challenge to the inherent complexity of injection molding processes. This complexity often exceeds the capacity of conventional optimization methods, necessitating more sophisticated analytical approaches. Consequently, this research aims to evaluate the potential of integrating intelligent algorithms, specifically the selection of objectives using Principal Component Analysis and Mutual Information/Clustering, metamodels using Artificial Neural Networks, and optimization using Multi-Objective Evolutionary Algorithms, to manage and solve complex, real-world injection molding problems effectively. Using surrogate modeling to reduce computational costs, the study systematically investigates multiple methodological approaches, algorithmic configurations, and parameter-tuning strategies to enhance the robustness and reliability of predictive and optimization outcomes. The research results highlight the significant potential of data-mining methodologies, demonstrating their ability to capture and model complex relationships among variables accurately and to optimize conflicting objectives efficiently. In due course, the enhanced capabilities provided by these integrated data-mining techniques result in substantial improvements in mold design, process efficiency, product quality, and overall economic viability within the injection molding industry. Full article
(This article belongs to the Section Computing and Artificial Intelligence)
Show Figures

Figure 1

16 pages, 794 KB  
Article
Development and Validation of the Low Sit–High Step Test for Assessing Lower-Extremity Function in Sarcopenia
by Serpil Demir, Burak Elçin, Ramazan Mert, İbrahim Kök, Onur Öz, Ethem Kavukçu and Nilüfer Balcı
Diagnostics 2026, 16(3), 480; https://doi.org/10.3390/diagnostics16030480 - 4 Feb 2026
Abstract
Objectives: This study aimed to evaluate the validity, reliability, and diagnostic accuracy of the Low Sit–High Step (LS–HS) Test as an original, cost-effective, and clinically practical tool for assessing lower-extremity muscle strength and function, with a specific focus on its sensitivity in detecting [...] Read more.
Objectives: This study aimed to evaluate the validity, reliability, and diagnostic accuracy of the Low Sit–High Step (LS–HS) Test as an original, cost-effective, and clinically practical tool for assessing lower-extremity muscle strength and function, with a specific focus on its sensitivity in detecting early-stage sarcopenia. Methods: This cross-sectional study included 205 participants divided into four groups: probable sarcopenia, sarcopenia, and two control groups (young and middle-to-older adults). The LS–HS Test was compared across groups and against standard assessments to evaluate its efficacy in measuring lower-extremity function. Reliability was verified through Cronbach’s alpha and ICC. Multinomial logistic regression was used to determine the test’s predictive power, while ROC analysis assessed its diagnostic accuracy for sarcopenia screening. Results: The LS–HS scores were significantly higher in participants with probable sarcopenia and sarcopenia (p< 0.05). Multinomial logistic regression revealed that the LS–HS performance was a significant predictor of both probable sarcopenia and sarcopenia (p < 0.001). The test demonstrated excellent internal consistency (Cronbach’s α = 0.938) and very high inter-rater and test–retest reliability (ICC = 0.998). ROC analysis confirmed high diagnostic accuracy in distinguishing both probable sarcopenia (AUC = 0.768) and sarcopenia (AUC = 0.704) (all p< 0.01). Conclusions: The LS–HS Test is a valid, reliable, and sensitive tool for assessing lower-extremity functional capacity. Its ability to identify early functional decline, often manifesting before significant muscle mass loss, positions it as an effective alternative to traditional assessments in routine clinical practice, particularly for the early detection and monitoring of the sarcopenia spectrum. Full article
(This article belongs to the Section Clinical Diagnosis and Prognosis)
Show Figures

Figure 1

23 pages, 2643 KB  
Article
Rapid Monitoring and Quantification of Primary and Secondary Oxidative Markers in Edible Oils During Deep Frying Using Near-Infrared Spectroscopy and Chemometrics
by Taha Mehany, José M. González-Sáiz and Consuelo Pizarro
Foods 2026, 15(3), 557; https://doi.org/10.3390/foods15030557 - 4 Feb 2026
Abstract
Background: Oxidative degradation during deep frying negatively affects the nutritional quality and stability of edible oils. Rapid, non-destructive methods to monitor oxidation, particularly in antioxidant-enriched oils, are therefore of growing interest. Materials and Methods: This study investigates the potential of near-infrared (NIR) spectroscopy [...] Read more.
Background: Oxidative degradation during deep frying negatively affects the nutritional quality and stability of edible oils. Rapid, non-destructive methods to monitor oxidation, particularly in antioxidant-enriched oils, are therefore of growing interest. Materials and Methods: This study investigates the potential of near-infrared (NIR) spectroscopy combined with chemometric modeling—specifically the Stepwise Decorrelation of Variables (SELECT) algorithm and Ordinary Least Squares (OLS) regression—to quantitatively assess oxidation dynamics in edible oils enriched with hydroxytyrosol extract from olive fruit during deep frying. Extra virgin olive oil, virgin olive oil, refined olive oil, refined sunflower oil, and high-oleic sunflower oil were evaluated under controlled thermal degradation conditions. Results: Variable selection identified key NIR spectral regions related to acidity, conjugated dienes (K232), secondary oxidation indices (K270 and ΔK), peroxide value (PV), anisidine value (AnV), and the total oxidation (TOTOX) index. From 700 measured wavelengths, a limited number were sufficient for robust prediction (16–30 wavelengths depending on the parameter), with critical sensitivity observed around 1792 nm and 1392 nm. The optimized NIR–SELECT–OLS models showed strong predictive performance across oil types (R2 > 0.90; explained variance > 85%). Conclusions: The results demonstrate that hydroxytyrosol enrichment enhances the oxidative and nutritional stability of edible oils during deep frying. Moreover, the integration of NIR spectroscopy with chemometric modeling provides an effective, non-destructive tool for real-time monitoring of oil oxidation, supporting sustainable quality control, process optimization, and antioxidant fortification in functional edible oils. Full article
Show Figures

Figure 1

37 pages, 975 KB  
Review
Wearable Biosensing and Machine Learning for Data-Driven Training and Coaching Support
by Rubén Madrigal-Cerezo, Natalia Domínguez-Sanz and Alexandra Martín-Rodríguez
Biosensors 2026, 16(2), 97; https://doi.org/10.3390/bios16020097 - 4 Feb 2026
Abstract
Background: Artificial Intelligence (AI) and Machine Learning (ML) are increasingly integrated into sport and exercise through wearable biosensing systems that enable continuous monitoring and data-driven training adaptation. However, their practical value for coaching depends on the validity of biosensor data, the robustness of [...] Read more.
Background: Artificial Intelligence (AI) and Machine Learning (ML) are increasingly integrated into sport and exercise through wearable biosensing systems that enable continuous monitoring and data-driven training adaptation. However, their practical value for coaching depends on the validity of biosensor data, the robustness of analytical models, and the conditions under which these systems have been empirically evaluated. Methods: A structured narrative review was conducted using Scopus, PubMed, Web of Science, and Google Scholar (2010–2026), synthesising empirical and applied evidence on wearable biosensing, signal processing, and ML-based adaptive training systems. To enhance transparency, an evidence map of core empirical studies was constructed, summarising sensing modalities, cohort sizes, experimental settings (laboratory vs. field), model types, evaluation protocols, and key outcomes. Results: Evidence from field and laboratory studies indicates that wearable biosensors can reliably capture physiological (e.g., heart rate variability), biomechanical (e.g., inertial and electromyographic signals), and biochemical (e.g., sweat lactate and electrolytes) markers relevant to training load, fatigue, and recovery, provided that signal quality control and calibration procedures are applied. ML models trained on these data can support training adaptation and recovery estimation, with improved performance over traditional workload metrics in endurance, strength, and team-sport contexts when evaluated using athlete-wise or longitudinal validation schemes. Nevertheless, the evidence map also highlights recurring limitations, including sensitivity to motion artefacts, inter-session variability, distribution shift between laboratory and field settings, and overconfident predictions when contextual or psychosocial inputs are absent. Conclusions: Current empirical evidence supports the use of AI-driven biosensor systems as decision-support tools for monitoring and adaptive training, but not as autonomous coaching agents. Their effectiveness is bounded by sensor reliability, appropriate validation protocols, and human oversight. The most defensible model emerging from the evidence is human–AI collaboration, in which ML enhances precision and consistency in data interpretation, while coaches retain responsibility for contextual judgement, ethical decision-making, and athlete-centred care. Full article
(This article belongs to the Special Issue Wearable Sensors for Precise Exercise Monitoring and Analysis)
Show Figures

Figure 1

35 pages, 7867 KB  
Article
Inter-Comparison of Deep Learning Models for Flood Forecasting in Ethiopia’s Upper Awash Basin
by Girma Moges Mengistu, Addisu G. Semie, Gulilat T. Diro, Natei Ermias Benti, Emiola O. Gbobaniyi and Yonas Mersha
Water 2026, 18(3), 397; https://doi.org/10.3390/w18030397 - 3 Feb 2026
Abstract
Flood events driven by climate variability and change pose significant risks for socio-economic activities in the Awash Basin, necessitating advanced forecasting tools. This study benchmarks five deep learning (DL) architectures, Convolutional Neural Network (CNN), Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), Bidirectional [...] Read more.
Flood events driven by climate variability and change pose significant risks for socio-economic activities in the Awash Basin, necessitating advanced forecasting tools. This study benchmarks five deep learning (DL) architectures, Convolutional Neural Network (CNN), Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), Bidirectional LSTM (BiLSTM), and a Hybrid CNN–LSTM, for daily discharge forecasting for the Hombole catchment in the Upper Awash Basin (UAB) using 40 years of hydrometeorological observations (1981–2020). Rainfall, lagged discharge, and seasonal indicators were used as predictors. Model performance was evaluated against two baseline approaches, a conceptual HBV rainfall–runoff model as well as a climatology, using standard and hydrological metrics. Of the two baselines (climatology and HBV), the climatology showed limited skill with large bias and negative NSE, whereas the HBV model achieved moderate skill (NSE = 0.64 and KGE = 0.82). In contrast, all DL models substantially improved predictive performance, achieving test NSE values above 0.83 and low overall bias. Among them, the Hybrid CNN–LSTM provided the most balanced performance, combining local temporal feature extraction with long-term memory and yielding stable efficiency (NSE ≈ 0.84, KGE ≈ 0.90, and PBIAS ≈ −2%) across flow regimes. The LSTM and GRU models performed comparably, offering strong temporal learning and robust daily predictions, while BiLSTM improved flood timing through bidirectional sequence modeling. The CNN captured short-term variability effectively but showed weaker representation of extreme peaks. Analysis of peak-flow metrics revealed systematic underestimation of extreme discharge magnitudes across all models. However, a post-processing flow-regime classification based on discharge quantiles demonstrated high extreme-event detection skill, with deep learning models exceeding 89% accuracy in identifying extreme-flow occurrences on the test set. These findings indicate that, while magnitude errors remain for rare floods, DL models reliably discriminate flood regimes relevant for early warning. Overall, the results show that deep learning models provide clear improvements over climatology and conceptual baselines for daily streamflow forecasting in the UAB, while highlighting remaining challenges in peak-flow magnitude prediction. The study indicates promising results for the integration of deep learning methods into flood early-warning workflows; however, these results could be further improved by adopting a probabilistic forecasting framework that accounts for model uncertainty. Full article
(This article belongs to the Section Hydrology)
Show Figures

Figure 1

16 pages, 615 KB  
Article
Multimodal Large Language Model for Fracture Detection in Emergency Orthopedic Trauma: A Diagnostic Accuracy Study
by Sadık Emre Erginoğlu, Nuri Koray Ülgen, Nihat Yiğit, Ali Said Nazlıgül and Mehmet Orçun Akkurt
Diagnostics 2026, 16(3), 476; https://doi.org/10.3390/diagnostics16030476 - 3 Feb 2026
Viewed by 21
Abstract
Background: Rapid and accurate fracture detection is critical in emergency departments (EDs), where high patient volume and time pressure increase the risk of diagnostic error, particularly in radiographic interpretation. Multimodal large language models (LLMs) with image-recognition capability have recently emerged as general-purpose [...] Read more.
Background: Rapid and accurate fracture detection is critical in emergency departments (EDs), where high patient volume and time pressure increase the risk of diagnostic error, particularly in radiographic interpretation. Multimodal large language models (LLMs) with image-recognition capability have recently emerged as general-purpose tools for clinical decision support, but their diagnostic performance within routine emergency department imaging workflows in orthopedic trauma remains unclear. Methods: In this retrospective diagnostic accuracy study, we included 1136 consecutive patients referred from the ED to orthopedics between 1 January and 1 June 2025 at a single tertiary center. Given the single-center, retrospective design, the findings should be interpreted as hypothesis-generating and may not be fully generalizable to other institutions. Emergency radiographs and clinical data were processed by a multimodal LLM (2025 version) via an official API using a standardized, deterministic prompt. The model’s outputs (“Fracture present”, “No fracture”, or “Uncertain”) were compared with final diagnoses established by blinded orthopedic specialists, which served as the reference standard. Diagnostic agreement was analyzed using Cohen’s kappa (κ), sensitivity, specificity, accuracy, and 95% confidence intervals (CIs). False-negative (FN) cases were defined as instances where the LLM reported “no acute fracture” but the specialist identified a fracture. The evaluated system is a general-purpose multimodal LLM and was not trained specifically on orthopedic radiographs. Results: Overall, the LLM showed good diagnostic agreement with orthopedic specialists, with concordant results in 808 of 1136 patients (71.1%; κ = 0.634; 95% CI: 68.4–73.7). The model achieved balanced performance with sensitivity of 76.9% and specificity of 66.8%. The highest agreement was observed in knee trauma (91.7%), followed by wrist (78.8%) and hand (69.6%). False-negative cases accounted for 184 patients (16.2% of the total cohort), representing 32.4% of all LLM-negative assessments. Most FN fractures were non-displaced (82.6%), and 17.4% of FN cases required surgical treatment. Ankle and foot regions showed the highest FN rates (30.4% and 17.4%, respectively), reflecting the anatomical and radiographic complexity of these areas. Positive predictive value (PPV) and negative predictive value (NPV) were 69.4% and 74.5%, respectively, with likelihood ratios indicating moderate shifts in post-test probability. Conclusions: In an emergency department-to-orthopedics consultation cohort reflecting routine clinical workflow, a multimodal LLM demonstrated moderate-to-good diagnostic agreement with orthopedic specialists, broadly within the range reported in prior fracture-detection AI studies; however, these comparisons are indirect because model architectures, training strategies, datasets, and endpoints differ across studies. However, its limited ability to detect non-displaced fractures—especially in anatomically complex regions like the ankle and foot—carries direct patient safety implications and confirms that specialist review remains indispensable. At present, such models may be explored as hypothesis-generating triage or decision-support tools, with mandatory specialist confirmation, rather than as standalone diagnostic systems. Prospective, multi-center studies using high-resolution imaging and anatomically optimized algorithms are needed before routine clinical adoption in emergency care. Full article
(This article belongs to the Special Issue Applications of Artificial Intelligence in Orthopedics)
Show Figures

Figure 1

27 pages, 916 KB  
Review
Enzymatic Hydrolysis of Lignocellulosic Biomass: Structural Features, Process Aspects, Kinetics, and Computational Tools
by Darlisson Santos, Joyce Gueiros Wanderley Siqueira, Marcos Gabriel Lopes da Silva, Maria Donato, Girleide da Silva, Bruna Pratto, Allan Almeida Albuquerque, Emmanuel Damilano Dutra and Jorge Luíz Silveira Sonego
Biomass 2026, 6(1), 13; https://doi.org/10.3390/biomass6010013 - 3 Feb 2026
Viewed by 52
Abstract
This manuscript provides a comprehensive review of the enzymatic hydrolysis of lignocellulosic biomass, emphasizing how chemical composition, structural features, inhibitory compounds, and process configurations collectively influence the conversion of structural polysaccharides into fermentable sugars. Variability among herbaceous, woody, and residual biomasses results in [...] Read more.
This manuscript provides a comprehensive review of the enzymatic hydrolysis of lignocellulosic biomass, emphasizing how chemical composition, structural features, inhibitory compounds, and process configurations collectively influence the conversion of structural polysaccharides into fermentable sugars. Variability among herbaceous, woody, and residual biomasses results in differences in cellulose, hemicellulose, lignin content, and crystallinity, which strongly affect enzyme accessibility. The review discusses key inhibitory mechanisms, including nonproductive cellulase adsorption onto lignin, interference from phenolic derivatives and pretreatment by-products, and inhibition caused by accumulating mono- and oligosaccharides. Process configurations such as SHF, SSF, PSSF, and consolidated bioprocessing are compared, with SSF often achieving superior performance by mitigating end-product inhibition. The manuscript also highlights the growing relevance of computational modeling and simulation tools, which support kinetic prediction, the evaluation of transport limitations, and the optimization of operating conditions in high-solids systems. Additionally, recent advances in artificial intelligence are presented as powerful approaches for modeling nonlinear hydrolysis behavior, estimating kinetic parameters, identifying rate-limiting steps, and improving predictive accuracy in complex bioprocesses. Overall, the integration of experimental insights with advanced modeling, simulation, and AI-based strategies is essential for overcoming current limitations and enhancing the technical feasibility and industrial competitiveness of lignocellulosic bioconversion. Full article
Show Figures

Figure 1

Back to TopTop