Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (52)

Search Parameters:
Keywords = RFS screening tools

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
21 pages, 2684 KB  
Article
Construction of Yunnan Flue-Cured Tobacco Yield Integrated Learning Prediction Model Driven by Meteorological Data
by Yunshuang Wang, Jinheng Zhang, Xiaoyi Bai, Mengyan Zhao, Xianjin Jin and Bing Zhou
Agronomy 2025, 15(10), 2436; https://doi.org/10.3390/agronomy15102436 - 21 Oct 2025
Viewed by 248
Abstract
The timely and accurate prediction of flue-cured tobacco yield is crucial for its stable yield and income growth. Based on yield and meteorological data from 2003 to 2023 (from the NASA POWER database) of Yunnan Province, this study constructed a coupled framework of [...] Read more.
The timely and accurate prediction of flue-cured tobacco yield is crucial for its stable yield and income growth. Based on yield and meteorological data from 2003 to 2023 (from the NASA POWER database) of Yunnan Province, this study constructed a coupled framework of polynomial regression and a Stacking ensemble model. Four trend yield separation methods were compared, with polynomial regression selected as being optimal for capturing long-term trends. A total of 135 meteorological features were built using flue-cured tobacco’s growth period data, and 17 core features were screened via Pearson’s correlation analysis and Recursive Feature Elimination (RFE). With Random Forest (RF), Multi-Layer Perceptron (MLP), and Support Vector Regression (SVR) as base models, a ridge regression meta-model was developed to predict meteorological yield. The final results were obtained by integrating trend and meteorological yields, and core influencing factors were analyzed via SHapley Additive exPlanations (SHAP). The results showed that the Stacking model had the best predictive performance, significantly outperforming single models; August was the optimal prediction lead time; and the day–night temperature difference in the August maturity stage and the solar radiation in the April transplantation stage were core yield-influencing factors. This framework provides a practical yield prediction tool for Yunnan’s flue-cured tobacco areas and offers important empirical support for exploring meteorology–yield interactions in subtropical plateau crops. Full article
(This article belongs to the Section Precision and Digital Agriculture)
Show Figures

Figure 1

11 pages, 894 KB  
Article
AI-Based Prediction of Bone Conduction Thresholds Using Air Conduction Audiometry Data
by Chul Young Yoon, Junhun Lee, Jiwon Kim, Sunghwa You, Chanbeom Kwak and Young Joon Seo
J. Clin. Med. 2025, 14(18), 6549; https://doi.org/10.3390/jcm14186549 - 17 Sep 2025
Viewed by 445
Abstract
Background/Objectives: This study evaluated the feasibility of predicting bone conduction (BC) thresholds and classifying air–bone gap (ABG) status using only air conduction (AC) data obtained from pure tone audiometry (PTA). Methods: A total of 60,718 PTA records from five tertiary hospitals in the [...] Read more.
Background/Objectives: This study evaluated the feasibility of predicting bone conduction (BC) thresholds and classifying air–bone gap (ABG) status using only air conduction (AC) data obtained from pure tone audiometry (PTA). Methods: A total of 60,718 PTA records from five tertiary hospitals in the Republic of Korea were utilized. Input features included AC thresholds (0.25–8 kHz), age, and sex, while outputs were BC thresholds (0.25–4 kHz) and ABG classification based on 10 dB and 15 dB criteria. Five machine learning models—deep neural network (DNN), long short-term memory (LSTM), bidirectional LSTM (BiLSTM), random forest (RF), and extreme gradient boosting (XGB)—were trained using 5-fold cross-validation with Synthetic Minority Over-sampling Technique (SMOTE). Model performance was evaluated based on accuracy, sensitivity, precision, and F1 score under ±5 dB and ±10 dB thresholds for BC prediction. Results: LSTM and BiLSTM outperformed DNN in predicting BC thresholds, achieving ~60% accuracy within ±5 dB and ~80% within ±10 dB. For ABG classification, all models performed better with the 10 dB criterion than the 15 dB. Tree-based models (RF, XGB) achieved the highest classification accuracy (up to 0.512) and precision (up to 0.827). Confidence intervals for all metrics were within ±0.01, indicating stable results. Conclusions: AI models can accurately predict BC thresholds and ABG status using AC data alone. These findings support the integration of AI-driven tools into clinical audiology and telemedicine, particularly for remote screening and diagnosis. Future work should focus on clinical validation and implementation to expand accessibility in hearing care. Full article
Show Figures

Figure 1

26 pages, 4288 KB  
Article
Risk-Informed Dual-Threshold Screening for SPT-Based Liquefaction: A Probability-Calibrated Random Forest Approach
by Hani S. Alharbi
Buildings 2025, 15(17), 3206; https://doi.org/10.3390/buildings15173206 - 5 Sep 2025
Viewed by 684
Abstract
Soil liquefaction poses a significant risk to foundations during earthquakes, prompting the need for simple, risk-aware screening tools that go beyond single deterministic boundaries. This study creates a probability-calibrated dual-threshold screening rule using a random forest (RF) classifier trained on 208 SPT case [...] Read more.
Soil liquefaction poses a significant risk to foundations during earthquakes, prompting the need for simple, risk-aware screening tools that go beyond single deterministic boundaries. This study creates a probability-calibrated dual-threshold screening rule using a random forest (RF) classifier trained on 208 SPT case histories with quality-based weights (A/B/C = 1.0/0.70/0.40). The model is optimized with random search and calibrated through isotonic regression. Iso-probability contours from 1000 bootstrap samples produce paired thresholds for fines-corrected, overburden-normalized blow count N1,60,CS and normalized cyclic stress ratio CSR7.5,1 at target liquefaction probabilities Pliq = 5%, 20%, 50%, 80%, and 95%, with 90% confidence intervals. On an independent test set (n = 42), the calibrated model achieves AUC = 0.95, F1 = 0.92, and a better Brier score than the uncalibrated RF. The screening rule classifies a site as susceptible when N1,60,CS is at or below and CSR7.5,1 is at or above the probability-specific thresholds. Designed for level ground, free field, and clean-to-silty sand sites, this tool maintains the familiarity of SPT-based charts while making risk assessment transparent and auditable for different facility importance levels. Sensitivity tests show its robustness to reasonable rescaling of quality weights. The framework offers transparent thresholds with uncertainty bands for routine preliminary assessments and to guide the need for more detailed, site-specific analyses. Full article
(This article belongs to the Section Construction Management, and Computers & Digitization)
Show Figures

Figure 1

13 pages, 951 KB  
Article
Association of Vitamin D Deficiency with Local Muscle–Fat Ratio in Geriatric Palliative Care Patients: An Ultrasonographic Study
by Ayfer Durak and Umut Safer
Healthcare 2025, 13(17), 2188; https://doi.org/10.3390/healthcare13172188 - 1 Sep 2025
Viewed by 560
Abstract
Background/Objectives: Vitamin D deficiency is linked to muscle loss and fat changes in older adults, but data regarding palliative patients are limited. Ultrasound offers a practical tool to assess these changes. This study explores the relationship between vitamin D levels and ultrasound-measured muscle, [...] Read more.
Background/Objectives: Vitamin D deficiency is linked to muscle loss and fat changes in older adults, but data regarding palliative patients are limited. Ultrasound offers a practical tool to assess these changes. This study explores the relationship between vitamin D levels and ultrasound-measured muscle, fat, and their ratio in older adult palliative patients. Methods: This prospective cross-sectional study was conducted in a tertiary palliative care unit (June–September 2024). A total of 187 patients were grouped by serum vitamin D levels (<50 vs. ≥50 nmol/L). Demographic and clinical variables included sex, BMI, Activities of Daily Living (ADLs), calf circumference (CC), and comorbidities. Ultrasonography assessed muscle thickness (MT), subcutaneous fat thickness (SFT), and cross-sectional area (CSA) of Rectus Femoris (RF) and Biceps Brachii (BB). MT/SFT ratio was calculated. Logistic regression identified independent predictors. Results: Mean age was 75.1 ± 14.4 years; 55.6% of participants were female. Vitamin D deficiency (67.9%) was significantly associated with female sex (p = 0.037), ADL dependency (p < 0.001), lower BMI (p = 0.020), and reduced CC (p = 0.006). RF-MT, RF-SFT, RF-CSA, BB-MT, and BB-CSA were lower in the deficient group. RF-MT/SFT ratio was higher (p = 0.049). ADL dependency (p = 0.002) and RF-MT/SFT (p = 0.015) were independent predictors. Conclusions: Vitamin D deficiency was linked to a higher muscle-to-fat ratio, mainly due to fat loss rather than muscle gain. This may misrepresent muscle preservation and should be interpreted cautiously. Although vitamin D levels appear to be associated with physical function, additional prospective cohort and interventional supplementation studies are warranted to determine whether routine screening and targeted vitamin D supplementation can effectively support physical function in this population. Full article
Show Figures

Figure 1

20 pages, 11319 KB  
Article
Using Certainty Factor as a Spatial Sample Filter for Landslide Susceptibility Mapping: The Case of the Upper Jinsha River Region, Southeastern Tibetan Plateau
by Xin Zhou, Ke Jin, Xiaohui Sun, Yunkai Ruan, Yiding Bao, Xiulei Li and Li Tang
ISPRS Int. J. Geo-Inf. 2025, 14(9), 339; https://doi.org/10.3390/ijgi14090339 - 1 Sep 2025
Viewed by 651
Abstract
Landslide susceptibility mapping (LSM) faces persistent challenges in defining representative stable samples as conventional random selection often includes unstable areas, introducing spatial bias and compromising model accuracy. To address this, we redefine the certainty factor (CF) method—traditionally for factor weighting—as a spatial screening [...] Read more.
Landslide susceptibility mapping (LSM) faces persistent challenges in defining representative stable samples as conventional random selection often includes unstable areas, introducing spatial bias and compromising model accuracy. To address this, we redefine the certainty factor (CF) method—traditionally for factor weighting—as a spatial screening tool for stable zone delineation and apply it to the tectonically active upper Jinsha River (937 km2, southeastern Tibetan Plateau). Our approach first generates a preliminary susceptibility map via CF, using the natural breaks method to define low- and very low-susceptibility zones (CF < 0.1) as statistically stable regions. Non-landslide samples are exclusively selected from these zones for support vector machine (SVM) modeling with five-fold cross-validation. Key results: CF-guided sampling achieves training/testing AUC of 0.924/0.920, surpassing random sampling (0.882/0.878) by 4.8% and reducing ROC standard deviation by 32%. The final map shows 88.49% of known landslides concentrated in 25.70% of high/very high-susceptibility areas, aligning with geological controls (e.g., 92% of high-susceptibility units in soft lithologies within 500 m of faults). Despite using a simpler SVM, our framework outperforms advanced models (ANN: AUC, 0.890; RF: AUC, 0.870) in the same region, proving physical heuristic sample curation supersedes algorithmic complexity. This transferable framework embeds geological prior knowledge into machine learning, offering high-precision risk zoning for disaster mitigation in data-scarce mountainous regions. Full article
Show Figures

Figure 1

33 pages, 30680 KB  
Article
Quantitative Structure–Activity Relationship Study of Cathepsin L Inhibitors as SARS-CoV-2 Therapeutics Using Enhanced SVR with Multiple Kernel Function and PSO
by Shaokang Li, Zheng Li, Peijian Zhang and Aili Qu
Int. J. Mol. Sci. 2025, 26(17), 8423; https://doi.org/10.3390/ijms26178423 - 29 Aug 2025
Viewed by 651
Abstract
Cathepsin L (CatL) is a critical protease involved in cleaving the spike protein of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), facilitating viral entry into host cells. Inhibition of CatL is essential for preventing SARS-CoV-2 cell entry, making it a potential therapeutic target [...] Read more.
Cathepsin L (CatL) is a critical protease involved in cleaving the spike protein of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), facilitating viral entry into host cells. Inhibition of CatL is essential for preventing SARS-CoV-2 cell entry, making it a potential therapeutic target for drug development. Six QSAR models were established to predict the inhibitory activity (expressed as IC50 values) of candidate compounds against CatL. These models were developed using statistical method heuristic methods (HMs), the evolutionary algorithm gene expression programming (GEP), and the ensemble method random forest (RF), along with the kernel-based machine learning algorithm support vector regression (SVR) configured with various kernels: radial basis function (RBF), linear-RBF hybrid (LMIX2-SVR), and linear-RBF-polynomial hybrid (LMIX3-SVR). The particle swarm optimization algorithm was applied to optimize multi-parameter SVM models, ensuring low complexity and fast convergence. The properties of novel CatL inhibitors were explored through molecular docking analysis. The LMIX3-SVR model exhibited the best performance, with an R2 of 0.9676 and 0.9632 for the training set and test set and RMSE values of 0.0834 and 0.0322. Five-fold cross-validation R5fold2 = 0.9043 and leave-one-out cross-validation Rloo2 = 0.9525 demonstrated the strong prediction ability and robustness of the model, which fully proved the correctness of the five selected descriptors. Based on these results, the IC50 values of 578 newly designed compounds were predicted using the HM model, and the top five candidate compounds with the best physicochemical properties were further verified by Property Explorer Applet (PEA). The LMIX3-SVR model significantly advances QSAR modeling for drug discovery, providing a robust tool for designing and screening new drug molecules. This study contributes to the identification of novel CatL inhibitors, which aids in the development of effective therapeutics for SARS-CoV-2. Full article
Show Figures

Graphical abstract

19 pages, 2221 KB  
Article
Leveraging Deep Learning to Enhance Malnutrition Detection via Nutrition Risk Screening 2002: Insights from a National Cohort
by Nadir Yalçın, Merve Kaşıkcı, Burcu Kelleci-Çakır, Kutay Demirkan, Karel Allegaert, Meltem Halil, Mutlu Doğanay and Osman Abbasoğlu
Nutrients 2025, 17(16), 2716; https://doi.org/10.3390/nu17162716 - 21 Aug 2025
Cited by 2 | Viewed by 1116
Abstract
Purpose: This study aimed to develop and validate a new machine learning (ML)-based screening tool for a two-step prediction of the need for and type of nutritional therapy (enteral, parenteral, or combined) using Nutrition Risk Screening 2002 (NRS-2002) and other demographic parameters from [...] Read more.
Purpose: This study aimed to develop and validate a new machine learning (ML)-based screening tool for a two-step prediction of the need for and type of nutritional therapy (enteral, parenteral, or combined) using Nutrition Risk Screening 2002 (NRS-2002) and other demographic parameters from the Optimal Nutrition Care for All (ONCA) national cohort data. Methods: This multicenter retrospective cohort study included 191,028 patients, with data on age, gender, body mass index (BMI), NRS-2002 score, presence of cancer, and hospital unit type. In the first step, classification models estimated whether patients required nutritional therapy, while the second step predicted the type of therapy. The dataset was divided into 60% training, 20% validation, and 20% test sets. Random Forest (RF), Artificial Neural Network (ANN), deep learning (DL), Elastic Net (EN), and Naive Bayes (NB) algorithms were used for classification. Performance was evaluated using AUC, accuracy, balanced accuracy, MCC, sensitivity, specificity, PPV, NPV, and F1-score. Results: Of the patients, 54.6% were male, 9.2% had cancer, and 49.9% were hospitalized in internal medicine units. According to NRS-2002, 11.6% were at risk of malnutrition (≥3 points). The DL algorithm performed best in both classification steps. The top three variables for determining the need for nutritional therapy were severe illness, reduced dietary intake in the last week, and mild impaired nutritional status (AUC = 0.933). For determining the type of nutritional therapy, the most important variables were severe illness, severely impaired nutritional status, and ICU admission (AUC = 0.741). Adding gender, cancer status, and ward type to NRS-2002 improved AUC by 0.6% and 3.27% for steps 1 and 2, respectively. Conclusions: Incorporating gender, cancer status, and ward type into the widely used and validated NRS-2002 led to the development of a new scale that accurately classifies nutritional therapy type. This ML-enhanced model has the potential to be integrated into clinical workflows as a decision support system to guide nutritional therapy, although further external validation with larger multinational cohorts is needed. Full article
(This article belongs to the Section Clinical Nutrition)
Show Figures

Figure 1

26 pages, 6361 KB  
Article
Improving the Generalization Performance of Debris-Flow Susceptibility Modeling by a Stacking Ensemble Learning-Based Negative Sample Strategy
by Jiayi Li, Jialan Zhang, Jingyuan Yu, Yongbo Chu and Haijia Wen
Water 2025, 17(16), 2460; https://doi.org/10.3390/w17162460 - 19 Aug 2025
Viewed by 846
Abstract
To address the negative sample selection bias and limited interpretability of traditional debris-flow event susceptibility models, this study proposes a framework that enhances generalization by integrating negative sample screening via a stacking ensemble model with an interpretable random forest. Using Wenchuan County, Sichuan [...] Read more.
To address the negative sample selection bias and limited interpretability of traditional debris-flow event susceptibility models, this study proposes a framework that enhances generalization by integrating negative sample screening via a stacking ensemble model with an interpretable random forest. Using Wenchuan County, Sichuan Province, as the study area, 19 influencing factors were selected, encompassing topographic, geological, environmental, and anthropogenic variables. First, a stacking ensemble—comprising logistic regression (LR), decision tree (DT), gradient boosting decision tree (GBDT), and random forest (RF)—was employed as a preliminary classifier to identify very low-susceptibility areas as reliable negative samples, achieving a balanced 1:1 ratio of positive to negative instances. Subsequently, a stacking–random forest model (Stacking-RF) was trained for susceptibility zonation, and SHAP (Shapley additive explanations) was applied to quantify each factor’s contribution. The results show that: (1) the stacking ensemble achieved a test-set AUC (area under the receiver operating characteristic curve) of 0.9044, confirming its effectiveness in screening dependable negative samples; (2) the random forest model attained a test-set AUC of 0.9931, with very high-susceptibility zones—covering 15.86% of the study area—encompassing 92.3% of historical debris-flow events; (3) SHAP analysis identified the distance to a road and point-of-interest (POI) kernel density as the primary drivers of debris-flow susceptibility. The method quantified nonlinear impact thresholds, revealing significant susceptibility increases when road distance was less than 500 m or POI kernel density ranged between 50 and 200 units/km2; and (4) cross-regional validation in Qingchuan County demonstrated that the proposed model improved the capture rate for high/very high susceptibility areas by 48.86%, improving it from 4.55% to 53.41%, with a site density of 0.0469 events/km2 in very high-susceptibility zones. Overall, this framework offers a high-precision and interpretable debris-flow risk management tool, highlights the substantial influence of anthropogenic factors such as roads and land development, and introduces a “negative-sample screening with cross-regional generalization” strategy to support land-use planning and disaster prevention in mountainous regions. Full article
(This article belongs to the Special Issue Intelligent Analysis, Monitoring and Assessment of Debris Flow)
Show Figures

Figure 1

22 pages, 1837 KB  
Article
Anthropometric Measurements for Predicting Low Appendicular Lean Mass Index for the Diagnosis of Sarcopenia: A Machine Learning Model
by Ana M. González-Martin, Edgar Samid Limón-Villegas, Zyanya Reyes-Castillo, Francisco Esparza-Ros, Luis Alexis Hernández-Palma, Minerva Saraí Santillán-Rivera, Carlos Abraham Herrera-Amante, César Octavio Ramos-García and Nicoletta Righini
J. Funct. Morphol. Kinesiol. 2025, 10(3), 276; https://doi.org/10.3390/jfmk10030276 - 17 Jul 2025
Viewed by 1797
Abstract
Background: Sarcopenia is a progressive muscle disease that compromises mobility and quality of life in older adults. Although dual-energy X-ray absorptiometry (DXA) is the standard for assessing Appendicular Lean Mass Index (ALMI), it is costly and often inaccessible. This study aims to [...] Read more.
Background: Sarcopenia is a progressive muscle disease that compromises mobility and quality of life in older adults. Although dual-energy X-ray absorptiometry (DXA) is the standard for assessing Appendicular Lean Mass Index (ALMI), it is costly and often inaccessible. This study aims to develop machine learning models using anthropometric measurements to predict low ALMI for the diagnosis of sarcopenia. Methods: A cross-sectional study was conducted on 183 Mexican adults (67.2% women and 32.8% men, ≥60 years old). ALMI was measured using DXA, and anthropometric data were collected following the International Society for the Advancement of Kinanthropometry (ISAK) protocols. Predictive models were developed using Logistic Regression (LR), Decision Trees (DTs), Random Forests (RFs), Artificial Neural Networks (ANNs), and LASSO regression. The dataset was split into training (70%) and testing (30%) sets. Model performance was evaluated using classification performance metrics and the area under the ROC curve (AUC). Results: ALMI indicated strong correlations with BMI, corrected calf girth, and arm relaxed girth. Among models, DT achieved the best performance in females (AUC = 0.84), and ANN indicated the highest AUC in males (0.92). Regarding the prediction of low ALMI, specificity values were highest in DT for females (100%), while RF performed best in males (92%). The key predictive variables varied depending on sex, with BMI and calf girth being the most relevant for females and arm girth for males. Conclusions: Anthropometry combined with machine learning provides an accurate, low-cost approach for identifying low ALMI in older adults. This method could facilitate sarcopenia screening in clinical settings with limited access to advanced diagnostic tools. Full article
Show Figures

Figure 1

20 pages, 5288 KB  
Article
Spectral Estimation of Nitrogen Content in Cotton Leaves Under Coupled Nitrogen and Phosphorus Conditions
by Shunyu Qiao, Wenjin Fu, Jiaqiang Wang, Xiaolong An, Fuqing Li, Weiyang Liu and Chongfa Cai
Agronomy 2025, 15(7), 1701; https://doi.org/10.3390/agronomy15071701 - 14 Jul 2025
Viewed by 567
Abstract
With the increasing application of hyperspectral technology, rapid and accurate monitoring of cotton leaf nitrogen concentrations (LNCs) has become an effective tool for large-scale areas. This study used Tahe No. 2 cotton seeds with four nitrogen levels (0, 200, 350, 500 kg ha [...] Read more.
With the increasing application of hyperspectral technology, rapid and accurate monitoring of cotton leaf nitrogen concentrations (LNCs) has become an effective tool for large-scale areas. This study used Tahe No. 2 cotton seeds with four nitrogen levels (0, 200, 350, 500 kg ha−1) and four phosphorus levels (0, 100, 200, 300 kg ha−1). Spectral data were acquired using an ASD FieldSpec HandHeld2 portable spectrometer, which measures spectral reflectance covering a band of 325–1075 nm with a spectral resolution of 1 nm. LNCs determination and spectral estimation were conducted at six growth stages: squaring, initial bloom, peak bloom, initial boll, peak boll, and boll opening. Thirty-seven spectral indices (SIs) were selected. First derivative (FD), standard normal variate (SNV), multiplicative scatter correction (MSC), and Savitzky–Golay (SG) were applied to preprocess the spectra. Feature bands were screened using partial least squares discriminant analysis (PLS–DA), and support vector machine (SVM) and random forest (RF) models were used for accuracy validation. The results revealed that (1) LNCs initially increased and then decreased with growth, peaking at the full-flowering stage before gradually declining. (2) The best LNC recognition models were SVM–MSC in the squaring stage, SVM–FD in the initial bloom stage, SVM–FD in the peak bloom stage, SVM–FD in the initial boll stage, RF–SNV in the peak boll Mstage, and SVM–FD in the boll opening stage. FD showed the best performance compared with the other three treatments, with SVM outperforming RF in terms of higher R2 and lower RMSE values. The SVM–FD model effectively improved the accuracy and robustness of LNCs prediction using hyperspectral leaf spectra, providing valuable guidance for large-scale information production in high-standard cotton fields. Full article
(This article belongs to the Section Precision and Digital Agriculture)
Show Figures

Figure 1

16 pages, 6992 KB  
Article
Micromagnetic and Quantitative Prediction of Hardness and Impact Energy in Martensitic Stainless Steels Using Mutual Information Parameter Screening and Random Forest Modeling Methods
by Changjie Xu, Haijiang Dong, Zhengxiang Yan, Liting Wang, Mengshuai Ning, Xiucheng Liu and Cunfu He
Materials 2025, 18(7), 1685; https://doi.org/10.3390/ma18071685 - 7 Apr 2025
Cited by 1 | Viewed by 695
Abstract
This study proposes a novel modelling approach that integrates mutual information (MI)-based parameter screening with random forest (RF) modelling to achieve an accurate quantitative prediction of surface hardness and impact energy in two martensitic stainless steels (1Cr13 and 2Cr13). Preliminary analyses indicated that [...] Read more.
This study proposes a novel modelling approach that integrates mutual information (MI)-based parameter screening with random forest (RF) modelling to achieve an accurate quantitative prediction of surface hardness and impact energy in two martensitic stainless steels (1Cr13 and 2Cr13). Preliminary analyses indicated that the magnetic parameters derived from Barkhausen noise (MBN), and the incremental permeability (IP) measurements showed limited linear correlations with the target properties (surface hardness and impact energy). To address this challenge, an MI feature screening method has been developed to identify both the linear and non-linear parameter dependencies that are critical for predicting target mechanical properties. The selected features were then fed into an RF model, which outperformed traditional multiple linear regression in handling the complex, non-monotonic relationships between magnetic signatures and mechanical performance. A key advantage of the proposed MI-RF framework lies in its robustness to small sample sizes, where it achieved high prediction accuracy (e.g., R2 > 0.97 for hardness, and R2 > 0.86 for impact energy) using limited experimental data. By leveraging MI’s ability to capture multivariate dependencies and RF’s ensemble learning power, it effectively mitigates overfitting and improves generalisation. In addition to demonstrating a promising tool for the non-destructive evaluation of martensitic steels, this study also provides a transferable paradigm for the quantitative assessment of other mechanical properties by magnetic feature fusion. Full article
Show Figures

Figure 1

18 pages, 1256 KB  
Article
The Effect of Naturally Acquired Immunity on Mortality Predictors: A Focus on Individuals with New Coronavirus
by Mónica Queipo, Jorge Mateo, Ana María Torres and Julia Barbado
Biomedicines 2025, 13(4), 803; https://doi.org/10.3390/biomedicines13040803 - 27 Mar 2025
Viewed by 779
Abstract
Background/Objectives: The spread of the COVID-19 pandemic has spurred the development of advanced healthcare tools to effectively manage patient outcomes. This study aims to identify key predictors of mortality in hospitalized patients with some level of natural immunity, but not yet vaccinated, [...] Read more.
Background/Objectives: The spread of the COVID-19 pandemic has spurred the development of advanced healthcare tools to effectively manage patient outcomes. This study aims to identify key predictors of mortality in hospitalized patients with some level of natural immunity, but not yet vaccinated, using machine learning techniques. Methods: A total of 363 patients with COVID-19 admitted to Río Hortega University Hospital in Spain between the second and fourth waves of the pandemic were included in this study. Key characteristics related to both the patient’s previous status and hospital stay were screened using the Random Forest (RF) machine learning technique. Results: Of the 19 variables identified as having the greatest influence on predicting mortality, the most powerful ones could be identified at the time of hospital admission. These included the assessment of severity in community-acquired pneumonia (CURB-65) scale, age, the Glasgow Coma Scale (GCS), and comorbidities, as well as laboratory results. Some variables associated with hospitalization and intensive care unit (ICU) admission (acute renal failure, shock, PRONO sessions and the Acute Physiology and Chronic Health Evaluation [APACHE-II] scale) showed a certain degree of significance. The Random Forest (RF) method showed high accuracy, with a precision of >95%. Conclusions: This study shows that natural immunity generates significant changes in the evolution of the disease. As has been shown, machine learning models are an effective tool to improve personalized patient care in different periods. Full article
Show Figures

Figure 1

18 pages, 2075 KB  
Article
Proposed Comprehensive Methodology Integrated with Explainable Artificial Intelligence for Prediction of Possible Biomarkers in Metabolomics Panel of Plasma Samples for Breast Cancer Detection
by Cemil Colak, Fatma Hilal Yagin, Abdulmohsen Algarni, Ali Algarni, Fahaid Al-Hashem and Luca Paolo Ardigò
Medicina 2025, 61(4), 581; https://doi.org/10.3390/medicina61040581 - 25 Mar 2025
Cited by 2 | Viewed by 1846
Abstract
Aim: Breast cancer (BC) is the most common type of cancer in women, accounting for more than 30% of new female cancers each year. Although various treatments are available for BC, most cancer-related deaths are due to incurable metastases. Therefore, the early [...] Read more.
Aim: Breast cancer (BC) is the most common type of cancer in women, accounting for more than 30% of new female cancers each year. Although various treatments are available for BC, most cancer-related deaths are due to incurable metastases. Therefore, the early diagnosis and treatment of BC are crucial before metastasis. Mammography and ultrasonography are primarily used in the clinic for the initial identification and staging of BC; these methods are useful for general screening but have limitations in terms of sensitivity and specificity. Omics-based biomarkers, like metabolomics, can make early diagnosis much more accurate, make tracking the disease’s progression more accurate, and help make personalized treatment plans that are tailored to each tumor’s specific molecular profile. Metabolomics technology is a feasible and comprehensive method for early disease detection and biomarker identification at the molecular level. This research aimed to establish an interpretable predictive artificial intelligence (AI) model using plasma-based metabolomics panel data to identify potential biomarkers that distinguish BC individuals from healthy controls. Methods: A cohort of 138 BC patients and 76 healthy controls were studied. Plasma metabolites were examined using LC-TOFMS and GC-TOFMS techniques. Extreme Gradient Boosting (XGBoost), Light Gradient Boosting Machine (LightGBM), Adaptive Boosting (AdaBoost), and Random Forest (RF) were evaluated using performance metrics such as Receiver Operating Characteristic-Area Under the Curve (ROC AUC), accuracy, sensitivity, specificity, and F1 score. ROC and Precision-Recall (PR) curves were generated for comparative analysis. The SHapley Additive Descriptions (SHAP) analysis evaluated the optimal prediction model for interpretability. Results: The RF algorithm showed improved accuracy (0.963 ± 0.043) and sensitivity (0.977 ± 0.051); however, LightGBM achieved the highest ROC AUC (0.983 ± 0.028). RF also achieved the best Precision-Recall Area under the Curve (PR AUC) at 0.989. SHAP search found glycerophosphocholine and pentosidine as the most significant discriminatory metabolites. Uracil, glutamine, and butyrylcarnitine were also among the significant metabolites. Conclusions: Metabolomics biomarkers and an explainable AI (XAI)-based prediction model showed significant diagnostic accuracy and sensitivity in the detection of BC. The proposed XAI system using interpretable metabolite data can serve as a clinical decision support tool to improve early diagnosis processes. Full article
(This article belongs to the Special Issue Insights and Advances in Cancer Biomarkers)
Show Figures

Figure 1

23 pages, 1840 KB  
Review
Fusion-Based Approaches and Machine Learning Algorithms for Forest Monitoring: A Systematic Review
by Abdullah Al Saim and Mohamed H. Aly
Wild 2025, 2(1), 7; https://doi.org/10.3390/wild2010007 - 11 Mar 2025
Cited by 3 | Viewed by 3298
Abstract
Multi-source remote sensing fusion and machine learning are effective tools for forest monitoring. This study aimed to analyze various fusion techniques, their application with machine learning algorithms, and their assessment in estimating forest type and aboveground biomass (AGB). A keyword search across Web [...] Read more.
Multi-source remote sensing fusion and machine learning are effective tools for forest monitoring. This study aimed to analyze various fusion techniques, their application with machine learning algorithms, and their assessment in estimating forest type and aboveground biomass (AGB). A keyword search across Web of Science, Science Direct, and Google Scholar yielded 920 articles. After rigorous screening, 72 relevant articles were analyzed. Results showed a growing trend in optical and radar fusion, with notable use of hyperspectral images, LiDAR, and field measurements in fusion-based forest monitoring. Machine learning algorithms, particularly Random Forest (RF), Support Vector Machine (SVM), and K-Nearest Neighbor (KNN), leverage features from fused sources, with proper variable selection enhancing accuracy. Standard evaluation metrics include Mean Absolute Error (MAE), Root Mean Squared Error (RMSE), Overall Accuracy (OA), User’s Accuracy (UA), Producer’s Accuracy (PA), confusion matrix, and Kappa coefficient. This review provides a comprehensive overview of prevalent techniques, data sources, and evaluation metrics by synthesizing current research and highlighting data fusion’s potential to improve forest monitoring accuracy. The study underscores the importance of spectral, topographic, textural, and environmental variables, sensor frequency, and key research gaps for standardized evaluation protocols and exploration of multi-temporal fusion for dynamic forest change monitoring. Full article
Show Figures

Figure 1

24 pages, 4723 KB  
Article
Genotyping Identification of Maize Based on Three-Dimensional Structural Phenotyping and Gaussian Fuzzy Clustering
by Bo Xu, Chunjiang Zhao, Guijun Yang, Yuan Zhang, Changbin Liu, Haikuan Feng, Xiaodong Yang and Hao Yang
Agriculture 2025, 15(1), 85; https://doi.org/10.3390/agriculture15010085 - 2 Jan 2025
Cited by 1 | Viewed by 1050
Abstract
The maize tassel represents one of the most pivotal organs dictating maize yield and quality. Investigating its phenotypic information constitutes an exceedingly crucial task within the realm of breeding work, given that an optimal tassel structure is fundamental for attaining high maize yields. [...] Read more.
The maize tassel represents one of the most pivotal organs dictating maize yield and quality. Investigating its phenotypic information constitutes an exceedingly crucial task within the realm of breeding work, given that an optimal tassel structure is fundamental for attaining high maize yields. High-throughput phenotyping technologies furnish significant tools to augment the efficiency of analyzing maize tassel phenotypic information. Towards this end, we engineered a fully automated multi-angle digital imaging apparatus dedicated to maize tassels. This device was employed to capture images of tassels from 1227 inbred maize lines falling under three genotype classifications (NSS, TST, and SS). By leveraging the 3D reconstruction algorithm SFM (Structure from Motion), we promptly obtained point clouds of the maize tassels. Subsequently, we harnessed the TreeQSM algorithm, which is custom-designed for extracting tree topological structures, to extract 11 archetypal structural phenotypic parameters of the maize tassels. These encompassed main spike diameter, crown height, main spike length, stem length, stem diameter, the number of branches, total branch length, average crown diameter, maximum crown diameter, convex hull volume, and crown area. Finally, we compared the GFC (Gaussian Fuzzy Clustering algorithm) used in this study with commonly used algorithms, such as RF (Random Forest), SVM (Support Vector Machine), and BPNN (BP Neural Network), as well as k-Means, HCM (Hierarchical), and FCM (Fuzzy C-Means). We then conducted a correlation analysis between the extracted phenotypic parameters of the maize tassel structure and the genotypes of the maize materials. The research results showed that the Gaussian Fuzzy Clustering algorithm was the optimal choice for clustering maize genotypes. Specifically, its classification accuracies for the Non-Stiff Stalk (NSS) genotype and the Tropical and Subtropical (TST) genotype reached 67.7% and 78.5%, respectively. Moreover, among the materials with different maize genotypes, the number of branches, the total branch length, and the main spike length were the three indicators with the highest variability, while the crown volume, the average crown diameter, and the crown area were the three indicators with the lowest variability. This not only provided an important reference for the in-depth exploration of the variability of the phenotypic parameters of maize tassels but also opened up a new approach for screening breeding materials. Full article
(This article belongs to the Section Crop Genetics, Genomics and Breeding)
Show Figures

Figure 1

Back to TopTop