Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (23,043)

Search Parameters:
Keywords = Accuracy assessment

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
13 pages, 3044 KiB  
Article
Improving Event Data in Football Matches: A Case Study Model for Synchronizing Passing Events with Positional Data
by Alberto Cortez, Bruno Gonçalves, João Brito and Hugo Folgado
Appl. Sci. 2025, 15(15), 8694; https://doi.org/10.3390/app15158694 (registering DOI) - 6 Aug 2025
Abstract
In football, accurately pinpointing key events like passes is vital for analyzing player and team performance. Despite continuous technological advancements, existing tracking systems still face challenges in accurately synchronizing events and positional data accurately. This is a case study that proposes a new [...] Read more.
In football, accurately pinpointing key events like passes is vital for analyzing player and team performance. Despite continuous technological advancements, existing tracking systems still face challenges in accurately synchronizing events and positional data accurately. This is a case study that proposes a new method to synchronize events and positional data collected during football matches. Three datasets were used to perform this study: a dataset created by applying a custom algorithm that synchronizes positional and event data, referred to as the optimized synchronization dataset (OSD); a simple temporal alignment between positional and event data, referred to as the raw synchronization dataset (RSD); and a manual notational data (MND) from the match video footage, considered the ground truth observations. The timestamp of the pass in both synchronized datasets was compared to the ground truth observations (MND). Spatial differences in OSD were also compared to the RSD data and to the original data from the provider. Root mean square error (RMSE) and mean absolute error (MAE) were utilized to assess the accuracy of both procedures. More accurate results were observed for optimized dataset, with RMSE values of RSD = 75.16 ms (milliseconds) and OSD = 72.7 ms, and MAE values RSD = 60.50 ms and OSD = 59.73 ms. Spatial accuracy also improved, with OSD showing reduced deviation from RSD compared to the original event data. The mean positional deviation was reduced from 1.59 ± 0.82 m in original event data to 0.41 ± 0.75 m in RSD. In conclusion, the model offers a more accurate method for synchronizing independent datasets for event and positional data. This is particularly beneficial for applications where precise timing and spatial location of actions are critical. In contrast to previous synchronization methods, this approach simplifies the process by using an automated technique based on patterns of ball velocity. This streamlines synchronization across datasets, reduces the need for manual intervention, and makes the method more practical for routine use in applied settings. Full article
Show Figures

Figure 1

17 pages, 3074 KiB  
Article
Finite Element Model Updating for a Continuous Beam-Arch Composite Bridge Based on the RSM and a Nutcracker Optimization Algorithm
by Weihua Zhou, Hongyin Yang, Jing Hao, Mengxiang Zhai, Hongyou Cao, Zhangjun Liu and Kang Wang
Sensors 2025, 25(15), 4831; https://doi.org/10.3390/s25154831 (registering DOI) - 6 Aug 2025
Abstract
Accurate finite element (FE) models are essential for the safety assessment of civil engineering structures. However, obtaining reliable model parameters for existing bridges remains challenging due to the inability to conduct static load tests without disrupting traffic flow. To address this, this study [...] Read more.
Accurate finite element (FE) models are essential for the safety assessment of civil engineering structures. However, obtaining reliable model parameters for existing bridges remains challenging due to the inability to conduct static load tests without disrupting traffic flow. To address this, this study proposes an FE model updating framework that integrates the response surface method and the nutcracker optimization algorithm (NOA). This framework is characterized by the incorporation of ambient vibration data into parameter optimization, thereby enhancing model accuracy. The stochastic subspace identification method is first adopted to extract the bridge’s natural frequencies from vibration data. The response surface method is then employed to construct a response surface function that approximates the FE model. The NOA is subsequently applied to iteratively optimize this response surface function, ensuring rapid convergence and the precise adjustment of the FE model parameter. To validate the effectiveness of the proposed framework, a continuous beam–arch composite bridge with a span of 204.783 m was selected as a case study. The results indicate that the proposed method reduced the average frequency error from 5.58% to 2.75% by updating the model parameters. While the whale optimization algorithm required 21 iterations and the grey wolf optimizer needed 41 iterations to converge near the minimum, the NOA achieved this in merely 13 iterations, demonstrating the NOA’s superior convergence speed. Furthermore, the NOA significantly outperformed both the whale optimization algorithm and the grey wolf optimizer in reducing the error of the first transverse vibration frequency. Full article
(This article belongs to the Section Fault Diagnosis & Sensors)
Show Figures

Figure 1

11 pages, 1461 KiB  
Article
Comparative Analysis of Orbital Morphology Accuracy in 3D Models Based on Cone-Beam and Fan-Beam Computed Tomography Scans for Reconstructive Planning
by Natalia Bielecka-Kowalska, Bartosz Bielecki-Kowalski and Marcin Kozakiewicz
J. Clin. Med. 2025, 14(15), 5541; https://doi.org/10.3390/jcm14155541 (registering DOI) - 6 Aug 2025
Abstract
Background/Objectives: Orbital reconstruction remains one of the most demanding procedures in maxillofacial surgery. It requires not only precise anatomical knowledge but also poses multiple intraoperative challenges. Limited surgical visibility—especially in transconjunctival or transcaruncular approaches—demands exceptional precision from the surgeon. At the same time, [...] Read more.
Background/Objectives: Orbital reconstruction remains one of the most demanding procedures in maxillofacial surgery. It requires not only precise anatomical knowledge but also poses multiple intraoperative challenges. Limited surgical visibility—especially in transconjunctival or transcaruncular approaches—demands exceptional precision from the surgeon. At the same time, the complex anatomical structure of the orbit, its rich vascularization and innervation, and the risk of severe postoperative complications—such as diplopia, sensory deficits, impaired ocular mobility, or in the most serious cases, post-traumatic blindness due to nerve injury or orbital compartment syndrome—necessitate the highest level of surgical accuracy. In this context, patient-specific implants (PSIs), commonly fabricated from zirconium oxide or ultra-high-density polyethylene, have become invaluable. Within CAD-based reconstructive planning, especially for orbital implants, critical factors include the implant’s anatomical fit, passive stabilization on intact bony structures, and non-interference with orbital soft tissues. Above all, precise replication of the orbital dimensions is essential for optimal clinical outcomes. This study compares the morphological accuracy of orbital structures based on anthropometric measurements from 3D models generated from fan-beam computed tomography (FBCT) and cone-beam computed tomography (CBCT). Methods: A cohort group of 500 Caucasian patients aged 8 to 88 years was analyzed. 3D models of the orbits were generated from FBCT and CBCT scans. Anthropometric measurements were taken to evaluate the morphological accuracy of the orbital structures. The assessed parameters included orbital depth, orbital width, the distance from the infraorbital rim to the infraorbital foramen, the distance between the piriform aperture and the infraorbital foramen, and the distance from the zygomatico-orbital foramen to the infraorbital rim. Results: Statistically significant differences were observed between virtual models derived from FBCT and those based on CBCT in several key parameters. Discrepancies were particularly evident in measurements of orbital depth, orbital width, the distance from the infraorbital rim to the infraorbital foramen, the distance between the piriform aperture and the infraorbital foramen, and the distance from the zygomatico-orbital foramen to the infraorbital rim. Conclusions: The statistically significant discrepancies in selected orbital dimensions—particularly in regions of so-called thin bone—demonstrate that FBCT remains the gold standard in the planning and design of CAD/CAM patient-specific orbital implants. Despite its advantages, including greater accessibility and lower radiation dose, CBCT shows limited reliability in the context of orbital and infraorbital reconstruction planning. Full article
(This article belongs to the Special Issue State-of-the-Art Innovations in Oral and Maxillofacial Surgery)
Show Figures

Figure 1

16 pages, 7134 KiB  
Article
The Impact of an Object’s Surface Material and Preparatory Actions on the Accuracy of Optical Coordinate Measurement
by Danuta Owczarek, Ksenia Ostrowska, Jerzy Sładek, Adam Gąska, Wiktor Harmatys, Krzysztof Tomczyk, Danijela Ignjatović and Marek Sieja
Materials 2025, 18(15), 3693; https://doi.org/10.3390/ma18153693 (registering DOI) - 6 Aug 2025
Abstract
Optical coordinate measurement is a universal technique that aligns with the rapid development of industrial technologies and new materials. Nevertheless, can this technique be consistently effective when applied to the precise measurement of all types of materials? As shown in this article, an [...] Read more.
Optical coordinate measurement is a universal technique that aligns with the rapid development of industrial technologies and new materials. Nevertheless, can this technique be consistently effective when applied to the precise measurement of all types of materials? As shown in this article, an analysis of optical measurement systems reveals that some materials cause difficulties during the scanning process. This article details the matting process, resulting, as demonstrated, in lower measurement uncertainty values compared to the pre-matting state, and identifies materials for which applying a matting spray significantly improves the measurement quality. The authors propose a classification of materials into easy-to-scan and hard-to-scan groups, along with specific procedures to improve measurements, especially for the latter. Tests were conducted in an accredited Laboratory of Coordinate Metrology using an articulated arm with a laser probe. Measured objects included spheres made of ceramic, tungsten carbide (including a matte finish), aluminum oxide, titanium nitride-coated steel, and photopolymer resin, with reference diameters established by a high-precision Leitz PMM 12106 coordinate measuring machine. Diameters were determined from point clouds obtained via optical measurements using the best-fit method, both before and after matting. Color measurements using a spectrocolorimeter supplemented this study to assess the effect of matting on surface color. The results revealed correlations between the material type and measurement accuracy. Full article
(This article belongs to the Section Optical and Photonic Materials)
Show Figures

Figure 1

14 pages, 1252 KiB  
Article
Non-Invasive Prediction of Atrial Fibrosis Using a Regression Tree Model of Mean Left Atrial Voltage
by Javier Ibero, Ignacio García-Bolao, Gabriel Ballesteros, Pablo Ramos, Ramón Albarrán-Rincón, Leire Moriones, Jean Bragard and Inés Díaz-Dorronsoro
Biomedicines 2025, 13(8), 1917; https://doi.org/10.3390/biomedicines13081917 (registering DOI) - 6 Aug 2025
Abstract
Background: Atrial fibrosis is a key contributor to atrial cardiomyopathy and can be assessed invasively using mean left atrial voltage (MLAV) from electroanatomical mapping. However, the invasive nature of this procedure limits its clinical applicability. Machine learning (ML), particularly regression tree-based models, [...] Read more.
Background: Atrial fibrosis is a key contributor to atrial cardiomyopathy and can be assessed invasively using mean left atrial voltage (MLAV) from electroanatomical mapping. However, the invasive nature of this procedure limits its clinical applicability. Machine learning (ML), particularly regression tree-based models, may offer a non-invasive approach for predicting MLAV using clinical and echocardiographic data, improving non-invasive atrial fibrosis characterisation beyond current dichotomous classifications. Methods: We prospectively included and followed 113 patients with paroxysmal or persistent atrial fibrillation (AF) undergoing pulmonary vein isolation (PVI) with ultra-high-density voltage mapping (uHDvM), from whom MLAV was estimated. Standardised two-dimensional transthoracic echocardiography was performed before ablation, and clinical and echocardiographic variables were analysed. A regression tree model was constructed using the Classification and Regression Trees—CART-algorithm to identify key predictors of MLAV. Results: The regression tree model exhibited moderate predictive accuracy (R2 = 0.63; 95% CI: 0.55–0.71; root mean squared error = 0.90; 95% CI: 0.82–0.98), with indexed minimum LA volume and passive emptying fraction emerging as the most influential variables. No significant differences in AF recurrence-free survival were found among MLAV tertiles or model-based generated groups (log-rank p = 0.319 and p = 0.126, respectively). Conclusions: We present a novel ML-based regression tree model for non-invasive prediction of MLAV, identifying minimum LA volume and passive emptying fraction as the most significant predictors. This model offers an accessible, non-invasive tool for refining atrial cardiomyopathy characterisation by reflecting the fibrotic substrate as a continuum, a crucial advancement over existing dichotomous approaches to guide tailored therapeutic strategies. Full article
Show Figures

Figure 1

14 pages, 1437 KiB  
Article
Age-Stratified Classification of Common Middle Ear Pathologies Using Pressure-Less Acoustic Immittance (PLAI™) and Machine Learning
by Aleksandar Miladinović, Francesco Bassi, Miloš Ajčević and Agostino Accardo
Healthcare 2025, 13(15), 1921; https://doi.org/10.3390/healthcare13151921 (registering DOI) - 6 Aug 2025
Abstract
Background/Objective: This study explores a novel approach for diagnosing common middle ear pathologies using Pressure-Less Acoustic Immittance (PLAI™), a non-invasive alternative to conventional tympanometry. Methods: A total of 516 ear measurements were collected and stratified into three age groups: 0–3, 3–12, and 12+ [...] Read more.
Background/Objective: This study explores a novel approach for diagnosing common middle ear pathologies using Pressure-Less Acoustic Immittance (PLAI™), a non-invasive alternative to conventional tympanometry. Methods: A total of 516 ear measurements were collected and stratified into three age groups: 0–3, 3–12, and 12+ years, reflecting key developmental stages. PLAI™-derived acoustic parameters, including resonant frequency, peak admittance, canal volume, and resonance peak frequency boundaries, were analyzed using Random Forest classifiers, with SMOTE addressing class imbalance and SHAP values assessing feature importance. Results: Age-specific models demonstrated superior diagnostic accuracy compared to non-stratified approaches, with macro F1-scores of 0.79, 0.84, and 0.78, respectively. Resonant frequency, ear canal volume, and peak admittance consistently emerged as the most informative features. Notably, age-based stratification significantly reduced false negative rates for conditions such as Otitis Media with Effusion and tympanic membrane retractions, enhancing clinical reliability. These results underscore the relevance of age-aware modeling in pediatric audiology and validate PLAI™ as a promising tool for early, pressure-free middle ear diagnostics. Conclusions: While further validation on larger, balanced cohorts is recommended, this study supports the integration of machine learning and acoustic immittance into more accurate, developmentally informed screening frameworks. Full article
Show Figures

Figure 1

20 pages, 1938 KiB  
Article
A Fuzzy MCDM-Based Deep Multi-View Clustering Approach for Large-Scale Multi-View Data Analysis
by Yueyao Li and Bin Wu
Symmetry 2025, 17(8), 1253; https://doi.org/10.3390/sym17081253 (registering DOI) - 6 Aug 2025
Abstract
Multidimensional clustering of large-scale multi-view data is an important topic because it makes possible to combine a variety of manifestations of a complex information set. Nevertheless, comparing and selecting the most suitable deep clustering method is not an easy task, especially when several [...] Read more.
Multidimensional clustering of large-scale multi-view data is an important topic because it makes possible to combine a variety of manifestations of a complex information set. Nevertheless, comparing and selecting the most suitable deep clustering method is not an easy task, especially when several opposing criteria are applied. Multi-criteria decision-making (MCDM) techniques provide systematic approaches to making such judgments, although they are often limited in their ability to handle uncertainty, imprecise judgments, and interdependencies in practice. To solve these problems, this paper suggests a circular Fermatean fuzzy technique order preference by similarity to ideal solution (CFF-TOPSIS) method, which combines improved fuzzy modeling with MCDM to make the decision-making process accurate and sound. By exploiting the intrinsic symmetry of TOPSIS, where distances to positive and negative ideal solutions are treated symmetrically, the proposed model integrates five evaluation criteria for assessing clustering adequacy, including clustering accuracy, scalability, computational complexity, robustness, and interpretability, to critically evaluate five alternative clustering methods based on the input of three decision-makers. This measurement is performed efficiently by the CFF-TOPSIS method based on the uncertainty and subjective judgment contained within circular Fermatean fuzzy sets (CFFSs). The model is reliable and superior to existing models, as confirmed by sensitivity and comparative analyses. The suggested approach provides a systematic and flexible method for making decisions in complex big-data settings, while maintaining symmetry in the evaluation of alternatives and criteria. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

15 pages, 2070 KiB  
Article
Machine Learning for Personalized Prediction of Electrocardiogram (EKG) Use in Emergency Care
by Hairong Wang and Xingyu Zhang
J. Pers. Med. 2025, 15(8), 358; https://doi.org/10.3390/jpm15080358 (registering DOI) - 6 Aug 2025
Abstract
Background: Electrocardiograms (EKGs) are essential tools in emergency medicine, often used to evaluate chest pain, dyspnea, and other symptoms suggestive of cardiac dysfunction. Yet, EKGs are not universally administered to all emergency department (ED) patients. Understanding and predicting which patients receive an [...] Read more.
Background: Electrocardiograms (EKGs) are essential tools in emergency medicine, often used to evaluate chest pain, dyspnea, and other symptoms suggestive of cardiac dysfunction. Yet, EKGs are not universally administered to all emergency department (ED) patients. Understanding and predicting which patients receive an EKG may offer insights into clinical decision making, resource allocation, and potential disparities in care. This study examines whether integrating structured clinical data with free-text patient narratives can improve prediction of EKG utilization in the ED. Methods: We conducted a retrospective observational study to predict electrocardiogram (EKG) utilization using data from 13,115 adult emergency department (ED) visits in the nationally representative 2021 National Hospital Ambulatory Medical Care Survey–Emergency Department (NHAMCS-ED), leveraging both structured features—demographics, vital signs, comorbidities, arrival mode, and triage acuity, with the most influential selected via Lasso regression—and unstructured patient narratives transformed into numerical embeddings using Clinical-BERT. Four supervised learning models—Logistic Regression (LR), Support Vector Machine (SVM), Random Forest (RF) and Extreme Gradient Boosting (XGB)—were trained on three inputs (structured data only, text embeddings only, and a late-fusion combined model); hyperparameters were optimized by grid search with 5-fold cross-validation; performance was evaluated via AUROC, accuracy, sensitivity, specificity and precision; and interpretability was assessed using SHAP values and Permutation Feature Importance. Results: EKGs were administered in 30.6% of adult ED visits. Patients who received EKGs were more likely to be older, White, Medicare-insured, and to present with abnormal vital signs or higher triage severity. Across all models, the combined data approach yielded superior predictive performance. The SVM and LR achieved the highest area under the ROC curve (AUC = 0.860 and 0.861) when using both structured and unstructured data, compared to 0.772 with structured data alone and 0.823 and 0.822 with unstructured data alone. Similar improvements were observed in accuracy, sensitivity, and specificity. Conclusions: Integrating structured clinical data with patient narratives significantly enhances the ability to predict EKG utilization in the emergency department. These findings support a personalized medicine framework by demonstrating how multimodal data integration can enable individualized, real-time decision support in the ED. Full article
(This article belongs to the Special Issue Machine Learning in Epidemiology)
Show Figures

Figure 1

18 pages, 617 KiB  
Article
GNR: Genetic-Embedded Nuclear Reaction Optimization with F-Score Filter for Gene Selection in Cancer Classification
by Shahad Alkamli and Hala Alshamlan
Int. J. Mol. Sci. 2025, 26(15), 7587; https://doi.org/10.3390/ijms26157587 - 6 Aug 2025
Abstract
The classification of cancer based on gene expression profiles is a central challenge in precision oncology due to the high dimensionality and low sample size inherent in microarray datasets. Effective gene selection is crucial for improving classification accuracy while minimizing computational overhead and [...] Read more.
The classification of cancer based on gene expression profiles is a central challenge in precision oncology due to the high dimensionality and low sample size inherent in microarray datasets. Effective gene selection is crucial for improving classification accuracy while minimizing computational overhead and model complexity. This study introduces Genetic-Embedded Nuclear Reaction Optimization (GNR), a novel hybrid metaheuristic that enhances the conventional Nuclear Reaction Optimization (NRO) algorithm by embedding a genetic uniform crossover mechanism into its fusion phase. The proposed algorithm leverages a two-stage process: an initial F-score filtering step to reduce dimensionality, followed by GNR-driven optimization to identify compact, informative gene subsets. Evaluations were conducted on six widely used microarray cancer datasets, with Support Vector Machines (SVM) employed as classifiers and performance assessed via Leave-One-Out Cross-Validation (LOOCV). Results show that GNR consistently outperforms the original NRO and several benchmark hybrid algorithms, achieving 100% classification accuracy with significantly smaller gene subsets across all datasets. These findings confirm the efficacy of the genetic-embedded fusion strategy in enhancing local exploitation while preserving the global search capabilities of NRO, thereby offering a robust and interpretable approach for gene selection in cancer classification. Full article
Show Figures

Figure 1

22 pages, 7820 KiB  
Article
A Junction Temperature Prediction Method Based on Multivariate Linear Regression Using Current Fall Characteristics of SiC MOSFETs
by Haihong Qin, Yang Zhang, Yu Zeng, Yuan Kang, Ziyue Zhu and Fan Wu
Sensors 2025, 25(15), 4828; https://doi.org/10.3390/s25154828 - 6 Aug 2025
Abstract
The junction temperature (Tj) is a key parameter reflecting the thermal behavior of Silicon carbide (SiC) MOSFETs and is essential for condition monitoring and reliability assessment in power electronic systems. However, the limited temperature sensitivity of switching characteristics makes it [...] Read more.
The junction temperature (Tj) is a key parameter reflecting the thermal behavior of Silicon carbide (SiC) MOSFETs and is essential for condition monitoring and reliability assessment in power electronic systems. However, the limited temperature sensitivity of switching characteristics makes it difficult for traditional single temperature-sensitive electrical parameters (TSEPs) to achieve accurate estimation. To address this challenge and enable practical thermal sensing applications, this study proposes an accurate, application-oriented Tj estimation method based on multivariate linear regression (MLR) using turn-off current fall time (tfi) and fall loss (Efi) as complementary TSEPs. First, the feasibility of using current fall time and current fall energy loss as TSEPs is demonstrated. Then, a coupled junction temperature prediction model is developed based on multivariate linear regression using tfi and Efi. The proposed method is experimentally validated through comparative analysis. Experimental results demonstrate that the proposed method achieves high prediction accuracy, highlighting its effectiveness and superiority in MLR approach based on the current fall phase characteristics of SiC MOSFETs. This method offers promising prospects for enhancing the condition monitoring, reliability assessment, and intelligent sensing capabilities of power electronics systems. Full article
Show Figures

Figure 1

11 pages, 443 KiB  
Article
Cognitive Screening with the Italian International HIV Dementia Scale in People Living with HIV: A Cross-Sectional Study in the cART Era
by Maristella Belfiori, Francesco Salis, Sergio Angioni, Claudia Bonalumi, Diva Cabeccia, Camilla Onnis, Nicola Pirisi, Francesco Ortu, Paola Piano, Stefano Del Giacco and Antonella Mandas
Infect. Dis. Rep. 2025, 17(4), 95; https://doi.org/10.3390/idr17040095 (registering DOI) - 6 Aug 2025
Abstract
Background: HIV-associated neurocognitive disorders (HANDs) continue to be a significant concern, despite the advancements in prognosis achieved through Combination Antiretroviral Therapy (cART). Neuropsychological assessment, recommended by international guidelines for HANDs diagnosis, can be resource-intensive. Brief screening tools, like the International HIV Dementia [...] Read more.
Background: HIV-associated neurocognitive disorders (HANDs) continue to be a significant concern, despite the advancements in prognosis achieved through Combination Antiretroviral Therapy (cART). Neuropsychological assessment, recommended by international guidelines for HANDs diagnosis, can be resource-intensive. Brief screening tools, like the International HIV Dementia Scale (IHDS) and the Montreal Cognitive Assessment (MoCA), are crucial in facilitating initial evaluations. This study aims to assess the Italian IHDS (IHDS-IT) and evaluate its sensitivity and specificity in detecting cognitive impairment in HIV patients. Methods: This cross-sectional study involved 294 patients aged ≥30 years, evaluated at the Immunology Unit of the University of Cagliari. Cognitive function was assessed using the MoCA and IHDS. Laboratory parameters, such as CD4 nadir, current CD4 count, and HIV-RNA levels, were also collected. Statistical analyses included Spearman’s correlation, Receiver Operating Characteristic analysis, and the Youden J statistic to identify the optimal IHDS-IT cut-off for cognitive impairment detection. Results: The IHDS and MoCA scores showed a moderate positive correlation (Spearman’s rho = 0.411, p < 0.0001). ROC analysis identified an IHDS-IT cut-off of ≤9, yielding an Area Under the Curve (AUC) of 0.76, sensitivity of 71.7%, and specificity of 67.2%. At this threshold, 73.1% of patients with MoCA scores below 23 also presented abnormal IHDS scores, highlighting the complementary utility of both cognitive assessment instruments. Conclusions: The IHDS-IT exhibited fair diagnostic accuracy for intercepting cognitive impairment, with a lower optimal cut-off than previously reported. The observed differences may reflect this study cohort’s demographic and clinical characteristics, including advanced age and long-lasting HIV infection. Further, longitudinal studies are necessary to validate these findings and to confirm the proposed IHDS cut-off over extended periods. Full article
(This article belongs to the Section HIV-AIDS)
Show Figures

Figure 1

18 pages, 8000 KiB  
Article
Phenology-Aware Machine Learning Framework for Chlorophyll Estimation in Cotton Using Hyperspectral Reflectance
by Chunbo Jiang, Yi Cheng, Yongfu Li, Lei Peng, Gangshang Dong, Ning Lai and Qinglong Geng
Remote Sens. 2025, 17(15), 2713; https://doi.org/10.3390/rs17152713 - 6 Aug 2025
Abstract
Accurate and non-destructive monitoring of leaf chlorophyll content (LCC) is essential for assessing crop photosynthetic activity and nitrogen status in precision agriculture. This study introduces a phenology-aware machine learning framework that combines hyperspectral reflectance data with various regression models to estimate leaf chlorophyll [...] Read more.
Accurate and non-destructive monitoring of leaf chlorophyll content (LCC) is essential for assessing crop photosynthetic activity and nitrogen status in precision agriculture. This study introduces a phenology-aware machine learning framework that combines hyperspectral reflectance data with various regression models to estimate leaf chlorophyll content (LCC) in cotton at six key reproductive stages. Field experiments utilized synchronized spectral and SPAD measurements, incorporating spectral transformations—such as vegetation indices (VIs), first-order derivatives, and trilateration edge parameters (TEPs, a new set of geometric metrics for red-edge characterization)—for evaluation. Five regression approaches were evaluated, including univariate and multivariate linear models, along with three machine learning algorithms: Random Forest, K-Nearest Neighbor, and Support Vector Regression. Random Forest consistently outperformed the other models, achieving the highest R2 (0.85) and the lowest RMSE (4.1) during the bud stage. Notably, the optimal prediction accuracy was achieved with fewer than five spectral features. The proposed framework demonstrates the potential for scalable, stage-specific monitoring of chlorophyll dynamics and offers valuable insights for large-scale crop management applications. Full article
Show Figures

Figure 1

19 pages, 487 KiB  
Review
Smart Clothing and Medical Imaging Innovations for Real-Time Monitoring and Early Detection of Stroke: Bridging Technology and Patient Care
by David Sipos, Kata Vészi, Bence Bogár, Dániel Pető, Gábor Füredi, József Betlehem and Attila András Pandur
Diagnostics 2025, 15(15), 1970; https://doi.org/10.3390/diagnostics15151970 - 6 Aug 2025
Abstract
Stroke is a significant global health concern characterized by the abrupt disruption of cerebral blood flow, leading to neurological impairment. Accurate and timely diagnosis—enabled by imaging modalities such as computed tomography (CT) and magnetic resonance imaging (MRI)—is essential for differentiating stroke types and [...] Read more.
Stroke is a significant global health concern characterized by the abrupt disruption of cerebral blood flow, leading to neurological impairment. Accurate and timely diagnosis—enabled by imaging modalities such as computed tomography (CT) and magnetic resonance imaging (MRI)—is essential for differentiating stroke types and initiating interventions like thrombolysis, thrombectomy, or surgical management. In parallel, recent advancements in wearable technology, particularly smart clothing, offer new opportunities for stroke prevention, real-time monitoring, and rehabilitation. These garments integrate various sensors, including electrocardiogram (ECG) electrodes, electroencephalography (EEG) caps, electromyography (EMG) sensors, and motion or pressure sensors, to continuously track physiological and functional parameters. For example, ECG shirts monitor cardiac rhythm to detect atrial fibrillation, smart socks assess gait asymmetry for early mobility decline, and EEG caps provide data on neurocognitive recovery during rehabilitation. These technologies support personalized care across the stroke continuum, from early risk detection and acute event monitoring to long-term recovery. Integration with AI-driven analytics further enhances diagnostic accuracy and therapy optimization. This narrative review explores the application of smart clothing in conjunction with traditional imaging to improve stroke management and patient outcomes through a more proactive, connected, and patient-centered approach. Full article
Show Figures

Figure 1

31 pages, 1803 KiB  
Article
A Hybrid Machine Learning Approach for High-Accuracy Energy Consumption Prediction Using Indoor Environmental Quality Sensors
by Bibars Amangeldy, Nurdaulet Tasmurzayev, Timur Imankulov, Baglan Imanbek, Waldemar Wójcik and Yedil Nurakhov
Energies 2025, 18(15), 4164; https://doi.org/10.3390/en18154164 - 6 Aug 2025
Abstract
Accurate forecasting of energy consumption in buildings is essential for achieving energy efficiency and reducing carbon emissions. However, many existing models rely on limited input variables and overlook the complex influence of indoor environmental quality (IEQ). In this study, we assess the performance [...] Read more.
Accurate forecasting of energy consumption in buildings is essential for achieving energy efficiency and reducing carbon emissions. However, many existing models rely on limited input variables and overlook the complex influence of indoor environmental quality (IEQ). In this study, we assess the performance of hybrid machine learning ensembles for predicting hourly energy demand in a smart office environment using high-frequency IEQ sensor data. Environmental variables including carbon dioxide concentration (CO2), particulate matter (PM2.5), total volatile organic compounds (TVOCs), noise levels, humidity, and temperature were recorded over a four-month period. We evaluated two ensemble configurations combining support vector regression (SVR) with either Random Forest or LightGBM as base learners and Ridge regression as a meta-learner, alongside single-model baselines such as SVR and artificial neural networks (ANN). The SVR combined with Random Forest and Ridge regression demonstrated the highest predictive performance, achieving a mean absolute error (MAE) of 1.20, a mean absolute percentage error (MAPE) of 8.92%, and a coefficient of determination (R2) of 0.82. Feature importance analysis using SHAP values, together with non-parametric statistical testing, identified TVOCs, humidity, and PM2.5 as the most influential predictors of energy use. These findings highlight the value of integrating high-resolution IEQ data into predictive frameworks and demonstrate that such data can significantly improve forecasting accuracy. This effect is attributed to the direct link between these IEQ variables and the activation of energy-intensive systems; fluctuations in humidity drive HVAC energy use for dehumidification, while elevated pollutant levels (TVOCs, PM2.5) trigger increased ventilation to maintain indoor air quality, thus raising the total energy load. Full article
Show Figures

Figure 1

23 pages, 3337 KiB  
Article
Imbalance Charge Reduction in the Italian Intra-Day Market Using Short-Term Forecasting of Photovoltaic Generation
by Cristina Ventura, Giuseppe Marco Tina and Santi Agatino Rizzo
Energies 2025, 18(15), 4161; https://doi.org/10.3390/en18154161 - 5 Aug 2025
Abstract
In the Italian intra-day electricity market (MI-XBID), where energy positions can be adjusted up to one hour before delivery, imbalance charges due to forecast errors from non-programmable renewable sources represent a critical issue. This work focuses on photovoltaic (PV) systems, whose production variability [...] Read more.
In the Italian intra-day electricity market (MI-XBID), where energy positions can be adjusted up to one hour before delivery, imbalance charges due to forecast errors from non-programmable renewable sources represent a critical issue. This work focuses on photovoltaic (PV) systems, whose production variability makes them particularly sensitive to forecast accuracy. To address these challenges, a comprehensive methodology for assessing and mitigating imbalance penalties by integrating a short-term PV forecasting model with a battery energy storage system is proposed. Unlike conventional approaches that focus exclusively on improving statistical accuracy, this study emphasizes the economic and regulatory impact of forecast errors under the current Italian imbalance settlement framework. A hybrid physical-artificial neural network is developed to forecast PV power one hour in advance, combining historical production data and clear-sky irradiance estimates. The resulting imbalances are analyzed using regulatory tolerance thresholds. Simulation results show that, by adopting a control strategy aimed at maintaining the battery’s state of charge around 50%, imbalance penalties can be completely eliminated using a storage system sized for just over 2 equivalent hours of storage capacity. The methodology provides a practical tool for market participants to quantify the benefits of storage integration and can be generalized to other electricity markets where tolerance bands for imbalances are applied. Full article
(This article belongs to the Special Issue Advanced Forecasting Methods for Sustainable Power Grid: 2nd Edition)
Show Figures

Figure 1

Back to TopTop