Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (12,266)

Search Parameters:
Keywords = cost-effective approach

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
33 pages, 7296 KiB  
Article
Unsupervised Binary Classifier-Based Object Detection Algorithm with Integrated Background Subtraction Suitable for Use with Aerial Imagery
by Gabija Veličkaitė, Ignas Daugėla and Ivan Suzdalev
Appl. Sci. 2025, 15(15), 8608; https://doi.org/10.3390/app15158608 (registering DOI) - 3 Aug 2025
Abstract
This research presents the development of a novel object detection algorithm designed to identify humans in natural outdoor environments using minimal computational resources. The proposed system, SARGAS, combines a custom convolutional neural network (CNN) classifier with MOG2 background subtraction and partial affine transformations [...] Read more.
This research presents the development of a novel object detection algorithm designed to identify humans in natural outdoor environments using minimal computational resources. The proposed system, SARGAS, combines a custom convolutional neural network (CNN) classifier with MOG2 background subtraction and partial affine transformations for camera stabilization. A secondary CNN refines detections and reduces false positives. Unlike conventional supervised models, SARGAS is trained in a partially unsupervised manner, learning to recognize feature patterns without requiring labeled data. The algorithm achieved a recall of 93%, demonstrating strong detection capability even under challenging conditions. However, the overall accuracy reached 65%, due to a higher rate of false positives—an expected trade-off when maximizing recall. This bias is intentional, as missing a human target in search and rescue applications carries a higher cost than producing additional false detections. While supervised models, such as YOLOv5, perform well on data resembling their training sets, they exhibit significant performance degradation on previously unseen footage. In contrast, SARGAS generalizes more effectively, making it a promising candidate for real-world deployment in environments where labeled training data is limited or unavailable. The results establish a solid foundation for further improvements and suggest that unsupervised CNN-based approaches hold strong potential in object detection tasks. Full article
29 pages, 9514 KiB  
Article
Kennaugh Elements Allow Early Detection of Bark Beetle Infestation in Temperate Forests Using Sentinel-1 Data
by Christine Hechtl, Sarah Hauser, Andreas Schmitt, Marco Heurich and Anna Wendleder
Forests 2025, 16(8), 1272; https://doi.org/10.3390/f16081272 (registering DOI) - 3 Aug 2025
Abstract
Climate change is generally having a negative impact on forest health by inducing drought stress and favouring the spread of pest species, such as bark beetles. The terrestrial monitoring of bark beetle infestation is very time-consuming, especially in the early stages, and therefore [...] Read more.
Climate change is generally having a negative impact on forest health by inducing drought stress and favouring the spread of pest species, such as bark beetles. The terrestrial monitoring of bark beetle infestation is very time-consuming, especially in the early stages, and therefore not feasible for extensive areas, emphasising the need for a comprehensive approach based on remote sensing. Although numerous studies have researched the use of optical data for this task, radar data remains comparatively underexplored. Therefore, this study uses the weekly and cloud-free acquisitions of Sentinel-1 in the Bavarian Forest National Park. Time series analysis within a Multi-SAR framework using Random Forest enables the monitoring of moisture content loss and, consequently, the assessment of tree vitality, which is crucial for the detection of stress conditions conducive to bark beetle outbreaks. High accuracies are achieved in predicting future bark beetle infestation (R2 of 0.83–0.89). These results demonstrate that forest vitality trends ranging from healthy to bark beetle-affected states can be mapped, supporting early intervention strategies. The standard deviation of 0.44 to 0.76 years indicates that the model deviates on average by half a year, mainly due to the uncertainty in the reference data. This temporal uncertainty is acceptable, as half a year provides a sufficient window to identify stressed forest areas and implement targeted management actions before bark beetle damage occurs. The successful application of this technique to extensive test sites in the state of North Rhine-Westphalia proves its transferability. For the first time, the results clearly demonstrate the expected relationship between radar backscatter expressed in the Kennaugh elements K0 and K1 and bark beetle infestation, thereby providing an opportunity for the continuous and cost-effective monitoring of forest health from space. Full article
(This article belongs to the Section Forest Health)
Show Figures

Figure 1

17 pages, 1318 KiB  
Article
Mobile and Wireless Autofluorescence Detection Systems and Their Application for Skin Tissues
by Yizhen Wang, Yuyang Zhang, Yunfei Li and Fuhong Cai
Biosensors 2025, 15(8), 501; https://doi.org/10.3390/bios15080501 (registering DOI) - 3 Aug 2025
Abstract
Skin autofluorescence (SAF) detection technology represents a noninvasive, convenient, and cost-effective optical detection approach. It can be employed for the differentiation of various diseases, including metabolic diseases and dermatitis, as well as for monitoring the treatment efficacy. Distinct from diffuse reflection signals, the [...] Read more.
Skin autofluorescence (SAF) detection technology represents a noninvasive, convenient, and cost-effective optical detection approach. It can be employed for the differentiation of various diseases, including metabolic diseases and dermatitis, as well as for monitoring the treatment efficacy. Distinct from diffuse reflection signals, the autofluorescence signals of biological tissues are relatively weak, making them challenging to be captured by photoelectric sensors. Moreover, the absorption and scattering properties of biological tissues lead to a substantial attenuation of the autofluorescence of biological tissues, thereby worsening the signal-to-noise ratio. This has also imposed limitations on the development and application of compact-sized autofluorescence detection systems. In this study, a compact LED light source and a CMOS sensor were utilized as the excitation and detection devices for skin tissue autofluorescence, respectively, to construct a mobile and wireless skin tissue autofluorescence detection system. This system can achieve the detection of skin tissue autofluorescence with a high signal-to-noise ratio under the drive of a simple power supply and a single-chip microcontroller. The detection time is less than 0.1 s. To enhance the stability of the system, a pressure sensor was incorporated. This pressure sensor can monitor the pressure exerted by the skin on the detection system during the testing process, thereby improving the accuracy of the detection signal. The developed system features a compact structure, user-friendliness, and a favorable signal-to-noise ratio of the detection signal, holding significant application potential in future assessments of skin aging and the risk of diabetic complications. Full article
18 pages, 594 KiB  
Article
Leveraging Dynamic Pricing and Real-Time Grid Analysis: A Danish Perspective on Flexible Industry Optimization
by Sreelatha Aihloor Subramanyam, Sina Ghaemi, Hessam Golmohamadi, Amjad Anvari-Moghaddam and Birgitte Bak-Jensen
Energies 2025, 18(15), 4116; https://doi.org/10.3390/en18154116 (registering DOI) - 3 Aug 2025
Abstract
Flexibility is advocated as an effective solution to address the growing need to alleviate grid congestion, necessitating efficient energy management strategies for industrial operations. This paper presents a mixed-integer linear programming (MILP)-based optimization framework for a flexible asset in an industrial setting, aiming [...] Read more.
Flexibility is advocated as an effective solution to address the growing need to alleviate grid congestion, necessitating efficient energy management strategies for industrial operations. This paper presents a mixed-integer linear programming (MILP)-based optimization framework for a flexible asset in an industrial setting, aiming to minimize operational costs and enhance energy efficiency. The method integrates dynamic pricing and real-time grid analysis, alongside a state estimation model using Extended Kalman Filtering (EKF) that improves the accuracy of system state predictions. Model Predictive Control (MPC) is employed for real-time adjustments. A real-world case studies from aquaculture industries and industrial power grids in Denmark demonstrates the approach. By leveraging dynamic pricing and grid signals, the system enables adaptive pump scheduling, achieving a 27% reduction in energy costs while maintaining voltage stability within 0.95–1.05 p.u. and ensuring operational safety. These results confirm the effectiveness of grid-aware, flexible control in reducing costs and enhancing stability, supporting the transition toward smarter, sustainable industrial energy systems. Full article
(This article belongs to the Section F1: Electrical Power System)
27 pages, 2929 KiB  
Article
Comparative Performance Analysis of Gene Expression Programming and Linear Regression Models for IRI-Based Pavement Condition Index Prediction
by Mostafa M. Radwan, Majid Faissal Jassim, Samir A. B. Al-Jassim, Mahmoud M. Elnahla and Yasser A. S. Gamal
Eng 2025, 6(8), 183; https://doi.org/10.3390/eng6080183 (registering DOI) - 3 Aug 2025
Abstract
Traditional Pavement Condition Index (PCI) assessments are highly resource-intensive, demanding substantial time and labor while generating significant carbon emissions through extensive field operations. To address these sustainability challenges, this research presents an innovative methodology utilizing Gene Expression Programming (GEP) to determine PCI values [...] Read more.
Traditional Pavement Condition Index (PCI) assessments are highly resource-intensive, demanding substantial time and labor while generating significant carbon emissions through extensive field operations. To address these sustainability challenges, this research presents an innovative methodology utilizing Gene Expression Programming (GEP) to determine PCI values based on International Roughness Index (IRI) measurements from Iraqi road networks, offering an environmentally conscious and resource-efficient approach to pavement management. The study incorporated 401 samples of IRI and PCI data through comprehensive visual inspection procedures. The developed GEP model exhibited exceptional predictive performance, with coefficient of determination (R2) values achieving 0.821 for training, 0.858 for validation, and 0.8233 overall, successfully accounting for approximately 82–85% of PCI variance. Prediction accuracy remained robust with Mean Absolute Error (MAE) values of 12–13 units and Root Mean Square Error (RMSE) values of 11.209 and 11.00 for training and validation sets, respectively. The lower validation RMSE suggests effective generalization without overfitting. Strong correlations between predicted and measured values exceeded 0.90, with acceptable relative absolute error values ranging from 0.403 to 0.387, confirming model effectiveness. Comparative analysis reveals GEP outperforms alternative regression methods in generalization capacity, particularly in real-world applications. This sustainable methodology represents a cost-effective alternative to conventional PCI evaluation, significantly reducing environmental impact through decreased field operations, lower fuel consumption, and minimized traffic disruption. By streamlining pavement management while maintaining assessment reliability and accuracy, this approach supports environmentally responsible transportation systems and aligns contemporary sustainability goals in infrastructure management. Full article
(This article belongs to the Section Chemical, Civil and Environmental Engineering)
Show Figures

Figure 1

20 pages, 19537 KiB  
Article
Submarine Topography Classification Using ConDenseNet with Label Smoothing Regularization
by Jingyan Zhang, Kongwen Zhang and Jiangtao Liu
Remote Sens. 2025, 17(15), 2686; https://doi.org/10.3390/rs17152686 (registering DOI) - 3 Aug 2025
Abstract
The classification of submarine topography and geomorphology is essential for marine resource exploitation and ocean engineering, with wide-ranging implications in marine geology, disaster assessment, resource exploration, and autonomous underwater navigation. Submarine landscapes are highly complex and diverse. Traditional visual interpretation methods are not [...] Read more.
The classification of submarine topography and geomorphology is essential for marine resource exploitation and ocean engineering, with wide-ranging implications in marine geology, disaster assessment, resource exploration, and autonomous underwater navigation. Submarine landscapes are highly complex and diverse. Traditional visual interpretation methods are not only inefficient and subjective but also lack the precision required for high-accuracy classification. While many machine learning and deep learning models have achieved promising results in image classification, limited work has been performed on integrating backscatter and bathymetric data for multi-source processing. Existing approaches often suffer from high computational costs and excessive hyperparameter demands. In this study, we propose a novel approach that integrates pruning-enhanced ConDenseNet with label smoothing regularization to reduce misclassification, strengthen the cross-entropy loss function, and significantly lower model complexity. Our method improves classification accuracy by 2% to 10%, reduces the number of hyperparameters by 50% to 96%, and cuts computation time by 50% to 85.5% compared to state-of-the-art models, including AlexNet, VGG, ResNet, and Vision Transformer. These results demonstrate the effectiveness and efficiency of our model for multi-source submarine topography classification. Full article
Show Figures

Figure 1

18 pages, 5178 KiB  
Article
Quantification of Suspended Sediment Concentration Using Laboratory Experimental Data and Machine Learning Model
by Sathvik Reddy Nookala, Jennifer G. Duan, Kun Qi, Jason Pacheco and Sen He
Water 2025, 17(15), 2301; https://doi.org/10.3390/w17152301 (registering DOI) - 2 Aug 2025
Abstract
Monitoring sediment concentration in water bodies is crucial for assessing water quality, ecosystems, and environmental health. However, physical sampling and sensor-based approaches are labor-intensive and unsuitable for large-scale, continuous monitoring. This study employs machine learning models to estimate suspended sediment concentration using images [...] Read more.
Monitoring sediment concentration in water bodies is crucial for assessing water quality, ecosystems, and environmental health. However, physical sampling and sensor-based approaches are labor-intensive and unsuitable for large-scale, continuous monitoring. This study employs machine learning models to estimate suspended sediment concentration using images captured in natural light, named RGB, and near-infrared (NIR) conditions. A controlled dataset of approximately 1300 images with SSC values ranging from 1000 mg/L to 150,000 mg/L was developed, incorporating temperature, time of image capture, and solar irradiance as additional features. Random forest regression and gradient boosting regression were trained on mean RGB values, red reflectance, time of captured, and temperature for natural light images, achieving up to 72.96% accuracy within a 30% relative error. In contrast, NIR images leveraged gray-level co-occurrence matrix texture features and temperature, reaching 83.08% accuracy. Comparative analysis showed that ensemble models outperformed deep learning models like Convolutional Neural Networks and Multi-Layer Perceptrons, which struggled with high-dimensional feature extraction. These findings suggest that using machine learning models and RGB and NIR imagery offers a scalable, non-invasive, and cost-effective way of sediment monitoring in support of water quality assessment and environmental management. Full article
Show Figures

Figure 1

11 pages, 690 KiB  
Article
Leadless Pacemaker Implantation During Extraction in Patients with Active Infection: A Comprehensive Analysis of Safety, Patient Benefits and Costs
by Aviv Solomon, Maor Tzuberi, Anat Berkovitch, Eran Hoch, Roy Beinart and Eyal Nof
J. Clin. Med. 2025, 14(15), 5450; https://doi.org/10.3390/jcm14155450 (registering DOI) - 2 Aug 2025
Abstract
Background: Cardiac implantable electronic device (CIED) infections necessitate extraction and subsequent pacing interventions. Conventional methods after removing the infected CIED system involve temporary or semi-permanent pacing followed by delayed permanent pacemaker (PPM) implantation. Leadless pacemakers (LPs) may offer an alternative, allowing immediate PPM [...] Read more.
Background: Cardiac implantable electronic device (CIED) infections necessitate extraction and subsequent pacing interventions. Conventional methods after removing the infected CIED system involve temporary or semi-permanent pacing followed by delayed permanent pacemaker (PPM) implantation. Leadless pacemakers (LPs) may offer an alternative, allowing immediate PPM implantation without increasing infection risks. Our objective is to evaluate the safety and cost-effectiveness of LP implantation during the same procedure of CIED extraction, compared to conventional two-stage approaches. Methods: Pacemaker-dependent patients with systemic or pocket infection undergoing device extraction and LP implantation during the same procedure at Sheba Medical Center, Israel, were compared to a historical group of patients undergoing a semi-permanent (SP) pacemaker implantation during the procedure, followed by a permanent pacemaker implantation. Results: The cohort included 87 patients, 45 undergoing LP implantation and 42 SP implantation during the extraction procedure. The LP group demonstrated shorter intensive care unit stay (1 ± 3 days vs. 7 ± 12 days, p < 0.001) and overall hospital days (11 ± 24 days vs. 17 ± 17 days, p < 0.001). Rates of infection relapse and one-year mortality were comparable between groups. Economic analysis revealed comparable total costs, despite the higher initial expense of LPs. Conclusions: LP implantation during CIED extraction offers significant clinical and logistical advantages, including reduced hospital stays and streamlined treatment, with comparable safety and cost-effectiveness to conventional approaches. Full article
(This article belongs to the Section Cardiology)
25 pages, 2100 KiB  
Article
Flexible Demand Side Management in Smart Cities: Integrating Diverse User Profiles and Multiple Objectives
by Nuno Souza e Silva and Paulo Ferrão
Energies 2025, 18(15), 4107; https://doi.org/10.3390/en18154107 (registering DOI) - 2 Aug 2025
Abstract
Demand Side Management (DSM) plays a crucial role in modern energy systems, enabling more efficient use of energy resources and contributing to the sustainability of the power grid. This study examines DSM strategies within a multi-environment context encompassing residential, commercial, and industrial sectors, [...] Read more.
Demand Side Management (DSM) plays a crucial role in modern energy systems, enabling more efficient use of energy resources and contributing to the sustainability of the power grid. This study examines DSM strategies within a multi-environment context encompassing residential, commercial, and industrial sectors, with a focus on diverse appliance types that exhibit distinct operational characteristics and user preferences. Initially, a single-objective optimization approach using Genetic Algorithms (GAs) is employed to minimize the total energy cost under a real Time-of-Use (ToU) pricing scheme. This heuristic method allows for the effective scheduling of appliance operations while factoring in their unique characteristics such as power consumption, usage duration, and user-defined operational flexibility. This study extends the optimization problem to a multi-objective framework that incorporates the minimization of CO2 emissions under a real annual energy mix while also accounting for user discomfort. The Non-dominated Sorting Genetic Algorithm II (NSGA-II) is utilized for this purpose, providing a Pareto-optimal set of solutions that balances these competing objectives. The inclusion of multiple objectives ensures a comprehensive assessment of DSM strategies, aiming to reduce environmental impact and enhance user satisfaction. Additionally, this study monitors the Peak-to-Average Ratio (PAR) to evaluate the impact of DSM strategies on load balancing and grid stability. It also analyzes the impact of considering different periods of the year with the associated ToU hourly schedule and CO2 emissions hourly profile. A key innovation of this research is the integration of detailed, category-specific metrics that enable the disaggregation of costs, emissions, and user discomfort across residential, commercial, and industrial appliances. This granularity enables stakeholders to implement tailored strategies that align with specific operational goals and regulatory compliance. Also, the emphasis on a user discomfort indicator allows us to explore the flexibility available in such DSM mechanisms. The results demonstrate the effectiveness of the proposed multi-objective optimization approach in achieving significant cost savings that may reach 20% for industrial applications, while the order of magnitude of the trade-offs involved in terms of emissions reduction, improvement in discomfort, and PAR reduction is quantified for different frameworks. The outcomes not only underscore the efficacy of applying advanced optimization frameworks to real-world problems but also point to pathways for future research in smart energy management. This comprehensive analysis highlights the potential of advanced DSM techniques to enhance the sustainability and resilience of energy systems while also offering valuable policy implications. Full article
Show Figures

Figure 1

24 pages, 1396 KiB  
Article
Design of Experiments Leads to Scalable Analgesic Near-Infrared Fluorescent Coconut Nanoemulsions
by Amit Chandra Das, Gayathri Aparnasai Reddy, Shekh Md. Newaj, Smith Patel, Riddhi Vichare, Lu Liu and Jelena M. Janjic
Pharmaceutics 2025, 17(8), 1010; https://doi.org/10.3390/pharmaceutics17081010 (registering DOI) - 1 Aug 2025
Viewed by 30
Abstract
Background: Pain is a complex phenomenon characterized by unpleasant experiences with profound heterogeneity influenced by biological, psychological, and social factors. According to the National Health Interview Survey, 50.2 million U.S. adults (20.5%) experience pain on most days, with the annual cost of prescription [...] Read more.
Background: Pain is a complex phenomenon characterized by unpleasant experiences with profound heterogeneity influenced by biological, psychological, and social factors. According to the National Health Interview Survey, 50.2 million U.S. adults (20.5%) experience pain on most days, with the annual cost of prescription medication for pain reaching approximately USD 17.8 billion. Theranostic pain nanomedicine therefore emerges as an attractive analgesic strategy with the potential for increased efficacy, reduced side-effects, and treatment personalization. Theranostic nanomedicine combines drug delivery and diagnostic features, allowing for real-time monitoring of analgesic efficacy in vivo using molecular imaging. However, clinical translation of these nanomedicines are challenging due to complex manufacturing methodologies, lack of standardized quality control, and potentially high costs. Quality by Design (QbD) can navigate these challenges and lead to the development of an optimal pain nanomedicine. Our lab previously reported a macrophage-targeted perfluorocarbon nanoemulsion (PFC NE) that demonstrated analgesic efficacy across multiple rodent pain models in both sexes. Here, we report PFC-free, biphasic nanoemulsions formulated with a biocompatible and non-immunogenic plant-based coconut oil loaded with a COX-2 inhibitor and a clinical-grade, indocyanine green (ICG) near-infrared fluorescent (NIRF) dye for parenteral theranostic analgesic nanomedicine. Methods: Critical process parameters and material attributes were identified through the FMECA (Failure, Modes, Effects, and Criticality Analysis) method and optimized using a 3 × 2 full-factorial design of experiments. We investigated the impact of the oil-to-surfactant ratio (w/w) with three different surfactant systems on the colloidal properties of NE. Small-scale (100 mL) batches were manufactured using sonication and microfluidization, and the final formulation was scaled up to 500 mL with microfluidization. The colloidal stability of NE was assessed using dynamic light scattering (DLS) and drug quantification was conducted through reverse-phase HPLC. An in vitro drug release study was conducted using the dialysis bag method, accompanied by HPLC quantification. The formulation was further evaluated for cell viability, cellular uptake, and COX-2 inhibition in the RAW 264.7 macrophage cell line. Results: Nanoemulsion droplet size increased with a higher oil-to-surfactant ratio (w/w) but was no significant impact by the type of surfactant system used. Thermal cycling and serum stability studies confirmed NE colloidal stability upon exposure to high and low temperatures and biological fluids. We also demonstrated the necessity of a solubilizer for long-term fluorescence stability of ICG. The nanoemulsion showed no cellular toxicity and effectively inhibited PGE2 in activated macrophages. Conclusions: To our knowledge, this is the first instance of a celecoxib-loaded theranostic platform developed using a plant-derived hydrocarbon oil, applying the QbD approach that demonstrated COX-2 inhibition. Full article
(This article belongs to the Special Issue Quality by Design in Pharmaceutical Manufacturing)
16 pages, 2640 KiB  
Article
Reactive Aerosol Jet Printing of Ag Nanoparticles: A New Tool for SERS Substrate Preparation
by Eugenio Gibertini, Lydia Federica Gervasini, Jody Albertazzi, Lorenzo Maria Facchetti, Matteo Tommasini, Valentina Busini and Luca Magagnin
Coatings 2025, 15(8), 900; https://doi.org/10.3390/coatings15080900 (registering DOI) - 1 Aug 2025
Viewed by 25
Abstract
The detection of trace chemicals at low and ultra-low concentrations is critical for applications in environmental monitoring, medical diagnostics, food safety and other fields. Conventional detection techniques often lack the required sensitivity, specificity, or cost-effectiveness, making real-time, in situ analysis challenging. Surface-enhanced Raman [...] Read more.
The detection of trace chemicals at low and ultra-low concentrations is critical for applications in environmental monitoring, medical diagnostics, food safety and other fields. Conventional detection techniques often lack the required sensitivity, specificity, or cost-effectiveness, making real-time, in situ analysis challenging. Surface-enhanced Raman spectroscopy (SERS) is a powerful analytical tool, offering improved sensitivity through the enhancement of Raman scattering by plasmonic nanostructures. While noble metals such as Ag and Au are currently the reference choices for SERS substrates, fabrication methods should balance enhancement efficiency, reproducibility and scalability. In this study, we propose a novel approach for SERS substrate fabrication using reactive Aerosol Jet Printing (r-AJP) as an innovative additive manufacturing technique. The r-AJP process enables in-flight Ag seed reduction and nucleation of Ag nanoparticles (NPs) by mixing silver nitrate and ascorbic acid aerosols before deposition, as suggested by computational fluid dynamics (CFD) simulations. The resulting coatings were characterized by X-ray diffraction (XRD) and scanning electron microscopy (SEM) analyses, revealing the formation of nanoporous crystalline Ag agglomerates partially covered by residual matter. The as-prepared SERS substrates exhibited remarkable SERS activity, demonstrating a high enhancement factor (106) for rhodamine (R6G) detection. Our findings highlight the potential of r-AJP as a scalable and cost-effective fabrication strategy for next-generation SERS sensors, paving the way for the development of a new additive manufacturing tool for noble metal material deposition. Full article
(This article belongs to the Section Surface Characterization, Deposition and Modification)
25 pages, 2859 KiB  
Article
Feature-Based Normality Models for Anomaly Detection
by Hui Yie Teh, Kevin I-Kai Wang and Andreas W. Kempa-Liehr
Sensors 2025, 25(15), 4757; https://doi.org/10.3390/s25154757 (registering DOI) - 1 Aug 2025
Viewed by 36
Abstract
Detecting previously unseen anomalies in sensor data is a challenging problem for artificial intelligence when sensor-specific and deployment-specific characteristics of the time series need to be learned from a short calibration period. From the application point of view, this challenge becomes increasingly important [...] Read more.
Detecting previously unseen anomalies in sensor data is a challenging problem for artificial intelligence when sensor-specific and deployment-specific characteristics of the time series need to be learned from a short calibration period. From the application point of view, this challenge becomes increasingly important because many applications are gravitating towards utilising low-cost sensors for Internet of Things deployments. While these sensors offer cost-effectiveness and customisation, their data quality does not match that of their high-end counterparts. To improve sensor data quality while addressing the challenges of anomaly detection in Internet of Things applications, we present an anomaly detection framework that learns a normality model of sensor data. The framework models the typical behaviour of individual sensors, which is crucial for the reliable detection of sensor data anomalies, especially when dealing with sensors observing significantly different signal characteristics. Our framework learns sensor-specific normality models from a small set of anomaly-free training data while employing an unsupervised feature engineering approach to select statistically significant features. The selected features are subsequently used to train a Local Outlier Factor anomaly detection model, which adaptively determines the boundary separating normal data from anomalies. The proposed anomaly detection framework is evaluated on three real-world public environmental monitoring datasets with heterogeneous sensor readings. The sensor-specific normality models are learned from extremely short calibration periods (as short as the first 3 days or 10% of the total recorded data) and outperform four other state-of-the-art anomaly detection approaches with respect to F1-score (between 5.4% and 9.3% better) and Matthews correlation coefficient (between 4.0% and 7.6% better). Full article
(This article belongs to the Special Issue Innovative Approaches to Cybersecurity for IoT and Wireless Networks)
Show Figures

Figure 1

20 pages, 865 KiB  
Review
Barriers and Facilitators to Artificial Intelligence Implementation in Diabetes Management from Healthcare Workers’ Perspective: A Scoping Review
by Giovanni Cangelosi, Andrea Conti, Gabriele Caggianelli, Massimiliano Panella, Fabio Petrelli, Stefano Mancin, Matteo Ratti and Alice Masini
Medicina 2025, 61(8), 1403; https://doi.org/10.3390/medicina61081403 (registering DOI) - 1 Aug 2025
Viewed by 32
Abstract
Background and Objectives: Diabetes is a global public health challenge, with increasing prevalence worldwide. The implementation of artificial intelligence (AI) in the management of this condition offers potential benefits in improving healthcare outcomes. This study primarily investigates the barriers and facilitators perceived by [...] Read more.
Background and Objectives: Diabetes is a global public health challenge, with increasing prevalence worldwide. The implementation of artificial intelligence (AI) in the management of this condition offers potential benefits in improving healthcare outcomes. This study primarily investigates the barriers and facilitators perceived by healthcare professionals in the adoption of AI. Secondarily, by analyzing both quantitative and qualitative data collected, it aims to support the potential development of AI-based programs for diabetes management, with particular focus on a possible bottom-up approach. Materials and Methods: A scoping review was conducted following PRISMA-ScR guidelines for reporting and registered in the Open Science Framework (OSF) database. The study selection process was conducted in two phases—title/abstract screening and full-text review—independently by three researchers, with a fourth resolving conflicts. Data were extracted and assessed using Joanna Briggs Institute (JBI) tools. The included studies were synthesized narratively, combining both quantitative and qualitative analyses to ensure methodological rigor and contextual depth. Results: The adoption of AI tools in diabetes management is influenced by several barriers, including perceived unsatisfactory clinical performance, high costs, issues related to data security and decision-making transparency, as well as limited training among healthcare workers. Key facilitators include improved clinical efficiency, ease of use, time-saving, and organizational support, which contribute to broader acceptance of the technology. Conclusions: The active and continuous involvement of healthcare workers represents a valuable opportunity to develop more effective, reliable, and well-integrated AI solutions in clinical practice. Our findings emphasize the importance of a bottom-up approach and highlight how adequate training and organizational support can help overcome existing barriers, promoting sustainable and equitable innovation aligned with public health priorities. Full article
(This article belongs to the Special Issue Advances in Public Health and Healthcare Management for Chronic Care)
14 pages, 2795 KiB  
Article
Obtaining Rotational Stiffness of Wind Turbine Foundation from Acceleration and Wind Speed SCADA Data
by Jiazhi Dai, Mario Rotea and Nasser Kehtarnavaz
Sensors 2025, 25(15), 4756; https://doi.org/10.3390/s25154756 (registering DOI) - 1 Aug 2025
Viewed by 55
Abstract
Monitoring the health of wind turbine foundations is essential for ensuring their operational safety. This paper presents a cost-effective approach to obtain rotational stiffness of wind turbine foundations by using only acceleration and wind speed data that are part of SCADA data, thus [...] Read more.
Monitoring the health of wind turbine foundations is essential for ensuring their operational safety. This paper presents a cost-effective approach to obtain rotational stiffness of wind turbine foundations by using only acceleration and wind speed data that are part of SCADA data, thus lowering the use of moment and tilt sensors that are currently being used for obtaining foundation stiffness. First, a convolutional neural network model is applied to map acceleration and wind speed data within a moving window to corresponding moment and tilt values. Rotational stiffness of the foundation is then estimated by fitting a line in the moment-tilt plane. The results obtained indicate that such a mapping model can provide stiffness values that are within 7% of ground truth stiffness values on average. Second, the developed mapping model is re-trained by using synthetic acceleration and wind speed data that are generated by an autoencoder generative AI network. The results obtained indicate that although the exact amount of stiffness drop cannot be determined, the drops themselves can be detected. This mapping model can be used not only to lower the cost associated with obtaining foundation rotational stiffness but also to sound an alarm when a foundation starts deteriorating. Full article
(This article belongs to the Special Issue Sensors Technology Applied in Power Systems and Energy Management)
Show Figures

Figure 1

20 pages, 3027 KiB  
Article
Evolutionary Game Analysis of Multi-Agent Synergistic Incentives Driving Green Energy Market Expansion
by Yanping Yang, Xuan Yu and Bojun Wang
Sustainability 2025, 17(15), 7002; https://doi.org/10.3390/su17157002 (registering DOI) - 1 Aug 2025
Viewed by 52
Abstract
Achieving the construction sector’s dual carbon objectives necessitates scaling green energy adoption in new residential buildings. The current literature critically overlooks four unresolved problems: oversimplified penalty mechanisms, ignoring escalating regulatory costs; static subsidies misaligned with market maturity evolution; systematic exclusion of innovation feedback [...] Read more.
Achieving the construction sector’s dual carbon objectives necessitates scaling green energy adoption in new residential buildings. The current literature critically overlooks four unresolved problems: oversimplified penalty mechanisms, ignoring escalating regulatory costs; static subsidies misaligned with market maturity evolution; systematic exclusion of innovation feedback from energy suppliers; and underexplored behavioral evolution of building owners. This study establishes a government–suppliers–owners evolutionary game framework with dynamically calibrated policies, simulated using MATLAB multi-scenario analysis. Novel findings demonstrate: (1) A dual-threshold penalty effect where excessive fines diminish policy returns due to regulatory costs, requiring dynamic calibration distinct from fixed-penalty approaches; (2) Market-maturity-phased subsidies increasing owner adoption probability by 30% through staged progression; (3) Energy suppliers’ cost-reducing innovations as pivotal feedback drivers resolving coordination failures, overlooked in prior tripartite models; (4) Owners’ adoption motivation shifts from short-term economic incentives to environmentally driven decisions under policy guidance. The framework resolves these gaps through integrated dynamic mechanisms, providing policymakers with evidence-based regulatory thresholds, energy suppliers with cost-reduction targets, and academia with replicable modeling tools. Full article
Show Figures

Figure 1

Back to TopTop