Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (8,727)

Search Parameters:
Keywords = Internet-of-things (IoT)

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
28 pages, 5125 KB  
Article
Dual-Branch Hyperspectral Open-Set Classification with Reconstruction–Prototype Fusion for Satellite IoT Perception
by Jialing Tang, Shengwei Lei, Jingqi Liu, Ning Lv and Haibin Qi
Remote Sens. 2025, 17(22), 3722; https://doi.org/10.3390/rs17223722 - 14 Nov 2025
Abstract
The satellite Internet of Things (SatIoT) enables real-time acquisition and large-scale coverage of hyperspectral imagery, providing essential data support for decision-making in domains such as geological exploration, environmental monitoring, and urban management. Hyperspectral remote sensing classification constitutes a critical component of intelligent applications [...] Read more.
The satellite Internet of Things (SatIoT) enables real-time acquisition and large-scale coverage of hyperspectral imagery, providing essential data support for decision-making in domains such as geological exploration, environmental monitoring, and urban management. Hyperspectral remote sensing classification constitutes a critical component of intelligent applications driven by the SatIoT, yet it faces two major challenges: the massive data volume imposes heavy storage and processing burdens on conventional satellite systems, while dimensionality reduction often compromises classification accuracy; furthermore, mainstream neural network models are constrained by insufficient labeled data and spectral shifts, frequently leading to misclassification of unknown categories and degradation of cross-regional performance. To address these issues, this study proposes an open-set hyperspectral classification method with dual branches of reconstruction and prototype-based classification. Specifically, we build upon an autoencoder. We design a spectral–spatial attention module and an information residual connection module. These modules accurately capture spectral–spatial features. This improves the reconstruction accuracy of known classes. It also adapts to the high-dimensional characteristics of satellite data. Prototype representations of unknown classes are constructed by incorporating classification confidence, enabling effective separation in the feature space and targeted recognition of unknown categories in complex scenarios. By jointly leveraging prototype distance and reconstruction error, the proposed method achieves synergistic improvement in both accurate classification of known classes and reliable detection of unknown ones. Comparative experiments and visualization analyses on three publicly available datasets: Salinas-A, PaviaU, and Dioni-demonstrate that the proposed approach significantly outperforms baseline methods such as MDL4OW and IADMRN in terms of unknown detection rate (UDR), open-set overall accuracy (OpenOA), and open-set F1 score, while on the Salinas-A dataset, the performance gap between closed-set and open-set classification is as small as 1.82%, highlighting superior robustness. Full article
Show Figures

Figure 1

27 pages, 915 KB  
Systematic Review
The Impact of IoT-Enabled Routing Optimization on Waste Collection Distance: A Systematic Review and Meta-Analysis
by Rafael R. Maciel, Adler Diniz de Souza, Rodrigo M. A. Almeida and João Paulo R. R. Leite
Logistics 2025, 9(4), 161; https://doi.org/10.3390/logistics9040161 - 14 Nov 2025
Abstract
Background: Waste collection is a critical logistical challenge in urban management, and while Internet of Things (IoT) technologies are increasingly used to optimize collection routes, a systematic, quantitative synthesis of their impact is lacking. This study aims to bridge this gap by [...] Read more.
Background: Waste collection is a critical logistical challenge in urban management, and while Internet of Things (IoT) technologies are increasingly used to optimize collection routes, a systematic, quantitative synthesis of their impact is lacking. This study aims to bridge this gap by quantifying the effect of IoT-enabled routing optimization on waste collection distances. Methods: We conducted a systematic review and meta-analysis following the PRISMA protocol, searching the Scopus, IEEE Xplore, and ACM Digital Library databases. This process yielded 11 eligible studies, providing 21 distinct samples for quantitative synthesis. Results: The analysis reveals that IoT-enabled routing optimization reduces collection distance by a combined average of 21.51%. A significant disparity was found between study types, with simulation-based approaches reporting higher reductions (−39.79%) compared to real-world deployments (−12.37%). No statistically significant performance differences were observed across different routing algorithm categories or Vehicle Routing Problem (VRP) variants. Conclusions: These findings provide robust quantitative evidence of the significant efficiency gains from implementing IoT-based smart waste management systems. The gap between simulated and real-world results underscores the need for practitioners to set realistic expectations, while our analysis supports the adoption of these technologies for more sustainable urban logistics. Full article
23 pages, 1177 KB  
Review
A Survey on Privacy Preservation Techniques in IoT Systems
by Rupinder Kaur, Tiago Rodrigues, Nourin Kadir and Rasha Kashef
Sensors 2025, 25(22), 6967; https://doi.org/10.3390/s25226967 - 14 Nov 2025
Abstract
The Internet of Things (IoT) has become deeply embedded in modern society, enabling applications across smart homes, healthcare, industrial automation, and environmental monitoring. However, as billions of interconnected devices continuously collect and exchange sensitive data, privacy and security concerns have escalated. This survey [...] Read more.
The Internet of Things (IoT) has become deeply embedded in modern society, enabling applications across smart homes, healthcare, industrial automation, and environmental monitoring. However, as billions of interconnected devices continuously collect and exchange sensitive data, privacy and security concerns have escalated. This survey systematically reviews the state-of-the-art privacy-preserving techniques in IoT systems, emphasizing approaches that protect user data during collection, transmission, and storage. Peer-reviewed studies from 2016 to 2025 and technical reports were analyzed to examine applied mechanisms, datasets, and analytical models. Our analysis shows that blockchain and federated learning are the most prevalent decentralized privacy-preserving methods, while homomorphic encryption and differential privacy have recently gained traction for lightweight and edge-based IoT implementations. Despite these advancements, challenges persist, including computational overhead, limited scalability, and real-time performance constraints in resource-constrained devices. Furthermore, gaps remain in cross-domain interoperability, energy-efficient cryptographic designs, and privacy solutions for Unmanned Aerial Vehicle (UAV) and vehicular IoT systems. This survey offers a comprehensive overview of current research trends, identifies critical limitations, and outlines promising future directions to guide the design of secure and privacy-aware IoT architectures. Full article
(This article belongs to the Special Issue Security and Privacy in Wireless Sensor Networks (WSNs))
32 pages, 4190 KB  
Article
AegisGuard: A Multi-Stage Hybrid Intrusion Detection System with Optimized Feature Selection for Industrial IoT Security
by Mounir Mohammad Abou Elasaad, Samir G. Sayed and Mohamed M. El-Dakroury
Sensors 2025, 25(22), 6958; https://doi.org/10.3390/s25226958 - 14 Nov 2025
Abstract
The rapid expansion of the Industrial Internet of Things (IIoT) within smart grid infrastructures has increased the risk of sophisticated cyberattacks, where severe class imbalance and stringent real-time requirements continue to hinder the effectiveness of conventional intrusion detection systems (IDSs). Existing approaches often [...] Read more.
The rapid expansion of the Industrial Internet of Things (IIoT) within smart grid infrastructures has increased the risk of sophisticated cyberattacks, where severe class imbalance and stringent real-time requirements continue to hinder the effectiveness of conventional intrusion detection systems (IDSs). Existing approaches often achieve high accuracy on specific datasets but lack generalizability, interpretability, and stability when deployed across heterogeneous IIoT environments. This paper introduces AegisGuard, a hybrid intrusion detection framework that integrates an adaptive four-stage sampling process with a calibrated ensemble learning strategy. The sampling module dynamically combines SMOTE, SMOTE-ENN, ADASYN, and controlled under sampling to mitigate the extreme imbalance between benign and malicious traffic. A quantum-inspired feature selection mechanism then fuses statistical, informational, and model-based significance measures through a trust-aware weighting scheme to retain only the most discriminative attributes. The optimized ensemble, comprising Random Forest, Extra Trees, LightGBM, XGBoost, and CatBoost, undergoes Optuna-based hyperparameter tuning and post-training probability calibration to minimize false alarms while preserving accuracy. Experimental evaluation on four benchmark datasets demonstrates the robustness and scalability of AegisGuard. On the CIC-IoT 2023 dataset, it achieves 99.6% accuracy and a false alarm rate of 0.31%, while maintaining comparable performance on TON-IoT (98.3%), UNSW-NB15 (98.4%), and Bot-IoT (99.4%). The proposed framework reduces feature dimensionality by 54% and memory usage by 65%, enabling near real-time inference (0.42 s per sample) suitable for operational IIoT environments. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

26 pages, 7162 KB  
Article
A Reconfigurable Channel Receiver Employing Free-Running Oscillator and Frequency Estimation for IoT Applications
by Meng Liu
Electronics 2025, 14(22), 4435; https://doi.org/10.3390/electronics14224435 - 13 Nov 2025
Abstract
The rapid development of the Internet of Things (IoT) has imposed increasingly stringent power consumption requirements on receiver design. Unlike phase-locked loops (PLLs), free-running oscillators eliminate power-hungry loop circuitry. However, the inherent frequency offset of free-running oscillators introduces uncertainty in the intermediate frequency [...] Read more.
The rapid development of the Internet of Things (IoT) has imposed increasingly stringent power consumption requirements on receiver design. Unlike phase-locked loops (PLLs), free-running oscillators eliminate power-hungry loop circuitry. However, the inherent frequency offset of free-running oscillators introduces uncertainty in the intermediate frequency (IF), preventing the receiver from aligning with the desired channel. To address this, we present a reconfigurable channel receiver employing a free-running oscillator and frequency estimation for low-power IoT applications. The proposed receiver first captures a specific preamble sequence corresponding to the desired channel through multiple parallel sub-channels implemented in the digital baseband (DBB), which collectively cover the expected IF frequency range. The desired IF frequency is estimated using the proposed preamble-based frequency estimation (PBFE) algorithm. After frequency estimation, the receiver switches to a single-channel mode and tunes its passband center frequency to the estimated IF frequency, enabling high-sensitivity data reception. Measurement results demonstrate that the PBFE algorithm achieves reliable frequency estimation with a minimum IF signal-to-noise ratio (SNR) of 2 dB and an estimation error below 22 kHz. In single-channel mode, with a residual frequency offset of 30 kHz, an 8-point energy accumulation decoding scheme achieves a bit error rate (BER) of 10−3 at an IF SNR of 5.2 dB. Compared with the case of the original 50 kHz IF frequency offset, the required SNR is improved by 4.1 dB. Full article
(This article belongs to the Section Circuit and Signal Processing)
Show Figures

Figure 1

20 pages, 4080 KB  
Article
From Street Canyons to Corridors: Adapting Urban Propagation Models for an Indoor IQRF Network
by Talip Eren Doyan, Bengisu Yalcinkaya, Deren Dogan, Yaser Dalveren and Mohammad Derawi
Sensors 2025, 25(22), 6950; https://doi.org/10.3390/s25226950 - 13 Nov 2025
Abstract
Among wireless communication technologies underlying Internet of Things (IoT)-based smart buildings, IQRF (Intelligent Connectivity Using Radio Frequency) technology is a promising candidate due to its low power consumption, cost-effectiveness, and wide coverage. However, effectively modeling the propagation characteristics of IQRF in complex indoor [...] Read more.
Among wireless communication technologies underlying Internet of Things (IoT)-based smart buildings, IQRF (Intelligent Connectivity Using Radio Frequency) technology is a promising candidate due to its low power consumption, cost-effectiveness, and wide coverage. However, effectively modeling the propagation characteristics of IQRF in complex indoor environments for simple and accurate network deployment remains challenging, as architectural elements like walls and corners cause substantial signal attenuation and unpredictable propagation behavior. This study investigates the applicability of a site-specific modeling approach, originally developed for urban street canyons, to characterize peer-to-peer (P2P) IQRF links operating at 868 MHz in typical indoor scenarios, including line-of-sight (LoS), one-turn, and two-turn non-line-of-sight (NLoS) configurations. The received signal powers are compared with well-known empirical models, including international telecommunication union radio communication sector (ITU-R) P.1238-9 and WINNER II, and ray-tracing simulations. The results show that while ITU-R P.1238-9 achieves lower prediction error under LoS conditions with a root mean square error (RMSE) of 5.694 dB, the site-specific approach achieves substantially higher accuracy in NLoS scenarios, maintaining RMSE values below 3.9 dB for one- and two-turn links. Furthermore, ray-tracing simulations exhibited notably larger deviations, with RMSE values ranging from 7.522 dB to 16.267 dB and lower correlation with measurements. These results demonstrate the potential of site-specific modeling to provide practical, computationally efficient, and accurate insights for IQRF network deployment planning in smart building environments. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

23 pages, 2168 KB  
Review
Electrospun Nanofiber Platforms for Advanced Sensors in Livestock-Derived Food Quality and Safety Monitoring: A Review
by Karna Ramachandraiah, Elizabeth M. Martin and Alya Limayem
Sensors 2025, 25(22), 6947; https://doi.org/10.3390/s25226947 - 13 Nov 2025
Abstract
Over the past two decades, the meat industry has faced increasing pressure to prevent foodborne outbreaks and reduce economic losses associated with delayed detection of spoilage. This demand has accelerated the development of on-site, real-time sensing tools capable of identifying early signs of [...] Read more.
Over the past two decades, the meat industry has faced increasing pressure to prevent foodborne outbreaks and reduce economic losses associated with delayed detection of spoilage. This demand has accelerated the development of on-site, real-time sensing tools capable of identifying early signs of contamination. Electrospun nanofiber (NF) platforms have emerged as particularly promising due to their large surface area, tunable porosity, and versatile chemistry, which make them ideal scaffolds for immobilizing enzymes, antibodies, or aptamers while preserving bioactivity under field conditions. These NFs have been integrated into optical, electrochemical, and resistive devices, each enhancing response time and sensitivity for key targets ranging from volatile organic compounds indicating early decay to specific bacterial markers and antibiotic residues. In practical applications, NF matrices enhance signal generation (SERS hotspots), facilitate analyte diffusion through three-dimensional networks, and stabilize delicate biorecognition elements for repeated use. This review summarizes major NF fabrication strategies, representative sensor designs for meat quality monitoring, and performance considerations relevant to industrial deployment, including reproducibility, shelf life, and regulatory compliance. The integration of such platforms with data networks and Internet of Things (IoT) nodes offers a path toward continuous, automated surveillance throughout processing and cold-chain logistics. By addressing current technical and regulatory challenges, NF-based biosensors have the potential to significantly reduce waste and safeguard public health through early detection of contamination before it escalates into costly recalls. Full article
(This article belongs to the Section Smart Agriculture)
Show Figures

Figure 1

44 pages, 13672 KB  
Article
A Hybrid Positioning Framework for Large-Scale Three-Dimensional IoT Environments
by Shima Koulaeizadeh, Hatef Javadi, Sudabeh Gholizadeh, Saeid Barshandeh, Giuseppe Loseto and Nicola Epicoco
Sensors 2025, 25(22), 6943; https://doi.org/10.3390/s25226943 - 13 Nov 2025
Abstract
The Internet of Things (IoT) and Edge Computing (EC) play an essential role in today’s communication systems, supporting diverse applications in industry, healthcare, and environmental monitoring; however, these technologies face a major challenge in accurately determining the geographic origin of sensed data, as [...] Read more.
The Internet of Things (IoT) and Edge Computing (EC) play an essential role in today’s communication systems, supporting diverse applications in industry, healthcare, and environmental monitoring; however, these technologies face a major challenge in accurately determining the geographic origin of sensed data, as such data are meaningful only when their source location is known. The use of Global Positioning System (GPS) is often impractical or inefficient in many environments due to limited satellite coverage, high energy consumption, and environmental interference. This paper recruits the Distance Vector-Hop (DV-Hop), Jellyfish Search (JS), and Artificial Rabbits Optimization (ARO) algorithms and presents an innovative GPS-free positioning framework for three-dimensional (3D) EC environments. In the proposed framework, the basic DV-Hop and multi-angulation algorithms are generalized for three-dimensional environments. Next, both algorithms are structurally modified and integrated in a complementary manner to balance exploration and exploitation. Furthermore, a Lévy flight-based perturbation phase and a local search mechanism are incorporated to enhance convergence speed and solution precision. To evaluate performance, sixteen 3D IoT environments with different configurations were simulated, and the results were compared with nine state-of-the-art localization algorithms using MSE, NLE, ALE, and LEV metrics. The quantitative relative improvement ratio test demonstrates that the proposed method is, on average, 39% more accurate than its competitors. Full article
(This article belongs to the Section Sensor Networks)
32 pages, 684 KB  
Systematic Review
Artificial Intelligence (AI) in Construction Safety: A Systematic Literature Review
by Sharmin Jahan Badhan and Reihaneh Samsami
Buildings 2025, 15(22), 4084; https://doi.org/10.3390/buildings15224084 - 13 Nov 2025
Abstract
The construction industry remains among the most hazardous sectors globally, facing persistent safety challenges despite advancements in occupational health and safety OHS) measures. The objective of this study is to systematically analyze the use of Artificial Intelligence (AI) in construction safety management and [...] Read more.
The construction industry remains among the most hazardous sectors globally, facing persistent safety challenges despite advancements in occupational health and safety OHS) measures. The objective of this study is to systematically analyze the use of Artificial Intelligence (AI) in construction safety management and to identify the most effective techniques, data modalities, and validation practices. The method involved a systematic review of 122 peer-reviewed studies published between 2016 and 2025 and retrieved from major academic databases. The selected studies were classified by AI technologies including Machine Learning (ML), Deep Learning (DL), Computer Vision (CV), Natural Language Processing (NLP), and the Internet of Things (IoT), and by their applications in real-time hazard detection, predictive analytics, and automated compliance monitoring. The results show that DL and CV models, particularly Convolutional Neural Network (CNN) and You Only Look Once (YOLO)-based frameworks, are the most frequently implemented for personal protective equipment recognition and proximity monitoring, while ML approaches such as Support Vector Machines (SVM) and ensemble algorithms perform effectively on structured and sensor-based data. Major challenges identified include data quality, generalizability, interpretability, privacy, and integration with existing workflows. The paper concludes that explainable, scalable, and user-centric AI integrated with Building Information Modeling (BIM), Augmented Reality (AR) or Virtual Reality (VR), and wearable technologies is essential to enhance safety performance and achieve sustainable digital transformation in construction environments. Full article
Show Figures

Figure 1

36 pages, 4374 KB  
Review
Spectrum Sensing in Cognitive Radio Internet of Things: State-of-the-Art, Applications, Challenges, and Future Prospects
by Akeem Abimbola Raji and Thomas O. Olwal
J. Sens. Actuator Netw. 2025, 14(6), 109; https://doi.org/10.3390/jsan14060109 - 13 Nov 2025
Abstract
The proliferation of Internet of Things (IoT) devices due to remarkable developments in mobile connectivity has caused a tremendous increase in the consumption of broadband spectrums in fifth generation (5G) mobile access. In order to secure the continued growth of IoT, there is [...] Read more.
The proliferation of Internet of Things (IoT) devices due to remarkable developments in mobile connectivity has caused a tremendous increase in the consumption of broadband spectrums in fifth generation (5G) mobile access. In order to secure the continued growth of IoT, there is a need for efficient management of communication resources in the 5G wireless access. Cognitive radio (CR) is advanced to maximally utilize bandwidth spectrums in the radio communication network. The integration of CR into IoT networks is a promising technology that is aimed at productive utilization of the spectrum, with a view to making more spectral bands available to IoT devices for communication. An important function of CR is spectrum sensing (SS), which enables maximum utilization of the spectrum in the radio networks. Existing SS techniques demonstrate poor performance in noisy channel states and are not immune from the dynamic effects of wireless channels. This article presents a comprehensive review of various approaches commonly used for SS. Furthermore, multi-agent deep reinforcement learning (MADRL) is proposed for enhancing the accuracy of spectrum detection in erratic wireless channels. Finally, we highlight challenges that currently exist in SS in CRIoT networks and further state future research directions in this regard. Full article
Show Figures

Figure 1

17 pages, 3261 KB  
Article
Scalable Generation of Synthetic IoT Network Datasets: A Case Study with Cooja
by Hrant Khachatrian, Aram Dovlatyan, Greta Grigoryan and Theofanis P. Raptis
Future Internet 2025, 17(11), 518; https://doi.org/10.3390/fi17110518 - 13 Nov 2025
Abstract
Predicting the behavior of Internet of Things (IoT) networks under irregular topologies and heterogeneous battery conditions remains a significant challenge. Simulation tools can capture these effects but can require high manual effort and computational capacity, motivating the use of machine learning surrogates. This [...] Read more.
Predicting the behavior of Internet of Things (IoT) networks under irregular topologies and heterogeneous battery conditions remains a significant challenge. Simulation tools can capture these effects but can require high manual effort and computational capacity, motivating the use of machine learning surrogates. This work introduces an automated pipeline for generating large-scale IoT network datasets by bringing together the Contiki-NG firmware, parameterized topology generation, and Slurm-based orchestration of Cooja simulations. The system supports a variety of network structures, scalable node counts, randomized battery allocations, and routing protocols to reproduce diverse failure modes. As a case study, we conduct over 10,000 Cooja simulations with 15–75 battery-powered motes arranged in sparse grid topologies and operating the RPL routing protocol, consuming 1300 CPU-hours in total. The simulations capture realistic failure modes, including unjoined nodes despite physical connectivity and cascading disconnects caused by battery depletion. The resulting graph-structured datasets are used for two prediction tasks: (1) estimating the last successful message delivery time for each node and (2) predicting network-wide spatial coverage. Graph neural network models trained on these datasets outperform baseline regression models and topology-aware heuristics while evaluating substantially faster than full simulations. The proposed framework provides a reproducible foundation for data-driven analysis of energy-limited IoT networks. Full article
Show Figures

Figure 1

50 pages, 3556 KB  
Article
RAVE-HD: A Novel Sequential Deep Learning Approach for Heart Disease Risk Prediction in e-Healthcare
by Muhammad Jaffar Khan, Basit Raza and Muhammad Faheem
Diagnostics 2025, 15(22), 2866; https://doi.org/10.3390/diagnostics15222866 - 12 Nov 2025
Abstract
Background/Objectives: Heart disease (HD) is recently becoming the foremost cause of death worldwide, underlining the importance of early and correct diagnosis to improve patient outcomes. Although Internet of Things (IoT)-enabled machine learning approaches have demonstrated encouraging outcomes in screening, existing approaches often face [...] Read more.
Background/Objectives: Heart disease (HD) is recently becoming the foremost cause of death worldwide, underlining the importance of early and correct diagnosis to improve patient outcomes. Although Internet of Things (IoT)-enabled machine learning approaches have demonstrated encouraging outcomes in screening, existing approaches often face challenges such as imbalanced dataset handling, influential feature selection identification, and the ability to adapt to evolving HD data forms. To tackle the aforementioned challenges, we present a sequential hybrid approach, RAVE-HD (ResNet And Vanilla RNN Ensemble for HD), that combines a number of cutting-edge techniques to enhance screening. Methods: Preprocessing phase includes duplicates removal and feature scaling for data consistency. Recursive Feature Elimination is employed to extract the most informative features, while a proximity-weighted random synthetic sampling technique addresses class imbalance to reduce class biases. The proposed RAVE model in RAVE-HD approach sequentially integrates a Residual Network (ResNet) for high-level feature extraction and Vanilla Recurrent Neural Network to capture the non-linearity of the feature relationships present in the HDHI medical dataset. Results: Compared to ResNet and Vanilla RNN baselines, the proposed RAVE model attained superior results: 92.06% accuracy and 97.12% ROC-AUC. Stratified 10-fold cross-validation validated the robustness of RAVE, while Sensitivity-to-Prevalence analysis demonstrated stable recall and predictable precision across varying disease prevalence levels. Additional evaluations, including bootstrap and DeLong analyses, showed statistical significance (p<0.001) of the discriminative gains of RAVE. Minimum Clinically Important Difference (MCID) evaluation confirmed clinically meaningful improvements (3%) over strong baselines. Cross-dataset validation using the CVD dataset verified robust generalization (92.4% accuracy). SHAP analysis provided interpretability to build clinical trust. Conclusions: RAVE-HD shows promise as a reliable, explainable, and scalable solution for large-scale HD screening, consistently performing well across diverse evaluations and datasets. Through statistical validation, the RAVE-HD approach emerges as a practical decision-support tool in HD predictive screening results. Full article
(This article belongs to the Section Machine Learning and Artificial Intelligence in Diagnostics)
Show Figures

Figure 1

32 pages, 2954 KB  
Review
From Traditional Machine Learning to Fine-Tuning Large Language Models: A Review for Sensors-Based Soil Moisture Forecasting
by Md Babul Islam, Antonio Guerrieri, Raffaele Gravina, Declan T. Delaney and Giancarlo Fortino
Sensors 2025, 25(22), 6903; https://doi.org/10.3390/s25226903 - 12 Nov 2025
Abstract
Smart Agriculture (SA) combines cutting edge technologies such as the Internet of Things (IoT), Artificial Intelligence (AI), and real-time sensing systems with traditional farming practices to enhance productivity, optimize resource use, and support environmental sustainability. A key aspect of SA is the continuous [...] Read more.
Smart Agriculture (SA) combines cutting edge technologies such as the Internet of Things (IoT), Artificial Intelligence (AI), and real-time sensing systems with traditional farming practices to enhance productivity, optimize resource use, and support environmental sustainability. A key aspect of SA is the continuous monitoring of field conditions, particularly Soil Moisture (SM), which plays a crucial role in crop growth and water management. Accurate forecasting of SM allows farmers to make timely irrigation decisions, improve field management, and conserve water. To support this, recent studies have increasingly adopted soil sensors, local weather data, and AI-based data-driven models for SM forecasting. In the literature, most existing review articles lack a structured framework and often overlook recent advancements, including privacy-preserving Federated Learning (FL), Transfer Learning (TL), and the integration of Large Language Models (LLMs). To address this gap, this paper proposes a novel taxonomy for SM forecasting and presents a comprehensive review of existing approaches, including traditional machine learning, deep learning, and hybrid models. Using the PRISMA methodology, we reviewed over 189 papers and selected 68 peer-reviewed studies published between 2017 and 2025. These studies are analyzed based on sensor types, input features, AI techniques, data durations, and evaluation metrics. Six guiding research questions were developed to shape the review and inform the taxonomy. Finally, this work identifies promising research directions, such as the application of TinyML for edge deployment, explainable AI for improved transparency, and privacy-aware model training. This review aims to provide researchers and practitioners with valuable insights for building accurate, scalable, and trustworthy SM forecasting systems to advance SA. Full article
(This article belongs to the Special Issue Feature Papers in the Internet of Things Section 2025)
Show Figures

Figure 1

36 pages, 3031 KB  
Systematic Review
Exploring Smart Furniture: A Systematic Review of Integrated Technologies, Functionalities, and Applications
by Inês Mimoso, Marcelo Brites-Pereira, Leovaldo Alcântara, Maria Inês Morgado, Gualter Morgado, Inês Saavedra, Francisco José Melero Muñoz, Juliana Louceiro and Elísio Costa
Sensors 2025, 25(22), 6900; https://doi.org/10.3390/s25226900 - 12 Nov 2025
Abstract
Smart furniture represents a growing field that integrates Internet of Things (IoT), embedded systems and assistive technologies, yet lacks a comprehensive synthesis of its components and applications. This PRISMA-guided systematic review analysed 35 studies published between 2014 and 2024, sourced from PubMed, Web [...] Read more.
Smart furniture represents a growing field that integrates Internet of Things (IoT), embedded systems and assistive technologies, yet lacks a comprehensive synthesis of its components and applications. This PRISMA-guided systematic review analysed 35 studies published between 2014 and 2024, sourced from PubMed, Web of Science and Scopus. The included studies presented prototypes of smart furniture that used IoT, sensors or automation. The focus was on extracting data related to technological configurations, functional uses, validation methods, maturity levels and commercialisation. Three technological pillars emerged, data collection (n = 31 studies), transmission/processing (n = 30), and actuation (n = 22), often combined into multifunctional systems (n = 14). Health monitoring was the dominant application (n = 15), followed by environmental control (n = 8) and assistive functions for older adults (n = 8). Validation methods varied; 37% relied solely on laboratory testing, while 20% only involved end-users. Only one solution surpassed Technology Readiness Level (TRL) 7 and is currently on the market. Current research remains pre-commercial, with gaps in AI integration, long-term validation, and participatory design. Smart furniture shows promise for healthcare and independent living, but requires standardised evaluation, ethical data practices, and co-creation to achieve market readiness. Full article
(This article belongs to the Section Intelligent Sensors)
Show Figures

Figure 1

36 pages, 2540 KB  
Review
A Framework-Driven Evaluation and Survey of MCU Fault Injection Resilience for IoT
by Igor Seniushin, Natalya Glazyrina, Yernat Atanbayev, Kamal Bairamov, Yenlik Satiyeva, Olzhas Nurman and Mamyr Altaibek
Appl. Sci. 2025, 15(22), 11991; https://doi.org/10.3390/app152211991 - 12 Nov 2025
Viewed by 96
Abstract
With the increasing prevalence of Internet of Things (IoT) devices in areas like authentication, data protection, and access control, general purpose microcontrollers (MCUs) have become the primary platform for security-critical apps. However, the expense of these attacks has decreased significantly in recent years, [...] Read more.
With the increasing prevalence of Internet of Things (IoT) devices in areas like authentication, data protection, and access control, general purpose microcontrollers (MCUs) have become the primary platform for security-critical apps. However, the expense of these attacks has decreased significantly in recent years, making them a viable threat to MCU-based devices. We present a framework-driven perspective with a comparative survey of MCU fault injection resilience for IoT. The survey supports—and is organized around—the procedural evaluation framework we introduce. We discuss the basic requirements for security first, and then categorize the common types of hardware intrusion, including side-channel attacks, fault injection attacks, and invasive methods. We synthesize reported security technologies employed by MCU vendors, such as TrustZone/TEE, Physical Unclonable Functions (PUF), secure boot, flash encryption, secure debugging, and tamper detection, in the context of FIA scenarios. A comparison of representative MCUs—STM32U585, NXP LPC55S69, Nordic nRF54L15, Espressif ESP32-C6, and Renesas RA8M1—highlights cost–security trade-offs relevant to token-class deployments. We position this work as a framework/perspective: an evidence-first FI evaluation protocol for token-class MCUs, a portable checklist unifying PSA/SESIP/CC expectations, and a set of concrete case studies (e.g., ESP32-C6 secure boot hardening). We do not claim a formal systematic review. Full article
Show Figures

Figure 1

Back to TopTop