Next Issue
Volume 7, March
Previous Issue
Volume 6, September
 
 

IoT, Volume 6, Issue 4 (December 2025) – 22 articles

Cover Story (view full-size image): The rapid growth of the Internet of Things has exposed fundamental challenges in trust, identity management, and access control across large-scale, heterogeneous, and resource-constrained environments. In this review, we examine how blockchain technologies address these challenges through decentralized trust frameworks, self-sovereign identity models, smart contract-based access control, and privacy-preserving cryptographic mechanisms. By synthesizing recent advances, architectures, and real-world IoT use cases, we highlight both the strengths and limitations of blockchain-enabled security solutions, identify key regulatory and scalability challenges, and outline future research directions for building secure, trustworthy, and interoperable IoT ecosystems. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
43 pages, 2472 KB  
Article
Privacy-Preserving Federated Learning for Distributed Financial IoT: A Blockchain-Based Framework for Secure Cryptocurrency Market Analytics
by Oleksandr Kuznetsov, Saltanat Adilzhanova, Serhiy Florov, Valerii Bushkov and Danylo Peremetchyk
IoT 2025, 6(4), 78; https://doi.org/10.3390/iot6040078 - 11 Dec 2025
Viewed by 352
Abstract
The proliferation of Internet of Things (IoT) devices in financial markets has created distributed ecosystems where cryptocurrency exchanges, trading platforms, and market data providers operate as autonomous edge nodes generating massive volumes of sensitive financial data. Collaborative machine learning across these distributed financial [...] Read more.
The proliferation of Internet of Things (IoT) devices in financial markets has created distributed ecosystems where cryptocurrency exchanges, trading platforms, and market data providers operate as autonomous edge nodes generating massive volumes of sensitive financial data. Collaborative machine learning across these distributed financial IoT nodes faces fundamental challenges: institutions possess valuable proprietary data but cannot share it directly due to competitive concerns, regulatory constraints, and trust management requirements in decentralized networks. This study presents a privacy-preserving federated learning framework tailored for distributed financial IoT systems, combining differential privacy with Shamir secret sharing to enable secure collaborative intelligence across blockchain-based cryptocurrency trading networks. We implement per-layer gradient clipping and Rényi differential privacy composition to minimize utility loss while maintaining formal privacy guarantees in edge computing scenarios. Using 5.6 million orderbook observations from 11 cryptocurrency pairs collected across distributed exchange nodes, we evaluate three data partitioning strategies simulating realistic heterogeneity patterns in financial IoT deployments. Our experiments reveal that federated edge learning imposes 9–15 percentage point accuracy degradation compared to centralized cloud processing, driven primarily by data distribution heterogeneity across autonomous nodes. Critically, adding differential privacy (ε = 3.0) and cryptographic secret sharing increases this degradation by less than 0.3 percentage points when mechanisms are calibrated appropriately for edge devices. The framework achieves 62–66.5% direction accuracy on cryptocurrency price movements, with confidence-based execution generating 71–137 basis points average profit per trade. These results demonstrate the practical viability of privacy-preserving collaborative intelligence for distributed financial IoT while identifying that the federated optimization gap dominates privacy mechanism costs. Our findings offer architectural insights for designing trustworthy distributed systems in blockchain-enabled financial IoT ecosystems. Full article
(This article belongs to the Special Issue Blockchain-Based Trusted IoT)
Show Figures

Figure 1

20 pages, 324 KB  
Review
LPWAN Technologies for IoT: Real-World Deployment Performance and Practical Comparison
by Dmitrijs Orlovs, Artis Rusins, Valters Skrastiņš and Janis Judvaitis
IoT 2025, 6(4), 77; https://doi.org/10.3390/iot6040077 - 10 Dec 2025
Viewed by 455
Abstract
Low Power Wide Area Networks (LPWAN) have emerged as essential connectivity solutions for the Internet of Things (IoT), addressing requirements for long range, energy efficient communication that traditional wireless technologies cannot meet. With LPWAN connections projected to grow at 26% compound annual growth [...] Read more.
Low Power Wide Area Networks (LPWAN) have emerged as essential connectivity solutions for the Internet of Things (IoT), addressing requirements for long range, energy efficient communication that traditional wireless technologies cannot meet. With LPWAN connections projected to grow at 26% compound annual growth rate until 2027, understanding real-world performance is crucial for technology selection. This review examines four leading LPWAN technologies—LoRaWAN, Sigfox, Narrowband IoT (NB-IoT), and LTE-M. This review analyzes 20 peer reviewed studies from 2015–2025 reporting real-world deployment metrics across power consumption, range, data rate, scalability, availability, and security. Across these studies, practical performance diverges from vendor specifications. In the cited rural and urban LoRaWAN deployments LoRaWAN achieves 2+ year battery life and 11 km rural range but suffers collision limitations above 1000 devices per gateway. Sigfox demonstrates exceptional range (280 km record) with minimal power consumption but remains constrained by 12 byte payloads and security vulnerabilities. NB-IoT provides robust performance with 96–100% packet delivery ratios at −127 dBm on the tested commercial networks, and supports tens of thousands devices per cell, though mobility increases energy consumption. In the cited trials LTE-M offers highest throughput and sub 200 ms latency but fails beyond −113 dBm where NB-IoT maintains connectivity. NB-IoT emerges optimal for large scale stationary deployments, while LTE-M suits high throughput mobile applications. Full article
16 pages, 5273 KB  
Article
Fog Computing and Graph-Based Databases for Remote Health Monitoring in IoMT Settings
by Karrar A. Yousif, Jorge Calvillo-Arbizu and Agustín W. Lara-Romero
IoT 2025, 6(4), 76; https://doi.org/10.3390/iot6040076 - 3 Dec 2025
Viewed by 240
Abstract
Remote patient monitoring is a promising and transformative pillar of healthcare. However, deploying such systems at a scale—across thousands of patients and Internet of Medical Things (IoMT) devices—demands robust, low-latency, and scalable storage systems. This research examines the application of Fog Computing for [...] Read more.
Remote patient monitoring is a promising and transformative pillar of healthcare. However, deploying such systems at a scale—across thousands of patients and Internet of Medical Things (IoMT) devices—demands robust, low-latency, and scalable storage systems. This research examines the application of Fog Computing for remote patient monitoring in IoMT settings, where a large volume of data, low latency, and secure management of confidential healthcare information are essential. We propose a four-layer IoMT–Fog–Cloud architecture in which Fog nodes, equipped with graph-based databases (Neo4j), conduct local processing, filtering, and integration of heterogeneous health data before transmitting it to cloud servers. To assess the viability of our approach, we implemented a containerised Fog node and simulated multiple patient-device networks using a real-world dataset. System performance was evaluated using 11 scenarios with varying numbers of devices and data transmission frequencies. Performance metrics include CPU load, memory footprint, and query latency. The results demonstrate that Neo4j can efficiently ingest and query millions of health observations with an acceptable latency of less than 500 ms, even in extreme scenarios involving more than 12,000 devices transmitting data every 50 ms. The resource consumption remained well below the critical thresholds, highlighting the suitability of the proposed approach for Fog nodes. Combining Fog computing and Neo4j is a novel approach that meets the latency and real-time data ingestion requirements of IoMT environments. Therefore, it is suitable for supporting delay-sensitive monitoring programmes, where rapid detection of anomalies is critical (e.g., a prompt response to cardiac emergencies or early detection of respiratory deterioration in patients with chronic obstructive pulmonary disease), even at a large scale. Full article
(This article belongs to the Special Issue IoT-Based Assistive Technologies and Platforms for Healthcare)
Show Figures

Figure 1

49 pages, 6479 KB  
Article
IoT-Driven Destination Prediction in Smart Urban Mobility: A Comparative Study of Markov Chains and Hidden Markov Models
by João Batista Firmino Junior, Francisco Dantas Nobre Neto, Bruno Neiva Moreno and Tiago Brasileiro Araújo
IoT 2025, 6(4), 75; https://doi.org/10.3390/iot6040075 - 3 Dec 2025
Viewed by 306
Abstract
The increasing availability of IoT-enabled mobility data and intelligent transportation systems in Smart Cities demands efficient and interpretable models for destination prediction. This study presents a comparative analysis between Markov Chains and Hidden Markov Models applied to urban mobility trajectories, evaluated through mean [...] Read more.
The increasing availability of IoT-enabled mobility data and intelligent transportation systems in Smart Cities demands efficient and interpretable models for destination prediction. This study presents a comparative analysis between Markov Chains and Hidden Markov Models applied to urban mobility trajectories, evaluated through mean precision values. To ensure methodological rigor, the Smart Sampling with Data Filtering (SSDF) method was developed, integrating trajectory segmentation, spatial tessellation, frequency aggregation, and 10-fold cross-validation. Using data from 23 vehicles in the Vehicle Energy Dataset (VED) and a filtering threshold based on trajectory recurrence, the results show that the HMM achieved 61% precision versus 59% for Markov Chains (p = 0.0248). Incorporating day-of-week contextual information led to statistically significant precision improvements in 78.3% of cases for precision (95.7% for recall, 87.0% for F1-score). The remaining 21.7% indicate that model selection should balance model complexity and precision-efficiency trade-off. The proposed SSDF method establishes a replicable foundation for evaluating probabilistic models in IoT-based mobility systems, contributing to scalable, explainable, and sustainable Smart City transportation analytics. Full article
(This article belongs to the Special Issue IoT-Driven Smart Cities)
Show Figures

Figure 1

26 pages, 946 KB  
Article
Optimizing IoMT Security: Performance Trade-Offs Between Neural Network Architectural Design, Dimensionality Reduction, and Class Imbalance Handling
by Heyfa Ammar and Asma Cherif
IoT 2025, 6(4), 74; https://doi.org/10.3390/iot6040074 - 29 Nov 2025
Viewed by 204
Abstract
The proliferation of Internet of Medical Things (IoMT) devices in healthcare requires robust intrusion detection systems to protect sensitive data and ensure patient safety. While existing neural network-based Intrusion Detection Systems have shown considerable effectiveness, significant challenges persist—particularly class imbalance and high data [...] Read more.
The proliferation of Internet of Medical Things (IoMT) devices in healthcare requires robust intrusion detection systems to protect sensitive data and ensure patient safety. While existing neural network-based Intrusion Detection Systems have shown considerable effectiveness, significant challenges persist—particularly class imbalance and high data dimensionality. Although various approaches have been proposed to mitigate these issues, their actual impact on detection accuracy remains insufficiently explored. This study investigates advanced Artificial Neural Network (ANN) architectures and preprocessing strategies for intrusion detection in IoMT environments, addressing critical challenges of feature dimensionality and class imbalance. Leveraging the WUSTL-EHMS-2020 dataset—a specialized dataset specifically designed for IoMT cybersecurity research—this research systematically examines the performance of multiple neural network designs. Our research implements and evaluates five distinct ANN architectures: the Standard Feedforward Network, the Enhanced Channel ANN, Dual-Branch Addition and Concatenation ANNs, and the Shortcut Connection ANN. To mitigate the class imbalance challenge, we compare three balancing approaches: the Synthetic Minority Over-sampling Technique (SMOTE), Hybrid Over-Under Sampling, and the Weighted Cross-Entropy Loss Function. Performance analysis reveals nuanced insights across different architectures and balancing strategies. SMOTE-based models achieved average AUC scores ranging from 0.8491 to 0.8766. Hybrid sampling strategies improved performance, with AUC increasing to 0.8750. The weighted cross-entropy loss function demonstrated the most consistent performance. The most significant finding emerges from the Dual-Branch ANN with addition operations and a weighted loss function, which achieved 0.9403 Accuracy, 0.8786 AUC, a 0.8716 F1-Score, 0.8650 Precision, and 0.8786 Recall. Compared to the related work’s baseline, it demonstrates a substantial increase in F1 Score by 8.45% and an improvement of 18.67% in AUC and Recall, highlighting the model’s superiority at identifying potential security threats and minimizing false negatives. Full article
Show Figures

Figure 1

27 pages, 2355 KB  
Article
An IoT-Enabled Digital Twin Architecture with Feature-Optimized Transformer-Based Triage Classifier on a Cloud Platform
by Haider Q. Mutashar, Hiba A. Abu-Alsaad and Sawsan M. Mahmoud
IoT 2025, 6(4), 73; https://doi.org/10.3390/iot6040073 - 26 Nov 2025
Viewed by 382
Abstract
It is essential to assign the correct triage level to patients as soon as they arrive in the emergency department in order to save lives, especially during peak demand. However, many healthcare systems estimate the triage levels by manual eyes-on evaluation, which can [...] Read more.
It is essential to assign the correct triage level to patients as soon as they arrive in the emergency department in order to save lives, especially during peak demand. However, many healthcare systems estimate the triage levels by manual eyes-on evaluation, which can be inconsistent and time consuming. This study creates a full Digital Twin-based architecture for patient monitoring and automated triage level recommendation using IoT sensors, AI, and cloud-based services. The system can monitor all patients’ vital signs through embedded sensors. The readings are used to update the Digital Twin instances that represent the present condition of the patients. This data is then used for triage prediction using a pretrained model that can predict the patients’ triage levels. The training of the model utilized the synthetic minority over-sampling technique, combined with Tomek links to lessen the degree of data imbalance. Additionally, Lagrange element optimization was applied to select those features of the most informative nature. The final triage level is predicted using the Tabular Prior-Data Fitted Network, a transformer-based model tailored for tabular data classification. This combination achieved an overall accuracy of 87.27%. The proposed system demonstrates the potential of integrating digital twins and AI to improve decision support in emergency healthcare environments. Full article
Show Figures

Figure 1

25 pages, 601 KB  
Article
Multi-Flow Complex Event Optimization in the Edge: A Smart Street Scenario
by Halit Uyanık and Tolga Ovatman
IoT 2025, 6(4), 72; https://doi.org/10.3390/iot6040072 - 21 Nov 2025
Viewed by 289
Abstract
Internet of Things (IoT) devices can be used to provide safety, security, and other services that ensure that smart systems work as intended. However, the increasing complexity of the tasks is increasing the potential of performance loss when limited resources are not utilized [...] Read more.
Internet of Things (IoT) devices can be used to provide safety, security, and other services that ensure that smart systems work as intended. However, the increasing complexity of the tasks is increasing the potential of performance loss when limited resources are not utilized appropriately. Distributed complex event processing (CEP) applications can be used to execute multiple unique tasks on sensor data. Since these operations can require a variety of data from multiple sensors across separate task steps, non-optimal code and data management can lead to increased processing delays. In this study, a mathematical model for optimizing critical path performance across multiple independent CEP flows is proposed. The model optimally assigns both where codes are executed at, as well as where their respective data should be placed at. The proposed solution is implemented within an open source library with the inclusion of operator placement heuristics from the literature. Approaches are tested within a realistic smart-street scenario. Consumer delays, algorithm runtimes, and delivery ratios within different time windows are reported. The results indicate that the proposed approach can reduce the delivery times for the critical CEP paths better than the heuristic solutions, with the downside of increased optimization runtimes. Full article
(This article belongs to the Special Issue IoT and Distributed Computing)
Show Figures

Figure 1

24 pages, 814 KB  
Article
A Machine Learning Approach to Detect Denial of Sleep Attacks in Internet of Things (IoT)
by Ishara Dissanayake, Anuradhi Welhenge and Hesiri Dhammika Weerasinghe
IoT 2025, 6(4), 71; https://doi.org/10.3390/iot6040071 - 20 Nov 2025
Cited by 1 | Viewed by 438
Abstract
The Internet of Things (IoT) has rapidly evolved into a central component of today’s technological landscape, enabling seamless connectivity and communication among a vast array of devices. It underpins automation, real-time monitoring, and smart infrastructure, serving as a foundation for Industry 4.0 and [...] Read more.
The Internet of Things (IoT) has rapidly evolved into a central component of today’s technological landscape, enabling seamless connectivity and communication among a vast array of devices. It underpins automation, real-time monitoring, and smart infrastructure, serving as a foundation for Industry 4.0 and paving the way toward Industry 5.0. Despite the potential of IoT systems to transform industries, these systems face a number of challenges, most notably the lack of processing power, storage space, and battery life. Whereas cloud and fog computing help to relieve computational and storage constraints, energy limitations remain a severe impediment to long-term autonomous operation. Among the threats that exploit this weakness, the Denial-of-Sleep (DoSl) attack is particularly problematic because it prevents nodes from entering low-power states, leading to battery depletion and degraded network performance. This research investigates machine-learning (ML) and deep-learning (DL) methods for identifying such energy-wasting behaviors to protect IoT energy resources. A dataset was generated in a simulated IoT environment under multiple DoSl attack conditions to validate the proposed approach. Several ML and DL models were trained and tested on this data to discover distinctive power-consumption patterns related to the attacks. The experimental results confirm that the proposed models can effectively detect anomalous behaviors associated with DoSl activity, demonstrating their potential for energy-aware threat detection in IoT networks. Specifically, the Random Forest and Decision Tree classifiers achieved accuracies of 98.57% and 97.86%, respectively, on the held-out 25% test set, while the Long Short-Term Memory (LSTM) model reached 97.92% accuracy under a chronological split, confirming effective temporal generalization. All evaluations were conducted in a simulated environment, and the paper also outlines potential pathways for future physical testbed deployment. Full article
Show Figures

Figure 1

21 pages, 4097 KB  
Article
Lightweight Quantized XGBoost for Botnet Detection in Resource-Constrained IoT Networks
by Mohammed Rauf Ali Khan, Abdulaziz Y. Barnawi, Adnan Munir, Zainab Alsalman and Dario Marcelo Satan Sanunga
IoT 2025, 6(4), 70; https://doi.org/10.3390/iot6040070 - 18 Nov 2025
Viewed by 513
Abstract
The rapid expansion of IoT devices has introduced significant security challenges, with malware authors constantly evolving their techniques to exploit vulnerabilities in IoT networks. Despite this growing threat, progress in developing effective detection solutions remains limited. In this study, we present an ML-based [...] Read more.
The rapid expansion of IoT devices has introduced significant security challenges, with malware authors constantly evolving their techniques to exploit vulnerabilities in IoT networks. Despite this growing threat, progress in developing effective detection solutions remains limited. In this study, we present an ML-based framework for detecting and classifying network threats targeting IoT environments. Using the CTU-IoT-Malware-Capture 2023 dataset and the UNSW Bot-IoT dataset, we transformed the task into a structured multi-class classification problem to better reflect real-world detection challenges. Our primary contribution lies in demonstrating the effectiveness of post-training quantization on gradient-boosted models, specifically a Quantized XGB variant enhanced with histogram-based quantization. This approach significantly reduces model size and inference time without sacrificing accuracy. The proposed model achieved high classification accuracies of 99.93% and 99.99% on the two datasets, while the quantization step led to 1.42× and 3× improvements in inference speed, and reductions in model size by 3.61× and 2.71×, respectively, making it well-suited for deployment in resource-constrained IoT settings. This work demonstrates not only the effectiveness of gradient boosting in handling complex traffic data but also introduces an efficient optimization strategy for real-time IoT threat detection. Full article
Show Figures

Figure 1

34 pages, 14464 KB  
Article
Modular IoT Architecture for Monitoring and Control of Office Environments Based on Home Assistant
by Yevheniy Khomenko and Sergii Babichev
IoT 2025, 6(4), 69; https://doi.org/10.3390/iot6040069 - 17 Nov 2025
Viewed by 1076
Abstract
Cloud-centric IoT frameworks remain dominant; however, they introduce major challenges related to data privacy, latency, and system resilience. Existing open-source solutions often lack standardized principles for scalable, local-first deployment and do not adequately integrate fault tolerance with hybrid automation logic. This study presents [...] Read more.
Cloud-centric IoT frameworks remain dominant; however, they introduce major challenges related to data privacy, latency, and system resilience. Existing open-source solutions often lack standardized principles for scalable, local-first deployment and do not adequately integrate fault tolerance with hybrid automation logic. This study presents a practical and extensible local-first IoT architecture designed for full operational autonomy using open-source components. The proposed system features a modular, layered design that includes device, communication, data, management, service, security, and presentation layers. It integrates MQTT, Zigbee, REST, and WebSocket protocols to enable reliable publish–subscribe and request–response communication among heterogeneous devices. A hybrid automation model combines rule-based logic with lightweight data-driven routines for context-aware decision-making. The implementation uses Proxmox-based virtualization with Home Assistant as the core automation engine and operates entirely offline, ensuring privacy and continuity without cloud dependency. The architecture was deployed in a real-world office environment and evaluated under workload and fault-injection scenarios. Results demonstrate stable operation with MQTT throughput exceeding 360,000 messages without packet loss, automatic recovery from simulated failures within three minutes, and energy savings of approximately 28% compared to baseline manual control. Compared to established frameworks such as FIWARE and IoT-A, the proposed approach achieves enhanced modularity, local autonomy, and hybrid control capabilities, offering a reproducible model for privacy-sensitive smart environments. Full article
Show Figures

Figure 1

20 pages, 9789 KB  
Article
WireDepth: IoT-Enabled Multi-Sensor Depth Monitoring for Precision Subsoiling in Sugarcane
by Saman Abdanan Mehdizadeh, Aghajan Bahadori, Manocheher Ebadian, Mohammad Hasan Sadeghian, Mansour Nasr Esfahani and Yiannis Ampatzidis
IoT 2025, 6(4), 68; https://doi.org/10.3390/iot6040068 - 14 Nov 2025
Viewed by 357
Abstract
Subsoil compaction is a major constraint in sugarcane production, limiting yields and reducing resource-use efficiency. This study presents WireDepth, an innovative cloud-connected monitoring system that leverages edge computing and IoT technologies for real-time, spatially aware analysis and visualization of subsoiling depth. The system [...] Read more.
Subsoil compaction is a major constraint in sugarcane production, limiting yields and reducing resource-use efficiency. This study presents WireDepth, an innovative cloud-connected monitoring system that leverages edge computing and IoT technologies for real-time, spatially aware analysis and visualization of subsoiling depth. The system integrates ultrasonic, laser, inclinometer, and potentiometer sensors mounted on the subsoiler, with on-board microcontroller processing and dual wireless connectivity (LoRaWAN and NB-IoT/LTE-M) for robust data transmission. A cloud platform delivers advanced analytics, including 3D depth maps and operational efficiency metrics. System accuracy was assessed using 300 reference depth measurements, with Root Mean Square Error (RMSE) and Mean Absolute Percentage Error (MAPE) calculated per sensor. The inclinometer and potentiometer achieved the highest accuracy (MAPE of 0.92% and 0.84%, respectively), with no significant deviation from field measurements (paired t-tests, p > 0.05). Ultrasonic and laser sensors exhibited higher errors, particularly at shallow depths, due to soil debris interference. Correlation analysis confirmed a significant effect of depth on sensor accuracy, with laser sensors showing the strongest association (Pearson r = 0.457, p < 0.001). Field validation in commercial sugarcane fields demonstrated that WireDepth improves subsoiling precision, reduces energy waste, and supports sustainable production by enhancing soil structure and root development. These findings advance precision agriculture by offering a scalable, real-time solution for subsoiling management, with broad implications for yield improvement in compaction-affected systems. Full article
Show Figures

Figure 1

26 pages, 1043 KB  
Article
Centralized Two-Tiered Tree-Based Intrusion-Detection System (C2T-IDS)
by Hisham Abdul Karim Yassine, Mohammed El Saleh, Bilal Ezzeddine Nakhal and Abdallah El Chakik
IoT 2025, 6(4), 67; https://doi.org/10.3390/iot6040067 - 5 Nov 2025
Viewed by 798
Abstract
The exponential growth of Internet of Things (IoT) devices introduces significant security challenges due to their resource constraints and diverse attack surfaces. To address these issues, this paper proposes the Centralized Two-Tiered Tree-Based Intrusion Detection System (C2T-IDS), a lightweight framework designed for efficient [...] Read more.
The exponential growth of Internet of Things (IoT) devices introduces significant security challenges due to their resource constraints and diverse attack surfaces. To address these issues, this paper proposes the Centralized Two-Tiered Tree-Based Intrusion Detection System (C2T-IDS), a lightweight framework designed for efficient and scalable threat detection in IoT networks. The system employs a hybrid edge-centralized architecture, where the first tier, deployed on edge gateways, performs real-time binary classification to detect anomalous traffic using optimized tree-based models. The second tier, hosted on a centralized server, conducts detailed multi-class classification to diagnose specific attack types using advanced ensemble methods. Evaluated on the realistic CIC-IoT-2023 dataset, C2T-IDS achieves a Macro F1-Score of up to 0.94 in detection and 0.80 in diagnosis, outperforming direct multi-class classification by 5–15%. With inference times as low as 6 milliseconds on edge devices, the framework demonstrates a practical balance between accuracy, efficiency, and deployability, offering a robust solution for securing resource-constrained IoT environments. Full article
(This article belongs to the Special Issue IoT and Distributed Computing)
Show Figures

Figure 1

25 pages, 774 KB  
Review
A Systematic Review for Ammonia Monitoring Systems Based on the Internet of Things
by Adriel Henrique Monte Claro da Silva, Mikaelle Karoline da Silva, Augusto Santos and Luis Arturo Gómez-Malagón
IoT 2025, 6(4), 66; https://doi.org/10.3390/iot6040066 - 30 Oct 2025
Viewed by 1340
Abstract
Ammonia is a gas primarily produced for use in agriculture, refrigeration systems, chemical manufacturing, and power generation. Despite its benefits, improper management of ammonia poses significant risks to human health and the environment. Consequently, monitoring ammonia is essential for enhancing industrial safety and [...] Read more.
Ammonia is a gas primarily produced for use in agriculture, refrigeration systems, chemical manufacturing, and power generation. Despite its benefits, improper management of ammonia poses significant risks to human health and the environment. Consequently, monitoring ammonia is essential for enhancing industrial safety and preventing leaks that can lead to environmental contamination. Given the abundance and diversity of studies on Internet of Things (IoT) systems for gas detection, the main objective of this paper is to systematically review the literature to identify emerging research trends and opportunities. This review follows the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) methodology, focusing on sensor technologies, microcontrollers, communication technologies, IoT platforms, and applications. The main findings indicate that most studies employed sensors from the MQ family (particularly the MQ-135 and MQ-137), microcontrollers based on the Xtensa architecture (ESP32 and ESP8266) and ARM Cortex-A processors (Raspberry Pi 3B+/4), with Wi-Fi as the predominant communication technology, and Blynk and ThingSpeak as the primary cloud-based IoT platforms. The most frequent applications were agriculture and environmental monitoring. These findings highlight the growing maturity of IoT technologies in ammonia sensing, while also addressing challenges like sensor reliability, energy efficiency, and development of integrated solutions with Artificial Intelligence. Full article
Show Figures

Figure 1

28 pages, 2443 KB  
Article
Blockchain for Secure IoT: A Review of Identity Management, Access Control, and Trust Mechanisms
by Behnam Khayer, Siamak Mirzaei, Hooman Alavizadeh and Ahmad Salehi Shahraki
IoT 2025, 6(4), 65; https://doi.org/10.3390/iot6040065 - 28 Oct 2025
Cited by 1 | Viewed by 1855
Abstract
Blockchain technologies offer transformative potential in terms of addressing the security, trust, and identity management issues that exist in large-scale Internet of Things (IoT) deployments. This narrative review provides a comprehensive survey of various studies, focusing on decentralized identity management, trust mechanisms, smart [...] Read more.
Blockchain technologies offer transformative potential in terms of addressing the security, trust, and identity management issues that exist in large-scale Internet of Things (IoT) deployments. This narrative review provides a comprehensive survey of various studies, focusing on decentralized identity management, trust mechanisms, smart contracts, privacy preservation, and real-world IoT applications. According to the literature, blockchain-based solutions provide robust authentication through mechanisms such as Physical Unclonable Functions (PUFs), enhance transparency via smart contract-enabled reputation systems, and significantly mitigate vulnerabilities, including single points of failure and Sybil attacks. Smart contracts enable secure interactions by automating resource allocation, access control, and verification. Cryptographic tools, including zero-knowledge proofs (ZKPs), proxy re-encryption, and Merkle trees, further improve data privacy and device integrity. Despite these advantages, challenges persist in areas such as scalability, regulatory and compliance issues, privacy and security concerns, resource constraints, and interoperability. By reviewing the current state-of-the-art literature, this review emphasizes the importance of establishing standardized protocols, performance benchmarks, and robust regulatory frameworks to achieve scalable and secure blockchain-integrated IoT solutions, and provides emerging trends and future research directions for the integration of blockchain technology into the IoT ecosystem. Full article
(This article belongs to the Special Issue Blockchain-Based Trusted IoT)
Show Figures

Figure 1

20 pages, 4527 KB  
Article
Compost Monitoring System for Kitchen Waste Management: Development, Deployment and Analysis
by Sasirekha Gurla Venkata Kameswari, Arun Basavaraju, Chandrashekhar Siva Kumar and Jyotsna Bapat
IoT 2025, 6(4), 64; https://doi.org/10.3390/iot6040064 - 27 Oct 2025
Viewed by 1400
Abstract
Composting can be perceived as an art and science of converting organic waste into a rich and nutritious soil amendment—compost. The existing literature talks about how and what parameters need to be monitored in the process of composting and what actions are to [...] Read more.
Composting can be perceived as an art and science of converting organic waste into a rich and nutritious soil amendment—compost. The existing literature talks about how and what parameters need to be monitored in the process of composting and what actions are to be taken to optimize the process. In this paper, the development, deployment and data analytics of a compost monitoring system are presented, wherein not only the parameters to be measured but also the topology, mechanical design and battery operation details, which are crucial for the deployment of the system, are considered. Having realized that the temperature plays an important role in the process of composting, a contactless method of monitoring the compost temperature, using thermal imaging, has been investigated. Results showing the screenshots of the successfully developed system, plots of the obtained data and the inferences drawn from them are presented. This work not only contributes to the composting data, which is scarce, but also brings out the advantages of using thermal images in addition to temperature sensor probes. Full article
Show Figures

Figure 1

30 pages, 1166 KB  
Article
Case-Based Data Quality Management for IoT Logs: A Case Study Focusing on Detection of Data Quality Issues
by Alexander Schultheis, Yannis Bertrand, Joscha Grüger, Lukas Malburg, Ralph Bergmann and Estefanía Serral Asensio
IoT 2025, 6(4), 63; https://doi.org/10.3390/iot6040063 - 23 Oct 2025
Viewed by 745
Abstract
Smart manufacturing applications increasingly rely on time-series data from Industrial IoT sensors, yet these data streams often contain data quality issues (DQIs) that affect analysis and disrupt production. While traditional Machine Learning methods are difficult to apply due to the small amount of [...] Read more.
Smart manufacturing applications increasingly rely on time-series data from Industrial IoT sensors, yet these data streams often contain data quality issues (DQIs) that affect analysis and disrupt production. While traditional Machine Learning methods are difficult to apply due to the small amount of data available, the knowledge-based approach of Case-Based Reasoning (CBR) offers a way to reuse previously gained experience. We introduce the first end-to-end Case-Based Reasoning (CBR) framework that both detects and remedies DQIs in near real time, even when only a handful of annotated fault instances are available. Our solution encodes expert experience in the four CBR knowledge containers: (i) a vocabulary that represents sensor streams and their context in the DataStream format; (ii) a case base populated with fault-annotated event logs; (iii) tailored similarity measures—including a weighted Dynamic Time Warping variant and structure-aware list mapping—that isolate the signatures of missing-value, missing-sensor, and time-shift errors; and (iv) lightweight adaptation rules that recommend concrete repair actions or, where appropriate, invoke automated imputation and alignment routines. A case study is used to examine and present the suitability of the approach for a specific application domain. Although the case study demonstrates only limited capabilities in identifying Data Quality Issues (DQIs), we aim to support transparent evaluation and future research by publishing (1) a prototype of the Case-Based Reasoning (CBR) system and (2) a publicly accessible, meticulously annotated sensor-log benchmark. Together, these resources provide a reproducible baseline and a modular foundation for advancing similarity metrics, expanding the DQI taxonomy, and enabling knowledge-intensive reasoning in IoT data quality management. Full article
Show Figures

Figure 1

20 pages, 448 KB  
Article
Toward Scalable and Sustainable Detection Systems: A Behavioural Taxonomy and Utility-Based Framework for Security Detection in IoT and IIoT
by Ali Jaddoa, Hasanein Alharbi, Abbas Hommadi and Hussein A. Ismael
IoT 2025, 6(4), 62; https://doi.org/10.3390/iot6040062 - 21 Oct 2025
Viewed by 650
Abstract
Resource-constrained IoT and IIoT systems require detection architectures that balance accuracy with energy efficiency, scalability, and contextual awareness. This paper presents a conceptual framework informed by a systematic review of energy-aware detection systems (XDS), unifying intrusion and anomaly detection systems (IDS and ADS) [...] Read more.
Resource-constrained IoT and IIoT systems require detection architectures that balance accuracy with energy efficiency, scalability, and contextual awareness. This paper presents a conceptual framework informed by a systematic review of energy-aware detection systems (XDS), unifying intrusion and anomaly detection systems (IDS and ADS) within a single framework. The proposed taxonomy captures six key dimensions: energy-awareness, adaptivity, modularity, offloading support, domain scope, and attack coverage. Applying this framework to the recent literature reveals recurring limitations, including static architectures, limited runtime coordination, and narrow evaluation settings. To address these challenges, we introduce a utility-based decision model for multi-layer task placement, guided by operational metrics such as energy cost, latency, and detection complexity. Unlike review-only studies, this work contributes both a synthesis of current limitations and the design of a novel six-dimensional taxonomy and utility-based layered architecture. The study concludes with future directions that support the development of adaptable, sustainable, and context-aware XDS architectures for heterogeneous environments. Full article
Show Figures

Figure 1

29 pages, 1829 KB  
Review
A Comprehensive Review of Cybersecurity Threats to Wireless Infocommunications in the Quantum-Age Cryptography
by Ivan Laktionov, Grygorii Diachenko, Dmytro Moroz and Iryna Getman
IoT 2025, 6(4), 61; https://doi.org/10.3390/iot6040061 - 16 Oct 2025
Viewed by 1943
Abstract
The dynamic growth in the dependence of numerous industrial sectors, businesses, and critical infrastructure on infocommunication technologies necessitates the enhancement of their resilience to cyberattacks and radio-frequency threats. This article addresses a relevant scientific and applied issue, which is to formulate prospective directions [...] Read more.
The dynamic growth in the dependence of numerous industrial sectors, businesses, and critical infrastructure on infocommunication technologies necessitates the enhancement of their resilience to cyberattacks and radio-frequency threats. This article addresses a relevant scientific and applied issue, which is to formulate prospective directions for improving the effectiveness of cybersecurity approaches for infocommunication networks through a comparative analysis and logical synthesis of the state-of-the-art of applied research on cyber threats to the information security of mobile and satellite networks, including those related to the rapid development of quantum computing technologies. The article presents results on the systematisation of cyberattacks at the physical, signalling and cryptographic levels, as well as threats to cryptographic protocols and authentication systems. Particular attention is given to the prospects for implementing post-quantum cryptography, hybrid cryptographic models and the integration of threat detection mechanisms based on machine learning and artificial intelligence algorithms. The article proposes a classification of current threats according to architectural levels, analyses typical protocol vulnerabilities in next-generation mobile networks and satellite communications, and identifies key research gaps in existing cybersecurity approaches. Based on a critical analysis of scientific and applied literature, this article identifies key areas for future research. These include developing lightweight cryptographic algorithms, standardising post-quantum cryptographic models, creating adaptive cybersecurity frameworks and optimising protection mechanisms for resource-constrained devices within information and digital networks. Full article
(This article belongs to the Special Issue Cybersecurity in the Age of the Internet of Things)
Show Figures

Figure 1

30 pages, 782 KB  
Article
BiLSTM-Based Fault Anticipation for Predictive Activation of FRER in Time-Sensitive Industrial Networks
by Mohamed Seliem, Utz Roedig, Cormac Sreenan and Dirk Pesch
IoT 2025, 6(4), 60; https://doi.org/10.3390/iot6040060 - 2 Oct 2025
Viewed by 864
Abstract
Frame Replication and Elimination for Reliability (FRER) in Time-Sensitive Networking (TSN) enhances fault tolerance by duplicating critical traffic across disjoint paths. However, always-on FRER configurations introduce persistent redundancy overhead, even under nominal network conditions. This paper proposes a predictive FRER activation framework that [...] Read more.
Frame Replication and Elimination for Reliability (FRER) in Time-Sensitive Networking (TSN) enhances fault tolerance by duplicating critical traffic across disjoint paths. However, always-on FRER configurations introduce persistent redundancy overhead, even under nominal network conditions. This paper proposes a predictive FRER activation framework that anticipates faults using a Key Performance Indicator (KPI)-driven bidirectional Long Short-Term Memory (BiLSTM) model. By continuously analyzing multivariate KPIs—such as latency, jitter, and retransmission rates—the model forecasts potential faults and proactively activates FRER. Redundancy is deactivated upon KPI recovery or after a defined minimum protection window, thereby reducing bandwidth usage without compromising reliability. The framework includes a Python-based simulation environment, a real-time visualization dashboard built with Streamlit, and a fully integrated runtime controller. The experimental results demonstrate substantial improvements in link utilization while preserving fault protection, highlighting the effectiveness of anticipatory redundancy strategies in industrial TSN environments. Full article
(This article belongs to the Special Issue AIoT-Enabled Sustainable Smart Manufacturing)
Show Figures

Figure 1

20 pages, 5553 KB  
Article
Transmit Power Optimization for Intelligent Reflecting Surface-Assisted Coal Mine Wireless Communication Systems
by Yang Liu, Xiaoyue Li, Bin Wang and Yanhong Xu
IoT 2025, 6(4), 59; https://doi.org/10.3390/iot6040059 - 25 Sep 2025
Viewed by 560
Abstract
The adverse propagation environment in underground coal mine tunnels caused by enclosed spaces, rough surfaces, and dense scatterers severely degrades reliable wireless signal transmission, which further impedes the deployment of IoT applications such as gas monitors and personnel positioning terminals. However, the conventional [...] Read more.
The adverse propagation environment in underground coal mine tunnels caused by enclosed spaces, rough surfaces, and dense scatterers severely degrades reliable wireless signal transmission, which further impedes the deployment of IoT applications such as gas monitors and personnel positioning terminals. However, the conventional power enhancement solutions are infeasible for the underground coal mine scenario due to strict explosion-proof safety regulations and battery-powered IoT devices. To address this challenge, we propose singular value decomposition-based Lagrangian optimization (SVD-LOP) to minimize transmit power at the mining base station (MBS) for IRS-assisted coal mine wireless communication systems. In particular, we first establish a three-dimensional twin cluster geometry-based stochastic model (3D-TCGBSM) to accurately characterize the underground coal mine channel. On this basis, we formulate the MBS transmit power minimization problem constrained by user signal-to-noise ratio (SNR) target and IRS phase shifts. To solve this non-convex problem, we propose the SVD-LOP algorithm that performs SVD on the channel matrix to decouple the complex channel coupling and introduces the Lagrange multipliers. Furthermore, we develop a low-complexity successive convex approximation (LC-SCA) algorithm to reduce computational complexity, which constructs a convex approximation of the objective function based on a first-order Taylor expansion and enables suboptimal solutions. Simulation results demonstrate that the proposed SVD-LOP and LC-SCA algorithms achieve transmit power peaks of 20.8dBm and 21.4dBm, respectively, which are slightly lower than the 21.8dBm observed for the SDR algorithm. It is evident that these algorithms remain well below the explosion-proof safety threshold, which achieves significant power reduction. However, computational complexity analysis reveals that the proposed SVD-LOP and LC-SCA algorithms achieve O(N3) and O(N2) respectively, which offers substantial reductions compared to the SDR algorithm’s O(N7). Moreover, both proposed algorithms exhibit robust convergence across varying user SNR targets while maintaining stable performance gains under different tunnel roughness scenarios. Full article
Show Figures

Figure 1

21 pages, 6984 KB  
Article
Acoustic Trap Design for Biodiversity Detection
by Chingiz Seyidbayli, Bárbara Fengler, Daniel Szafranski and Andreas Reinhardt
IoT 2025, 6(4), 58; https://doi.org/10.3390/iot6040058 - 24 Sep 2025
Viewed by 1533
Abstract
Real-time insect monitoring is essential for sustainable agriculture and biodiversity conservation. The traditional method of attracting insects to colored glue traps and manually counting the catch is time-intensive and requires specialized taxonomic expertise. Moreover, these traps are often lethal to pests and beneficial [...] Read more.
Real-time insect monitoring is essential for sustainable agriculture and biodiversity conservation. The traditional method of attracting insects to colored glue traps and manually counting the catch is time-intensive and requires specialized taxonomic expertise. Moreover, these traps are often lethal to pests and beneficial insects alike, raising both ecological and ethical concerns. Camera-based trap designs have recently emerged to lower the amount of manual labor involved in determining insect species, yet they are still deadly to the catch. This study presents the design and evaluation of a non-lethal acoustic monitoring system capable of detecting and classifying insect species based on their sound signatures. A first prototype was developed with a focus on low self-noise and suitability for autonomous field deployment. The system was initially validated through laboratory experiments, and subsequently tested in six rapeseed fields over a 25-day period. More than 3400 h of acoustic data were successfully collected without system failures. Key findings highlight the importance of carefully selecting each component to minimize self-noise, as insect sounds are extremely low in amplitude. The results also underscore the need for efficient data and energy management strategies in long-term field deployments. This paper aims to share the development process, design decisions, technical challenges, and practical lessons learned over the course of building our IoT sensor system. By outlining what worked, what did not, and what should be improved, this work contributes to the advancement of non-invasive insect monitoring technologies. Full article
Show Figures

Figure 1

18 pages, 2920 KB  
Article
UniTwin: Enabling Multi-Digital Twin Coordination for Modeling Distributed and Complex Systems
by Tim Markus Häußermann, Joel Lehmann, Florian Kolb, Alessa Rache and Julian Reichwald
IoT 2025, 6(4), 57; https://doi.org/10.3390/iot6040057 - 23 Sep 2025
Viewed by 856
Abstract
The growing complexity and scale of Cyber–Physical Systems (CPSs) have led to an increasing need for the holistic orchestration of multiple Digital Twins (DTs). Therefore, an extension to the UniTwin framework is introduced within this paper. UniTwin is a containerized, cloud-native DT framework. [...] Read more.
The growing complexity and scale of Cyber–Physical Systems (CPSs) have led to an increasing need for the holistic orchestration of multiple Digital Twins (DTs). Therefore, an extension to the UniTwin framework is introduced within this paper. UniTwin is a containerized, cloud-native DT framework. This extension enables the hierarchical aggregation of DTs across various abstraction levels. Traditional DT frameworks often lack mechanisms for dynamic composition at the level of entire systems. This is essential for modeling distributed systems in heterogeneous environments. UniTwin addresses this gap by grouping DTs into composite entities with an aggregation mechanism. The aggregation mechanism is demonstrated in a smart manufacturing case study, which covers the orchestration of a production line for personalized shopping cart chips. It uses modular DTs provided for each device within the production line. A System-Aggregated Digital Twin (S-ADT) is used to orchestrate the individual DTs, mapping the devices in the production line. Therefore, the production line adapts and reconfigures according to user-defined parameters. This validates the flexibility and practicality of the aggregation mechanism. This work contributes an aggregation mechanism for the UniTwin framework, paving the way for adaptable DTs for complex CPSs in domains like smart manufacturing, logistics, and infrastructure. Full article
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop