Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (135)

Search Parameters:
Keywords = IIoT architectures

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
10 pages, 637 KiB  
Proceeding Paper
Improving Industrial Control System Cybersecurity with Time-Series Prediction Models
by Velizar Varbanov and Tatiana Atanasova
Eng. Proc. 2025, 101(1), 4; https://doi.org/10.3390/engproc2025101004 - 22 Jul 2025
Viewed by 88
Abstract
Traditional security detection methods struggle to identify zero-day attacks in Industrial Control Systems (ICSs), particularly within critical infrastructures (CIs) integrated with the Industrial Internet of Things (IIoT). These attacks exploit unknown vulnerabilities, leveraging the complexity of physical and digital system interconnections, making them [...] Read more.
Traditional security detection methods struggle to identify zero-day attacks in Industrial Control Systems (ICSs), particularly within critical infrastructures (CIs) integrated with the Industrial Internet of Things (IIoT). These attacks exploit unknown vulnerabilities, leveraging the complexity of physical and digital system interconnections, making them difficult to detect. The integration of legacy ICS networks with modern computing and networking technologies has expanded the attack surface, increasing susceptibility to cyber threats. Anomaly detection systems play a crucial role in safeguarding these infrastructures by identifying deviations from normal operations. This study investigates the effectiveness of deep learning-based anomaly detection models in revealing operational anomalies that could indicate potential cyber-attacks. We implemented and evaluated a hybrid deep learning architecture combining Convolutional Neural Networks (CNNs) and Long Short-Term Memory (LSTM) networks to analyze ICS telemetry data. The CNN-LSTM model excels in identifying time-dependent anomalies and enables near real-time detection of cyber-attacks, significantly improving security monitoring capabilities for IIoT-integrated critical infrastructures. Full article
Show Figures

Figure 1

28 pages, 1293 KiB  
Article
A Lightweight Double-Deep Q-Network for Energy Efficiency Optimization of Industrial IoT Devices in Thermal Power Plants
by Shuang Gao, Yuntao Zou and Li Feng
Electronics 2025, 14(13), 2569; https://doi.org/10.3390/electronics14132569 - 25 Jun 2025
Viewed by 330
Abstract
Industrial Internet of Things (IIoT) deployments in thermal power plants face significant energy efficiency challenges due to harsh operating conditions and device resource constraints. This paper presents gradient memory double-deep Q-network (GM-DDQN), a lightweight reinforcement learning approach for energy optimization on resource-constrained IIoT [...] Read more.
Industrial Internet of Things (IIoT) deployments in thermal power plants face significant energy efficiency challenges due to harsh operating conditions and device resource constraints. This paper presents gradient memory double-deep Q-network (GM-DDQN), a lightweight reinforcement learning approach for energy optimization on resource-constrained IIoT devices. At its core, GM-DDQN introduces the gradient memory mechanism, a novel memory-efficient alternative to experience replay. This core innovation, combined with a simplified neural network architecture and efficient parameter quantization, collectively reduces memory requirements by 99% and computation time by 85–90% compared to standard methods. Experimental evaluations across three realistic simulated thermal power plant scenarios demonstrate that GM-DDQN improves energy efficiency by 42% compared to fixed policies and 27% compared to threshold-based approaches, extending battery lifetime from 8–9 months to 14–15 months while maintaining 96–97% PSR. The method enables sophisticated reinforcement learning directly on IIoT edge devices without requiring cloud connectivity, reducing maintenance costs and improving monitoring reliability in industrial environments. Full article
Show Figures

Figure 1

34 pages, 2216 KiB  
Article
An Optimized Transformer–GAN–AE for Intrusion Detection in Edge and IIoT Systems: Experimental Insights from WUSTL-IIoT-2021, EdgeIIoTset, and TON_IoT Datasets
by Ahmad Salehiyan, Pardis Sadatian Moghaddam and Masoud Kaveh
Future Internet 2025, 17(7), 279; https://doi.org/10.3390/fi17070279 - 24 Jun 2025
Viewed by 396
Abstract
The rapid expansion of Edge and Industrial Internet of Things (IIoT) systems has intensified the risk and complexity of cyberattacks. Detecting advanced intrusions in these heterogeneous and high-dimensional environments remains challenging. As the IIoT becomes integral to critical infrastructure, ensuring security is crucial [...] Read more.
The rapid expansion of Edge and Industrial Internet of Things (IIoT) systems has intensified the risk and complexity of cyberattacks. Detecting advanced intrusions in these heterogeneous and high-dimensional environments remains challenging. As the IIoT becomes integral to critical infrastructure, ensuring security is crucial to prevent disruptions and data breaches. Traditional IDS approaches often fall short against evolving threats, highlighting the need for intelligent and adaptive solutions. While deep learning (DL) offers strong capabilities for pattern recognition, single-model architectures often lack robustness. Thus, hybrid and optimized DL models are increasingly necessary to improve detection performance and address data imbalance and noise. In this study, we propose an optimized hybrid DL framework that combines a transformer, generative adversarial network (GAN), and autoencoder (AE) components, referred to as Transformer–GAN–AE, for robust intrusion detection in Edge and IIoT environments. To enhance the training and convergence of the GAN component, we integrate an improved chimp optimization algorithm (IChOA) for hyperparameter tuning and feature refinement. The proposed method is evaluated using three recent and comprehensive benchmark datasets, WUSTL-IIoT-2021, EdgeIIoTset, and TON_IoT, widely recognized as standard testbeds for IIoT intrusion detection research. Extensive experiments are conducted to assess the model’s performance compared to several state-of-the-art techniques, including standard GAN, convolutional neural network (CNN), deep belief network (DBN), time-series transformer (TST), bidirectional encoder representations from transformers (BERT), and extreme gradient boosting (XGBoost). Evaluation metrics include accuracy, recall, AUC, and run time. Results demonstrate that the proposed Transformer–GAN–AE framework outperforms all baseline methods, achieving a best accuracy of 98.92%, along with superior recall and AUC values. The integration of IChOA enhances GAN stability and accelerates training by optimizing hyperparameters. Together with the transformer for temporal feature extraction and the AE for denoising, the hybrid architecture effectively addresses complex, imbalanced intrusion data. The proposed optimized Transformer–GAN–AE model demonstrates high accuracy and robustness, offering a scalable solution for real-world Edge and IIoT intrusion detection. Full article
Show Figures

Figure 1

22 pages, 2535 KiB  
Article
Research on a Secure and Reliable Runtime Patching Method for Cyber–Physical Systems and Internet of Things Devices
by Zesheng Xi, Bo Zhang, Aniruddha Bhattacharjya, Yunfan Wang and Chuan He
Symmetry 2025, 17(7), 983; https://doi.org/10.3390/sym17070983 - 21 Jun 2025
Viewed by 357
Abstract
Recent advances in technologies such as blockchain, the Internet of Things (IoT), Cyber–Physical Systems (CPSs), and the Industrial Internet of Things (IIoT) have driven the digitalization and intelligent transformation of modern industries. However, embedded control devices within power system communication infrastructures have become [...] Read more.
Recent advances in technologies such as blockchain, the Internet of Things (IoT), Cyber–Physical Systems (CPSs), and the Industrial Internet of Things (IIoT) have driven the digitalization and intelligent transformation of modern industries. However, embedded control devices within power system communication infrastructures have become increasingly susceptible to cyber threats due to escalating software complexity and extensive network exposure. We have seen that symmetric conventional patching techniques—both static and dynamic—often fail to satisfy the stringent requirements of real-time responsiveness and computational efficiency in resource-constrained environments of all kinds of power grids. To address this limitation, we have proposed a hardware-assisted runtime patching framework tailored for embedded systems in critical power system networks. Our method has integrated binary-level vulnerability modeling, execution-trace-driven fault localization, and lightweight patch synthesis, enabling dynamic, in-place code redirection without disrupting ongoing operations. By constructing a system-level instruction flow model, the framework has leveraged on-chip debug registers to deploy patches at runtime, ensuring minimal operational impact. Experimental evaluations within a simulated substation communication architecture have revealed that the proposed approach has reduced patch latency by 92% over static techniques, which are symmetrical in a working way, while incurring less than 3% CPU overhead. This work has offered a scalable and real-time model-driven defense strategy that has enhanced the cyber–physical resilience of embedded systems in modern power systems, contributing new insights into the intersection of runtime security and grid infrastructure reliability. Full article
(This article belongs to the Section Computer)
Show Figures

Figure 1

27 pages, 4717 KiB  
Article
Enhancing Bidirectional Modbus TCP ↔ RTU Gateway Performance: A UDP Mechanism and Markov Chain Approach
by Shuang Zhao, Qinghai Zhang, Qingjian Zhao, Xiaoqian Zhang, Yang Guo, Shilei Lu, Liqiang Song and Zhengxu Zhao
Sensors 2025, 25(13), 3861; https://doi.org/10.3390/s25133861 - 21 Jun 2025
Viewed by 1040
Abstract
In the Industrial Internet of Things (IIoT) field, the diversity of devices and protocols leads to interconnection challenges. Conventional Modbus Transmission Control Protocol (TCP) to Remote Terminal Unit (RTU) gateways suffer from high overhead and latency of the TCP protocol stack. To enhance [...] Read more.
In the Industrial Internet of Things (IIoT) field, the diversity of devices and protocols leads to interconnection challenges. Conventional Modbus Transmission Control Protocol (TCP) to Remote Terminal Unit (RTU) gateways suffer from high overhead and latency of the TCP protocol stack. To enhance real-time communication while ensuring reliability, this study applies Markov chain theory to analyze User Datagram Protocol (UDP) transmission characteristics. An Advanced UDP (AUDP) protocol is proposed by integrating a Cyclic Redundancy Check (CRC) check mechanism, retransmission mechanism, Transaction ID matching mechanism, and exponential backoff mechanism at the UDP application layer. Based on AUDP, a Modbus AUDP-RTU gateway is designed with a lightweight architecture to achieve bidirectional conversion between Modbus AUDP and Modbus RTU. Experimental validation and Markov chain-based modeling demonstrate that the proposed gateway significantly reduces communication latency compared to Modbus TCP-RTU and exhibits higher reliability than Modbus UDP-RTU. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

26 pages, 1636 KiB  
Article
Blockchain Solutions for Enhancing Security and Privacy in Industrial IoT
by Meryam Essaid and Hongtaek Ju
Appl. Sci. 2025, 15(12), 6835; https://doi.org/10.3390/app15126835 - 17 Jun 2025
Viewed by 593
Abstract
The Industrial Internet of Things (IIoT) has revolutionized smart manufacturing by enhancing automation, operational efficiency, and data-driven decision making. However, the interconnected nature of IIoT devices raises significant concerns about security and system integrity. This paper examines the application of blockchain technology to [...] Read more.
The Industrial Internet of Things (IIoT) has revolutionized smart manufacturing by enhancing automation, operational efficiency, and data-driven decision making. However, the interconnected nature of IIoT devices raises significant concerns about security and system integrity. This paper examines the application of blockchain technology to address these challenges, with a focus on data integrity, access control, and traceability. This paper proposes a blockchain-based framework that leverages decentralized security, smart contracts, and edge computing to mitigate vulnerabilities, including unauthorized access and data manipulation. The framework is evaluated for practicality, scalability, and constraints within IIoT environments. Additionally, this paper discusses the integration of complementary security mechanisms, such as Zero Trust architecture and AI-driven anomaly detection, to provide a comprehensive cybersecurity solution for the Industrial Internet of Things (IIoT). Full article
(This article belongs to the Special Issue Advanced Blockchain Technology for the Internet of Things)
Show Figures

Figure 1

20 pages, 1300 KiB  
Article
QPUF: Quantum Physical Unclonable Functions for Security-by-Design of Industrial Internet-of-Things
by Venkata K. V. V. Bathalapalli, Saraju P. Mohanty, Chenyun Pan and Elias Kougianos
Cryptography 2025, 9(2), 34; https://doi.org/10.3390/cryptography9020034 - 27 May 2025
Viewed by 1176
Abstract
This research investigates the integration of quantum hardware-assisted security into critical applications, including the Industrial Internet-of-Things (IIoT), Smart Grid, and Smart Transportation. The Quantum Physical Unclonable Functions (QPUF) architecture has emerged as a robust security paradigm, harnessing the inherent randomness of quantum hardware [...] Read more.
This research investigates the integration of quantum hardware-assisted security into critical applications, including the Industrial Internet-of-Things (IIoT), Smart Grid, and Smart Transportation. The Quantum Physical Unclonable Functions (QPUF) architecture has emerged as a robust security paradigm, harnessing the inherent randomness of quantum hardware to generate unique and tamper-resistant cryptographic fingerprints. This work explores the potential of Quantum Computing for Security-by-Design (SbD) in the Industrial Internet-of-Things (IIoT), aiming to establish security as a fundamental and inherent feature. SbD in Quantum Computing focuses on ensuring the security and privacy of Quantum computing applications by leveraging the fundamental principles of quantum mechanics, which underpin the quantum computing infrastructure. This research presents a scalable and sustainable security framework for the trusted attestation of smart industrial entities in Quantum Industrial Internet-of-Things (QIoT) applications within Industry 4.0. Central to this approach is the QPUF, which leverages quantum mechanical principles to generate unique, tamper-resistant fingerprints. The proposed QPUF circuit logic has been deployed on IBM quantum systems and simulators for validation. The experimental results demonstrate the enhanced randomness and an intra-hamming distance of approximately 50% on the IBM quantum hardware, along with improved reliability despite varying error rates, coherence, and decoherence times. Furthermore, the circuit achieved 100% reliability on Google’s Cirq simulator and 95% reliability on IBM’s quantum simulator, highlighting the QPUF’s potential in advancing quantum-centric security solutions. Full article
(This article belongs to the Special Issue Emerging Topics in Hardware Security)
Show Figures

Figure 1

25 pages, 11142 KiB  
Article
Enhanced Heat-Powered Batteryless IIoT Architecture with NB-IoT for Predictive Maintenance in the Oil and Gas Industry
by Raúl Aragonés, Joan Oliver and Carles Ferrer
Sensors 2025, 25(8), 2590; https://doi.org/10.3390/s25082590 - 19 Apr 2025
Cited by 2 | Viewed by 584
Abstract
The carbon footprint associated with human activity, particularly from energy-intensive industries such as iron and steel, aluminium, cement, oil and gas, and petrochemicals, contributes significantly to global warming. These industries face unique challenges in achieving Industry 4.0 goals due to the widespread adoption [...] Read more.
The carbon footprint associated with human activity, particularly from energy-intensive industries such as iron and steel, aluminium, cement, oil and gas, and petrochemicals, contributes significantly to global warming. These industries face unique challenges in achieving Industry 4.0 goals due to the widespread adoption of industrial Internet of Things (IIoT) technologies, which require reliable and efficient power solutions. Conventional wireless devices powered by lithium batteries have limitations, including a reduced lifespan in high-temperature environments, incompatibility with explosive atmospheres, and high maintenance costs. This paper proposes a novel approach to address these challenges by leveraging residual heat to power IIoT devices, eliminating the need for batteries and enabling autonomous operation. Based on the Seebeck effect, thermoelectric energy harvesters transduce waste heat from industrial surfaces, such as pipes or chimneys, into sufficient electrical energy to power IoT nodes for applications like the condition monitoring and predictive maintenance of rotating machinery. The methodology presented standardises the modelling and simulation of Waste Heat Recovery Systems (IoT-WHRSs), demonstrating their feasibility through statistical analysis of IoT-WHRS architectures. Furthermore, this technology has been successfully implemented in a petroleum refinery, where it benefits from the NB-IoT standard for long-range, robust, and secure communications, ensuring reliable data transmission in harsh industrial environments. The results highlight the potential of this solution to reduce costs, improve safety, and enhance efficiency in demanding industrial applications, making it a valuable tool for the energy transition. Full article
Show Figures

Figure 1

23 pages, 3925 KiB  
Article
An Edge-Computing-Based Integrated Framework for Network Traffic Analysis and Intrusion Detection to Enhance Cyber–Physical System Security in Industrial IoT
by Tamara Zhukabayeva, Zulfiqar Ahmad, Aigul Adamova, Nurdaulet Karabayev and Assel Abdildayeva
Sensors 2025, 25(8), 2395; https://doi.org/10.3390/s25082395 - 10 Apr 2025
Viewed by 1269
Abstract
Industrial Internet of things (IIoT) environments need to implement reliable security measures because of the growth in network traffic and overall connectivity. Accordingly, this work provides the architecture of network traffic analysis and the detection of intrusions in a network with the help [...] Read more.
Industrial Internet of things (IIoT) environments need to implement reliable security measures because of the growth in network traffic and overall connectivity. Accordingly, this work provides the architecture of network traffic analysis and the detection of intrusions in a network with the help of edge computing and using machine-learning methods. The study uses k-means and DBSCAN techniques to examine the flow of traffic in a network and to discover several groups of behavior and possible anomalies. An assessment of the two clustering methods shows that K-means achieves a silhouette score of 0.612, while DBSCAN achieves 0.473. For intrusion detection, k-nearest neighbors (KNN), random forest (RF), and logistic regression (LR) were used and evaluated. The analysis revealed that both KNN and RF yielded seamless results in terms of precision, recall, and F1 score, close to the maximum possible value of 1.00, as demonstrated by both ROC and precision–recall curves. Accuracy matrices show that RF had better precision and recall for both benign and attacks, while KNN and LR had good detection with slight fluctuations. With the integration of edge computing, the framework is improved by real-time data processing, which means a lower latency of the security system. This work enriches the knowledge of the IIOT by offering a detailed solution to the issue of cybersecurity in IoT systems, based on well-grounded performance assessments and the right implementation of current technologies. The results thus support the effectiveness of the proposed framework to improve security and provide tangible improvements over current approaches by identifying potential threats within a network. Full article
Show Figures

Figure 1

17 pages, 2587 KiB  
Article
A Cyber Manufacturing IoT System for Adaptive Machine Learning Model Deployment by Interactive Causality-Enabled Self-Labeling
by Yutian Ren, Yuqi He, Xuyin Zhang, Aaron Yen and Guann-Pyng Li
Machines 2025, 13(4), 304; https://doi.org/10.3390/machines13040304 - 8 Apr 2025
Viewed by 568
Abstract
Machine learning (ML) has been demonstrated to improve productivity in many manufacturing applications. To host these ML applications, several software and Industrial Internet of Things (IIoT) systems have been proposed for manufacturing applications to deploy ML applications and provide real-time intelligence. Recently, an [...] Read more.
Machine learning (ML) has been demonstrated to improve productivity in many manufacturing applications. To host these ML applications, several software and Industrial Internet of Things (IIoT) systems have been proposed for manufacturing applications to deploy ML applications and provide real-time intelligence. Recently, an interactive causality-enabled self-labeling method has been proposed to advance adaptive ML applications in cyber–physical systems, especially manufacturing, by automatically adapting and personalizing ML models after deployment to counter data distribution shifts. The unique features of the self-labeling method require a novel software system to support dynamism at various levels. This paper proposes the AdaptIoT system, comprising an end-to-end data streaming pipeline, ML service integration, and an automated self-labeling service. The self-labeling service consists of causal knowledge bases and automated full-cycle self-labeling workflows to adapt multiple ML models simultaneously. AdaptIoT employs a containerized microservice architecture to deliver a scalable and portable solution for small and medium-sized manufacturers. A field demonstration of a self-labeling adaptive ML application is conducted with a makerspace and shows reliable performance with comparable accuracy at 98.3%. Full article
Show Figures

Figure 1

19 pages, 1237 KiB  
Article
Cyberattack Detection Systems in Industrial Internet of Things (IIoT) Networks in Big Data Environments
by Abdullah Orman
Appl. Sci. 2025, 15(6), 3121; https://doi.org/10.3390/app15063121 - 13 Mar 2025
Cited by 2 | Viewed by 1848
Abstract
The rapid expansion of the Industrial Internet of Things (IIoT) has revolutionized industrial automation and introduced significant cybersecurity challenges, particularly for supervisory control and data acquisition (SCADA) systems. Traditional intrusion detection systems (IDSs) often struggle to effectively identify and mitigate complex cyberthreats, such [...] Read more.
The rapid expansion of the Industrial Internet of Things (IIoT) has revolutionized industrial automation and introduced significant cybersecurity challenges, particularly for supervisory control and data acquisition (SCADA) systems. Traditional intrusion detection systems (IDSs) often struggle to effectively identify and mitigate complex cyberthreats, such as denial-of-service (DoS) and distributed denial-of-service (DDoS) attacks. This study proposes an advanced IDS framework integrating machine learning, deep learning, and hybrid models to enhance cybersecurity in IIoT environments. Using the WUSTL-IIoT-2021 dataset, multiple classification models—including decision tree, random forest, multilayer perceptron (MLP), convolutional neural networks (CNNs), and hybrid deep learning architectures—were systematically evaluated based on key performance metrics, including accuracy, precision, recall, and F1 score. This research introduces several key innovations. First, it presents a comparative analysis of machine learning, deep learning, and hybrid models within a unified experimental framework, offering a comprehensive evaluation of various approaches. Second, while existing studies frequently favor hybrid models, findings from this study reveal that the standalone MLP model outperforms other architectures, achieving the highest detection accuracy of 99.99%. This outcome highlights the critical role of dataset-specific feature distributions in determining model effectiveness and calls for a more nuanced approach when selecting detection models for IIoT cybersecurity applications. Additionally, the study explores a broad range of hyperparameter configurations, optimizing model effectiveness for IIoT-specific intrusion detection. These contributions provide valuable insights for developing more efficient and adaptable IDS solutions in IIoT networks. Full article
(This article belongs to the Special Issue Trends and Prospects for Wireless Sensor Networks and IoT)
Show Figures

Figure 1

23 pages, 9395 KiB  
Article
MAS-LSTM: A Multi-Agent LSTM-Based Approach for Scalable Anomaly Detection in IIoT Networks
by Zhenkai Qin, Qining Luo, Xunyi Nong, Xiaolong Chen, Hongfeng Zhang and Cora Un In Wong
Processes 2025, 13(3), 753; https://doi.org/10.3390/pr13030753 - 5 Mar 2025
Viewed by 1545
Abstract
The increasing complexity of interconnected systems in the Internet of Things (IoT) demands advanced methodologies for real-time security and management. This study presents MAS-LSTM, an anomaly-detection framework that combines multi-agent systems (MASs) with long short-term memory (LSTM) networks. By training agents on IoT [...] Read more.
The increasing complexity of interconnected systems in the Internet of Things (IoT) demands advanced methodologies for real-time security and management. This study presents MAS-LSTM, an anomaly-detection framework that combines multi-agent systems (MASs) with long short-term memory (LSTM) networks. By training agents on IoT traffic datasets (NF-ToN-IoT, NF-BoT-IoT, and their V2 versions), MAS-LSTM offers scalable, decentralized anomaly detection. The LSTM networks capture temporal dependencies, enhancing anomaly detection in time-series data. This framework overcomes key limitations of existing methods, such as scalability in heterogeneous traffic and computational efficiency in resource-constrained IIoT environments. Additionally, it leverages graph signal processing for adaptive and modular detection across diverse IoT scenarios. Experimental results demonstrate its effectiveness, achieving F1 scores of 0.9861 and 0.8413 on NF-BoT-IoT and NF-ToN-IoT, respectively. For V2 versions, MAS-LSTM achieves F1 scores of 0.9965 and 0.9678. These results highlight its robustness in handling large-scale IIoT traffic. Despite challenges in real-world deployment, such as adversarial attacks and communication overhead, future research could focus on self-supervised learning and lightweight architectures for resource-constrained environments. Full article
Show Figures

Figure 1

60 pages, 1482 KiB  
Systematic Review
Federated Learning for Cloud and Edge Security: A Systematic Review of Challenges and AI Opportunities
by Latifa Albshaier, Seetah Almarri and Abdullah Albuali
Electronics 2025, 14(5), 1019; https://doi.org/10.3390/electronics14051019 - 3 Mar 2025
Cited by 9 | Viewed by 6961
Abstract
The ongoing evolution of cloud computing requires sustained attention to security, privacy, and compliance issues. The purpose of this paper is to systematically review the current literature regarding the application of federated learning (FL) and artificial intelligence (AI) to improve cloud computing security [...] Read more.
The ongoing evolution of cloud computing requires sustained attention to security, privacy, and compliance issues. The purpose of this paper is to systematically review the current literature regarding the application of federated learning (FL) and artificial intelligence (AI) to improve cloud computing security while preserving privacy, delivering real-time threat detection, and meeting regulatory requirements. The current research follows a systematic literature review (SLR) approach, which examined 30 studies published between 2020 and 2024 and followed the PRISMA 2020 checklist. The analysis shows that FL provides significant privacy risk reduction by 25%, especially in healthcare and similar domains, and it improves threat detection by 40% in critical infrastructure areas. A total of 80% of reviewed implementations showed improved privacy, but challenges like communication overhead and resource limitations persist, with 50% of studies reporting latency issues. To overcome these obstacles, this study also explores some emerging solutions, which include model compression, hybrid federated architectures, and cryptographic enhancements. Additionally, this paper demonstrates the unexploited capability of FL for real-time decision-making in dynamic edge environments and highlights its potential across autonomous systems, Industrial Internet of Things (IIoT), and cybersecurity frameworks. The paper’s proposed insights present a deployment strategy for FL models which enables scalable, secure, and privacy-preserving operations and will enable robust cloud security solutions in the AI era. Full article
Show Figures

Figure 1

32 pages, 498 KiB  
Review
A Survey on the Applications of Cloud Computing in the Industrial Internet of Things
by Elias Dritsas and Maria Trigka
Big Data Cogn. Comput. 2025, 9(2), 44; https://doi.org/10.3390/bdcc9020044 - 17 Feb 2025
Cited by 3 | Viewed by 3784
Abstract
The convergence of cloud computing and the Industrial Internet of Things (IIoT) has significantly transformed industrial operations, enabling intelligent, scalable, and efficient systems. This survey provides a comprehensive analysis of the role cloud computing plays in IIoT ecosystems, focusing on its architectural frameworks, [...] Read more.
The convergence of cloud computing and the Industrial Internet of Things (IIoT) has significantly transformed industrial operations, enabling intelligent, scalable, and efficient systems. This survey provides a comprehensive analysis of the role cloud computing plays in IIoT ecosystems, focusing on its architectural frameworks, service models, and application domains. By leveraging centralized, edge, and hybrid cloud architectures, IIoT systems achieve enhanced real-time processing capabilities, streamlined data management, and optimized resource allocation. Moreover, this study delves into integrating artificial intelligence (AI) and machine learning (ML) in cloud platforms to facilitate predictive analytics, anomaly detection, and operational intelligence in IIoT environments. Security challenges, including secure device-to-cloud communication and privacy concerns, are addressed with innovative solutions like blockchain and AI-powered intrusion detection systems. Future trends, such as adopting 5G, serverless computing, and AI-driven adaptive services, are also discussed, offering a forward-looking perspective on this rapidly evolving domain. Finally, this survey contributes to a well-rounded understanding of cloud computing’s multifaceted aspects and highlights its pivotal role in driving the next generation of industrial innovation and operational excellence. Full article
(This article belongs to the Special Issue Application of Cloud Computing in Industrial Internet of Things)
Show Figures

Figure 1

27 pages, 3199 KiB  
Article
Hybrid CNN–BiLSTM–DNN Approach for Detecting Cybersecurity Threats in IoT Networks
by Bright Agbor Agbor, Bliss Utibe-Abasi Stephen, Philip Asuquo, Uduak Onofiok Luke and Victor Anaga
Computers 2025, 14(2), 58; https://doi.org/10.3390/computers14020058 - 10 Feb 2025
Cited by 3 | Viewed by 2378
Abstract
The Internet of Things (IoT) ecosystem is rapidly expanding. It is driven by continuous innovation but accompanied by increasingly sophisticated cybersecurity threats. Protecting IoT devices from these emerging vulnerabilities has become a critical priority. This study addresses the limitations of existing IoT threat [...] Read more.
The Internet of Things (IoT) ecosystem is rapidly expanding. It is driven by continuous innovation but accompanied by increasingly sophisticated cybersecurity threats. Protecting IoT devices from these emerging vulnerabilities has become a critical priority. This study addresses the limitations of existing IoT threat detection methods, which often struggle with the dynamic nature of IoT environments and the growing complexity of cyberattacks. To overcome these challenges, a novel hybrid architecture combining Convolutional Neural Networks (CNN), Bidirectional Long Short-Term Memory (BiLSTM), and Deep Neural Networks (DNN) is proposed for accurate and efficient IoT threat detection. The model’s performance is evaluated using the IoT-23 and Edge-IIoTset datasets, which encompass over ten distinct attack types. The proposed framework achieves a remarkable 99% accuracy on both datasets, outperforming existing state-of-the-art IoT cybersecurity solutions. Advanced optimization techniques, including model pruning and quantization, are applied to enhance deployment efficiency in resource-constrained IoT environments. The results highlight the model’s robustness and its adaptability to diverse IoT scenarios, which address key limitations of prior approaches. This research provides a robust and efficient solution for IoT threat detection, establishing a foundation for advancing IoT security and addressing the evolving landscape of cyber threats while driving future innovations in the field. Full article
(This article belongs to the Special Issue Multimedia Data and Network Security)
Show Figures

Figure 1

Back to TopTop