Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (390)

Search Parameters:
Keywords = distributed denial of services attack

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
27 pages, 3983 KB  
Article
Low-Latency DDoS Detection for IIoT and SCADA Networks Using Proximal Policy Optimisation and Deep Reinforcement Learning
by Mikiyas Alemayehu, Mohamed Chahine Ghanem, Hamza Kheddar, Dipo Dunsin, Chaker Abdelaziz Kerrache and Geetanjali Rathee
Information 2026, 17(5), 412; https://doi.org/10.3390/info17050412 - 26 Apr 2026
Viewed by 102
Abstract
Industrial Internet of Things (IIoT) and SCADA-connected networks are increasingly vulnerable to Distributed Denial of Service (DDoS) attacks, which can disrupt time-sensitive industrial processes and compromise operational continuity. Effective mitigation requires accurate and low-latency attack detection at the network edge, where industrial gateways [...] Read more.
Industrial Internet of Things (IIoT) and SCADA-connected networks are increasingly vulnerable to Distributed Denial of Service (DDoS) attacks, which can disrupt time-sensitive industrial processes and compromise operational continuity. Effective mitigation requires accurate and low-latency attack detection at the network edge, where industrial gateways operate under strict constraints in computation, memory, and energy. This study investigates Deep Reinforcement Learning (DRL) for real-time binary DDoS detection and proposes a detector based on Proximal Policy Optimisation (PPO) for deployment in resource-constrained IIoT environments. Four DRL agents, namely Deep Q-Network (DQN), Double DQN, Dueling DQN, and PPO, are trained and evaluated within a unified experimental pipeline incorporating automatic label mapping, numerical feature selection, robust scaling, and class balancing. Experiments are conducted on three representative benchmark datasets: CIC-DDoS2019, Edge-IIoTset, and CICIoT23. Performance is assessed using accuracy, precision, recall, F1-score, false positive rate, false negative rate, and CPU inference latency. The reward function is asymmetric: +1 for correct classification, −1 for false positive, and −2 for false negative, penalising missed attacks more heavily for IIoT safety. The results show that PPO provides a competitive accuracy–latency tradeoff across all three datasets, achieving the highest mean accuracy of 97.65% and ranking first on CIC-DDoS2019 with a score of 95.92%, while remaining competitive on Edge-IIoTset (99.11%) and CICIoT23 (97.92%). PPO also converges faster than the value-based baselines. Inference latency is below 0.8 ms per sample on a standard CPU (Intel i7-11800H), confirming real-time feasibility. To support practical deployment, the trained PPO policies are exported to ONNX format (≈9 KB per model), enabling lightweight and PyTorch-independent inference on industrial edge gateways. Full article
(This article belongs to the Special Issue Reinforcement Learning for Cyber Security: Methods and Applications)
24 pages, 1869 KB  
Article
Neuro-Fuzzy Approach for Detecting DDoS Attacks in IoT Environments Applied to Biosignal Monitoring
by Angela M. Parra and Marcia M. Bayas
Technologies 2026, 14(5), 253; https://doi.org/10.3390/technologies14050253 - 24 Apr 2026
Viewed by 238
Abstract
Distributed denial-of-service (DDoS) attacks pose a critical threat to the availability of the Internet of Medical Things (IoMT). This paper proposes an intrusion detection system (IDS) based on a hybrid neuro-fuzzy-inspired approach to identify DDoS attacks in IoMT environments. The architecture combines an [...] Read more.
Distributed denial-of-service (DDoS) attacks pose a critical threat to the availability of the Internet of Medical Things (IoMT). This paper proposes an intrusion detection system (IDS) based on a hybrid neuro-fuzzy-inspired approach to identify DDoS attacks in IoMT environments. The architecture combines an ensemble of decision trees, a sigmoidal smoothing mechanism, and a multilayer neural meta-classifier, enabling the modeling of nonlinear relationships between legitimate and malicious traffic without requiring explicit fuzzy rules or a formal fuzzy inference mechanism. The evaluation was conducted using the public DoS/DDoS-MQTT-IoT dataset, which was extended by incorporating legitimate traffic generated by electrocardiography (ECG) monitoring devices to approximate real operational IoMT conditions. The model was validated using stratified cross-validation and bootstrap procedures. In the extended IoMT scenario including ECG traffic, the proposed approach achieved an area under the ROC curve (AUC) of 0.904 and an F1 score of 0.823. Finally, the IDS was integrated into an intrusion detection and prevention system (IDPS) capable of detecting anomalous traffic patterns within three seconds and automatically blocking malicious IP addresses after repeated detections. Full article
Show Figures

Graphical abstract

28 pages, 1805 KB  
Article
Intelligent Threat Defense Mechanisms for 5G APIs
by Asif Yasin, Seyed Ebrahim Hosseini, Muhammad Nadeem and Shahbaz Pervez
Future Internet 2026, 18(5), 223; https://doi.org/10.3390/fi18050223 - 22 Apr 2026
Viewed by 234
Abstract
As 5G Standalone Core networks grow, Application Programming Interface (APIs) have become a key part of how network systems talk to each other. They allow different functions to share data and complete tasks quickly. However, this also makes them targets for attacks. 5G [...] Read more.
As 5G Standalone Core networks grow, Application Programming Interface (APIs) have become a key part of how network systems talk to each other. They allow different functions to share data and complete tasks quickly. However, this also makes them targets for attacks. 5G Standalone Core networks rely on Service-Based Architecture (SBA), where network functions communicate through exposed APIs. These APIs are attractive targets for cyberattacks because they are externally accessible, handle sensitive control-plane operations, and exchange structured data using Hypertext Transfer Protocol version 2 (HTTP/2) and JavaScript Object Notation (JSON) protocols. Most older security tools work using fixed rules, which cannot always detect new or changing threats. This study aimed to fix that gap by using Artificial Intelligence to make API security smarter. Two AI models were tested: Long Short-Term Memory (LSTM), which learns from past traffic and Reinforcement Learning (RL), which learns by adapting to network behavior. Both were used to assess API traffic and assign a real-time risk score. Synthetic traffic was created using Python, including both normal API calls and different types of attacks like Distributed Denial-of-Service (DDoS), brute force, and Structured Query Language (SQL) injection. The results show that both LSTM and RL models were better than traditional rule-based systems. They found more threats, gave fewer false alerts, and responded faster. RL was especially strong at handling unknown or changing attacks. Experimental results show that the proposed LSTM and RL models achieved approximately 95% detection accuracy, significantly outperforming the static rule-based baseline model, which achieved 58% accuracy. The results demonstrate the effectiveness of adaptive AI-based security mechanisms for detecting evolving API threats. This research shows that AI can help protect 5G APIs in a smarter and more flexible way. It can support telecom networks by making threat detection faster, more accurate, and ready for future challenges. Full article
(This article belongs to the Section Cybersecurity)
Show Figures

Figure 1

21 pages, 498 KB  
Article
An Evaluation of Supervised Machine Learning Pipelines for the Identification of Distributed Denial-of-Service Attacks Using Conventional and Computational Performance Metrics
by Adrian Kwiecien and Waddah Saeed
Math. Comput. Appl. 2026, 31(2), 62; https://doi.org/10.3390/mca31020062 - 13 Apr 2026
Viewed by 275
Abstract
Distributed denial-of-service (DDoS) attacks, a type of Denial-of-Service (DoS) attack in which the targeted server, service or network is overloaded with malicious traffic originating from various different sources with the aim of making such targets inaccessible for legitimate users, continue to pose a [...] Read more.
Distributed denial-of-service (DDoS) attacks, a type of Denial-of-Service (DoS) attack in which the targeted server, service or network is overloaded with malicious traffic originating from various different sources with the aim of making such targets inaccessible for legitimate users, continue to pose a pertinent threat to the availability and integrity of organisational digital assets. While many studies have shown that machine learning models can provide high predictive accuracy in detecting such attacks, they often fail to evaluate the practicality of deploying such models to production. This study aims to address this gap by evaluating a considerable amount of pipelines based on five popular supervised classifiers for detecting DDoS attacks using the CICDDoS2019 dataset. The study employs a comprehensive methodology that combines both manual feature removal with automated encoding, scaling and feature selection integrated within pipelines. A total of 210 pipelines formed of five classifiers, three features selectors, two hyperparameter tuners and seven train–test splits were initially evaluated. Pipeline performance was assessed using both conventional and computational performance metrics. To identify the champion pipeline, a two-step approach was employed: composite scoring for shortlisting and statistical testing using Friedman and post hoc Nemenyi tests. The champion pipeline was shown to be Decision Tree coupled with Recursive Feature Elimination (with 20 features selected) and Grid Search hyperparameter tuning with a 90-10 train–test split. It achieved the most optimal balance of predictive capabilities and computational overheads, achieving an MCC of 0.993±0.024, training time of 0.194±0.001 s, inference time of 0.000998±0.00008 s, CPU time of 0.194±0.008 s and average memory usage of 15,167 ± 322 kilobytes across training and inference. The findings highlight the importance of a holistic and more nuanced approach when selecting a champion pipeline that is not only effective but also feasible for deployment in resource-constrained environments. Full article
Show Figures

Figure 1

24 pages, 806 KB  
Article
EGGA: An Error-Guided Generative Augmentation and Optimized ML-Based IDS for EV Charging Network Security
by Li Yang and G. Kirubavathi
Future Internet 2026, 18(4), 202; https://doi.org/10.3390/fi18040202 - 13 Apr 2026
Viewed by 289
Abstract
Electric Vehicle Charging Systems (EVCSs) are increasingly connected with the Internet of Things (IoT) and smart grid infrastructure, yet they face growing cyber risks due to expanded attack interfaces. These systems are vulnerable to various attacks that potentially impact both charging operations and [...] Read more.
Electric Vehicle Charging Systems (EVCSs) are increasingly connected with the Internet of Things (IoT) and smart grid infrastructure, yet they face growing cyber risks due to expanded attack interfaces. These systems are vulnerable to various attacks that potentially impact both charging operations and user privacy. Intrusion Detection Systems (IDSs) are essential for identifying suspicious activities and mitigating risks to protect EVCS networks, but conventional ML-based IDSs are often unable to achieve optimal performance due to imbalanced datasets, complex traffic distributions, and human design limitations. In practice, EVCS traffic is typically multi-class, imbalanced, and safety-critical, where both missed attacks and false alarms can lead to denial of charging, service interruption, unnecessary incident escalation, financial loss, and reduced user trust. Automated ML (AutoML) and Generative Artificial Intelligence (GAI) have emerged as promising solutions in cybersecurity. Existing GAI and augmentation methods are mostly class-frequency-driven, but this does not necessarily improve the error-prone regions where IDSs actually fail. In this paper, we propose a GAI and an AutoML-based IDS that incorporates a Conditional Generative Adversarial Network (cGAN) with the optimized XGBoost model to improve the effectiveness of intrusion detection in EVCS networks and IoT systems. The proposed framework involves two techniques: (1) a novel cGAN-based error-guided generative augmentation (EGGA) method that extracts misclassified samples and generates a more robust training set for IDS development, and (2) an optimized IDS model that automatically constructs an optimized XGBoost model based on Bayesian Optimization with Tree-structured Parzen Estimator (BO-TPE). The main algorithmic novelty lies in EGGA, which uses model errors to guide generative augmentation toward difficult decision regions, while the overall pipeline represents a practical system-level integration of EGGA, XGBoost, and BO-TPE. To the best of our knowledge, this is the first work that combines GAI and AutoML to specifically improve detection on hard samples, enabling more autonomous and reliable identification of diverse cyber attacks in EV charging networks and IoT systems. Experiments are conducted on two benchmark EVCS and cybersecurity datasets, CICEVSE2024 and CICIDS2017, demonstrating consistent and statistically meaningful improvements over state-of-the-art IDS models. This research highlights the importance of combining automation, generative balancing, and optimized learning to strengthen cybersecurity solutions for EV charging networks and IoT systems. Full article
Show Figures

Figure 1

27 pages, 3109 KB  
Article
Early Detection of Virtual Machine Failures in Cloud Computing Using Quantum-Enhanced Support Vector Machine
by Bhargavi Krishnamurthy, Saikat Das and Sajjan G. Shiva
Mathematics 2026, 14(7), 1229; https://doi.org/10.3390/math14071229 - 7 Apr 2026
Viewed by 299
Abstract
Cloud computing is one of the essential computing platforms for modern enterprises. A total of 84 percent of large businesses use cloud computing services in 2025 to enable remote working and higher flexibility of operation with reduction in the cost of operation. Cloud [...] Read more.
Cloud computing is one of the essential computing platforms for modern enterprises. A total of 84 percent of large businesses use cloud computing services in 2025 to enable remote working and higher flexibility of operation with reduction in the cost of operation. Cloud environments are dynamic and multitenant, often demanding high computational resources for real-time processing. However, the cloud system’s behavior is subjected to various kinds of anomalies in which patterns of data deviate from the normal traffic. The varieties of anomalies that exist are performance anomalies, security anomalies, resource anomalies, and network anomalies. These anomalies disrupt the normal operation of cloud systems by increasing the latency, reducing throughput, frequently violating service level agreements (SLAs), and experiencing the failure of virtual machines. Among all anomalies, virtual machine failures are one of the potential anomalies in which the normal operation of the virtual machine is interrupted, resulting in the degradation of services. Virtual machine failure happens because of resource exhaustion, malware access, packet loss, Distributed Denial of Service attacks, etc. Hence, there is a need to detect the chances of virtual machine failures and prevent it through proactive measures. Traditional machine learning techniques often struggle with high-dimensional data and nonlinear correlations, ending up with poor real-time adaptation. Hence, quantum machine learning is found to be a promising solution which effectively deals with combinatorially complex and high-dimensional data. In this paper, a novel quantum-enhanced support vector machine (QSVM) is designed as an optimized binary classifier which combines the principles of both quantum computing and support vector machine. It encodes the classical data into quantum states. Feature mapping is performed to transform the data into the high-dimensional form of Hilbert space. Quantum kernel evaluation is performed to evaluate similarities. Through effective optimization, optimal hyperplanes are designed to detect the anomalous behavior of virtual machines. This results in the exponential speed-up of operation and prevents the local minima through entanglement and superposition operation. The performance of the proposed QSVM is analyzed using the QuCloudSim 1.0 simulator and further validated using expected value analysis methodology. Full article
Show Figures

Figure 1

23 pages, 630 KB  
Article
Depth-First Search-Based Malicious Node Detection with Honeypot Technology in Wireless Sensor Networks
by Sercan Demirci, Doğan Yıldız, Durmuş Özkan Şahin and Asmaa Alaadin
Mathematics 2026, 14(6), 1050; https://doi.org/10.3390/math14061050 - 20 Mar 2026
Viewed by 366
Abstract
Wireless sensor networks (WSNs) are highly susceptible to Denial-of-Service (DoS) attacks due to their resource-constrained and distributed nature. In this study, we propose a novel trust-based malicious node detection mechanism that leverages a Depth-First Search (DFS) strategy to trace and identify attack sources [...] Read more.
Wireless sensor networks (WSNs) are highly susceptible to Denial-of-Service (DoS) attacks due to their resource-constrained and distributed nature. In this study, we propose a novel trust-based malicious node detection mechanism that leverages a Depth-First Search (DFS) strategy to trace and identify attack sources within clustered WSN architectures efficiently. The proposed approach dynamically evaluates trust scores between nodes to detect anomalous behaviors and employs a honeypot-based redirection system to isolate compromised nodes from the main communication flow. This combination enhances detection accuracy while minimizing false positives and energy overhead. The method is implemented and evaluated using a custom simulation environment. Comparative experimental results against state-of-the-art techniques such as the Evolved Trust Updating Mechanism (EVO) and Multi-agent Trust-based Intrusion Detection System (MULTI) demonstrate that our Trust-Based Honeypot (TBHP) achieves superior performance in terms of detection rate, false-alarm rate, and network lifetime extension. Full article
(This article belongs to the Topic Recent Advances in Security, Privacy, and Trust)
Show Figures

Figure 1

18 pages, 1430 KB  
Article
Multi-Layer Traffic Analysis Framework for DDoS Attacks in Software-Defined IoT Networks
by Keerthana Balaji and Mamatha Balachandra
Future Internet 2026, 18(3), 164; https://doi.org/10.3390/fi18030164 - 19 Mar 2026
Viewed by 291
Abstract
The data plane and the control plane are targets for Distributed Denial of Service (DDoS) attacks in the Software-Defined Internet of Things (SDIoT). Currently available studies rely on observations from a single network layer which limits the cross-layer attack analysis. This paper presents [...] Read more.
The data plane and the control plane are targets for Distributed Denial of Service (DDoS) attacks in the Software-Defined Internet of Things (SDIoT). Currently available studies rely on observations from a single network layer which limits the cross-layer attack analysis. This paper presents a synchronized, phase-aware, and a multi-layer traffic collection framework mimicking SDIoT environments under diverse DDoS attack scenarios. The data collected are the metrics captured at host, switch, and controller layers during normal, attack, and post-attack phases with strict temporal alignment. For capturing diverse DDoS attack behaviors in SDIoT environments, representative data plane attacks including volumetric flooding and switch-level flow table saturation were used. Control plane level attack targeting the SDN controller was implemented. The evaluation was done using a Mininet-based SDIoT testbed with a POX controller. Each scenario is executed across five independent runs with statistical validation. The proposed framework enables reproducible and time-aligned multi-layer analysis through standardized orchestration and automated logging. Results indicate that SDIoT DDoS behavior demonstrates differently across traffic, state, and resource-level metrics, and that accurate characterization benefits from temporally aligned multi-layer monitoring rather than relying solely on packet rate analysis. Full article
(This article belongs to the Special Issue Cybersecurity, Privacy, and Trust in Intelligent Networked Systems)
Show Figures

Figure 1

15 pages, 1117 KB  
Article
Application of Impulsive SIRQ Models for the Development of Forecasting and Cyberattack Mitigation Scenarios
by Valentyn Sobchuk, Vitalii Savchenko, Bohdan Stepanchenko and Halyna Haidur
Axioms 2026, 15(3), 229; https://doi.org/10.3390/axioms15030229 - 19 Mar 2026
Viewed by 309
Abstract
This paper proposes an impulsive SIRQ model for the analysis of computer network resilience against malware propagation and distributed denial-of-service (DDoS) attacks. The model extends classical epidemic frameworks by combining the continuous-time dynamics of malicious object spreading with discrete control actions corresponding to [...] Read more.
This paper proposes an impulsive SIRQ model for the analysis of computer network resilience against malware propagation and distributed denial-of-service (DDoS) attacks. The model extends classical epidemic frameworks by combining the continuous-time dynamics of malicious object spreading with discrete control actions corresponding to mass updates, node isolation, and access control policies. A qualitative analysis of the resulting system of impulsive differential equations is performed. The basic reproduction number R0, identified as a threshold parameter characterizing the intensity of attack propagation, and sufficient conditions for the global asymptotic stability of the infection-free state are established. It is shown that, under periodic impulsive control, the infection-free state can be stabilized with respect to the target population coordinates even when R0>1. An exponential decay estimate for the total active threat is derived, guaranteeing the asymptotic extinction of the infected and quarantined node populations. The proposed approach provides quantitative criteria for the effectiveness of impulsive cyber defense strategies and offers a theoretical foundation for the design of adaptive multi-layer protection systems for critical information infrastructures. Practical interpretation of the results illustrates the dependence of the critical impulsive control period on the model parameters and demonstrates the applicability of the approach to cybersecurity strategy design. Full article
Show Figures

Figure 1

28 pages, 1099 KB  
Article
DELP-Net: A Differentiable Entropy Layer Pyramid Network for End-to-End Low-Rate DoS Detection
by Jinyi Wang, Congyuan Xu and Jun Yang
Entropy 2026, 28(3), 328; https://doi.org/10.3390/e28030328 - 15 Mar 2026
Viewed by 286
Abstract
Low-rate Denial-of-Service (LDoS) attacks exploit periodic traffic pulses to trigger congestion while maintaining a low average rate, making them highly stealthy and difficult to distinguish from legitimate bursty traffic using threshold-based or simple statistical detectors. To address this challenge, this paper proposes DELP-Net, [...] Read more.
Low-rate Denial-of-Service (LDoS) attacks exploit periodic traffic pulses to trigger congestion while maintaining a low average rate, making them highly stealthy and difficult to distinguish from legitimate bursty traffic using threshold-based or simple statistical detectors. To address this challenge, this paper proposes DELP-Net, an end-to-end Differentiable Entropy Layer Pyramid Network for window-level online LDoS detection directly from raw traffic. DELP-Net combines a multi-scale one-dimensional convolutional pyramid with a differentiable Rényi-entropy-driven attention mechanism to capture distributional regularity and weak repetitive patterns characteristic of LDoS traffic. In addition, an entropy-conditioned temporal convolutional network is employed to model cross-window periodic dependencies in a lightweight manner, together with an entropy-regularized hybrid loss to enhance robustness under complex background traffic. Experiments on the low-rate DoS dataset show that DELP-Net achieves an average F1 score of 0.9877 across six LDoS attack types, with a detection rate of 98.69% and a false-positive rate of 1.15%, demonstrating its effectiveness and suitability for practical online intrusion detection deployments. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

24 pages, 1662 KB  
Article
Optimal Synergistic Attack Strategy Targeting Energy Storage and Load Sides in Integrated Energy Systems
by Shan Cheng, Siyu Wan and Weiwei Liu
Energies 2026, 19(5), 1300; https://doi.org/10.3390/en19051300 - 5 Mar 2026
Viewed by 285
Abstract
With the large-scale integration of distributed energy resources, modern energy systems are becoming increasingly dependent on communication networks for monitoring and control. This growing reliance exposes integrated energy systems (IESs) to potential cyber threats, as attackers may exploit vulnerabilities in communication protocols to [...] Read more.
With the large-scale integration of distributed energy resources, modern energy systems are becoming increasingly dependent on communication networks for monitoring and control. This growing reliance exposes integrated energy systems (IESs) to potential cyber threats, as attackers may exploit vulnerabilities in communication protocols to disrupt system operation. However, most existing studies primarily investigate the stable operation of electro–thermal coupled systems from a defensive standpoint, while paying limited attention to the potential economic damage that could be induced from an attacker’s perspective. Motivated by this gap, this paper develops an optimal coordinated attack strategy targeting both energy storage units and load-side resources from the attacker’s viewpoint. First, an economic dispatch model for an electricity–heat–gas integrated energy system is established, and a fully distributed solution algorithm is proposed to obtain the optimal economic operating cost. Subsequently, by compromising energy storage and load units with relatively low security levels, a three-stage coordinated cyber-attack framework is designed for the IES. In the first two stages, covert data integrity attacks (DIAs) are launched to inject falsified power information into the system. In the third stage, a denial-of-service (DoS) attack is introduced to operate in synergy with the DIAs, forcing the system to converge to a feasible yet economically suboptimal operating point. The optimal initiation timing of the DoS attack is derived through theoretical analysis. Simulation results demonstrate that the proposed strategy can induce an economic loss of approximately 21.7% while maintaining system feasibility. By revealing these latent vulnerabilities from an attacker-oriented perspective, this study provides a theoretical basis for the development of proactive defense mechanisms, thereby enhancing the long-term economic and operational security of future integrated energy systems. Full article
Show Figures

Figure 1

20 pages, 2485 KB  
Article
Gated Residual Chebyshev KAN for Lightweight IoT DDoS Detection
by Fray L. Becerra-Suarez, Edwin Valencia-Castillo, Ana G. Borrero-Ramírez and Manuel G. Forero
J. Cybersecur. Priv. 2026, 6(2), 47; https://doi.org/10.3390/jcp6020047 - 4 Mar 2026
Viewed by 654
Abstract
Distributed denial-of-service (DDoS) attacks have become a critical threat to Internet of Things (IoT) infrastructures due to their high traffic dynamics, strong class imbalance, and strict resource constraints at the edge. This paper proposes ChebyKANRes, a lightweight intrusion detection model that combines Chebyshev [...] Read more.
Distributed denial-of-service (DDoS) attacks have become a critical threat to Internet of Things (IoT) infrastructures due to their high traffic dynamics, strong class imbalance, and strict resource constraints at the edge. This paper proposes ChebyKANRes, a lightweight intrusion detection model that combines Chebyshev polynomial expansions to parameterize learnable univariate transformations, a gate mechanism to modulate feature flow, and residual connections to stabilize optimization in deeper KAN-style stacks. Experiments were conducted on the CICIoT2023 dataset focusing on benign traffic and 12 DDoS subtypes, using a reproducible pipeline with stratified splitting, cross-validation (k = 5), and early stopping. The proposed model consistently improves multi-class performance (Accuracy: 0.9983) over an optimized MLP baseline (Accuracy: 0.9641), while maintaining a compact size suitable for edge deployment (≈123 k parameters; ~0.47 MB). Within CICIoT2023 and the evaluated split/training protocol, the proposed ChebyKANRes configuration shows improved imbalance-robust multiclass detection while maintaining a compact model size and comparable batch inference time. Full article
(This article belongs to the Section Security Engineering & Applications)
Show Figures

Figure 1

38 pages, 10593 KB  
Article
Real-World Experimental Evaluation of DDoS and DRDoS Attacks on Industrial IoT Communication in an Automated Cyber-Physical Production Line
by Tibor Horak, Roman Ruzarovsky, Roman Zelník, Martin Csekei and Ján Šido
Machines 2026, 14(3), 258; https://doi.org/10.3390/machines14030258 - 25 Feb 2026
Viewed by 1002
Abstract
Automated production lines are increasingly being expanded with Industrial Internet of Things (IIoT) devices, creating complex Cyber-Physical Systems (CPSs) that connect physical production with control and information infrastructure. However, the convergence of Information Technology (IT) and Operational Technology (OT) layers creates new entry [...] Read more.
Automated production lines are increasingly being expanded with Industrial Internet of Things (IIoT) devices, creating complex Cyber-Physical Systems (CPSs) that connect physical production with control and information infrastructure. However, the convergence of Information Technology (IT) and Operational Technology (OT) layers creates new entry points for attacks targeting communication availability. Most existing studies analyze Distributed Denial of Service (DDoS) attacks primarily in simulation or testbed environments, with limited experimental verification of their impact on real-world production systems. This article presents an experimental evaluation of the impact of DDoS and Distributed Reflection Denial of Service (DRDoS) attacks carried out directly on a physical automated production line with integrated IIoT infrastructure during real operation. Three attack scenarios (TCP SYN flood, TCP ACK flood, and ICMP reflected attack) were implemented, targeting Programmable Logic Controllers (PLCs), Radio-Frequency Identification (RFID) subsystems, and selected IIoT devices. The results showed rapid degradation of deterministic PROFINET communication, disruption of the link between the OT and IT layers, loss of digital product representation, and physical interruption of the production process. Based on the findings, a minimally invasive security solution based on perimeter protection was designed and experimentally verified. The results emphasize the need to design IIoT-based manufacturing systems with an emphasis on network segmentation and architectural separation of the IT and OT layers. Full article
Show Figures

Figure 1

31 pages, 2986 KB  
Systematic Review
A Systematic Review of Machine-Learning-Based Detection of DDoS Attacks in Software-Defined Networks
by Surendren Ganeshan and R Kanesaraj Ramasamy
Future Internet 2026, 18(2), 109; https://doi.org/10.3390/fi18020109 - 19 Feb 2026
Viewed by 1035
Abstract
Software-Defined Networking (SDN) has emerged as a fundamental architecture for future Internet systems by enabling centralized control, programmability, and fine-grained traffic management. However, the logical centralization of the SDN control plane also introduces critical vulnerabilities, particularly to Distributed Denial-of-Service (DDoS) attacks that can [...] Read more.
Software-Defined Networking (SDN) has emerged as a fundamental architecture for future Internet systems by enabling centralized control, programmability, and fine-grained traffic management. However, the logical centralization of the SDN control plane also introduces critical vulnerabilities, particularly to Distributed Denial-of-Service (DDoS) attacks that can severely disrupt network availability and performance. To address these challenges, machine-learning (ML) techniques have been increasingly adopted to enable intelligent, adaptive, and data-driven DDoS detection mechanisms within SDN environments. This study presents a PRISMA-guided systematic literature review of recent ML-based approaches for DDoS detection in SDN-based networks. A comprehensive search of IEEE Xplore, ACM Digital Library, ScienceDirect, and Google Scholar identified 38 primary studies published between 2021 and 2025. The selected studies were systematically analyzed to examine learning paradigms, experimental environments, evaluation metrics, datasets, and emerging architectural trends. The synthesis reveals that while single machine-learning classifiers remain dominant in the literature, hybrid and ensemble-based approaches are increasingly adopted to improve detection robustness under dynamic and high-volume traffic conditions. Experimental evaluations are predominantly conducted using SDN emulation platforms such as Mininet integrated with controllers, including Ryu and OpenDaylight, with performance commonly measured using accuracy, precision, recall, and F1 score, alongside emerging system-level metrics such as detection latency and controller resource utilization. Public datasets, including CICIDS2017, CICDDoS2019, and InSDN, are widely used, although a significant portion of studies rely on custom SDN-generated datasets to capture control-plane-specific behaviors. Despite notable advances in detection accuracy, several challenges persist, including limited generalization to low-rate and unknown attacks, dependency on synthetic traffic, and insufficient validation under real-time operational conditions. Based on the synthesized findings, this review highlights key research directions toward intelligent, scalable, and resilient DDoS defense mechanisms for future Internet architectures, emphasizing adaptive learning, lightweight deployment, and integration with programmable networking infrastructures. Full article
Show Figures

Graphical abstract

33 pages, 630 KB  
Article
PID Control for Uncertain Systems with Integral Measurements and DoS Attacks Using a Binary Encoding Scheme
by Nan Hou, Yanshuo Wu, Hongyu Gao, Zhongrui Hu and Xianye Bu
Entropy 2026, 28(2), 225; https://doi.org/10.3390/e28020225 - 15 Feb 2026
Viewed by 390
Abstract
In this paper, an observer-based proportional-integral-derivative (PID) controller is designed for a class of uncertain nonlinear systems with integral measurements, denial-of-service (DoS) attacks and bounded stochastic noises under a binary encoding scheme (BES). Parameter uncertainty is involved with a norm-bounded multiplicative expression. Integral [...] Read more.
In this paper, an observer-based proportional-integral-derivative (PID) controller is designed for a class of uncertain nonlinear systems with integral measurements, denial-of-service (DoS) attacks and bounded stochastic noises under a binary encoding scheme (BES). Parameter uncertainty is involved with a norm-bounded multiplicative expression. Integral measurements are considered to reflect the delayed signal collection of sensor. For communication, BES is put into use in the signal transmission process from the sensor to the observer and from the controller to the actuator. Random bit flipping is described that may take place caused by channel noises, whose impact is described by a stochastic noise. Randomly occurring DoS attacks are taken account of that may appear due to the shared network, which block the transmitted signals totally. Three sets of Bernoulli-distributed random variables are adopted to reveal the random occurrence of uncertainties, bit flipping and DoS attacks. The aim of this paper is to design an observer-based PID controller which guarantees that the closed-loop system reaches exponential ultimate boundedness in mean square (EUBMS). By virtue of Lyapunov stability theory, stochastic analysis technique and matrix inequality method, a sufficient condition is developed for designing the observer-based PID controller such that the closed-loop system achieves EUBMS performance, and the ultimate upper bound of the controlled output is bounded and such a bound is minimized. The gain matrices of the observer-based controller are acquired explicitly by virtue of solving the solution to an optimized issue with several matrix inequality constraints. Two simulation examples are given which indicate the usefulness of the proposed control method in this paper adequately. Full article
(This article belongs to the Special Issue Information Theory in Control Systems, 3rd Edition)
Show Figures

Figure 1

Back to TopTop