Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,667)

Search Parameters:
Keywords = network traffic generator

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 1959 KB  
Article
GSF-LLM: Graph-Enhanced Spatio-Temporal Fusion-Based Large Language Model for Traffic Prediction
by Honggang Wang, Ye Li, Wenzhi Zhao, Haozhe Zhu, Jin Zhang and Xuening Wu
Sensors 2025, 25(21), 6698; https://doi.org/10.3390/s25216698 (registering DOI) - 2 Nov 2025
Abstract
Accurate traffic prediction is essential for intelligent transportation systems, urban mobility management, and traffic optimization. However, existing deep learning approaches often struggle to jointly capture complex spatial dependencies and temporal dynamics, and they are prone to overfitting when modeling large-scale traffic networks. To [...] Read more.
Accurate traffic prediction is essential for intelligent transportation systems, urban mobility management, and traffic optimization. However, existing deep learning approaches often struggle to jointly capture complex spatial dependencies and temporal dynamics, and they are prone to overfitting when modeling large-scale traffic networks. To address these challenges, we propose the GSF-LLM (graph-enhanced spatio-temporal fusion-based large language model), a novel framework that integrates large language models (LLMs) with graph-based spatio-temporal learning. GSF-LLM employs a spatio-temporal fusion module to jointly encode spatial and temporal correlations, combined with a partially frozen graph attention (PFGA) mechanism to model topological dependencies while mitigating overfitting. Furthermore, a low-rank adaptation (LoRA) strategy is adopted to fine-tune a subset of LLM parameters, improving training efficiency and generalization. Experiments on multiple real-world traffic datasets demonstrate that GSF-LLM consistently outperforms state-of-the-art baselines, showing strong potential for extension to related tasks such as data imputation, trajectory generation, and anomaly detection. Full article
(This article belongs to the Section Intelligent Sensors)
Show Figures

Figure 1

33 pages, 5642 KB  
Article
Feature-Optimized Machine Learning Approaches for Enhanced DDoS Attack Detection and Mitigation
by Ahmed Jamal Ibrahim, Sándor R. Répás and Nurullah Bektaş
Computers 2025, 14(11), 472; https://doi.org/10.3390/computers14110472 (registering DOI) - 1 Nov 2025
Abstract
Distributed denial of service (DDoS) attacks pose a serious risk to the operational stability of a network for companies, often leading to service disruptions and financial damage and a loss of trust and credibility. The increasing sophistication and scale of these threats highlight [...] Read more.
Distributed denial of service (DDoS) attacks pose a serious risk to the operational stability of a network for companies, often leading to service disruptions and financial damage and a loss of trust and credibility. The increasing sophistication and scale of these threats highlight the pressing need for advanced mitigation strategies. Despite the numerous existing studies on DDoS detection, many rely on large, redundant feature sets and lack validation for real-time applicability, leading to high computational complexity and limited generalization across diverse network conditions. This study addresses this gap by proposing a feature-optimized and computationally efficient ML framework for DDoS detection and mitigation using benchmark dataset. The proposed approach serves as a foundational step toward developing a low complexity model suitable for future real-time and hardware-based implementation. The dataset was systematically preprocessed to identify critical parameters, such as packet length Min, Total Backward Packets, Avg Fwd Segment Size, and others. Several ML algorithms, involving Logistic Regression, Decision Tree, Random Forest, Gradient Boosting, and Cat-Boost, are applied to develop models for detecting and mitigating abnormal network traffic. The developed ML model demonstrates high performance, achieving 99.78% accuracy with Decision Tree and 99.85% with Random Forest, representing improvements of 1.53% and 0.74% compared to previous work, respectively. In addition, the Decision Tree algorithm achieved 99.85% accuracy for mitigation. with an inference time as low as 0.004 s, proving its suitability for identifying DDoS attacks in real time. Overall, this research presents an effective approach for DDoS detection, emphasizing the integration of ML models into existing security systems to enhance real-time threat mitigation. Full article
Show Figures

Figure 1

16 pages, 1425 KB  
Article
Unlocking Few-Shot Encrypted Traffic Classification: A Contrastive-Driven Meta-Learning Approach
by Zheng Li, Jian Wang, Ya-Fei Song and Shao-Hua Yue
Electronics 2025, 14(21), 4245; https://doi.org/10.3390/electronics14214245 - 30 Oct 2025
Viewed by 153
Abstract
The classification of encrypted traffic is critical for network security, yet it faces a significant “few-shot” challenge as novel applications with scarce labeled data continuously emerge. This complexity arises from the high-dimensional, noisy nature of traffic data, making it difficult for models to [...] Read more.
The classification of encrypted traffic is critical for network security, yet it faces a significant “few-shot” challenge as novel applications with scarce labeled data continuously emerge. This complexity arises from the high-dimensional, noisy nature of traffic data, making it difficult for models to generalize from few examples. Existing paradigms, such as meta-learning from scratch or standard pre-train/fine-tune methods, often fail in this scenario. To address this gap, we propose Contrastive Learning Meta-Flow (CL-MetaFlow), a novel two-stage learning framework that uniquely synergizes the strengths of contrastive representation learning and meta-learning adaptation. In the first stage, a robust feature encoder is pre-trained using supervised contrastive learning on known traffic classes, shaping a highly discriminative and metric-friendly embedding space. In the second stage, this pre-trained encoder initializes a Prototypical Network, enabling rapid and effective adaptation to new, unseen classes from only a few samples. Extensive experiments on a benchmark dataset (ISCX-VPN-2016 & ISCX-Tor-2017) demonstrate the superiority of our approach. Notably, in a five-way five-shot setting, CL-MetaFlow achieves a Macro F1-Score of 0.620, significantly outperforming from-scratch ProtoNet (0.384), a standard fine-tuning baseline (0.160), and strong pre-training counterparts like SimCLR+ProtoNet (0.545) and a re-implemented T-Sanitation (0.591). Our work validates that a high-quality, domain-adapted feature prior is the key to unlocking high-performance few-shot learning in complex network environments, providing a practical and powerful solution for real-world traffic analysis. Full article
Show Figures

Figure 1

22 pages, 588 KB  
Article
Hybrid AI-Based Framework for Generating Realistic Attack-Related Network Flow Data for Cybersecurity Digital Twins
by Eider Iturbe, Javier Arcas, Gabriel Gaminde, Erkuden Rios and Nerea Toledo
Appl. Sci. 2025, 15(21), 11574; https://doi.org/10.3390/app152111574 - 29 Oct 2025
Viewed by 130
Abstract
In the context of cybersecurity digital twin environments, the ability to simulate realistic network traffic is critical for validating and training intrusion detection systems. However, generating synthetic data that accurately reflects the complex, time-dependent nature of network flows remains a significant challenge. This [...] Read more.
In the context of cybersecurity digital twin environments, the ability to simulate realistic network traffic is critical for validating and training intrusion detection systems. However, generating synthetic data that accurately reflects the complex, time-dependent nature of network flows remains a significant challenge. This paper presents an AI-based data generation approach designed to generate multivariate temporal network flow data that accurately reflects adversarial scenarios. The proposed method integrates a Long Short-Term Memory (LSTM) architecture trained to capture the temporal dynamics of both normal and attack traffic, ensuring the synthetic data preserves realistic, sequence-aware behavioral patterns. To further enhance data fidelity, a combination of deep learning-based generative models and statistical techniques is employed to synthesize both numerical and categorical features while maintaining the correct proportions and temporal relationships between attack and normal traffic. A key contribution of the framework is its ability to generate high-fidelity synthetic data that supports the simulation of realistic, production-like cybersecurity scenarios. Experimental results demonstrate the effectiveness of the approach in generating data that supports robust machine learning-based detection systems, making it a valuable tool for cybersecurity validation and training in digital twin environments. Full article
Show Figures

Figure 1

20 pages, 1100 KB  
Article
Data Distribution Strategies for Mixed Traffic Flows in Software-Defined Networks: A QoE-Driven Approach
by Hongming Li, Hao Li, Yuqing Ji and Ziwei Wang
Appl. Sci. 2025, 15(21), 11573; https://doi.org/10.3390/app152111573 - 29 Oct 2025
Viewed by 119
Abstract
The rapid proliferation of heterogeneous applications, from latency-critical video delivery to bandwidth-intensive file transfers, poses increasing challenges for modern communication networks. Traditional traffic engineering approaches often fall short in meeting diverse Quality of Experience (QoE) requirements under such conditions. To overcome these limitations, [...] Read more.
The rapid proliferation of heterogeneous applications, from latency-critical video delivery to bandwidth-intensive file transfers, poses increasing challenges for modern communication networks. Traditional traffic engineering approaches often fall short in meeting diverse Quality of Experience (QoE) requirements under such conditions. To overcome these limitations, this study proposes a QoE-driven distribution framework for mixed traffic in Software-Defined Networking (SDN) environments. The framework integrates flow categorization, adaptive path selection, and feedback-based optimization to dynamically allocate resources in alignment with application-level QoE metrics. By prioritizing delay-sensitive flows while ensuring efficient handling of high-volume traffic, the approach achieves balanced performance across heterogeneous service demands. In our 15-RSU Mininet tests under service number = 1 and offered demand = 10 ms, JOGAF attains max end-to-end delays of 415.74 ms, close to the 399.64 ms achieved by DOGA, while reducing the number of active hosts from 5 to 3 compared with DOGA. By contrast, HNOGA exhibits delayed growth of up to 7716.16 ms with 2 working hosts, indicating poorer suitability for latency-sensitive flows. These results indicate that JOGAF achieves near-DOGA latency with substantially lower host activation, offering a practical energy-aware alternative for mixed traffic SDN deployments. Beyond generic communication scenarios, the framework also shows strong potential in Intelligent Transportation Systems (ITS), where SDN-enabled vehicular networks require adaptive, user-centric service quality management. This work highlights the necessity of coupling classical traffic engineering concepts with SDN programmability to address the multifaceted challenges of next-generation networking. Moreover, it establishes a foundation for scalable, adaptive data distribution strategies capable of enhancing user experience while maintaining robustness across dynamic traffic environments. Full article
(This article belongs to the Section Electrical, Electronics and Communications Engineering)
Show Figures

Figure 1

19 pages, 4023 KB  
Article
RL-Based Resource Allocation in SDN-Enabled 6G Networks
by Ivan Radosavljević, Petar D. Bojović and Živko Bojović
Future Internet 2025, 17(11), 497; https://doi.org/10.3390/fi17110497 - 29 Oct 2025
Viewed by 297
Abstract
Dynamic and efficient resource allocation is critical for Software-Defined Networking (SDN) enabled sixth-generation (6G) networks to ensure adaptability and optimized utilization of network resources. This paper proposes a reinforcement learning (RL)-based framework that integrates an actor–critic model with a modular SDN interface for [...] Read more.
Dynamic and efficient resource allocation is critical for Software-Defined Networking (SDN) enabled sixth-generation (6G) networks to ensure adaptability and optimized utilization of network resources. This paper proposes a reinforcement learning (RL)-based framework that integrates an actor–critic model with a modular SDN interface for fine-grained, queue-level bandwidth scheduling. The framework further incorporates a stochastic traffic generator for training and a virtualized multi-slice platform testbed for a realistic beyond-5G/6G evaluation. Experimental results show that the proposed RL model significantly outperforms a baseline forecasting model: it converges faster, showing notable improvements after 240 training epochs, achieves higher cumulative rewards, and reduces packet drops under dynamic traffic conditions. Moreover, the RL-based scheduling mechanism exhibits improved adaptability to traffic fluctuations, although both approaches face challenges under node outage conditions. These findings confirm that queue-level reinforcement learning enhances responsiveness and reliability in 6G networks, while also highlighting open challenges in fault-tolerant scheduling. Full article
Show Figures

Graphical abstract

33 pages, 1134 KB  
Review
A Comprehensive Review of DDoS Detection and Mitigation in SDN Environments: Machine Learning, Deep Learning, and Federated Learning Perspectives
by Sidra Batool, Muhammad Aslam, Edore Akpokodje and Syeda Fizzah Jilani
Electronics 2025, 14(21), 4222; https://doi.org/10.3390/electronics14214222 - 29 Oct 2025
Viewed by 390
Abstract
Software-defined networking (SDN) has reformed the traditional approach to managing and configuring networks by isolating the data plane from control plane. This isolation helps enable centralized control over network resources, enhanced programmability, and the ability to dynamically apply and enforce security and traffic [...] Read more.
Software-defined networking (SDN) has reformed the traditional approach to managing and configuring networks by isolating the data plane from control plane. This isolation helps enable centralized control over network resources, enhanced programmability, and the ability to dynamically apply and enforce security and traffic policies. The shift in architecture offers numerous advantages such as increased flexibility, scalability, and improved network management but also introduces new and notable security challenges such as Distributed Denial-of-Service (DDoS) attacks. Such attacks focus on affecting the target with malicious traffic and even short-lived DDoS incidents can drastically impact the entire network’s stability, performance and availability. This comprehensive review paper provides a detailed investigation of SDN principles, the nature of DDoS threats in such environments and the strategies used to detect/mitigate these attacks. It provides novelty by offering an in-depth categorization of state-of-the-art detection techniques, utilizing machine learning, deep learning, and federated learning in domain-specific and general-purpose SDN scenarios. Each method is analyzed for its effectiveness. The paper further evaluates the strengths and weaknesses of these techniques, highlighting their applicability in different SDN contexts. In addition, the paper outlines the key performance metrics used in evaluating these detection mechanisms. Moreover, the novelty of the study is classifying the datasets commonly used for training and validating DDoS detection models into two major categories: legacy-compatible datasets that are adapted from traditional network environments, and SDN-contextual datasets that are specifically generated to reflect the characteristics of modern SDN systems. Finally, the paper suggests a few directions for future research. These include enhancing the robustness of detection models, integrating privacy-preserving techniques in collaborative learning, and developing more comprehensive and realistic SDN-specific datasets to improve the strength of SDN infrastructures against DDoS threats. Full article
Show Figures

Figure 1

22 pages, 1339 KB  
Article
AI-Powered Security for IoT Ecosystems: A Hybrid Deep Learning Approach to Anomaly Detection
by Deepak Kumar, Priyanka Pramod Pawar, Santosh Reddy Addula, Mohan Kumar Meesala, Oludotun Oni, Qasim Naveed Cheema, Anwar Ul Haq and Guna Sekhar Sajja
J. Cybersecur. Priv. 2025, 5(4), 90; https://doi.org/10.3390/jcp5040090 - 27 Oct 2025
Viewed by 428
Abstract
The rapid expansion of the Internet of Things (IoT) has introduced new vulnerabilities that traditional security mechanisms often fail to address effectively. Signature-based intrusion detection systems cannot adapt to zero-day attacks, while rule-based solutions lack scalability for the diverse and high-volume traffic in [...] Read more.
The rapid expansion of the Internet of Things (IoT) has introduced new vulnerabilities that traditional security mechanisms often fail to address effectively. Signature-based intrusion detection systems cannot adapt to zero-day attacks, while rule-based solutions lack scalability for the diverse and high-volume traffic in IoT environments. To strengthen the security framework for IoT, this paper proposes a deep learning-based anomaly detection approach that integrates Convolutional Neural Networks (CNNs) and Bidirectional Gated Recurrent Units (BiGRUs). The model is further optimized using the Moth–Flame Optimization (MFO) algorithm for automated hyperparameter tuning. To mitigate class imbalance in benchmark datasets, we employ Generative Adversarial Networks (GANs) for synthetic sample generation alongside Z-score normalization. The proposed CNN–BiGRU + MFO framework is evaluated on two widely used datasets, UNSW-NB15 and UCI SECOM. Experimental results demonstrate superior performance compared to several baseline deep learning models, achieving improvements across accuracy, precision, recall, F1-score, and ROC–AUC. These findings highlight the potential of combining hybrid deep learning architectures with evolutionary optimization for effective and generalizable intrusion detection in IoT systems. Full article
(This article belongs to the Special Issue Cybersecurity in the Age of AI and IoT: Challenges and Innovations)
Show Figures

Figure 1

29 pages, 1491 KB  
Article
Towards Sustainable Urban Mobility: Evaluating the Effective Connectivity of Cycling Networks in Mixed Traffic Environments of Nanjing, China
by Zhaoqiu Tan and Jinru Wang
Sustainability 2025, 17(21), 9528; https://doi.org/10.3390/su17219528 - 26 Oct 2025
Viewed by 336
Abstract
Promoting cycling in mixed-traffic environments remains a global challenge, hinging on the development of well-connected, low-stress networks. However, existing evaluation frameworks often lack comprehensiveness, overlooking the multifaceted nature of cyclists’ experiences. This study addresses this gap by proposing a novel multidimensional evaluation framework [...] Read more.
Promoting cycling in mixed-traffic environments remains a global challenge, hinging on the development of well-connected, low-stress networks. However, existing evaluation frameworks often lack comprehensiveness, overlooking the multifaceted nature of cyclists’ experiences. This study addresses this gap by proposing a novel multidimensional evaluation framework for assessing the effective connectivity of urban cycling networks. The framework integrates four critical dimensions: (1) structural connectivity of the basic road network, (2) dynamic interference from mixed traffic, (3) comfort of the cycling environment, and (4) cross-barrier connectivity. Using Nanjing, China, as a case study, we applied a hybrid Analytic Hierarchy Process (AHP)–Grey Clustering method to derive objective indicator weights and conduct a comprehensive evaluation. The results yield a composite score of 3.2568 (on a 0–4 scale), classifying Nanjing’s cycling network connectivity at the “Four-Star” level, indicating a generally positive developmental trajectory. Nevertheless, spatial disparities persist: the urban core faces intense traffic interference, while peripheral areas are hindered by network fragmentation and poor permeability. Key challenges include frequent vehicle–cyclist conflicts at intersections, inadequate nighttime illumination, suboptimal pavement conditions, and excessive detours caused by natural barriers such as the Yangtze River. This study provides urban planners and policymakers with a robust and systematic diagnostic tool to identify deficiencies and prioritize targeted interventions, ultimately contributing to sustainable urban mobility by enhancing the resilience, equity, and attractiveness of cycling networks in complex mixed-traffic settings. Full article
Show Figures

Figure 1

29 pages, 3861 KB  
Article
Mitigating Crossfire Attacks via Topology Spoofing Based on ENRNN-MTD
by Dexian Chang, Xiaobing Zhang, Jiajia Sun and Chen Fang
Appl. Sci. 2025, 15(21), 11432; https://doi.org/10.3390/app152111432 - 25 Oct 2025
Viewed by 297
Abstract
Crossfire attacks disrupt network services by targeting critical links of server groups, causing traffic congestion and server failures that prevent legitimate users from accessing services. To counter this threat, this study proposes a novel topology spoofing defense mechanism based on a sequence-based Graph [...] Read more.
Crossfire attacks disrupt network services by targeting critical links of server groups, causing traffic congestion and server failures that prevent legitimate users from accessing services. To counter this threat, this study proposes a novel topology spoofing defense mechanism based on a sequence-based Graph Neural Network–Moving Target Defense (ENRNN-MTD). During the reconnaissance phase, the method employs a GNN to generate multiple random and diverse virtual topologies, which are mapped to various external hosts. This obscures the real internal network structure and complicates the attacker’s ability to accurately identify it. In the attack phase, an IP random-hopping mechanism using a chaotic sequence is introduced to conceal node information and increase the cost of launching attacks, thereby enhancing the protection of critical services. Experimental results demonstrate that, compared to existing defense mechanisms, the proposed approach exhibits significant advantages in terms of deception topology randomness, defensive effectiveness, and system load management. Full article
(This article belongs to the Special Issue IoT Technology and Information Security)
Show Figures

Figure 1

33 pages, 1433 KB  
Article
Hybrid Time Series Transformer–Deep Belief Network for Robust Anomaly Detection in Mobile Communication Networks
by Anita Ershadi Oskouei, Mehrdad Kaveh, Francisco Hernando-Gallego and Diego Martín
Symmetry 2025, 17(11), 1800; https://doi.org/10.3390/sym17111800 - 25 Oct 2025
Viewed by 392
Abstract
The rapid evolution of 5G and emerging 6G networks has increased system complexity, data volume, and security risks, making anomaly detection vital for ensuring reliability and resilience. However, existing machine learning (ML)-based approaches still face challenges related to poor generalization, weak temporal modeling, [...] Read more.
The rapid evolution of 5G and emerging 6G networks has increased system complexity, data volume, and security risks, making anomaly detection vital for ensuring reliability and resilience. However, existing machine learning (ML)-based approaches still face challenges related to poor generalization, weak temporal modeling, and degraded accuracy under heterogeneous and imbalanced real-world conditions. To overcome these limitations, a hybrid time series transformer–deep belief network (HTST-DBN) is introduced, integrating the sequential modeling strength of TST with the hierarchical feature representation of DBN, while an improved orchard algorithm (IOA) performs adaptive hyper-parameter optimization. The framework also embodies the concept of symmetry and asymmetry. The IOA introduces controlled symmetry-breaking between exploration and exploitation, while the TST captures symmetric temporal patterns in network traffic whose asymmetric deviations often indicate anomalies. The proposed method is evaluated across four benchmark datasets (ToN-IoT, 5G-NIDD, CICDDoS2019, and Edge-IoTset) that capture diverse network environments, including 5G core traffic, IoT telemetry, mobile edge computing, and DDoS attacks. Experimental evaluation is conducted by benchmarking HTST-DBN against several state-of-the-art models, including TST, bidirectional encoder representations from transformers (BERT), DBN, deep reinforcement learning (DRL), convolutional neural network (CNN), and random forest (RF) classifiers. The proposed HTST-DBN achieves outstanding performance, with the highest accuracy reaching 99.61%, alongside strong recall and area under the curve (AUC) scores. The HTST-DBN framework presents a scalable and reliable solution for anomaly detection in next-generation mobile networks. Its hybrid architecture, reinforced by hyper-parameter optimization, enables effective learning in complex, dynamic, and heterogeneous environments, making it suitable for real-world deployment in future 5G/6G infrastructures. Full article
(This article belongs to the Special Issue AI-Driven Optimization for EDA: Balancing Symmetry and Asymmetry)
Show Figures

Figure 1

33 pages, 2850 KB  
Review
Network Traffic Analysis Based on Graph Neural Networks: A Scoping Review
by Ruonan Wang, Jinjing Zhao, Hongzheng Zhang, Liqiang He, Hu Li and Minhuan Huang
Big Data Cogn. Comput. 2025, 9(11), 270; https://doi.org/10.3390/bdcc9110270 - 24 Oct 2025
Viewed by 638
Abstract
Network traffic analysis is crucial for understanding network behavior and identifying underlying applications, protocols, and service groups. The increasing complexity of network environments, driven by the evolution of the Internet, poses significant challenges to traditional analytical approaches. Graph Neural Networks (GNNs) have recently [...] Read more.
Network traffic analysis is crucial for understanding network behavior and identifying underlying applications, protocols, and service groups. The increasing complexity of network environments, driven by the evolution of the Internet, poses significant challenges to traditional analytical approaches. Graph Neural Networks (GNNs) have recently garnered considerable attention in network traffic analysis due to their ability to model complex relationships within network flows and between communicating entities. This scoping review systematically surveys major academic databases, employing predefined eligibility criteria to identify and synthesize key research in the field, following the Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) methodology. We present a comprehensive overview of a generalized architecture for GNN-based traffic analysis and categorize recent methods into three primary types: node prediction, edge prediction, and graph prediction. We discuss challenges in network traffic analysis, summarize solutions from various methods, and provide practical recommendations for model selection. This review also compiles publicly available datasets and open-source code, serving as valuable resources for further research. Finally, we outline future research directions to advance this field. This work offers an updated understanding of GNN applications in network traffic analysis and provides practical guidance for researchers and practitioners. Full article
Show Figures

Figure 1

20 pages, 1257 KB  
Article
Detecting AI-Generated Network Traffic Using Transformer–MLP Ensemble
by Byeongchan Kim, Abhishek Chaudhary and Sunoh Choi
Appl. Sci. 2025, 15(21), 11338; https://doi.org/10.3390/app152111338 - 22 Oct 2025
Viewed by 322
Abstract
The rapid growth of generative artificial intelligence (AI) has enabled diverse applications but also introduced new attack techniques. Similar to deepfake media, generative AI can be exploited to create AI-generated traffic that evades existing intrusion detection systems (IDSs). This paper proposes a Dual [...] Read more.
The rapid growth of generative artificial intelligence (AI) has enabled diverse applications but also introduced new attack techniques. Similar to deepfake media, generative AI can be exploited to create AI-generated traffic that evades existing intrusion detection systems (IDSs). This paper proposes a Dual Detection System to detect such synthetic network traffic in the Message Queuing Telemetry Transport (MQTT) protocol widely used in Internet of Things (IoT) environments. The system operates in two stages: (i) primary filtering with a Long Short-Term Memory (LSTM) model to detect malicious traffic, and (ii) secondary verification with a Transformer–MLP ensemble to identify AI-generated traffic. Experimental results show that the proposed method achieves an average accuracy of 99.1 ± 0.6% across different traffic types (normal, malicious, and AI-generated), with nearly 100% detection of synthetic traffic. These findings demonstrate that the proposed dual detection system effectively overcomes the limitations of single-model approaches and significantly enhances detection performance. Full article
Show Figures

Figure 1

24 pages, 3824 KB  
Article
BiTAD: An Interpretable Temporal Anomaly Detector for 5G Networks with TwinLens Explainability
by Justin Li Ting Lau, Ying Han Pang, Charilaos Zarakovitis, Heng Siong Lim, Dionysis Skordoulis, Shih Yin Ooi, Kah Yoong Chan and Wai Leong Pang
Future Internet 2025, 17(11), 482; https://doi.org/10.3390/fi17110482 - 22 Oct 2025
Viewed by 314
Abstract
The transition to 5G networks brings unprecedented speed, ultra-low latency, and massive connectivity. Nevertheless, it introduces complex traffic patterns and broader attack surfaces that render traditional intrusion detection systems (IDSs) ineffective. Existing rule-based methods and classical machine learning approaches struggle to capture the [...] Read more.
The transition to 5G networks brings unprecedented speed, ultra-low latency, and massive connectivity. Nevertheless, it introduces complex traffic patterns and broader attack surfaces that render traditional intrusion detection systems (IDSs) ineffective. Existing rule-based methods and classical machine learning approaches struggle to capture the temporal and dynamic characteristics of 5G traffic, while many deep learning models lack interpretability, making them unsuitable for high-stakes security environments. To address these challenges, we propose Bidirectional Temporal Anomaly Detector (BiTAD), a deep temporal learning architecture for anomaly detection in 5G networks. BiTAD leverages dual-direction temporal sequence modelling with attention to encode both past and future dependencies while focusing on critical segments within network sequences. Like many deep models, BiTAD’s faces interpretability challenges. To resolve its “black-box” nature, a dual-perspective explainability module, coined TwinLens, is proposed. This module integrates SHAP and TimeSHAP to provide global feature attribution and temporal relevance, delivering dual-perspective interpretability. Evaluated on the public 5G-NIDD dataset, BiTAD demonstrates superior detection performance compared to existing models. TwinLens enables transparent insights by identifying which features and when they were most influential to anomaly predictions. By jointly addressing the limitations in temporal modelling and interpretability, our work contributes a practical IDS framework tailored to the demands of next-generation mobile networks. Full article
Show Figures

Figure 1

28 pages, 990 KB  
Article
Cross-Domain Adversarial Alignment for Network Anomaly Detection Through Behavioral Embedding Enrichment
by Cristian Salvador-Najar and Luis Julián Domínguez Pérez
Computers 2025, 14(11), 450; https://doi.org/10.3390/computers14110450 - 22 Oct 2025
Viewed by 284
Abstract
Detecting anomalies in network traffic is a central task in cybersecurity and digital infrastructure management. Traditional approaches rely on statistical models, rule-based systems, or machine learning techniques to identify deviations from expected patterns, but often face limitations in generalization across domains. This study [...] Read more.
Detecting anomalies in network traffic is a central task in cybersecurity and digital infrastructure management. Traditional approaches rely on statistical models, rule-based systems, or machine learning techniques to identify deviations from expected patterns, but often face limitations in generalization across domains. This study proposes a cross-domain data enrichment framework that integrates behavioral embeddings with network traffic features through adversarial autoencoders. Each network traffic record is paired with the most similar behavioral profile embedding from user web activity data (Charles dataset) using cosine similarity, thereby providing contextual enrichment for anomaly detection. The proposed system comprises (i) behavioral profile clustering via autoencoder embeddings and (ii) cross-domain latent alignment through adversarial autoencoders, with a discriminator to enable feature fusion. A Deep Feedforward Neural Network trained on the enriched feature space achieves 97.17% accuracy, 96.95% precision, 97.34% recall, and 97.14% F1-score, with stable cross-validation performance (99.79% average accuracy across folds). Behavioral clustering quality is supported by a silhouette score of 0.86 and a Davies–Bouldin index of 0.57. To assess robustness and transferability, the framework was evaluated on the UNSW-NB15 and the CIC-IDS2017 datasets, where results confirmed consistent performance and reliability when compared to traffic-only baselines. This supports the feasibility of cross-domain alignment and shows that adversarial training enables stable feature integration without evidence of overfitting or memorization. Full article
(This article belongs to the Section ICT Infrastructures for Cybersecurity)
Show Figures

Figure 1

Back to TopTop