Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (46)

Search Parameters:
Keywords = SDN measurement

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
29 pages, 466 KB  
Review
From Counters to Telemetry: A Survey of Programmable Network-Wide Monitoring
by Nofel Yaseen
Network 2025, 5(3), 38; https://doi.org/10.3390/network5030038 - 16 Sep 2025
Viewed by 1537
Abstract
Network monitoring is becoming increasingly challenging as networks grow in scale, speed, and complexity. The evolution of monitoring approaches reflects a shift from device-centric, localized techniques toward network-wide observability enabled by modern networking paradigms. Early methods like SNMP polling and NetFlow provided basic [...] Read more.
Network monitoring is becoming increasingly challenging as networks grow in scale, speed, and complexity. The evolution of monitoring approaches reflects a shift from device-centric, localized techniques toward network-wide observability enabled by modern networking paradigms. Early methods like SNMP polling and NetFlow provided basic insights but struggled with real-time visibility in large, dynamic environments. The emergence of Software-Defined Networking (SDN) introduced centralized control and a global view of network state, opening the door to more coordinated and programmable measurement strategies. More recently, programmable data planes (e.g., P4-based switches) and in-band telemetry frameworks have allowed fine grained, line rate data collection directly from traffic, reducing overhead and latency compared to traditional polling. These developments mark a move away from single point or per flow analysis toward holistic monitoring woven throughout the network fabric. In this survey, we systematically review the state of the art in network-wide monitoring. We define key concepts (topologies, flows, telemetry, observability) and trace the progression of monitoring architectures from traditional networks to SDN to fully programmable networks. We introduce a taxonomy spanning local device measures, path level techniques, global network-wide methods, and hybrid approaches. Finally, we summarize open research challenges and future directions, highlighting that modern networks demand monitoring frameworks that are not only scalable and real-time but also tightly integrated with network control and automation. Full article
Show Figures

Figure 1

31 pages, 2736 KB  
Article
Unseen Attack Detection in Software-Defined Networking Using a BERT-Based Large Language Model
by Mohammed N. Swileh and Shengli Zhang
AI 2025, 6(7), 154; https://doi.org/10.3390/ai6070154 - 11 Jul 2025
Cited by 1 | Viewed by 1391
Abstract
Software-defined networking (SDN) represents a transformative shift in network architecture by decoupling the control plane from the data plane, enabling centralized and flexible management of network resources. However, this architectural shift introduces significant security challenges, as SDN’s centralized control becomes an attractive target [...] Read more.
Software-defined networking (SDN) represents a transformative shift in network architecture by decoupling the control plane from the data plane, enabling centralized and flexible management of network resources. However, this architectural shift introduces significant security challenges, as SDN’s centralized control becomes an attractive target for various types of attacks. While the body of current research on attack detection in SDN has yielded important results, several critical gaps remain that require further exploration. Addressing challenges in feature selection, broadening the scope beyond Distributed Denial of Service (DDoS) attacks, strengthening attack decisions based on multi-flow analysis, and building models capable of detecting unseen attacks that they have not been explicitly trained on are essential steps toward advancing security measures in SDN environments. In this paper, we introduce a novel approach that leverages Natural Language Processing (NLP) and the pre-trained Bidirectional Encoder Representations from Transformers (BERT)-base-uncased model to enhance the detection of attacks in SDN environments. Our approach transforms network flow data into a format interpretable by language models, allowing BERT-base-uncased to capture intricate patterns and relationships within network traffic. By utilizing Random Forest for feature selection, we optimize model performance and reduce computational overhead, ensuring efficient and accurate detection. Attack decisions are made based on several flows, providing stronger and more reliable detection of malicious traffic. Furthermore, our proposed method is specifically designed to detect previously unseen attacks, offering a solution for identifying threats that the model was not explicitly trained on. To rigorously evaluate our approach, we conducted experiments in two scenarios: one focused on detecting known attacks, achieving an accuracy, precision, recall, and F1-score of 99.96%, and another on detecting previously unseen attacks, where our model achieved 99.96% in all metrics, demonstrating the robustness and precision of our framework in detecting evolving threats, and reinforcing its potential to improve the security and resilience of SDN networks. Full article
(This article belongs to the Special Issue Artificial Intelligence for Network Management)
Show Figures

Figure 1

18 pages, 835 KB  
Article
Reliability Evaluation of Two-Stage Uncertain Multi-State Weighted k-Out-of-n Systems
by Chun Wei, Haiyan Shi and Zhiqiang Zhang
Symmetry 2025, 17(6), 912; https://doi.org/10.3390/sym17060912 - 9 Jun 2025
Viewed by 498
Abstract
A class of complex systems can be structurally decomposed into several typical multi-state weighted k/n systems. In view of this, this paper proposes a reliability model to evaluate the reliability of such systems. The system reliability evaluation process is divided into two critical [...] Read more.
A class of complex systems can be structurally decomposed into several typical multi-state weighted k/n systems. In view of this, this paper proposes a reliability model to evaluate the reliability of such systems. The system reliability evaluation process is divided into two critical stages: (1) Primary stage: Structural reliability analysis (system-level) and (2) Secondary stage: Subsystem reliability verification (component-level). In practical applications, when there is a lack of operational data, expert experience is needed to evaluate the states and availability weights of components, which is a typical application scenario for uncertainty theory. Under the framework of uncertainty theory, two-stage reliability measures and importance measures are defined, and corresponding calculation formulas are derived. To improve computational efficiency, a binary search algorithm is proposed. Finally, an application case of SDN was proposed to demonstrate the effectiveness of the theory and method. This study provides an idea for the reliability modeling of complex systems (multiple pipeline transmissions, power systems, distributed computing, etc.) in the absence of operational data. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

22 pages, 5204 KB  
Article
Faulty Links’ Fast Recovery Method Based on Deep Reinforcement Learning
by Wanwei Huang, Wenqiang Gui, Yingying Li, Qingsong Lv, Jia Zhang and Xi He
Algorithms 2025, 18(5), 241; https://doi.org/10.3390/a18050241 - 24 Apr 2025
Cited by 1 | Viewed by 702
Abstract
Aiming to address the high recovery delay and link congestion issues in the communication network of Wide-Area Measurement Systems (WAMSs), this paper introduces Software-Defined Networking (SDN) and proposes a deep reinforcement learning-based faulty-link fast recovery method (DDPG-LBBP). The DDPG-LBBP method takes delay and [...] Read more.
Aiming to address the high recovery delay and link congestion issues in the communication network of Wide-Area Measurement Systems (WAMSs), this paper introduces Software-Defined Networking (SDN) and proposes a deep reinforcement learning-based faulty-link fast recovery method (DDPG-LBBP). The DDPG-LBBP method takes delay and link utilization as the optimization objectives and uses gated recurrent neural network to accelerate algorithm convergence and output the optimal link weights for load balancing. By designing maximally disjoint backup paths, the method ensures the independence of the primary and backup paths, effectively preventing secondary failures caused by path overlap. The experiment compares the (1+2ε)-BPCA, FFRLI, and LIR methods using IEEE 30 and IEEE 57 benchmark power system communication network topologies. Experimental results show that DDPG-LBBP outperforms the others in faulty-link recovery delay, packet loss rate, and recovery success rate. Specifically, compared to the superior algorithm (1+2ε)-BPCA, recovery delay is decreased by about 12.26% and recovery success rate is improved by about 6.91%. Additionally, packet loss rate is decreased by about 15.31% compared to the superior FFRLI method. Full article
(This article belongs to the Section Evolutionary Algorithms and Machine Learning)
Show Figures

Figure 1

28 pages, 1185 KB  
Review
Integrating Blockchains with the IoT: A Review of Architectures and Marine Use Cases
by Andreas Polyvios Delladetsimas, Stamatis Papangelou, Elias Iosif and George Giaglis
Computers 2024, 13(12), 329; https://doi.org/10.3390/computers13120329 - 6 Dec 2024
Cited by 4 | Viewed by 3935
Abstract
This review examines the integration of blockchain technology with the IoT in the Marine Internet of Things (MIoT) and Internet of Underwater Things (IoUT), with applications in areas such as oceanographic monitoring and naval defense. These environments present distinct challenges, including a limited [...] Read more.
This review examines the integration of blockchain technology with the IoT in the Marine Internet of Things (MIoT) and Internet of Underwater Things (IoUT), with applications in areas such as oceanographic monitoring and naval defense. These environments present distinct challenges, including a limited communication bandwidth, energy constraints, and secure data handling needs. Enhancing BIoT systems requires a strategic selection of computing paradigms, such as edge and fog computing, and lightweight nodes to reduce latency and improve data processing in resource-limited settings. While a blockchain can improve data integrity and security, it can also introduce complexities, including interoperability issues, high energy consumption, standardization challenges, and costly transitions from legacy systems. The solutions reviewed here include lightweight consensus mechanisms to reduce computational demands. They also utilize established platforms, such as Ethereum and Hyperledger, or custom blockchains designed to meet marine-specific requirements. Additional approaches incorporate technologies such as fog and edge layers, software-defined networking (SDN), the InterPlanetary File System (IPFS) for decentralized storage, and AI-enhanced security measures, all adapted to each application’s needs. Future research will need to prioritize scalability, energy efficiency, and interoperability for effective BIoT deployment. Full article
(This article belongs to the Special Issue When Blockchain Meets IoT: Challenges and Potentials)
Show Figures

Figure 1

29 pages, 671 KB  
Article
A Detailed Inspection of Machine Learning Based Intrusion Detection Systems for Software Defined Networks
by Saif AlDeen AlSharman, Osama Al-Khaleel and Mahmoud Al-Ayyoub
IoT 2024, 5(4), 756-784; https://doi.org/10.3390/iot5040034 - 11 Nov 2024
Cited by 1 | Viewed by 2174
Abstract
The growing use of the Internet of Things (IoT) across a vast number of sectors in our daily life noticeably exposes IoT internet-connected devices, which generate, share, and store sensitive data, to a wide range of cyber threats. Software Defined Networks (SDNs) can [...] Read more.
The growing use of the Internet of Things (IoT) across a vast number of sectors in our daily life noticeably exposes IoT internet-connected devices, which generate, share, and store sensitive data, to a wide range of cyber threats. Software Defined Networks (SDNs) can play a significant role in enhancing the security of IoT networks against any potential attacks. The goal of the SDN approach to network administration is to enhance network performance and monitoring. This is achieved by allowing more dynamic and programmatically efficient network configuration; hence, simplifying networks through centralized management and control. There are many difficulties for manufacturers to manage the risks associated with evolving technology as the technology itself introduces a variety of vulnerabilities and dangers. Therefore, Intrusion Detection Systems (IDSs) are an essential component for keeping tabs on suspicious behaviors. While IDSs can be implemented with more simplicity due to the centralized view of an SDN, the effectiveness of modern detection methods, which are mainly based on machine learning (ML) or deep learning (DL), is dependent on the quality of the data used in their modeling. Anomaly-based detection systems employed in SDNs have a hard time getting started due to the lack of publicly available data, especially on the data layer. The large majority of existing literature relies on data from conventional networks. This study aims to generate multiple types of Distributed Denial of Service (DDoS) and Denial of Service (DoS) attacks over the data plane (Southbound) portion of an SDN implementation. The cutting-edge virtualization technology is used to simulate a real-world environment of Docker Orchestration as a distributed system. The collected dataset contains examples of both benign and suspicious forms of attacks on the data plane of an SDN infrastructure. We also conduct an experimental evaluation of our collected dataset with well-known machine learning-based techniques and statistical measures to prove their usefulness. Both resources we build in this work (the dataset we create and the baseline models we train on it) can be useful for researchers and practitioners working on improving the security of IoT networks by using SDN technologies. Full article
Show Figures

Figure 1

17 pages, 5706 KB  
Article
Dynamic Routing Using Fuzzy Logic for URLLC in 5G Networks Based on Software-Defined Networking
by Yan-Jing Wu, Menq-Chyun Chen, Wen-Shyang Hwang and Ming-Hua Cheng
Electronics 2024, 13(18), 3694; https://doi.org/10.3390/electronics13183694 - 18 Sep 2024
Cited by 1 | Viewed by 1787
Abstract
Software-defined networking (SDN) is an emerging networking technology with a central point, called the controller, on the control plane. This controller communicates with the application and data planes. In fifth-generation (5G) mobile wireless networks and beyond, specific levels of service quality are defined [...] Read more.
Software-defined networking (SDN) is an emerging networking technology with a central point, called the controller, on the control plane. This controller communicates with the application and data planes. In fifth-generation (5G) mobile wireless networks and beyond, specific levels of service quality are defined for different traffic types. Ultra-reliable low-latency communication (URLLC) is one of the key services in 5G. This paper presents a fuzzy logic (FL)-based dynamic routing (FLDR) mechanism with congestion avoidance for URLLC on SDN-based 5G networks. By periodically monitoring the network status and making forwarding decisions on the basis of fuzzy inference rules, the FLDR mechanism not only can reroute in real time, but also can cope with network status uncertainty owing to FL’s fault tolerance capabilities. Three input parameters, normalized throughput, packet delay, and link utilization, were employed as crisp inputs to the FL control system because they had a more accurate correlation with the network performance measures we studied. The crisp output of the FL control system, i.e., path weight, and a predefined threshold of packet loss ratio on a path were applied to make routing decisions. We evaluated the performance of the proposed FLDR mechanism on the Mininet simulator by installing three additional modules, topology discovery, monitoring, and rerouting with FL, on the traditional control plane of SDN. The superiority of the proposed FLDR over the other existing FL-based routing schemes was demonstrated using three performance measures, system throughput, packet loss rate, and packet delay versus traffic load in the system. Full article
Show Figures

Figure 1

19 pages, 380 KB  
Article
Minimizing the Density of Switch–Controller Latencies over Total Latency for Software-Defined Networks
by Andres Viveros, Pablo Adasme, Ali Dehghan Firoozabadi and Enrique San Juan
Algorithms 2024, 17(9), 393; https://doi.org/10.3390/a17090393 - 5 Sep 2024
Cited by 1 | Viewed by 1215
Abstract
This study examines the problem of minimizing the amount and distribution of time delays or latencies experienced by data as they travel from one point to another within a software-defined network (SDN). For this purpose, a model is proposed that seeks to represent [...] Read more.
This study examines the problem of minimizing the amount and distribution of time delays or latencies experienced by data as they travel from one point to another within a software-defined network (SDN). For this purpose, a model is proposed that seeks to represent the minimization of the distances between network switches in proportion to the total nodes in a network. The highlights of this study are the proposal of two mixed-integer quadratic models from a fractional initial version. The first is obtained by transforming (from the original fractional model) the objective function into equivalent constraints. The second one is obtained by splitting each term of the fraction with an additional variable. The two developed models have a relationship between switches and controllers with quadratic terms. For this reason, an algorithm is proposed that can solve these problems in a shorter CPU time than the proposed models. In the development of this research work, we used real benchmarks and randomly generated networks, which were to be solved by all the proposed models. In addition, a few additional random networks that are larger in size were considered to better evaluate the performance of the proposed algorithm. All these instances are evaluated for different density scenarios. More precisely, we impose a constraint on the number of controllers for each network. All tests were performed using our models and the computational power of the Gurobi solver to find the optimal solutions for most of the instances. To the best of our knowledge, this work represents a novel mathematical representation of the latency density management problem in an SDN to measure the efficiency of the network. A detailed analysis of the test results shows that the effectiveness of the proposed models is closely related to the size of the studied networks. Furthermore, it can be noticed that the performance of the second model compared to the first one presents better behavior in terms of CPU times, the optimal solutions obtained, and the reduced Mipgaps obtained using the solver. These findings provide a deep understanding of how the models operate and how the optimization dynamics contribute to improving the efficiency and performance of SDNs. Full article
Show Figures

Figure 1

19 pages, 1186 KB  
Article
PrismParser: A Framework for Implementing Efficient P4-Programmable Packet Parsers on FPGA
by Parisa Mashreghi-Moghadam, Tarek Ould-Bachir and Yvon Savaria
Future Internet 2024, 16(9), 307; https://doi.org/10.3390/fi16090307 - 27 Aug 2024
Viewed by 1518
Abstract
The increasing complexity of modern networks and their evolving needs demand flexible, high-performance packet processing solutions. The P4 language excels in specifying packet processing in software-defined networks (SDNs). Field-programmable gate arrays (FPGAs) are ideal for P4-based packet parsers due to their reconfigurability and [...] Read more.
The increasing complexity of modern networks and their evolving needs demand flexible, high-performance packet processing solutions. The P4 language excels in specifying packet processing in software-defined networks (SDNs). Field-programmable gate arrays (FPGAs) are ideal for P4-based packet parsers due to their reconfigurability and ability to handle data transmitted at high speed. This paper introduces three FPGA-based P4-programmable packet parsing architectural designs that translate P4 specifications into adaptable hardware implementations called base, overlay, and pipeline, each optimized for different packet parsing performance. As modern network infrastructures evolve, the need for multi-tenant environments becomes increasingly critical. Multi-tenancy allows multiple independent users or organizations to share the same physical network resources while maintaining isolation and customized configurations. The rise of 5G and cloud computing has accelerated the demand for network slicing and virtualization technologies, enabling efficient resource allocation and management for multiple tenants. By leveraging P4-programmable packet parsers on FPGAs, our framework addresses these challenges by providing flexible and scalable solutions for multi-tenant network environments. The base parser offers a simple design for essential packet parsing, using minimal resources for high-speed processing. The overlay parser extends the base design for parallel processing, supporting various bus sizes and throughputs. The pipeline parser boosts throughput by segmenting parsing into multiple stages. The efficiency of the proposed approaches is evaluated through detailed resource consumption metrics measured on an Alveo U280 board, demonstrating throughputs of 15.2 Gb/s for the base design, 15.2 Gb/s to 64.42 Gb/s for the overlay design, and up to 282 Gb/s for the pipelined design. These results demonstrate a range of high performances across varying throughput requirements. The proposed approach utilizes a system that ensures low latency and high throughput that yields streaming packet parsers directly from P4 programs, supporting parsing graphs with up to seven transitioning nodes and four connections between nodes. The functionality of the parsers was tested on enterprise networks, a firewall, and a 5G Access Gateway Function graph. Full article
(This article belongs to the Special Issue Convergence of Edge Computing and Next Generation Networking)
Show Figures

Figure 1

51 pages, 3714 KB  
Review
Network Security Challenges and Countermeasures for Software-Defined Smart Grids: A Survey
by Dennis Agnew, Sharon Boamah, Arturo Bretas and Janise McNair
Smart Cities 2024, 7(4), 2131-2181; https://doi.org/10.3390/smartcities7040085 - 2 Aug 2024
Cited by 8 | Viewed by 4953
Abstract
The rise of grid modernization has been prompted by the escalating demand for power, the deteriorating state of infrastructure, and the growing concern regarding the reliability of electric utilities. The smart grid encompasses recent advancements in electronics, technology, telecommunications, and computer capabilities. Smart [...] Read more.
The rise of grid modernization has been prompted by the escalating demand for power, the deteriorating state of infrastructure, and the growing concern regarding the reliability of electric utilities. The smart grid encompasses recent advancements in electronics, technology, telecommunications, and computer capabilities. Smart grid telecommunication frameworks provide bidirectional communication to facilitate grid operations. Software-defined networking (SDN) is a proposed approach for monitoring and regulating telecommunication networks, which allows for enhanced visibility, control, and security in smart grid systems. Nevertheless, the integration of telecommunications infrastructure exposes smart grid networks to potential cyberattacks. Unauthorized individuals may exploit unauthorized access to intercept communications, introduce fabricated data into system measurements, overwhelm communication channels with false data packets, or attack centralized controllers to disable network control. An ongoing, thorough examination of cyber attacks and protection strategies for smart grid networks is essential due to the ever-changing nature of these threats. Previous surveys on smart grid security lack modern methodologies and, to the best of our knowledge, most, if not all, focus on only one sort of attack or protection. This survey examines the most recent security techniques, simultaneous multi-pronged cyber attacks, and defense utilities in order to address the challenges of future SDN smart grid research. The objective is to identify future research requirements, describe the existing security challenges, and highlight emerging threats and their potential impact on the deployment of software-defined smart grid (SD-SG). Full article
Show Figures

Figure 1

22 pages, 2721 KB  
Article
Federated Learning-Based Security Attack Detection for Multi-Controller Software-Defined Networks
by Abrar Alkhamisi, Iyad Katib and Seyed M. Buhari
Algorithms 2024, 17(7), 290; https://doi.org/10.3390/a17070290 - 2 Jul 2024
Cited by 7 | Viewed by 2409
Abstract
A revolutionary concept of Multi-controller Software-Defined Networking (MC-SDN) is a promising structure for pursuing an evolving complex and expansive large-scale modern network environment. Despite the rich operational flexibility of MC-SDN, it is imperative to protect the network deployment against potential vulnerabilities that lead [...] Read more.
A revolutionary concept of Multi-controller Software-Defined Networking (MC-SDN) is a promising structure for pursuing an evolving complex and expansive large-scale modern network environment. Despite the rich operational flexibility of MC-SDN, it is imperative to protect the network deployment against potential vulnerabilities that lead to misuse and malicious activities on data planes. The security holes in the MC-SDN significantly impact network survivability, and subsequently, the data plane is vulnerable to potential security threats and unintended consequences. Accordingly, this work intends to design a Federated learning-based Security (FedSec) strategy that detects the MC-SDN attack. The FedSec ensures packet routing services among the nodes by maintaining a flow table frequently updated according to the global model knowledge. By executing the FedSec algorithm only on the network-centric nodes selected based on importance measurements, the FedSec reduces the system complexity and enhances attack detection and classification accuracy. Finally, the experimental results illustrate the significance of the proposed FedSec strategy regarding various metrics. Full article
(This article belongs to the Special Issue Supervised and Unsupervised Classification Algorithms (2nd Edition))
Show Figures

Figure 1

36 pages, 3662 KB  
Article
Enhancing Network Slicing Security: Machine Learning, Software-Defined Networking, and Network Functions Virtualization-Driven Strategies
by José Cunha, Pedro Ferreira, Eva M. Castro, Paula Cristina Oliveira, Maria João Nicolau, Iván Núñez, Xosé Ramon Sousa and Carlos Serôdio
Future Internet 2024, 16(7), 226; https://doi.org/10.3390/fi16070226 - 27 Jun 2024
Cited by 28 | Viewed by 8278
Abstract
The rapid development of 5G networks and the anticipation of 6G technologies have ushered in an era of highly customizable network environments facilitated by the innovative concept of network slicing. This technology allows the creation of multiple virtual networks on the same physical [...] Read more.
The rapid development of 5G networks and the anticipation of 6G technologies have ushered in an era of highly customizable network environments facilitated by the innovative concept of network slicing. This technology allows the creation of multiple virtual networks on the same physical infrastructure, each optimized for specific service requirements. Despite its numerous benefits, network slicing introduces significant security vulnerabilities that must be addressed to prevent exploitation by increasingly sophisticated cyber threats. This review explores the application of cutting-edge technologies—Artificial Intelligence (AI), specifically Machine Learning (ML), Software-Defined Networking (SDN), and Network Functions Virtualization (NFV)—in crafting advanced security solutions tailored for network slicing. AI’s predictive threat detection and automated response capabilities are analysed, highlighting its role in maintaining service integrity and resilience. Meanwhile, SDN and NFV are scrutinized for their ability to enforce flexible security policies and manage network functionalities dynamically, thereby enhancing the adaptability of security measures to meet evolving network demands. Thoroughly examining the current literature and industry practices, this paper identifies critical research gaps in security frameworks and proposes innovative solutions. We advocate for a holistic security strategy integrating ML, SDN, and NFV to enhance data confidentiality, integrity, and availability across network slices. The paper concludes with future research directions to develop robust, scalable, and efficient security frameworks capable of supporting the safe deployment of network slicing in next-generation networks. Full article
(This article belongs to the Special Issue Privacy and Security in Computing Continuum and Data-Driven Workflows)
Show Figures

Figure 1

14 pages, 800 KB  
Article
Open Source Software-Defined Networking Controllers—Operational and Security Issues
by Aleksandra Mardaus, Edyta Biernacka, Robert Wójcik and Jerzy Domżał
Electronics 2024, 13(12), 2329; https://doi.org/10.3390/electronics13122329 - 15 Jun 2024
Viewed by 2289
Abstract
The Software-Defined Networking concept plays an important role in network management. The central controller, which is the main element of SDN, allows the provision of traffic engineering and security solutions in single- and multiple-layer networks based on optical transmission. In this work, we [...] Read more.
The Software-Defined Networking concept plays an important role in network management. The central controller, which is the main element of SDN, allows the provision of traffic engineering and security solutions in single- and multiple-layer networks based on optical transmission. In this work, we compare selected open-source implementations of SDN controllers. Throughput and latency measurements were analyzed using the CBench program. The simulation of a link failure and a controller failure were conducted using the API provided by the Mininet network simulator. To detect security vulnerabilities, a dedicated program, called sdnpwn, was used. This work provides an overview of the selected controllers, indicating their strengths and weaknesses. Moreover, some implemention suggestions and recommendations are presented. Full article
(This article belongs to the Special Issue New Insight into Network Virtualization and Management)
Show Figures

Figure 1

14 pages, 1504 KB  
Article
Feature-Selection-Based DDoS Attack Detection Using AI Algorithms
by Muhammad Saibtain Raza, Mohammad Nowsin Amin Sheikh, I-Shyan Hwang and Mohammad Syuhaimi Ab-Rahman
Telecom 2024, 5(2), 333-346; https://doi.org/10.3390/telecom5020017 - 17 Apr 2024
Cited by 12 | Viewed by 6063
Abstract
SDN has the ability to transform network design by providing increased versatility and effective regulation. Its programmable centralized controller gives network administration employees more authority, allowing for more seamless supervision. However, centralization makes it vulnerable to a variety of attack vectors, with distributed [...] Read more.
SDN has the ability to transform network design by providing increased versatility and effective regulation. Its programmable centralized controller gives network administration employees more authority, allowing for more seamless supervision. However, centralization makes it vulnerable to a variety of attack vectors, with distributed denial of service (DDoS) attacks posing a serious concern. Feature selection-based Machine Learning (ML) techniques are more effective than traditional signature-based Intrusion Detection Systems (IDS) at identifying new threats in the context of defending against distributed denial of service (DDoS) attacks. In this study, NGBoost is compared with four additional machine learning (ML) algorithms: convolutional neural network (CNN), Stochastic Gradient Descent (SGD), Decision Tree, and Random Forest, in order to assess the effectiveness of DDoS detection on the CICDDoS2019 dataset. It focuses on important measures such as F1 score, recall, accuracy, and precision. We have examined NeTBIOS, a layer-7 attack, and SYN, a layer-4 attack, in our paper. Our investigation shows that Natural Gradient Boosting and Convolutional Neural Networks, in particular, show promise with tabular data categorization. In conclusion, we go through specific study results on protecting against attacks using DDoS. These experimental findings offer a framework for making decisions. Full article
Show Figures

Figure 1

18 pages, 3370 KB  
Article
Multi-Stage Learning Framework Using Convolutional Neural Network and Decision Tree-Based Classification for Detection of DDoS Pandemic Attacks in SDN-Based SCADA Systems
by Onur Polat, Muammer Türkoğlu, Hüseyin Polat, Saadin Oyucu, Hüseyin Üzen, Fahri Yardımcı and Ahmet Aksöz
Sensors 2024, 24(3), 1040; https://doi.org/10.3390/s24031040 - 5 Feb 2024
Cited by 15 | Viewed by 3358
Abstract
Supervisory Control and Data Acquisition (SCADA) systems, which play a critical role in monitoring, managing, and controlling industrial processes, face flexibility, scalability, and management difficulties arising from traditional network structures. Software-defined networking (SDN) offers a new opportunity to overcome the challenges traditional SCADA [...] Read more.
Supervisory Control and Data Acquisition (SCADA) systems, which play a critical role in monitoring, managing, and controlling industrial processes, face flexibility, scalability, and management difficulties arising from traditional network structures. Software-defined networking (SDN) offers a new opportunity to overcome the challenges traditional SCADA networks face, based on the concept of separating the control and data plane. Although integrating the SDN architecture into SCADA systems offers many advantages, it cannot address security concerns against cyber-attacks such as a distributed denial of service (DDoS). The fact that SDN has centralized management and programmability features causes attackers to carry out attacks that specifically target the SDN controller and data plane. If DDoS attacks against the SDN-based SCADA network are not detected and precautions are not taken, they can cause chaos and have terrible consequences. By detecting a possible DDoS attack at an early stage, security measures that can reduce the impact of the attack can be taken immediately, and the likelihood of being a direct victim of the attack decreases. This study proposes a multi-stage learning model using a 1-dimensional convolutional neural network (1D-CNN) and decision tree-based classification to detect DDoS attacks in SDN-based SCADA systems effectively. A new dataset containing various attack scenarios on a specific experimental network topology was created to be used in the training and testing phases of this model. According to the experimental results of this study, the proposed model achieved a 97.8% accuracy rate in DDoS-attack detection. The proposed multi-stage learning model shows that high-performance results can be achieved in detecting DDoS attacks against SDN-based SCADA systems. Full article
(This article belongs to the Special Issue Intelligent Solutions for Cybersecurity)
Show Figures

Figure 1

Back to TopTop