Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (58)

Search Parameters:
Keywords = massive Machine-Type Communication (mMTC)

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
36 pages, 738 KB  
Article
Activity Detection and Channel Estimation Based on Correlated Hybrid Message Passing for Grant-Free Massive Random Access
by Xiaofeng Liu, Xinrui Gong and Xiao Fu
Entropy 2025, 27(11), 1111; https://doi.org/10.3390/e27111111 - 28 Oct 2025
Viewed by 281
Abstract
Massive machine-type communications (mMTC) in future 6G networks will involve a vast number of devices with sporadic traffic. Grant-free access has emerged as an effective strategy to reduce the access latency and processing overhead by allowing devices to transmit without prior permission, making [...] Read more.
Massive machine-type communications (mMTC) in future 6G networks will involve a vast number of devices with sporadic traffic. Grant-free access has emerged as an effective strategy to reduce the access latency and processing overhead by allowing devices to transmit without prior permission, making accurate active user detection and channel estimation (AUDCE) crucial. In this paper, we investigate the joint AUDCE problem in wideband massive access systems. We develop an innovative channel prior model that captures the dual correlation structure of the channel using three state variables: active indication, channel supports, and channel values. By integrating Markov chains with coupled Gaussian distributions, the model effectively describes both the structural and numerical dependencies within the channel. We propose the correlated hybrid message passing (CHMP) algorithm based on Bethe free energy (BFE) minimization, which adaptively updates model parameters without requiring prior knowledge of user sparsity or channel priors. Simulation results show that the CHMP algorithm accurately detects active users and achieves precise channel estimation. Full article
(This article belongs to the Topic Advances in Sixth Generation and Beyond (6G&B))
Show Figures

Figure 1

16 pages, 1350 KB  
Article
The Synergistic Impact of 5G on Cloud-to-Edge Computing and the Evolution of Digital Applications
by Saleh M. Altowaijri and Mohamed Ayari
Mathematics 2025, 13(16), 2634; https://doi.org/10.3390/math13162634 - 16 Aug 2025
Cited by 1 | Viewed by 2956
Abstract
The integration of 5G technology with cloud and edge computing is redefining the digital landscape by enabling ultra-fast connectivity, low-latency communication, and scalable solutions across diverse application domains. This paper investigates the synergistic impact of 5G on cloud-to-edge architectures, emphasizing its transformative role [...] Read more.
The integration of 5G technology with cloud and edge computing is redefining the digital landscape by enabling ultra-fast connectivity, low-latency communication, and scalable solutions across diverse application domains. This paper investigates the synergistic impact of 5G on cloud-to-edge architectures, emphasizing its transformative role in revolutionizing sectors such as healthcare, smart cities, industrial automation, and autonomous systems. Key advancements in 5G—including Enhanced Mobile Broadband (eMBB), Ultra-Reliable Low-Latency Communication (URLLC), and Massive Machine-Type Communications (mMTC)—are examined for their role in enabling real-time data processing, edge intelligence, and IoT scalability. In addition to conceptual analysis, the paper presents simulation-based evaluations comparing 5G cloud-to-edge systems with traditional 4G cloud models. Quantitative results demonstrate significant improvements in latency, energy efficiency, reliability, and AI prediction accuracy. The study also explores challenges in infrastructure deployment, cybersecurity, and latency management while highlighting the growing opportunities for innovation in AI-driven automation and immersive consumer technologies. Future research directions are outlined, focusing on energy-efficient designs, advanced security mechanisms, and equitable access to 5G infrastructure. Overall, this study offers comprehensive insights and performance benchmarks that will serve as a valuable resource for researchers and practitioners working to advance next-generation digital ecosystems. Full article
(This article belongs to the Special Issue Innovations in Cloud Computing and Machine Learning Applications)
Show Figures

Figure 1

15 pages, 1529 KB  
Article
Peak Age of Information Optimization in Cell-Free Massive Random Access Networks
by Zhiru Zhao, Yuankang Huang and Wen Zhan
Electronics 2025, 14(13), 2714; https://doi.org/10.3390/electronics14132714 - 4 Jul 2025
Viewed by 561
Abstract
With the vigorous development of Internet of Things technologies, Cell-Free Radio Access Network (CF-RAN), leveraging its distributed coverage and single/multi-antenna Access Point (AP) coordination advantages, has become a key technology for supporting massive Machine-Type Communication (mMTC). However, under the grant-free random access mechanism, [...] Read more.
With the vigorous development of Internet of Things technologies, Cell-Free Radio Access Network (CF-RAN), leveraging its distributed coverage and single/multi-antenna Access Point (AP) coordination advantages, has become a key technology for supporting massive Machine-Type Communication (mMTC). However, under the grant-free random access mechanism, this network architecture faces the problem of information freshness degradation due to channel congestion. To address this issue, a joint decoding model based on logical grouping architecture is introduced to analyze the correlation between the successful packet transmission probability and the Peak Age of Information (PAoI) in both single-AP and multi-AP scenarios. On this basis, a global Particle Swarm Optimization (PSO) algorithm is designed to dynamically adjust the channel access probability to minimize the average PAoI across the network. To reduce signaling overhead, a PSO algorithm based on local topology information is further proposed to achieve collaborative optimization among neighboring APs. Simulation results demonstrate that the global PSO algorithm can achieve performance closely approximating the optimum, while the local PSO algorithm maintains similar performance without the need for global information. It is especially suitable for large-scale access scenarios with wide area coverage, providing an efficient solution for optimizing information freshness in CF-RAN. Full article
Show Figures

Figure 1

18 pages, 1372 KB  
Article
Resource Allocation in 5G Cellular IoT Systems with Early Transmissions at the Random Access Phase
by Anastasia Daraseliya, Eduard Sopin, Vyacheslav Begishev, Yevgeni Koucheryavy and Konstantin Samouylov
Sensors 2025, 25(7), 2264; https://doi.org/10.3390/s25072264 - 3 Apr 2025
Viewed by 1083
Abstract
While the market for massive machine type communications (mMTC) is evolving at an unprecedented pace, the standardization bodies, including 3GPP, are lagging behind with standardization of truly 5G-grade cellular Internet-of-Things (CIoT) systems. As an intermediate solution, an early data transmission mechanisms encapsulating the [...] Read more.
While the market for massive machine type communications (mMTC) is evolving at an unprecedented pace, the standardization bodies, including 3GPP, are lagging behind with standardization of truly 5G-grade cellular Internet-of-Things (CIoT) systems. As an intermediate solution, an early data transmission mechanisms encapsulating the data into the preambles has been recently proposed for 4G/5G Narrowband IoT (NB-IoT) technology. This mechanism is also expected to become a part of future CIoT systems. The aim of this paper is to propose a model for CIoT systems with and without early transmission functionality and assess the optimal distribution of resources at the random access and data transmission phases. To this end, the developed model captures both phases explicitly as well as different traffic composition in downlink and uplink directions. Our numerical results demonstrate that the use of early transmission functionality allows one to drastically decrease the delay of uplink packets by up to 20–40%, even in presence of downlink traffic sharing the same set of resources. However, it also affects the optimal share of resources allocated for random access and data transmission phases. As a result, the optimal performance of 5G mMTC technologies with or without early transmission mode can only be attained if the dynamic resource allocation is implemented. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

24 pages, 8199 KB  
Article
Redefining 6G Network Slicing: AI-Driven Solutions for Future Use Cases
by Robert Botez, Daniel Zinca and Virgil Dobrota
Electronics 2025, 14(2), 368; https://doi.org/10.3390/electronics14020368 - 18 Jan 2025
Cited by 10 | Viewed by 5530
Abstract
The evolution from 5G to 6G networks is driven by the need to meet the stringent requirements, i.e., ultra-reliable, low-latency, and high-throughput communication. The new services are called Further-Enhanced Mobile Broadband (feMBB), Extremely Reliable and Low-Latency Communications (ERLLCs), Ultra-Massive Machine-Type Communications (umMTCs), Massive [...] Read more.
The evolution from 5G to 6G networks is driven by the need to meet the stringent requirements, i.e., ultra-reliable, low-latency, and high-throughput communication. The new services are called Further-Enhanced Mobile Broadband (feMBB), Extremely Reliable and Low-Latency Communications (ERLLCs), Ultra-Massive Machine-Type Communications (umMTCs), Massive Ultra-Reliable Low-Latency Communications (mURLLCs), and Mobile Broadband Reliable Low-Latency Communications (MBRLLCs). Network slicing emerges as a critical enabler in 6G, providing virtualized, end-to-end network segments tailored to diverse application needs. Despite its significance, existing datasets for slice selection are limited to 5G or LTE-A contexts, lacking relevance to the enhanced requirements. In this work, we present a novel synthetic dataset tailored to 6G network slicing. By analyzing the emerging service requirements, we generated traffic parameters, including latency, packet loss, jitter, and transfer rates. Machine Learning (ML) models like Random Forest (RF), Decision Tree (DT), XGBoost, Support Vector Machine (SVM), and Feedforward Neural Network (FNN) were trained on this dataset, achieving over 99% accuracy in both slice classification and handover prediction. Our results highlight the potential of this dataset as a valuable tool for developing AI-assisted 6G network slicing mechanisms. While still in its early stages, the dataset lays a foundation for future research. As the 6G standardization advances, we aim to refine the dataset and models, ultimately enabling real-time, intelligent slicing solutions for next-generation networks. Full article
(This article belongs to the Special Issue Advances in IoT Security)
Show Figures

Figure 1

12 pages, 1157 KB  
Article
Multi-Layered Unsupervised Learning Driven by Signal-to-Noise Ratio-Based Relaying for Vehicular Ad Hoc Network-Supported Intelligent Transport System in eHealth Monitoring
by Ali Nauman, Adeel Iqbal, Tahir Khurshaid and Sung Won Kim
Sensors 2024, 24(20), 6548; https://doi.org/10.3390/s24206548 - 11 Oct 2024
Cited by 1 | Viewed by 1995
Abstract
Every year, about 1.19 million people are killed in traffic accidents; hence, the United Nations has a goal of halving the number of road traffic deaths and injuries by 2030. In line with this objective, technological innovations in telecommunication, particularly brought about by [...] Read more.
Every year, about 1.19 million people are killed in traffic accidents; hence, the United Nations has a goal of halving the number of road traffic deaths and injuries by 2030. In line with this objective, technological innovations in telecommunication, particularly brought about by the rise of 5G networks, have contributed to the development of modern Vehicle-to-Everything (V2X) systems for communication. A New Radio V2X (NR-V2X) was introduced in the latest Third Generation Partnership Project (3GPP) releases which allows user devices to exchange information without relying on roadside infrastructures. This, together with Massive Machine Type Communication (mMTC) and Ultra-Reliable Low Latency Communication (URLLC), has led to the significantly increased reliability, coverage, and efficiency of vehicular communication networks. The use of artificial intelligence (AI), especially K-means clustering, has been very promising in terms of supporting efficient data exchange in vehicular ad hoc networks (VANETs). K-means is an unsupervised machine learning (ML) technique that groups vehicles located near each other geographically so that they can communicate with one another directly within these clusters while also allowing for inter-cluster communication via cluster heads. This paper proposes a multi-layered VANET-enabled Intelligent Transportation System (ITS) framework powered by unsupervised learning to optimize communication efficiency, scalability, and reliability. By leveraging AI in VANET solutions, the proposed framework aims to address road safety challenges and contribute to global efforts to meet the United Nations’ 2030 target. Additionally, this framework’s robust communication and data processing capabilities can be extended to eHealth monitoring systems, enabling real-time health data transmission and processing for continuous patient monitoring and timely medical interventions. This paper’s contributions include exploring AI-driven approaches for enhanced data interaction, improved safety in VANET-based ITS environments, and potential applications in eHealth monitoring. Full article
(This article belongs to the Special Issue Intelligent Sensors and Control for Vehicle Automation)
Show Figures

Figure 1

13 pages, 392 KB  
Article
Grant-Free Random Access Enhanced by Massive MIMO and Non-Orthogonal Preambles
by Hao Jiang, Hongming Chen, Hongming Hu and Jie Ding
Electronics 2024, 13(11), 2179; https://doi.org/10.3390/electronics13112179 - 3 Jun 2024
Viewed by 1606
Abstract
Massive multiple input multiple output (MIMO) enabled grant-free random access (mGFRA) stands out as a promising random access (RA) solution, thus effectively addressing the need for massive access in massive machine-type communications (mMTCs) while ensuring high spectral efficiency and minimizing signaling overhead. However, [...] Read more.
Massive multiple input multiple output (MIMO) enabled grant-free random access (mGFRA) stands out as a promising random access (RA) solution, thus effectively addressing the need for massive access in massive machine-type communications (mMTCs) while ensuring high spectral efficiency and minimizing signaling overhead. However, the bottleneck of mGFRA is mainly dominated by the orthogonal preamble collisions, since the orthogonal preamble pool is small and of a fixed-sized. In this paper, we explore the potential of non-orthogonal preambles to overcome limitations and enhance the success probability of mGFRA without extending the length of the preamble. Given the RA procedure of mGFRA, we analyze the factors influencing the success rate of mGFRA with non-orthogonal preamble and propose to use two types of sequences, namely Gold sequence and Gaussian distribution sequence, as the preambles for mGFRA. Simulation results demonstrate the effectiveness of these two types pf non-orthogonal preambles in improving the success probability of mGFRA. Moreover, the system parameters’ impact on the performance of mGFRA with non-orthogonal preambles is examined and deliberated. Full article
Show Figures

Figure 1

24 pages, 1104 KB  
Article
A Learning-Based Energy-Efficient Device Grouping Mechanism for Massive Machine-Type Communication in the Context of Beyond 5G Networks
by Rubbens Boisguene, Ibrahim Althamary and Chih-Wei Huang
J. Sens. Actuator Netw. 2024, 13(3), 33; https://doi.org/10.3390/jsan13030033 - 28 May 2024
Cited by 4 | Viewed by 2154
Abstract
With the increasing demand for high data rates, low delay, and extended battery life, managing massive machine-type communication (mMTC) in the beyond 5G (B5G) context is challenging. MMTC devices, which play a role in developing the Internet of Things (IoT) and smart cities, [...] Read more.
With the increasing demand for high data rates, low delay, and extended battery life, managing massive machine-type communication (mMTC) in the beyond 5G (B5G) context is challenging. MMTC devices, which play a role in developing the Internet of Things (IoT) and smart cities, need to transmit short amounts of data periodically within a specific time frame. Although blockchain technology is utilized for secure data storage and transfer while digital twin technology provides real-time monitoring and management of the devices, issues such as constrained time delays and network congestion persist. Without a proper data transmission strategy, most devices would fail to transmit in time, thus defying their relevance and purpose. This work investigates the problem of massive random access channel (RACH) attempts while emphasizing the energy efficiency and access latency for mMTC devices with critical missions in B5G networks. Using machine learning techniques, we propose an attention-based reinforcement learning model that orchestrates the device grouping strategy to optimize device placement. Thus, the model guarantees a higher probability of success for the devices during data transmission access, eventually leading to more efficient energy consumption. Through thorough quantitative simulations, we demonstrate that the proposed learning-based approach significantly outperforms the other baseline grouping methods. Full article
Show Figures

Figure 1

18 pages, 834 KB  
Article
A Multi-Agent Reinforcement Learning-Based Grant-Free Random Access Protocol for mMTC Massive MIMO Networks
by Felipe Augusto Dutra Bueno, Alessandro Goedtel, Taufik Abrão and José Carlos Marinello
J. Sens. Actuator Netw. 2024, 13(3), 30; https://doi.org/10.3390/jsan13030030 - 30 Apr 2024
Cited by 2 | Viewed by 2717
Abstract
The expected huge number of connected devices in Internet of Things (IoT) applications characterizes the massive machine-type communication (mMTC) scenario, one prominent use case of beyond fifth-generation (B5G) systems. To meet mMTC connectivity requirements, grant-free (GF) random access (RA) protocols are seen as [...] Read more.
The expected huge number of connected devices in Internet of Things (IoT) applications characterizes the massive machine-type communication (mMTC) scenario, one prominent use case of beyond fifth-generation (B5G) systems. To meet mMTC connectivity requirements, grant-free (GF) random access (RA) protocols are seen as a promising solution due to the small amount of data that MTC devices usually transmit. In this paper, we propose a GF RA protocol based on a multi-agent reinforcement learning approach, applied to aid IoT devices in selecting the least congested RA pilots. The rewards obtained by the devices in collision cases resemble the congestion level of the chosen pilot. To enable the operation of the proposed method in a realistic B5G network scenario and aiming to reduce signaling overheads and centralized processing, the rewards in our proposed method are computed by the devices taking advantage of a large number of base station antennas. Numerical results demonstrate the superior performance of the proposed method in terms of latency, network throughput, and per-device throughput compared with other protocols. Full article
Show Figures

Figure 1

22 pages, 7068 KB  
Article
Field-Programmable Gate Array-Based Implementation of Zero-Trust Stream Data Encryption for Enabling 6G-Narrowband Internet of Things Massive Device Access
by Wen-Chung Tsai
Sensors 2024, 24(3), 853; https://doi.org/10.3390/s24030853 - 28 Jan 2024
Cited by 4 | Viewed by 2642
Abstract
With the advent of 6G Narrowband IoT (NB-IoT) technology, IoT security faces inevitable challenges due to the application requirements of Massive Machine-Type Communications (mMTCs). In response, a 6G base station (gNB) and User Equipment (UE) necessitate increased capacities to handle a larger number [...] Read more.
With the advent of 6G Narrowband IoT (NB-IoT) technology, IoT security faces inevitable challenges due to the application requirements of Massive Machine-Type Communications (mMTCs). In response, a 6G base station (gNB) and User Equipment (UE) necessitate increased capacities to handle a larger number of connections while maintaining reasonable performance during operations. To address this developmental trend and overcome associated technological hurdles, this paper proposes a hardware-accelerated and software co-designed mechanism to support streaming data transmissions and secure zero-trust inter-endpoint communications. The proposed implementations aim to offload processing efforts from micro-processors and enhance global system operation performance by hardware and software co-design in endpoint communications. Experimental results demonstrate that the proposed secure mechanism based on the use of non-repeating keys and implemented in FPGA, can save 85.61%, 99.71%, and 95.68% of the micro-processor’s processing time in key block generations, non-repeating checks, and data block transfers, respectively. Full article
Show Figures

Figure 1

27 pages, 2174 KB  
Article
Market and Sharing Alternatives for the Provision of Massive Machine-Type and Ultra-Reliable Low-Latency Communications Services over a 5G Network
by Edison Moreno-Cardenas and Luis Guijarro
Electronics 2023, 12(24), 4994; https://doi.org/10.3390/electronics12244994 - 13 Dec 2023
Viewed by 1871
Abstract
The objective of this paper is to analyze economic alternatives for the provision of ultra-reliable low-latency communication (URLLC) and massive machine-type communication (mMTC) services over a fifth-generation (5G) network. Two business models, a monopoly and a duopoly, are studied and two 5G network [...] Read more.
The objective of this paper is to analyze economic alternatives for the provision of ultra-reliable low-latency communication (URLLC) and massive machine-type communication (mMTC) services over a fifth-generation (5G) network. Two business models, a monopoly and a duopoly, are studied and two 5G network scenarios are analyzed: a 5G network where the network resources are shared between the two services without service priority, and a 5G network with network slicing that allows for URLLC traffic to have a higher priority. Microeconomics is used to model the behavior of users and operators, and game theory is used to model the strategic interaction between users and operators. The results show that a monopoly over a 5G network with network slicing is the most efficient way to provide both URLLC and mMTC services. Full article
Show Figures

Figure 1

20 pages, 1373 KB  
Article
An E2E Network Slicing Framework for Slice Creation and Deployment Using Machine Learning
by Sujitha Venkatapathy, Thiruvenkadam Srinivasan, Han-Gue Jo and In-Ho Ra
Sensors 2023, 23(23), 9608; https://doi.org/10.3390/s23239608 - 4 Dec 2023
Cited by 6 | Viewed by 3611
Abstract
Network slicing shows promise as a means to endow 5G networks with flexible and dynamic features. Network function virtualization (NFV) and software-defined networking (SDN) are the key methods for deploying network slicing, which will enable end-to-end (E2E) isolation services permitting each slice to [...] Read more.
Network slicing shows promise as a means to endow 5G networks with flexible and dynamic features. Network function virtualization (NFV) and software-defined networking (SDN) are the key methods for deploying network slicing, which will enable end-to-end (E2E) isolation services permitting each slice to be customized depending on service requirements. The goal of this investigation is to construct network slices through a machine learning algorithm and allocate resources for the newly created slices using dynamic programming in an efficient manner. A substrate network is constructed with a list of key performance indicators (KPIs) like CPU capacity, bandwidth, delay, link capacity, and security level. After that, network slices are produced by employing multi-layer perceptron (MLP) using the adaptive moment estimation (ADAM) optimization algorithm. For each requested service, the network slices are categorized as massive machine-type communications (mMTC), enhanced mobile broadband (eMBB), and ultra-reliable low-latency communications (uRLLC). After network slicing, resources are provided to the services that have been requested. In order to maximize the total user access rate and resource efficiency, Dijkstra’s algorithm is adopted for resource allocation that determines the shortest path between nodes in the substrate network. The simulation output shows that the present model allocates optimum slices to the requested services with high resource efficiency and reduced total bandwidth utilization. Full article
(This article belongs to the Section Sensor Networks)
Show Figures

Figure 1

10 pages, 4979 KB  
Article
Path Loss Characterization in an Outdoor Corridor Environment for IoT-5G in a Smart Campus University at 850 MHz and 3.5 GHz Frequency Bands
by Juan Muñoz, David Mancipe, Herman Fernández, Lorenzo Rubio, Vicent M. Rodrigo Peñarrocha and Juan Reig
Sensors 2023, 23(22), 9237; https://doi.org/10.3390/s23229237 - 17 Nov 2023
Cited by 5 | Viewed by 2810
Abstract
The usage scenarios defined in the ITU-M2150-1 recommendation for IMT-2020 systems, including enhanced Mobile Broadband (eMBB), Ultra-reliable Low-latency Communication (URLLC), and massive Machine Type Communication (mMTC), allow the possibility of accessing different services through the set of Radio Interface Technologies (RITs), Long-term Evolution [...] Read more.
The usage scenarios defined in the ITU-M2150-1 recommendation for IMT-2020 systems, including enhanced Mobile Broadband (eMBB), Ultra-reliable Low-latency Communication (URLLC), and massive Machine Type Communication (mMTC), allow the possibility of accessing different services through the set of Radio Interface Technologies (RITs), Long-term Evolution (LTE), and New Radio (NR), which are components of RIT. The potential of the low and medium frequency bands allocated by the Federal Communications Commission (FCC) for the fifth generation of mobile communications (5G) is described. In addition, in the Internet of Things (IoT) applications that will be covered by the case of use of the mMTC are framed. In this sense, a propagation channel measurement campaign was carried out at 850 MHz and 5.9 GHz in a covered corridor environment, located in an open space within the facilities of the Pedagogical and Technological University of Colombia campus. The measurements were carried out in the time domain using a channel sounder based on a Universal Software Radio Peripheral (USRP) to obtain the received signal power levels over a range of separation distances between the transmitter and receiver from 2.00 m to 67.5 m. Then, a link budget was proposed to describe the path loss behavior as a function of these distances to obtain the parameters for the close-in free space reference distance (CI) and the floating intercept (FI) path loss prediction models. These parameters were estimated from the measurements made using the Minimum Mean Square Error (MMSE) approach. The estimated path loss exponent (PLE) values for both the CI and FI path loss models at 850 MHz and 3.5 GHz are in the range of 2.21 to 2.41, respectively. This shows that the multipath effect causes a lack of constructive interference to the received power signal for this type of outdoor corridor scenario. These results can be used in simulation tools to evaluate the path loss behavior and optimize the deployment of device and sensor network infrastructure to enable 5G-IoT connectivity in smart university campus scenarios. Full article
(This article belongs to the Special Issue Internet of Things for Smart City Application)
Show Figures

Figure 1

44 pages, 1561 KB  
Article
A Comprehensive Study on the Role of Machine Learning in 5G Security: Challenges, Technologies, and Solutions
by Hussam N. Fakhouri, Sadi Alawadi, Feras M. Awaysheh, Imad Bani Hani, Mohannad Alkhalaileh and Faten Hamad
Electronics 2023, 12(22), 4604; https://doi.org/10.3390/electronics12224604 - 10 Nov 2023
Cited by 54 | Viewed by 8510
Abstract
Fifth-generation (5G) mobile networks have already marked their presence globally, revolutionizing entertainment, business, healthcare, and other domains. While this leap forward brings numerous advantages in speed and connectivity, it also poses new challenges for security protocols. Machine learning (ML) and deep learning (DL) [...] Read more.
Fifth-generation (5G) mobile networks have already marked their presence globally, revolutionizing entertainment, business, healthcare, and other domains. While this leap forward brings numerous advantages in speed and connectivity, it also poses new challenges for security protocols. Machine learning (ML) and deep learning (DL) have been employed to augment traditional security measures, promising to mitigate risks and vulnerabilities. This paper conducts an exhaustive study to assess ML and DL algorithms’ role and effectiveness within the 5G security landscape. Also, it offers a profound dissection of the 5G network’s security paradigm, particularly emphasizing the transformative role of ML and DL as enabling security tools. This study starts by examining the unique architecture of 5G and its inherent vulnerabilities, contrasting them with emerging threat vectors. Next, we conduct a detailed analysis of the network’s underlying segments, such as network slicing, Massive Machine-Type Communications (mMTC), and edge computing, revealing their associated security challenges. By scrutinizing current security protocols and international regulatory impositions, this paper delineates the existing 5G security landscape. Finally, we outline the capabilities of ML and DL in redefining 5G security. We detail their application in enhancing anomaly detection, fortifying predictive security measures, and strengthening intrusion prevention strategies. This research sheds light on the present-day 5G security challenges and offers a visionary perspective, highlighting the intersection of advanced computational methods and future 5G security. Full article
(This article belongs to the Section Networks)
Show Figures

Figure 1

32 pages, 419 KB  
Article
The 6G Ecosystem as Support for IoE and Private Networks: Vision, Requirements, and Challenges
by Carlos Serôdio, José Cunha, Guillermo Candela, Santiago Rodriguez, Xosé Ramón Sousa and Frederico Branco
Future Internet 2023, 15(11), 348; https://doi.org/10.3390/fi15110348 - 25 Oct 2023
Cited by 57 | Viewed by 6638
Abstract
The emergence of the sixth generation of cellular systems (6G) signals a transformative era and ecosystem for mobile communications, driven by demands from technologies like the internet of everything (IoE), V2X communications, and factory automation. To support this connectivity, mission-critical applications are emerging [...] Read more.
The emergence of the sixth generation of cellular systems (6G) signals a transformative era and ecosystem for mobile communications, driven by demands from technologies like the internet of everything (IoE), V2X communications, and factory automation. To support this connectivity, mission-critical applications are emerging with challenging network requirements. The primary goals of 6G include providing sophisticated and high-quality services, extremely reliable and further-enhanced mobile broadband (feMBB), low-latency communication (ERLLC), long-distance and high-mobility communications (LDHMC), ultra-massive machine-type communications (umMTC), extremely low-power communications (ELPC), holographic communications, and quality of experience (QoE), grounded in incorporating massive broad-bandwidth machine-type (mBBMT), mobile broad-bandwidth and low-latency (MBBLL), and massive low-latency machine-type (mLLMT) communications. In attaining its objectives, 6G faces challenges that demand inventive solutions, incorporating AI, softwarization, cloudification, virtualization, and slicing features. Technologies like network function virtualization (NFV), network slicing, and software-defined networking (SDN) play pivotal roles in this integration, which facilitates efficient resource utilization, responsive service provisioning, expanded coverage, enhanced network reliability, increased capacity, densification, heightened availability, safety, security, and reduced energy consumption. It presents innovative network infrastructure concepts, such as resource-as-a-service (RaaS) and infrastructure-as-a-service (IaaS), featuring management and service orchestration mechanisms. This includes nomadic networks, AI-aware networking strategies, and dynamic management of diverse network resources. This paper provides an in-depth survey of the wireless evolution leading to 6G networks, addressing future issues and challenges associated with 6G technology to support V2X environments considering presenting +challenges in architecture, spectrum, air interface, reliability, availability, density, flexibility, mobility, and security. Full article
(This article belongs to the Special Issue Moving towards 6G Wireless Technologies)
Back to TopTop