Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (67)

Search Parameters:
Keywords = 3rd Generation Partnership Project (3GPP)

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
17 pages, 8254 KB  
Article
QoS-Aware Downlink Paging Control for UAV-Assisted 5G-Advanced Networks with On-Demand Coverage
by Conghao Li, Haizhi Yu, Weidong Gao, Dengyan Wang, Shouhui Lai, Xu Zhao, Hongzhi Zhang and Gengshuo Liu
Drones 2026, 10(3), 191; https://doi.org/10.3390/drones10030191 - 10 Mar 2026
Viewed by 332
Abstract
To meet the energy-saving requirements of user equipment (UE) operating in Radio Resource Control idle/inactive states (RRC_IDLE/RRC_INACTIVE) in the 3rd-Generation Partnership Project (3GPP) 5G-Advanced (5G-A) networks, the New Radio (NR) downlink paging procedure relies on periodic monitoring and frequent synchronization signal block (SSB) [...] Read more.
To meet the energy-saving requirements of user equipment (UE) operating in Radio Resource Control idle/inactive states (RRC_IDLE/RRC_INACTIVE) in the 3rd-Generation Partnership Project (3GPP) 5G-Advanced (5G-A) networks, the New Radio (NR) downlink paging procedure relies on periodic monitoring and frequent synchronization signal block (SSB) measurements, which wastes energy when no paging arrivals occur. Meanwhile, heterogeneous Quality of Service (QoS) constraints make it difficult for fixed-parameter Idle Discontinuous Reception and Paging Early Indication mechanisms (IDRX/PEI) to balance energy, delay, and reliability. This paper develops a UAV-assisted 5G-A paging control framework that maps services into multiple QoS classes and models QoS violation risk and system energy consumption under unified accounting, including UE monitoring/reception energy and unmanned aerial vehicle (UAV) forwarding energy. We then propose a QoS-aware risk-driven paging strategy: an offline Long Short-Term Memory (LSTM) predictor is trained to estimate the time-to-next-arrival (TTNA) of paging events and produce a bounded urgency/risk signal to initialize class-dependent thresholds, while online triggering and QoS-feedback-based threshold adaptation regulate the empirical violation rate toward target constraints under varying loads, enabling a controllable energy–delay trade-off. A simulation-based evaluation is conducted to compare the proposed method with representative baselines (Enhanced Paging Monitoring (EPM), Split Paging Occasion (SPOP), and Predicted Paging Early Indication (PPEI)) and to examine the impact of SSB overhead and UAV relaying on the energy–delay–reliability trade-offs. Full article
Show Figures

Figure 1

18 pages, 1202 KB  
Article
A Data-Driven Distributed Autonomous Architecture for the 6G Network
by Qiuyue Gao, Jinyan Li and Yanxia Xing
Electronics 2026, 15(1), 102; https://doi.org/10.3390/electronics15010102 - 25 Dec 2025
Viewed by 854
Abstract
Driven by technological innovation, service diversification, and the evolution and defects of current networks, the 6th-generation (6G) network architecture is lacking in research. One of the challenges in this research is that the architectural design should take into account multiple factors: customers, operators, [...] Read more.
Driven by technological innovation, service diversification, and the evolution and defects of current networks, the 6th-generation (6G) network architecture is lacking in research. One of the challenges in this research is that the architectural design should take into account multiple factors: customers, operators, and vendors. For service-oriented and network-oriented design requirements, this article proposes a data-driven distributed autonomous architecture (DDAA) for 6G with a three-layer four-plane logical hierarchy. The architecture is simplified as four network function units (NFUs), the interaction among which is carried via dual-bus interfaces, i.e., the service-based interface (SBI) and data transmission interface (DTI). In addition, it is user data-centric and rendered as distributed autonomous domains (ADs) with different scales to better adapt to customized services. Different transition stages from the 5th generation (5G) to 6G are discussed. Network simplification evaluation is further provided by going through several signaling procedures of the 3rd-generation partnership project (3GPP), inspiring advanced research and subsequent standardization of the 6G network architecture. Full article
(This article belongs to the Special Issue 6G and Beyond: Architectures, Challenges, and Opportunities)
Show Figures

Figure 1

22 pages, 2718 KB  
Article
Joint Beam Position Grouping and RO Allocation for LEO Satellite Communication Systems
by Bojun Guo, Yiming Zhu, Yi Zheng, Yafei Wang, Mengyao Cao, Wenjin Wang and Li Chai
Electronics 2025, 14(23), 4731; https://doi.org/10.3390/electronics14234731 - 30 Nov 2025
Viewed by 652
Abstract
International organizations such as the 3rd Generation Partnership Project (3GPP) and the International Telecommunication Union (ITU) regard non-terrestrial networks (NTNs) as an essential component of the sixth-generation (6G) mobile communication technology and have advanced relevant standardization efforts. Low Earth orbit (LEO) satellite communication [...] Read more.
International organizations such as the 3rd Generation Partnership Project (3GPP) and the International Telecommunication Union (ITU) regard non-terrestrial networks (NTNs) as an essential component of the sixth-generation (6G) mobile communication technology and have advanced relevant standardization efforts. Low Earth orbit (LEO) satellite communication (SatCom) constitutes a key part of NTNs, and efficient uplink random access (RA) is crucial for establishing initial connections in LEO SatCom systems. However, the long propagation delay and wide coverage of LEO satellites substantially increase access latency and collision probability due to the limited number of beams and their constrained coverage areas. In addition, the highly non-uniform spatial distribution of user equipment (UE) further aggravates access inefficiency. To this end, this paper investigates joint beam position grouping and RA channel (RACH) occasions (ROs) allocation (JBPGRA) for LEO SatCom systems. Specifically, we develop a system model for RA under beam hopping and identify the key factors that influence RA performance. Furthermore, we derive expressions for both the instantaneous signal-to-interference-plus-noise ratio (SINR) and the average SINR under a given non-uniform UE spatial distribution. Building on this analysis, the JBPGRA problem is formulated as an integer linear programming problem that seeks to maximize RA success while conserving RO resources under non-uniform UE distribution. To achieve a practical solution, we propose an efficient JBPGRA algorithm composed of beam position classification, sparse beam position grouping, and RO allocation modules. Simulation results demonstrate that, under the same UE density, the proposed JBPGRA scheme achieves over 29% higher access success rate in dense beam positions compared with the uniform baseline adopted in existing SatCom systems, while reducing RO consumption by more than 49% and decreasing the number of beam position groups by over 57%. Full article
(This article belongs to the Special Issue Advances in Satellite/UAV Communications)
Show Figures

Figure 1

24 pages, 4372 KB  
Article
Performance Analysis of Multi-OEM TV White Space Radios in Outdoor Environments
by Mla Vilakazi, Koketso Makaleng, Lwando Ngcama, Mofolo Mofolo and Luzango Mfupe
Appl. Sci. 2025, 15(18), 9977; https://doi.org/10.3390/app15189977 - 12 Sep 2025
Cited by 1 | Viewed by 2061
Abstract
The television white space (TVWS) spectrum presents a promising opportunity to extend wireless broadband access, particularly in rural, underserved, and hard-to-reach communities. To leverage this potential, low-power radio communication equipment must efficiently utilise the TVWS spectrum on a secondary basis while ensuring strict [...] Read more.
The television white space (TVWS) spectrum presents a promising opportunity to extend wireless broadband access, particularly in rural, underserved, and hard-to-reach communities. To leverage this potential, low-power radio communication equipment must efficiently utilise the TVWS spectrum on a secondary basis while ensuring strict compliance with regulatory requirements to prevent harmful interference to primary services. This paper presents a comparative performance analysis of TVWS radio equipment from three original equipment manufacturers (OEMs). The equipment under test was identified to reflect each OEM, as follows: OEM 1 and OEM 2 from South Korea and OEM 3 from the USA. We evaluated their performance in two real-world field scenarios, namely outdoor short-distance and outdoor long-distance. The evaluation was based on the following key metrics: (i) spectrum utilisation efficiency (SUE), (ii) received signal strength (RSS), (iii) downlink throughput, and (iv) connectivity to the Geo-Location Spectrum Database (GLSD) in compliance with the South African TVWS regulatory framework. The overall preliminary experimental results indicate that in both scenarios, white space devices (WSDs) based on the Institute of Electrical and Electronics Engineers (IEEE) 802.11af Standard demonstrated better performance than those based on the 3rd Generation Partnership Project Long-Term Evolution-Advanced (3GPP LTE-A) Standard in terms of the SUE, downlink throughput, and RSS metrics. All WSDs under test demonstrated sufficient compliance with the regulatory requirement metric. Full article
(This article belongs to the Special Issue Applications of Wireless and Mobile Communications)
Show Figures

Figure 1

17 pages, 2946 KB  
Article
Generalized Frequency Division Multiplexing—Based Direct Mapping—Multiple-Input Multiple-Output Mobile Electroencephalography Communication Technique
by Chin-Feng Lin and Kun-Yu Chen
Appl. Sci. 2025, 15(17), 9451; https://doi.org/10.3390/app15179451 - 28 Aug 2025
Cited by 1 | Viewed by 859
Abstract
Electroencephalography (EEG) communication technology with ultra-low power consumption, high transmission data rates, and low latency plays a significant role in mHealth, telemedicine, and Internet of Medical Things (IoMT). In this paper, generalized frequency division multiplexing (GFDM)-based direct mapping (DM) multi-input—multi-output (MIMO) mobile EEG [...] Read more.
Electroencephalography (EEG) communication technology with ultra-low power consumption, high transmission data rates, and low latency plays a significant role in mHealth, telemedicine, and Internet of Medical Things (IoMT). In this paper, generalized frequency division multiplexing (GFDM)-based direct mapping (DM) multi-input—multi-output (MIMO) mobile EEG communication technology (MECT) is proposed for implementation with the above-mentioned applications. The (2000, 1000) low-density parity-check (LDPC) code, four-quadrature amplitude modulation (4-QAM), a power assignment mechanism, and the 3rd Generation Partnership Project (3GPP) cluster delay line (CDL) channel model D were integrated into the proposed EEGCT. The transmission bit error rates (BERs), mean square errors (MSEs), and Pearson-correlation coefficients (PCCs) of the original and received EEG signals were evaluated. Simulation results show that, with a signal to noise ratio (SNR) of 14.51 dB, with a channel estimation error (CEE) of 5%, the BER, MSE, and PCC of the original and received EEG signals were 9.9777 × 10−8, 1.440 × 10−5 and 0.999999998, respectively, whereas, with an SNR of 15.0004 dB and a CEE of 10%, they were 9.9777 × 10−8, 1.4368 × 10−5, and 0.999999997622151, respectively. As the BER value, and PS saving are 9.9777 × 10−8, and 40%, respectively. With the CEE changes from 0% to 5%, and 5% to 10%, the N0 values of the proposed MECT decrease by approximately 0.0022 and 0.002, respectively. The MECT has excellent EEG signal transmission performance. Full article
(This article belongs to the Special Issue Communication Technology for Smart Mobility Systems)
Show Figures

Figure 1

25 pages, 3374 KB  
Article
A GNSS–Cellular Network Hybridization Strategy for Robust Positioning
by María Jesús Jiménez-Martínez, Mónica Zabala Haro, Ángel Martín Furonés and Ana Anquela Julián
Appl. Sci. 2025, 15(11), 6300; https://doi.org/10.3390/app15116300 - 4 Jun 2025
Viewed by 1782
Abstract
The hybridization of cellular networks and GNSS systems has gained increasing attention, especially in urban canyons and indoor environments where GNSS performance degrades significantly. Hybrid localization is part of the 3rd Generation Partnership Project (3GPP) standard, offering an effective solution when satellite visibility [...] Read more.
The hybridization of cellular networks and GNSS systems has gained increasing attention, especially in urban canyons and indoor environments where GNSS performance degrades significantly. Hybrid localization is part of the 3rd Generation Partnership Project (3GPP) standard, offering an effective solution when satellite visibility is limited. Additional cellular measurements can enhance the accuracy and reliability of standalone UE. Hybrid methods offer multiple benefits: an improved availability, continuity, and integrity; better signal penetration due to proximity; a lower power consumption; and, in harsh environments, potentially more accurate positioning than a GNSS. Moreover, GNSS chipsets in mobile phones or smartwatches are typically power-intensive. This work presents a user-level hybridization method that enables UE to receive both GNSS and 4G/5G data and autonomously determine whether to apply hybrid positioning. The developed algorithms improve the precision and reliability, allowing user-driven decisions based on data quality. The system was tested under static conditions across various scenarios: outdoors, in urban canyons, and indoors. The results show that, while hybridization enhances positioning, the 4G-only solution often performs in terms of vertical accuracy. Standard deviation metrics help guide the selection of the most precise option in real time. Full article
(This article belongs to the Special Issue Mapping and Localization for Intelligent Vehicles in Urban Canyons)
Show Figures

Figure 1

30 pages, 1552 KB  
Review
3GPP Evolution from 5G to 6G: A 10-Year Retrospective
by Xingqin Lin
Telecom 2025, 6(2), 32; https://doi.org/10.3390/telecom6020032 - 20 May 2025
Cited by 12 | Viewed by 13297
Abstract
The 3rd Generation Partnership Project (3GPP) evolution of mobile communication technologies from 5G to 6G has been a transformative journey spanning a decade, shaped by six releases from Release 15 to Release 20. This article provides a retrospective of this evolution, highlighting the [...] Read more.
The 3rd Generation Partnership Project (3GPP) evolution of mobile communication technologies from 5G to 6G has been a transformative journey spanning a decade, shaped by six releases from Release 15 to Release 20. This article provides a retrospective of this evolution, highlighting the technical advancements, challenges, and milestones that have defined the transition from the foundational 5G era to the emergence of 6G. Starting with Release 15, which marked the birth of 5G and its New Radio (NR) air interface, the journey progressed through Release 16, where 5G was qualified as an International Mobile Telecommunications-2020 (IMT-2020) technology, and Release 17, which expanded 5G into new domains such as non-terrestrial networks. Release 18 ushered in the 5G-Advanced era, incorporating novel technologies like artificial intelligence. Releases 19 and 20 continue this momentum, focusing on commercially driven enhancements while laying the groundwork for the 6G era. This article explores how 3GPP technology evolution has shaped the telecommunications landscape over the past decade, bridging two mobile generations. It concludes with insights into learned lessons, future challenges, and opportunities, offering guidelines on 6G evolution for 2030 and beyond. Full article
Show Figures

Figure 1

17 pages, 642 KB  
Article
A Distributed Trustable Framework for AI-Aided Anomaly Detection
by Nikolaos Nomikos, George Xylouris, Gerasimos Patsourakis, Vasileios Nikolakakis, Anastasios Giannopoulos, Charilaos Mandilaris, Panagiotis Gkonis, Charalabos Skianis and Panagiotis Trakadas
Electronics 2025, 14(3), 410; https://doi.org/10.3390/electronics14030410 - 21 Jan 2025
Cited by 9 | Viewed by 2608
Abstract
The evolution towards sixth-generation (6G) networks requires new architecture enhancements to support the broad device ecosystem, comprising users, machines, autonomous vehicles, and Internet-of-things devices. Moreover, high heterogeneity in the desired quality-of-service (QoS) is expected, as 6G networks will offer extremely low-latency and high-throughput [...] Read more.
The evolution towards sixth-generation (6G) networks requires new architecture enhancements to support the broad device ecosystem, comprising users, machines, autonomous vehicles, and Internet-of-things devices. Moreover, high heterogeneity in the desired quality-of-service (QoS) is expected, as 6G networks will offer extremely low-latency and high-throughput services and error-free communication. This complex environment raises significant challenges in resource management while adhering to security and privacy constraints due to the plethora of data generation endpoints. Considering the advances in AI/ML-aided integration in wireless networks and recent efforts on the network data analytics function (NWDAF) by the 3rd generation partnership project (3GPP), this work presents an AI/ML-aided distributed trustable engine (DTE), collecting data from diverse sources of the 6G infrastructure and deploying ML methods for anomaly detection against diverse threat types. Moreover, we present the DTE architecture and its components, providing data management, AI/ML model training, and classification capabilities for anomaly detection. To promote privacy-aware networking, a federated learning (FL) framework to extend the DTE is discussed. Then, the anomaly detection capabilities of the AI/ML-aided DTE are presented in detail, together with the ML model training process, which considers various ML models. For this purpose, we use two open datasets representing attack scenarios in the core and the edge parts of the network. Experimental results, including an ensemble learning method and different supervised learning alternatives, show that the AI/ML-aided DTE can efficiently train ML models with reduced dimensionality and deploy them in diverse cybersecurity scenarios to improve anomaly detection in 6G networks. Full article
(This article belongs to the Special Issue Recent Advances and Challenges in IoT, Cloud and Edge Coexistence)
Show Figures

Figure 1

13 pages, 6831 KB  
Article
Demonstration of a Hybrid B5G System Integrating VLC and RF-Based Technologies with Access Networks
by Tomás Powell Villena Andrade, Celso Henrique de Souza Lopes, Letícia Carneiro de Souza and Arismar Cerqueira Sodré Junior
Appl. Sci. 2025, 15(2), 955; https://doi.org/10.3390/app15020955 - 19 Jan 2025
Cited by 1 | Viewed by 1771
Abstract
Visible-light communication (VLC) has emerged as a promising technology to provide the very high-throughput wireless communications demanded by beyond-fifth-generation (B5G) applications. However, few works are found in the literature regarding the integration of VLC systems with other wireless communications technologies and with access [...] Read more.
Visible-light communication (VLC) has emerged as a promising technology to provide the very high-throughput wireless communications demanded by beyond-fifth-generation (B5G) applications. However, few works are found in the literature regarding the integration of VLC systems with other wireless communications technologies and with access networks. In this context, and as a proof of concept, we implement and experimentally evaluate a hybrid network architecture based on VLC, radio-over-fiber (RoF), free space optics (FSO), fiber-wireless (FiWi), and millimeter-waves (mm-waves) for B5G applications. Such optical networks make use of fiber-optic links based on RoF technology as backhauls, whereas their fronthauls might be either by FSO or RoF. Finally, a triple-wireless-access network is ensured by VLC, FiWi, and mm-wave links. The latter use a real 5G new radio (5G NR) signal. The system performance is evaluated in terms of a root mean square error vector magnitude (EVMRMS) parameter in accordance with the 3rd-Generation Partnership Project (3GPP) requirements. The experimental results demonstrate a total maximal theoretical throughput of approximately 1.66 Gbps, aligning with the digital performance requirements set by 3GPP. Full article
(This article belongs to the Special Issue Visible Light Communications (VLC) Networks)
Show Figures

Figure 1

15 pages, 529 KB  
Article
A Throughput Analysis Using a Non-Saturated Markov Chain Model for LTE-LAA and WLAN Coexistence
by Mun-Suk Kim
Mathematics 2025, 13(1), 59; https://doi.org/10.3390/math13010059 - 27 Dec 2024
Cited by 1 | Viewed by 1063
Abstract
To address the severe spectrum shortage in mobile networks, the 3rd Generation Partnership Project (3GPP) standardized Long Term Evolution (LTE)-License Assisted Access (LAA) technology. The LTE-LAA system ensures efficient coexistence with other existing unlicensed systems by incorporating listen-before-talk functionality and conducting random backoff [...] Read more.
To address the severe spectrum shortage in mobile networks, the 3rd Generation Partnership Project (3GPP) standardized Long Term Evolution (LTE)-License Assisted Access (LAA) technology. The LTE-LAA system ensures efficient coexistence with other existing unlicensed systems by incorporating listen-before-talk functionality and conducting random backoff operations similar to those in the IEEE 802.11 distributed coordination function. In this paper, we propose an analytical model to calculate the throughput of each system in a scenario where a single LTE-LAA system shares an unlicensed channel with multiple wireless local area network (WLAN) systems. The LTE-LAA system is utilized for supplementary downlink transmission from the LTE-LAA eNodeB (eNB) to LTE-LAA devices. Our proposed analytical model uses a Markov chain to represent the random backoff operations of the LTE-LAA eNB and WLAN nodes under non-saturated traffic conditions and to calculate the impact of the clear channel assessment (CCA) performed by the LTE-LAA eNB. Through numerical results, we demonstrate how the throughput of both the LTE-LAA and WLAN systems is determined by the contention window size and CCA threshold of the LTE-LAA eNB. Full article
Show Figures

Figure 1

28 pages, 6182 KB  
Article
Toward an Era of Secure 5G Convergence Applications: Formal Security Verification of 3GPP AKMA with TLS 1.3 PSK Option
by Yongho Ko, I Wayan Adi Juliawan Pawana, Taeho Won, Philip Virgil Astillo and Ilsun You
Appl. Sci. 2024, 14(23), 11152; https://doi.org/10.3390/app142311152 - 29 Nov 2024
Cited by 3 | Viewed by 2876
Abstract
The 5th Generation Mobile Communication (5G) plays a significant role in the Fourth Industrial Revolution (4IR), facilitating significant improvements and innovations in various fields. The 3rd Generation Partnership Project (3GPP) is currently standardizing the Authentication and Key Management for Application (AKMA) system for [...] Read more.
The 5th Generation Mobile Communication (5G) plays a significant role in the Fourth Industrial Revolution (4IR), facilitating significant improvements and innovations in various fields. The 3rd Generation Partnership Project (3GPP) is currently standardizing the Authentication and Key Management for Application (AKMA) system for the 5G convergence applications (5G cAPPs). The Transport Layer Security (TLS) is recommended as the application-specific Ua* protocol between User Equipment (UE) and Application Function (AF) to securely transmit the AKMA identifiers of UE as well as guarantee traffic protection. Among TLS protocols, session resumption in TLS 1.2 and the Pre-Shared Key (PSK) modes of TLS 1.3 are particularly desirable for Ua*. Unfortunately, the integration of PSK options of TLS 1.3, namely PSK-only, PSK-(EC)DHE, and 0-RTT (0 Round-Trip Time) modes, with AKMA has not yet been thoroughly investigated; hence, security, performance, compatibility, and effectiveness remain uncertain. In response, this paper explores the integration of the TLS 1.3 PSK options with AKMA and investigates the said metrics by conducting formal security verification and emulating exemplary applications. According to the formal verification and experimental results, the PSK-(EC)DH mode shows a security strength trade-off with efficiency. On the one hand, the 0-RTT mode demonstrates better efficiency but exhibits drawbacks on forward secrecy and replay attacks. The result suggests that 0-RTT mode has to be approved to ensure seamless integration of the TLS 1.3 PSK option with AKMA. In addition, adjustment on the AKMA architecture is also imperative to enhance security level. Full article
(This article belongs to the Special Issue Edge-Enabled Big Data Intelligence for 6G and IoT Applications)
Show Figures

Figure 1

32 pages, 2926 KB  
Article
Mitigating Security Vulnerabilities in 6G Networks: A Comprehensive Analysis of the DMRN Protocol Using SVO Logic and ProVerif
by Ilsun You, Jiyoon Kim, I Wayan Adi Juliawan Pawana and Yongho Ko
Appl. Sci. 2024, 14(21), 9726; https://doi.org/10.3390/app14219726 - 24 Oct 2024
Cited by 7 | Viewed by 2781
Abstract
The rapid evolution of mobile and optical communication technologies is driving the transition from 5G to 6G networks. This transition inevitably brings about changes in authentication scenarios, as new security demands emerge that go beyond the capabilities of existing frameworks. Therefore, it is [...] Read more.
The rapid evolution of mobile and optical communication technologies is driving the transition from 5G to 6G networks. This transition inevitably brings about changes in authentication scenarios, as new security demands emerge that go beyond the capabilities of existing frameworks. Therefore, it is necessary to address these evolving requirements and the associated key challenges: ensuring Perfect Forward Secrecy (PFS) to protect communications even if long-term keys are compromised and integrating Post-Quantum Cryptography (PQC) techniques to defend against the threats posed by quantum computing. These are essential for both radio and optical communications, which are foundational elements of future 6G infrastructures. The DMRN Protocol, introduced in 2022, represents a major advancement by offering both PFS and PQC while maintaining compatibility with existing 3rd Generation Partnership Project (3GPP) standards. Given the looming quantum-era challenges, it is imperative to analyze the protocol’s security architecture through formal verification. Accordingly, we formally analyze the DMRN Protocol using SVO logic and ProVerif to assess its effectiveness in mitigating attack vectors, such as malicious or compromised serving networks (SNs) and home network (HN) masquerading. Our research found that the DMRN Protocol has vulnerabilities in key areas such as mutual authentication and key exchange. In light of these findings, our study provides critical insights into the design of secure and quantum-safe authentication protocols for the transition to 6G networks. Furthermore, by identifying the vulnerabilities in and discussing countermeasures to address the DMRN Protocol, this study lays the groundwork for the future standardization of secure 6G Authentication and Key Agreement protocols. Full article
(This article belongs to the Special Issue Intelligent Optical Signal Processing in Optical Fiber Communication)
Show Figures

Figure 1

20 pages, 947 KB  
Article
Evaluating the Impact of Pre-Configured Uplink Resources in Narrowband IoT
by Muhammad Tahir Abbas, Karl-Johan Grinnemo, Anna Brunstrom, Pascal Jörke, Johan Eklund, Stefan Alfredsson, Mohammad Rajiullah and Christian Wietfeld
Sensors 2024, 24(17), 5706; https://doi.org/10.3390/s24175706 - 2 Sep 2024
Cited by 2 | Viewed by 2873
Abstract
Deploying Cellular Internet of Things (CIoT) devices in urban and remote areas faces significant energy efficiency challenges. This is especially true for Narrowband IoT (NB-IoT) devices, which are expected to function on a single charge for up to 10 years while transmitting small [...] Read more.
Deploying Cellular Internet of Things (CIoT) devices in urban and remote areas faces significant energy efficiency challenges. This is especially true for Narrowband IoT (NB-IoT) devices, which are expected to function on a single charge for up to 10 years while transmitting small amounts of data daily. The 3rd Generation Partnership Project (3GPP) has introduced energy-saving mechanisms in Releases 13 to 16, including Early Data Transmission (EDT) and Preconfigured Uplink Resources (PURs). These mechanisms extend battery life and reduce latency by enabling data transmission without an active Radio Resource Control (RRC) connection or Random Access Procedure (RAP). This paper examines these mechanisms using the LENA-NB simulator in the ns-3 environment, which is a comprehensive framework for studying various aspects of NB-IoT. The LENA-NB has been extended with PURs, and our analysis shows that PURs significantly enhance battery life and latency efficiency, particularly in high-density environments. Compared to the default RAP method, PURs reduce energy consumption by more than 2.5 times and increases battery life by 1.6 times. Additionally, PURs achieve latency reductions of 2.5–3.5 times. The improvements with PURs are most notable for packets up to 125 bytes. Our findings highlight PURs’ potential to enable more efficient and effective CIoT deployments across various scenarios. This study represents a detailed analysis of latency and energy consumption in a simulated environment, advancing the understanding of PURs’ benefits. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Graphical abstract

27 pages, 3597 KB  
Article
A Blockchain-Assisted Security Protocol for Group Handover of MTC Devices in 5G Wireless Networks
by Ronghao Ma, Jianhong Zhou and Maode Ma
Sensors 2024, 24(7), 2331; https://doi.org/10.3390/s24072331 - 6 Apr 2024
Cited by 7 | Viewed by 3417
Abstract
In the realm of the fifth-generation (5G) wireless cellular networks, renowned for their dense connectivity, there lies a substantial facilitation of a myriad of Internet of Things (IoT) applications, which can be supported by the massive machine-type communication (MTC) technique, a fundamental communication [...] Read more.
In the realm of the fifth-generation (5G) wireless cellular networks, renowned for their dense connectivity, there lies a substantial facilitation of a myriad of Internet of Things (IoT) applications, which can be supported by the massive machine-type communication (MTC) technique, a fundamental communication framework. In some scenarios, a large number of machine-type communication devices (MTCD) may simultaneously enter the communication coverage of a target base station. However, the current handover mechanism specified by the 3rd Generation Partnership Project (3GPP) Release 16 incurs high signaling overhead within the access and core networks, which may have negative impacts on network efficiency. Additionally, other existing solutions are vulnerable to malicious attacks such as Denial of Service (DoS), Distributed Denial of Service (DDoS) attacks, and the failure of Key Forward Secrecy (KFS). To address this challenge, this paper proposes an efficient and secure handover authentication protocol for a group of MTCDs supported by blockchain technology. This protocol leverages the decentralized nature of blockchain technology and combines it with certificateless aggregate signatures to mutually authenticate the identity of a base station and a group of MTCDs. This approach can reduce signaling overhead and avoid key escrow while significantly lowering the risk associated with single points of failure. Additionally, the protocol protects device anonymity by encrypting device identities with temporary anonymous identity markers with the Elliptic Curve Diffie–Hellman (ECDH) to abandon serial numbers to prevent linkage attacks. The resilience of the proposed protocol against predominant malicious attacks has been rigorously validated through the application of the BAN logic and Scyther tool, underscoring its robust security attributes. Furthermore, compared to the existing solutions, the proposed protocol significantly reduces the authentication cost for a group of MTCDs during handover, while ensuring security, demonstrating commendable efficiency. Full article
(This article belongs to the Special Issue Feature Papers in Communications Section 2023)
Show Figures

Figure 1

38 pages, 1021 KB  
Review
A Systematic Survey on 5G and 6G Security Considerations, Challenges, Trends, and Research Areas
by Paul Scalise, Matthew Boeding, Michael Hempel, Hamid Sharif, Joseph Delloiacovo and John Reed
Future Internet 2024, 16(3), 67; https://doi.org/10.3390/fi16030067 - 20 Feb 2024
Cited by 64 | Viewed by 15595
Abstract
With the rapid rollout and growing adoption of 3GPP 5thGeneration (5G) cellular services, including in critical infrastructure sectors, it is important to review security mechanisms, risks, and potential vulnerabilities within this vital technology. Numerous security capabilities need to work together to ensure and [...] Read more.
With the rapid rollout and growing adoption of 3GPP 5thGeneration (5G) cellular services, including in critical infrastructure sectors, it is important to review security mechanisms, risks, and potential vulnerabilities within this vital technology. Numerous security capabilities need to work together to ensure and maintain a sufficiently secure 5G environment that places user privacy and security at the forefront. Confidentiality, integrity, and availability are all pillars of a privacy and security framework that define major aspects of 5G operations. They are incorporated and considered in the design of the 5G standard by the 3rd Generation Partnership Project (3GPP) with the goal of providing a highly reliable network operation for all. Through a comprehensive review, we aim to analyze the ever-evolving landscape of 5G, including any potential attack vectors and proposed measures to mitigate or prevent these threats. This paper presents a comprehensive survey of the state-of-the-art research that has been conducted in recent years regarding 5G systems, focusing on the main components in a systematic approach: the Core Network (CN), Radio Access Network (RAN), and User Equipment (UE). Additionally, we investigate the utilization of 5G in time-dependent, ultra-confidential, and private communications built around a Zero Trust approach. In today’s world, where everything is more connected than ever, Zero Trust policies and architectures can be highly valuable in operations containing sensitive data. Realizing a Zero Trust Architecture entails continuous verification of all devices, users, and requests, regardless of their location within the network, and grants permission only to authorized entities. Finally, developments and proposed methods of new 5G and future 6G security approaches, such as Blockchain technology, post-quantum cryptography (PQC), and Artificial Intelligence (AI) schemes, are also discussed to understand better the full landscape of current and future research within this telecommunications domain. Full article
(This article belongs to the Special Issue 5G Security: Challenges, Opportunities, and the Road Ahead)
Show Figures

Figure 1

Back to TopTop