Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (81)

Search Parameters:
Keywords = MTC (machine-type communication)

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
36 pages, 738 KB  
Article
Activity Detection and Channel Estimation Based on Correlated Hybrid Message Passing for Grant-Free Massive Random Access
by Xiaofeng Liu, Xinrui Gong and Xiao Fu
Entropy 2025, 27(11), 1111; https://doi.org/10.3390/e27111111 - 28 Oct 2025
Viewed by 647
Abstract
Massive machine-type communications (mMTC) in future 6G networks will involve a vast number of devices with sporadic traffic. Grant-free access has emerged as an effective strategy to reduce the access latency and processing overhead by allowing devices to transmit without prior permission, making [...] Read more.
Massive machine-type communications (mMTC) in future 6G networks will involve a vast number of devices with sporadic traffic. Grant-free access has emerged as an effective strategy to reduce the access latency and processing overhead by allowing devices to transmit without prior permission, making accurate active user detection and channel estimation (AUDCE) crucial. In this paper, we investigate the joint AUDCE problem in wideband massive access systems. We develop an innovative channel prior model that captures the dual correlation structure of the channel using three state variables: active indication, channel supports, and channel values. By integrating Markov chains with coupled Gaussian distributions, the model effectively describes both the structural and numerical dependencies within the channel. We propose the correlated hybrid message passing (CHMP) algorithm based on Bethe free energy (BFE) minimization, which adaptively updates model parameters without requiring prior knowledge of user sparsity or channel priors. Simulation results show that the CHMP algorithm accurately detects active users and achieves precise channel estimation. Full article
(This article belongs to the Topic Advances in Sixth Generation and Beyond (6G&B))
Show Figures

Figure 1

16 pages, 1350 KB  
Article
The Synergistic Impact of 5G on Cloud-to-Edge Computing and the Evolution of Digital Applications
by Saleh M. Altowaijri and Mohamed Ayari
Mathematics 2025, 13(16), 2634; https://doi.org/10.3390/math13162634 - 16 Aug 2025
Cited by 1 | Viewed by 5427
Abstract
The integration of 5G technology with cloud and edge computing is redefining the digital landscape by enabling ultra-fast connectivity, low-latency communication, and scalable solutions across diverse application domains. This paper investigates the synergistic impact of 5G on cloud-to-edge architectures, emphasizing its transformative role [...] Read more.
The integration of 5G technology with cloud and edge computing is redefining the digital landscape by enabling ultra-fast connectivity, low-latency communication, and scalable solutions across diverse application domains. This paper investigates the synergistic impact of 5G on cloud-to-edge architectures, emphasizing its transformative role in revolutionizing sectors such as healthcare, smart cities, industrial automation, and autonomous systems. Key advancements in 5G—including Enhanced Mobile Broadband (eMBB), Ultra-Reliable Low-Latency Communication (URLLC), and Massive Machine-Type Communications (mMTC)—are examined for their role in enabling real-time data processing, edge intelligence, and IoT scalability. In addition to conceptual analysis, the paper presents simulation-based evaluations comparing 5G cloud-to-edge systems with traditional 4G cloud models. Quantitative results demonstrate significant improvements in latency, energy efficiency, reliability, and AI prediction accuracy. The study also explores challenges in infrastructure deployment, cybersecurity, and latency management while highlighting the growing opportunities for innovation in AI-driven automation and immersive consumer technologies. Future research directions are outlined, focusing on energy-efficient designs, advanced security mechanisms, and equitable access to 5G infrastructure. Overall, this study offers comprehensive insights and performance benchmarks that will serve as a valuable resource for researchers and practitioners working to advance next-generation digital ecosystems. Full article
(This article belongs to the Special Issue Innovations in Cloud Computing and Machine Learning Applications)
Show Figures

Figure 1

15 pages, 1529 KB  
Article
Peak Age of Information Optimization in Cell-Free Massive Random Access Networks
by Zhiru Zhao, Yuankang Huang and Wen Zhan
Electronics 2025, 14(13), 2714; https://doi.org/10.3390/electronics14132714 - 4 Jul 2025
Viewed by 758
Abstract
With the vigorous development of Internet of Things technologies, Cell-Free Radio Access Network (CF-RAN), leveraging its distributed coverage and single/multi-antenna Access Point (AP) coordination advantages, has become a key technology for supporting massive Machine-Type Communication (mMTC). However, under the grant-free random access mechanism, [...] Read more.
With the vigorous development of Internet of Things technologies, Cell-Free Radio Access Network (CF-RAN), leveraging its distributed coverage and single/multi-antenna Access Point (AP) coordination advantages, has become a key technology for supporting massive Machine-Type Communication (mMTC). However, under the grant-free random access mechanism, this network architecture faces the problem of information freshness degradation due to channel congestion. To address this issue, a joint decoding model based on logical grouping architecture is introduced to analyze the correlation between the successful packet transmission probability and the Peak Age of Information (PAoI) in both single-AP and multi-AP scenarios. On this basis, a global Particle Swarm Optimization (PSO) algorithm is designed to dynamically adjust the channel access probability to minimize the average PAoI across the network. To reduce signaling overhead, a PSO algorithm based on local topology information is further proposed to achieve collaborative optimization among neighboring APs. Simulation results demonstrate that the global PSO algorithm can achieve performance closely approximating the optimum, while the local PSO algorithm maintains similar performance without the need for global information. It is especially suitable for large-scale access scenarios with wide area coverage, providing an efficient solution for optimizing information freshness in CF-RAN. Full article
Show Figures

Figure 1

18 pages, 1372 KB  
Article
Resource Allocation in 5G Cellular IoT Systems with Early Transmissions at the Random Access Phase
by Anastasia Daraseliya, Eduard Sopin, Vyacheslav Begishev, Yevgeni Koucheryavy and Konstantin Samouylov
Sensors 2025, 25(7), 2264; https://doi.org/10.3390/s25072264 - 3 Apr 2025
Cited by 1 | Viewed by 1395
Abstract
While the market for massive machine type communications (mMTC) is evolving at an unprecedented pace, the standardization bodies, including 3GPP, are lagging behind with standardization of truly 5G-grade cellular Internet-of-Things (CIoT) systems. As an intermediate solution, an early data transmission mechanisms encapsulating the [...] Read more.
While the market for massive machine type communications (mMTC) is evolving at an unprecedented pace, the standardization bodies, including 3GPP, are lagging behind with standardization of truly 5G-grade cellular Internet-of-Things (CIoT) systems. As an intermediate solution, an early data transmission mechanisms encapsulating the data into the preambles has been recently proposed for 4G/5G Narrowband IoT (NB-IoT) technology. This mechanism is also expected to become a part of future CIoT systems. The aim of this paper is to propose a model for CIoT systems with and without early transmission functionality and assess the optimal distribution of resources at the random access and data transmission phases. To this end, the developed model captures both phases explicitly as well as different traffic composition in downlink and uplink directions. Our numerical results demonstrate that the use of early transmission functionality allows one to drastically decrease the delay of uplink packets by up to 20–40%, even in presence of downlink traffic sharing the same set of resources. However, it also affects the optimal share of resources allocated for random access and data transmission phases. As a result, the optimal performance of 5G mMTC technologies with or without early transmission mode can only be attained if the dynamic resource allocation is implemented. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

24 pages, 8199 KB  
Article
Redefining 6G Network Slicing: AI-Driven Solutions for Future Use Cases
by Robert Botez, Daniel Zinca and Virgil Dobrota
Electronics 2025, 14(2), 368; https://doi.org/10.3390/electronics14020368 - 18 Jan 2025
Cited by 14 | Viewed by 6885
Abstract
The evolution from 5G to 6G networks is driven by the need to meet the stringent requirements, i.e., ultra-reliable, low-latency, and high-throughput communication. The new services are called Further-Enhanced Mobile Broadband (feMBB), Extremely Reliable and Low-Latency Communications (ERLLCs), Ultra-Massive Machine-Type Communications (umMTCs), Massive [...] Read more.
The evolution from 5G to 6G networks is driven by the need to meet the stringent requirements, i.e., ultra-reliable, low-latency, and high-throughput communication. The new services are called Further-Enhanced Mobile Broadband (feMBB), Extremely Reliable and Low-Latency Communications (ERLLCs), Ultra-Massive Machine-Type Communications (umMTCs), Massive Ultra-Reliable Low-Latency Communications (mURLLCs), and Mobile Broadband Reliable Low-Latency Communications (MBRLLCs). Network slicing emerges as a critical enabler in 6G, providing virtualized, end-to-end network segments tailored to diverse application needs. Despite its significance, existing datasets for slice selection are limited to 5G or LTE-A contexts, lacking relevance to the enhanced requirements. In this work, we present a novel synthetic dataset tailored to 6G network slicing. By analyzing the emerging service requirements, we generated traffic parameters, including latency, packet loss, jitter, and transfer rates. Machine Learning (ML) models like Random Forest (RF), Decision Tree (DT), XGBoost, Support Vector Machine (SVM), and Feedforward Neural Network (FNN) were trained on this dataset, achieving over 99% accuracy in both slice classification and handover prediction. Our results highlight the potential of this dataset as a valuable tool for developing AI-assisted 6G network slicing mechanisms. While still in its early stages, the dataset lays a foundation for future research. As the 6G standardization advances, we aim to refine the dataset and models, ultimately enabling real-time, intelligent slicing solutions for next-generation networks. Full article
(This article belongs to the Special Issue Advances in IoT Security)
Show Figures

Figure 1

12 pages, 1157 KB  
Article
Multi-Layered Unsupervised Learning Driven by Signal-to-Noise Ratio-Based Relaying for Vehicular Ad Hoc Network-Supported Intelligent Transport System in eHealth Monitoring
by Ali Nauman, Adeel Iqbal, Tahir Khurshaid and Sung Won Kim
Sensors 2024, 24(20), 6548; https://doi.org/10.3390/s24206548 - 11 Oct 2024
Cited by 1 | Viewed by 2197
Abstract
Every year, about 1.19 million people are killed in traffic accidents; hence, the United Nations has a goal of halving the number of road traffic deaths and injuries by 2030. In line with this objective, technological innovations in telecommunication, particularly brought about by [...] Read more.
Every year, about 1.19 million people are killed in traffic accidents; hence, the United Nations has a goal of halving the number of road traffic deaths and injuries by 2030. In line with this objective, technological innovations in telecommunication, particularly brought about by the rise of 5G networks, have contributed to the development of modern Vehicle-to-Everything (V2X) systems for communication. A New Radio V2X (NR-V2X) was introduced in the latest Third Generation Partnership Project (3GPP) releases which allows user devices to exchange information without relying on roadside infrastructures. This, together with Massive Machine Type Communication (mMTC) and Ultra-Reliable Low Latency Communication (URLLC), has led to the significantly increased reliability, coverage, and efficiency of vehicular communication networks. The use of artificial intelligence (AI), especially K-means clustering, has been very promising in terms of supporting efficient data exchange in vehicular ad hoc networks (VANETs). K-means is an unsupervised machine learning (ML) technique that groups vehicles located near each other geographically so that they can communicate with one another directly within these clusters while also allowing for inter-cluster communication via cluster heads. This paper proposes a multi-layered VANET-enabled Intelligent Transportation System (ITS) framework powered by unsupervised learning to optimize communication efficiency, scalability, and reliability. By leveraging AI in VANET solutions, the proposed framework aims to address road safety challenges and contribute to global efforts to meet the United Nations’ 2030 target. Additionally, this framework’s robust communication and data processing capabilities can be extended to eHealth monitoring systems, enabling real-time health data transmission and processing for continuous patient monitoring and timely medical interventions. This paper’s contributions include exploring AI-driven approaches for enhanced data interaction, improved safety in VANET-based ITS environments, and potential applications in eHealth monitoring. Full article
(This article belongs to the Special Issue Intelligent Sensors and Control for Vehicle Automation)
Show Figures

Figure 1

13 pages, 392 KB  
Article
Grant-Free Random Access Enhanced by Massive MIMO and Non-Orthogonal Preambles
by Hao Jiang, Hongming Chen, Hongming Hu and Jie Ding
Electronics 2024, 13(11), 2179; https://doi.org/10.3390/electronics13112179 - 3 Jun 2024
Viewed by 1886
Abstract
Massive multiple input multiple output (MIMO) enabled grant-free random access (mGFRA) stands out as a promising random access (RA) solution, thus effectively addressing the need for massive access in massive machine-type communications (mMTCs) while ensuring high spectral efficiency and minimizing signaling overhead. However, [...] Read more.
Massive multiple input multiple output (MIMO) enabled grant-free random access (mGFRA) stands out as a promising random access (RA) solution, thus effectively addressing the need for massive access in massive machine-type communications (mMTCs) while ensuring high spectral efficiency and minimizing signaling overhead. However, the bottleneck of mGFRA is mainly dominated by the orthogonal preamble collisions, since the orthogonal preamble pool is small and of a fixed-sized. In this paper, we explore the potential of non-orthogonal preambles to overcome limitations and enhance the success probability of mGFRA without extending the length of the preamble. Given the RA procedure of mGFRA, we analyze the factors influencing the success rate of mGFRA with non-orthogonal preamble and propose to use two types of sequences, namely Gold sequence and Gaussian distribution sequence, as the preambles for mGFRA. Simulation results demonstrate the effectiveness of these two types pf non-orthogonal preambles in improving the success probability of mGFRA. Moreover, the system parameters’ impact on the performance of mGFRA with non-orthogonal preambles is examined and deliberated. Full article
Show Figures

Figure 1

24 pages, 1104 KB  
Article
A Learning-Based Energy-Efficient Device Grouping Mechanism for Massive Machine-Type Communication in the Context of Beyond 5G Networks
by Rubbens Boisguene, Ibrahim Althamary and Chih-Wei Huang
J. Sens. Actuator Netw. 2024, 13(3), 33; https://doi.org/10.3390/jsan13030033 - 28 May 2024
Cited by 4 | Viewed by 2461
Abstract
With the increasing demand for high data rates, low delay, and extended battery life, managing massive machine-type communication (mMTC) in the beyond 5G (B5G) context is challenging. MMTC devices, which play a role in developing the Internet of Things (IoT) and smart cities, [...] Read more.
With the increasing demand for high data rates, low delay, and extended battery life, managing massive machine-type communication (mMTC) in the beyond 5G (B5G) context is challenging. MMTC devices, which play a role in developing the Internet of Things (IoT) and smart cities, need to transmit short amounts of data periodically within a specific time frame. Although blockchain technology is utilized for secure data storage and transfer while digital twin technology provides real-time monitoring and management of the devices, issues such as constrained time delays and network congestion persist. Without a proper data transmission strategy, most devices would fail to transmit in time, thus defying their relevance and purpose. This work investigates the problem of massive random access channel (RACH) attempts while emphasizing the energy efficiency and access latency for mMTC devices with critical missions in B5G networks. Using machine learning techniques, we propose an attention-based reinforcement learning model that orchestrates the device grouping strategy to optimize device placement. Thus, the model guarantees a higher probability of success for the devices during data transmission access, eventually leading to more efficient energy consumption. Through thorough quantitative simulations, we demonstrate that the proposed learning-based approach significantly outperforms the other baseline grouping methods. Full article
Show Figures

Figure 1

18 pages, 834 KB  
Article
A Multi-Agent Reinforcement Learning-Based Grant-Free Random Access Protocol for mMTC Massive MIMO Networks
by Felipe Augusto Dutra Bueno, Alessandro Goedtel, Taufik Abrão and José Carlos Marinello
J. Sens. Actuator Netw. 2024, 13(3), 30; https://doi.org/10.3390/jsan13030030 - 30 Apr 2024
Cited by 3 | Viewed by 3110
Abstract
The expected huge number of connected devices in Internet of Things (IoT) applications characterizes the massive machine-type communication (mMTC) scenario, one prominent use case of beyond fifth-generation (B5G) systems. To meet mMTC connectivity requirements, grant-free (GF) random access (RA) protocols are seen as [...] Read more.
The expected huge number of connected devices in Internet of Things (IoT) applications characterizes the massive machine-type communication (mMTC) scenario, one prominent use case of beyond fifth-generation (B5G) systems. To meet mMTC connectivity requirements, grant-free (GF) random access (RA) protocols are seen as a promising solution due to the small amount of data that MTC devices usually transmit. In this paper, we propose a GF RA protocol based on a multi-agent reinforcement learning approach, applied to aid IoT devices in selecting the least congested RA pilots. The rewards obtained by the devices in collision cases resemble the congestion level of the chosen pilot. To enable the operation of the proposed method in a realistic B5G network scenario and aiming to reduce signaling overheads and centralized processing, the rewards in our proposed method are computed by the devices taking advantage of a large number of base station antennas. Numerical results demonstrate the superior performance of the proposed method in terms of latency, network throughput, and per-device throughput compared with other protocols. Full article
Show Figures

Figure 1

17 pages, 961 KB  
Review
The Advantage of the 5G Network for Enhancing the Internet of Things and the Evolution of the 6G Network
by Georgios Gkagkas, Dimitrios J. Vergados, Angelos Michalas and Michael Dossis
Sensors 2024, 24(8), 2455; https://doi.org/10.3390/s24082455 - 11 Apr 2024
Cited by 27 | Viewed by 6564
Abstract
The Internet of Things (IoT) is what we have as a great breakthrough in the 5G network. Although the 5G network can support several Internet of Everything (IoE) services, 6G is the network to fully support that. This paper is a survey research [...] Read more.
The Internet of Things (IoT) is what we have as a great breakthrough in the 5G network. Although the 5G network can support several Internet of Everything (IoE) services, 6G is the network to fully support that. This paper is a survey research presenting the 5G and IoT technology and the challenges coming, with the 6G network being the new alternative network coming to solve these issues and limitations we are facing with 5G. A reference to the Control Plane and User Plane Separation (CUPS) is made with IPv4 and IPv6, addressing which is the foundation of the network slicing for the 5G core network. In comparison to other related papers, we provide in-depth information on how the IoT is going to affect our lives and how this technology is handled as the IoE in the 6G network. Finally, a full reference is made to the 6G network, with its challenges compared to the 5G network. Full article
(This article belongs to the Special Issue Advanced Technologies in 5G/6G-Enabled IoT Environments and Beyond)
Show Figures

Figure 1

27 pages, 3597 KB  
Article
A Blockchain-Assisted Security Protocol for Group Handover of MTC Devices in 5G Wireless Networks
by Ronghao Ma, Jianhong Zhou and Maode Ma
Sensors 2024, 24(7), 2331; https://doi.org/10.3390/s24072331 - 6 Apr 2024
Cited by 6 | Viewed by 3188
Abstract
In the realm of the fifth-generation (5G) wireless cellular networks, renowned for their dense connectivity, there lies a substantial facilitation of a myriad of Internet of Things (IoT) applications, which can be supported by the massive machine-type communication (MTC) technique, a fundamental communication [...] Read more.
In the realm of the fifth-generation (5G) wireless cellular networks, renowned for their dense connectivity, there lies a substantial facilitation of a myriad of Internet of Things (IoT) applications, which can be supported by the massive machine-type communication (MTC) technique, a fundamental communication framework. In some scenarios, a large number of machine-type communication devices (MTCD) may simultaneously enter the communication coverage of a target base station. However, the current handover mechanism specified by the 3rd Generation Partnership Project (3GPP) Release 16 incurs high signaling overhead within the access and core networks, which may have negative impacts on network efficiency. Additionally, other existing solutions are vulnerable to malicious attacks such as Denial of Service (DoS), Distributed Denial of Service (DDoS) attacks, and the failure of Key Forward Secrecy (KFS). To address this challenge, this paper proposes an efficient and secure handover authentication protocol for a group of MTCDs supported by blockchain technology. This protocol leverages the decentralized nature of blockchain technology and combines it with certificateless aggregate signatures to mutually authenticate the identity of a base station and a group of MTCDs. This approach can reduce signaling overhead and avoid key escrow while significantly lowering the risk associated with single points of failure. Additionally, the protocol protects device anonymity by encrypting device identities with temporary anonymous identity markers with the Elliptic Curve Diffie–Hellman (ECDH) to abandon serial numbers to prevent linkage attacks. The resilience of the proposed protocol against predominant malicious attacks has been rigorously validated through the application of the BAN logic and Scyther tool, underscoring its robust security attributes. Furthermore, compared to the existing solutions, the proposed protocol significantly reduces the authentication cost for a group of MTCDs during handover, while ensuring security, demonstrating commendable efficiency. Full article
(This article belongs to the Special Issue Feature Papers in Communications Section 2023)
Show Figures

Figure 1

24 pages, 967 KB  
Article
Effective Energy Efficiency under Delay–Outage Probability Constraints and F-Composite Fading
by Fahad Qasmi, Irfan Muhammad, Hirley Alves and Matti Latva-aho
Sensors 2024, 24(7), 2328; https://doi.org/10.3390/s24072328 - 6 Apr 2024
Viewed by 1687
Abstract
The paradigm of the Next Generation cellular network (6G) and beyond is machine-type communications (MTCs), where numerous Internet of Things (IoT) devices operate autonomously without human intervention over wireless channels. IoT’s autonomous and energy-intensive characteristics highlight effective energy efficiency (EEE) as a crucial [...] Read more.
The paradigm of the Next Generation cellular network (6G) and beyond is machine-type communications (MTCs), where numerous Internet of Things (IoT) devices operate autonomously without human intervention over wireless channels. IoT’s autonomous and energy-intensive characteristics highlight effective energy efficiency (EEE) as a crucial key performance indicator (KPI) of 6G. However, there is a lack of investigation on the EEE of random arrival traffic, which is the underlying platform for MTCs. In this work, we explore the distinct characteristics of F-composite fading channels, which specify the combined impact of multipath fading and shadowing. Furthermore, we evaluate the EEE over such fading under a finite blocklength regime and QoS constraints where IoT applications generate constant and sporadic traffic. We consider a point-to-point buffer-aided communication system model, where (1) an uplink transmission under a finite blocklength regime is examined; (2) we make realistic assumptions regarding the perfect channel state information (CSI) available at the receiver, and the channel is characterized by the F-composite fading model; and (3) due to its effectiveness and tractability, application data are found to have an average arrival rate calculated using Markovian sources models. To this end, we derive an exact closed-form expression for outage probability and the effective rate, which provides an accurate approximation for our analysis. Moreover, we determine the arrival and required service rates that satisfy the QoS constraints by applying effective bandwidth and capacity theories. The EEE is shown to be quasiconcave, with a trade-off between the transmit power and the rate for maximising the EEE. Measuring the impact of transmission power or rate individually is quite complex, but this complexity is further intensified when both variables are considered simultaneously. Thus, we formulate power allocation (PA) and rate allocation (RA) optimisation problems individually and jointly to maximise the EEE under a QoS constraint and solve such a problem numerically through a particle swarm optimization (PSO) algorithm. Finally, we examine the EEE performance in the context of line-of-sight and shadowing parameters. Full article
Show Figures

Figure 1

22 pages, 7068 KB  
Article
Field-Programmable Gate Array-Based Implementation of Zero-Trust Stream Data Encryption for Enabling 6G-Narrowband Internet of Things Massive Device Access
by Wen-Chung Tsai
Sensors 2024, 24(3), 853; https://doi.org/10.3390/s24030853 - 28 Jan 2024
Cited by 6 | Viewed by 2882
Abstract
With the advent of 6G Narrowband IoT (NB-IoT) technology, IoT security faces inevitable challenges due to the application requirements of Massive Machine-Type Communications (mMTCs). In response, a 6G base station (gNB) and User Equipment (UE) necessitate increased capacities to handle a larger number [...] Read more.
With the advent of 6G Narrowband IoT (NB-IoT) technology, IoT security faces inevitable challenges due to the application requirements of Massive Machine-Type Communications (mMTCs). In response, a 6G base station (gNB) and User Equipment (UE) necessitate increased capacities to handle a larger number of connections while maintaining reasonable performance during operations. To address this developmental trend and overcome associated technological hurdles, this paper proposes a hardware-accelerated and software co-designed mechanism to support streaming data transmissions and secure zero-trust inter-endpoint communications. The proposed implementations aim to offload processing efforts from micro-processors and enhance global system operation performance by hardware and software co-design in endpoint communications. Experimental results demonstrate that the proposed secure mechanism based on the use of non-repeating keys and implemented in FPGA, can save 85.61%, 99.71%, and 95.68% of the micro-processor’s processing time in key block generations, non-repeating checks, and data block transfers, respectively. Full article
Show Figures

Figure 1

27 pages, 2174 KB  
Article
Market and Sharing Alternatives for the Provision of Massive Machine-Type and Ultra-Reliable Low-Latency Communications Services over a 5G Network
by Edison Moreno-Cardenas and Luis Guijarro
Electronics 2023, 12(24), 4994; https://doi.org/10.3390/electronics12244994 - 13 Dec 2023
Cited by 1 | Viewed by 2009
Abstract
The objective of this paper is to analyze economic alternatives for the provision of ultra-reliable low-latency communication (URLLC) and massive machine-type communication (mMTC) services over a fifth-generation (5G) network. Two business models, a monopoly and a duopoly, are studied and two 5G network [...] Read more.
The objective of this paper is to analyze economic alternatives for the provision of ultra-reliable low-latency communication (URLLC) and massive machine-type communication (mMTC) services over a fifth-generation (5G) network. Two business models, a monopoly and a duopoly, are studied and two 5G network scenarios are analyzed: a 5G network where the network resources are shared between the two services without service priority, and a 5G network with network slicing that allows for URLLC traffic to have a higher priority. Microeconomics is used to model the behavior of users and operators, and game theory is used to model the strategic interaction between users and operators. The results show that a monopoly over a 5G network with network slicing is the most efficient way to provide both URLLC and mMTC services. Full article
Show Figures

Figure 1

20 pages, 1373 KB  
Article
An E2E Network Slicing Framework for Slice Creation and Deployment Using Machine Learning
by Sujitha Venkatapathy, Thiruvenkadam Srinivasan, Han-Gue Jo and In-Ho Ra
Sensors 2023, 23(23), 9608; https://doi.org/10.3390/s23239608 - 4 Dec 2023
Cited by 8 | Viewed by 3957
Abstract
Network slicing shows promise as a means to endow 5G networks with flexible and dynamic features. Network function virtualization (NFV) and software-defined networking (SDN) are the key methods for deploying network slicing, which will enable end-to-end (E2E) isolation services permitting each slice to [...] Read more.
Network slicing shows promise as a means to endow 5G networks with flexible and dynamic features. Network function virtualization (NFV) and software-defined networking (SDN) are the key methods for deploying network slicing, which will enable end-to-end (E2E) isolation services permitting each slice to be customized depending on service requirements. The goal of this investigation is to construct network slices through a machine learning algorithm and allocate resources for the newly created slices using dynamic programming in an efficient manner. A substrate network is constructed with a list of key performance indicators (KPIs) like CPU capacity, bandwidth, delay, link capacity, and security level. After that, network slices are produced by employing multi-layer perceptron (MLP) using the adaptive moment estimation (ADAM) optimization algorithm. For each requested service, the network slices are categorized as massive machine-type communications (mMTC), enhanced mobile broadband (eMBB), and ultra-reliable low-latency communications (uRLLC). After network slicing, resources are provided to the services that have been requested. In order to maximize the total user access rate and resource efficiency, Dijkstra’s algorithm is adopted for resource allocation that determines the shortest path between nodes in the substrate network. The simulation output shows that the present model allocates optimum slices to the requested services with high resource efficiency and reduced total bandwidth utilization. Full article
(This article belongs to the Section Sensor Networks)
Show Figures

Figure 1

Back to TopTop