Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (39)

Search Parameters:
Keywords = ultra-dense wireless networks

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
18 pages, 960 KiB  
Article
Hybrid Algorithm via Reciprocal-Argument Transformation for Efficient Gauss Hypergeometric Evaluation in Wireless Networks
by Jianping Cai and Zuobin Ying
Mathematics 2025, 13(15), 2354; https://doi.org/10.3390/math13152354 - 23 Jul 2025
Viewed by 127
Abstract
The rapid densification of wireless networks demands efficient evaluation of special functions underpinning system-level performance metrics. To facilitate research, we introduce a computational framework tailored for the zero-balanced Gauss hypergeometric function [...] Read more.
The rapid densification of wireless networks demands efficient evaluation of special functions underpinning system-level performance metrics. To facilitate research, we introduce a computational framework tailored for the zero-balanced Gauss hypergeometric function Ψ(x,y)F12(1,x;1+x;y), a fundamental mathematical kernel emerging in Signal-to-Interference-plus-Noise Ratio (SINR) coverage analysis of non-uniform cellular deployments. Specifically, we propose a novel Reciprocal-Argument Transformation Algorithm (RTA), derived rigorously from a Mellin–Barnes reciprocal-argument identity, achieving geometric convergence with O1/y. By integrating RTA with a Pfaff-series solver into a hybrid algorithm guided by a golden-ratio switching criterion, our approach ensures optimal efficiency and numerical stability. Comprehensive validation demonstrates that the hybrid algorithm reliably attains machine-precision accuracy (1016) within 1 μs per evaluation, dramatically accelerating calculations in realistic scenarios from hours to fractions of a second. Consequently, our method significantly enhances the feasibility of tractable optimization in ultra-dense non-uniform cellular networks, bridging the computational gap in large-scale wireless performance modeling. Full article
(This article belongs to the Special Issue Advances in High-Performance Computing, Optimization and Simulation)
Show Figures

Figure 1

23 pages, 2363 KiB  
Review
Handover Decisions for Ultra-Dense Networks in Smart Cities: A Survey
by Akzhibek Amirova, Ibraheem Shayea, Didar Yedilkhan, Laura Aldasheva and Alma Zakirova
Technologies 2025, 13(8), 313; https://doi.org/10.3390/technologies13080313 - 23 Jul 2025
Viewed by 526
Abstract
Handover (HO) management plays a key role in ensuring uninterrupted connectivity across evolving wireless networks. While previous generations such as 4G and 5G have introduced several HO strategies, these techniques are insufficient to meet the rigorous demands of sixth-generation (6G) networks in ultra-dense, [...] Read more.
Handover (HO) management plays a key role in ensuring uninterrupted connectivity across evolving wireless networks. While previous generations such as 4G and 5G have introduced several HO strategies, these techniques are insufficient to meet the rigorous demands of sixth-generation (6G) networks in ultra-dense, heterogeneous smart city environments. Existing studies often fail to provide integrated HO solutions that consider key concerns such as energy efficiency, security vulnerabilities, and interoperability across diverse network domains, including terrestrial, aerial, and satellite systems. Moreover, the dynamic and high-mobility nature of smart city ecosystems further complicate real-time HO decision-making. This survey aims to highlight these critical gaps by systematically categorizing state-of-the-art HO approaches into AI-based, fuzzy logic-based, and hybrid frameworks, while evaluating their performance against emerging 6G requirements. Future research directions are also outlined, emphasizing the development of lightweight AI–fuzzy hybrid models for real-time decision-making, the implementation of decentralized security mechanisms using blockchain, and the need for global standardization to enable seamless handovers across multi-domain networks. The key outcome of this review is a structured and in-depth synthesis of current advancements, which serves as a foundational reference for researchers and engineers aiming to design intelligent, scalable, and secure HO mechanisms that can support the operational complexity of next-generation smart cities. Full article
(This article belongs to the Section Information and Communication Technologies)
Show Figures

Figure 1

21 pages, 1476 KiB  
Article
AI-Driven Handover Management and Load Balancing Optimization in Ultra-Dense 5G/6G Cellular Networks
by Chaima Chabira, Ibraheem Shayea, Gulsaya Nurzhaubayeva, Laura Aldasheva, Didar Yedilkhan and Saule Amanzholova
Technologies 2025, 13(7), 276; https://doi.org/10.3390/technologies13070276 - 1 Jul 2025
Cited by 1 | Viewed by 1187
Abstract
This paper presents a comprehensive review of handover management and load balancing optimization (LBO) in ultra-dense 5G and emerging 6G cellular networks. With the increasing deployment of small cells and the rapid growth of data traffic, these networks face significant challenges in ensuring [...] Read more.
This paper presents a comprehensive review of handover management and load balancing optimization (LBO) in ultra-dense 5G and emerging 6G cellular networks. With the increasing deployment of small cells and the rapid growth of data traffic, these networks face significant challenges in ensuring seamless mobility and efficient resource allocation. Traditional handover and load balancing techniques, primarily designed for 4G systems, are no longer sufficient to address the complexity of heterogeneous network environments that incorporate millimeter-wave communication, Internet of Things (IoT) devices, and unmanned aerial vehicles (UAVs). The review focuses on how recent advances in artificial intelligence (AI), particularly machine learning (ML) and deep learning (DL), are being applied to improve predictive handover decisions and enable real-time, adaptive load distribution. AI-driven solutions can significantly reduce handover failures, latency, and network congestion, while improving overall user experience and quality of service (QoS). This paper surveys state-of-the-art research on these techniques, categorizing them according to their application domains and evaluating their performance benefits and limitations. Furthermore, the paper discusses the integration of intelligent handover and load balancing methods in smart city scenarios, where ultra-dense networks must support diverse services with high reliability and low latency. Key research gaps are also identified, including the need for standardized datasets, energy-efficient AI models, and context-aware mobility strategies. Overall, this review aims to guide future research and development in designing robust, AI-assisted mobility and resource management frameworks for next-generation wireless systems. Full article
Show Figures

Figure 1

19 pages, 8477 KiB  
Article
Wideband Dual-Polarized PRGW Antenna Array with High Isolation for Millimeter-Wave IoT Applications
by Zahra Mousavirazi, Mohamed Mamdouh M. Ali, Abdel R. Sebak and Tayeb A. Denidni
Sensors 2025, 25(11), 3387; https://doi.org/10.3390/s25113387 - 28 May 2025
Viewed by 663
Abstract
This work presents a novel dual-polarized antenna array tailored for Internet of Things (IoT) applications, specifically designed to operate in the millimeter-wave (mm-wave) spectrum within the frequency range of 30–60 GHz. Leveraging printed ridge gap waveguide (PRGW) technology, the antenna ensures robust performance [...] Read more.
This work presents a novel dual-polarized antenna array tailored for Internet of Things (IoT) applications, specifically designed to operate in the millimeter-wave (mm-wave) spectrum within the frequency range of 30–60 GHz. Leveraging printed ridge gap waveguide (PRGW) technology, the antenna ensures robust performance by eliminating parasitic radiation from the feed network, thus significantly enhancing the reliability and efficiency required by IoT communication systems, particularly for smart cities, autonomous vehicles, and high-speed sensor networks. The proposed antenna achieves superior radiation characteristics through a cross-shaped magneto-electric (ME) dipole backed by an artificial magnetic conductor (AMC) cavity and electromagnetic bandgap (EBG) structures. These features suppress surface waves, reduce edge diffraction, and minimize back-lobe emissions, enabling stable, high-quality IoT connectivity. The antenna demonstrates a wide impedance bandwidth of 24% centered at 30 GHz and exceptional isolation exceeding 40 dB, ensuring interference-free dual-polarized operation crucial for densely populated IoT environments. Fabrication and testing validate the design, consistently achieving a gain of approximately 13.88 dBi across the operational bandwidth. The antenna’s performance effectively addresses the critical requirements of emerging IoT systems, including ultra-high data throughput, reduced latency, and robust wireless connectivity, essential for real-time applications such as healthcare monitoring, vehicular communication, and smart infrastructure. Full article
(This article belongs to the Special Issue Design and Measurement of Millimeter-Wave Antennas)
Show Figures

Figure 1

23 pages, 787 KiB  
Article
Computation Offloading and Resource Allocation for Energy-Harvested MEC in an Ultra-Dense Network
by Dedi Triyanto, I Wayan Mustika and Widyawan
Sensors 2025, 25(6), 1722; https://doi.org/10.3390/s25061722 - 10 Mar 2025
Viewed by 956
Abstract
Mobile edge computing (MEC) is a modern technique that has led to substantial progress in wireless networks. To address the challenge of efficient task implementation in resource-limited environments, this work strengthens system performance through resource allocation based on fairness and energy efficiency. Integration [...] Read more.
Mobile edge computing (MEC) is a modern technique that has led to substantial progress in wireless networks. To address the challenge of efficient task implementation in resource-limited environments, this work strengthens system performance through resource allocation based on fairness and energy efficiency. Integration of energy-harvesting (EH) technology with MEC improves sustainability by optimizing the power consumption of mobile devices, which is crucial to the efficiency of task execution. The combination of MEC and an ultra-dense network (UDN) is essential in fifth-generation networks to fulfill the computing requirements of ultra-low-latency applications. In this study, issues related to computation offloading and resource allocation are addressed using the Lyapunov mixed-integer linear programming (MILP)-based optimal cost (LYMOC) technique. The optimization problem is solved using the Lyapunov drift-plus-penalty method. Subsequently, the MILP approach is employed to select the optimal offloading option while ensuring fairness-oriented resource allocation among users to improve overall system performance and user satisfaction. Unlike conventional approaches, which often overlook fairness in dense networks, the proposed method prioritizes fairness-oriented resource allocation, preventing service degradation and enhancing network efficiency. Overall, the results of simulation studies demonstrate that the LYMOC algorithm may considerably decrease the overall cost of system execution when compared with the Lyapunov–MILP-based short-distance complete local execution algorithm and the full offloading-computation method. Full article
(This article belongs to the Special Issue Advanced Management of Fog/Edge Networks and IoT Sensors Devices)
Show Figures

Figure 1

31 pages, 3954 KiB  
Review
A Review on Congestion Mitigation Techniques in Ultra-Dense Wireless Sensor Networks: State-of-the-Art Future Emerging Artificial Intelligence-Based Solutions
by Abdullah Umar, Zubair Khalid, Mohammed Ali, Mohammed Abazeed, Ali Alqahtani, Rahat Ullah and Hashim Safdar
Appl. Sci. 2023, 13(22), 12384; https://doi.org/10.3390/app132212384 - 16 Nov 2023
Cited by 4 | Viewed by 2899
Abstract
The Internet of Things (IoT) and wireless sensor networks (WSNs) have evolved rapidly due to technological breakthroughs. WSNs generate high traffic due to the growing number of sensor nodes. Congestion is one of several problems caused by the huge amount of data in [...] Read more.
The Internet of Things (IoT) and wireless sensor networks (WSNs) have evolved rapidly due to technological breakthroughs. WSNs generate high traffic due to the growing number of sensor nodes. Congestion is one of several problems caused by the huge amount of data in WSNs. When wireless network resources are limited and IoT devices require more and more resources, congestion occurs in extremely dense WSN-based IoT networks. Reduced throughput, reduced network capacity, and reduced energy efficiency within WSNs are all effects of congestion. These consequences eventually lead to network outages due to underutilized network resources, increased network operating costs, and significantly degraded quality of service (QoS). Therefore, it is critical to deal with congestion in WSN-based IoT networks. Researchers have developed a number of approaches to address this problem, with new solutions based on artificial intelligence (AI) standing out. This research examines how new AI-based algorithms contribute to congestion mitigation in WSN-based IoT networks and the various congestion mitigation strategies that have helped reduce congestion. This study also highlights the limitations of AI-based solutions, including where and why they are used in WSNs, and a comparative study of the current literature that makes this study novel. The study concludes with a discussion of its significance and potential future study topics. The topic of congestion reduction in ultra-dense WSN-based IoT networks, as well as the current state of the art and emerging future solutions, demonstrates their significant expertise in reducing WSN congestion. These solutions contribute to network optimization, throughput enhancement, quality of service improvement, network capacity expansion, and overall WSN efficiency improvement. Full article
Show Figures

Figure 1

40 pages, 3570 KiB  
Review
Emerging Technologies for 6G Communication Networks: Machine Learning Approaches
by Annisa Anggun Puspitasari, To Truong An, Mohammed H. Alsharif and Byung Moo Lee
Sensors 2023, 23(18), 7709; https://doi.org/10.3390/s23187709 - 6 Sep 2023
Cited by 48 | Viewed by 10592
Abstract
The fifth generation achieved tremendous success, which brings high hopes for the next generation, as evidenced by the sixth generation (6G) key performance indicators, which include ultra-reliable low latency communication (URLLC), extremely high data rate, high energy and spectral efficiency, ultra-dense connectivity, integrated [...] Read more.
The fifth generation achieved tremendous success, which brings high hopes for the next generation, as evidenced by the sixth generation (6G) key performance indicators, which include ultra-reliable low latency communication (URLLC), extremely high data rate, high energy and spectral efficiency, ultra-dense connectivity, integrated sensing and communication, and secure communication. Emerging technologies such as intelligent reflecting surface (IRS), unmanned aerial vehicles (UAVs), non-orthogonal multiple access (NOMA), and others have the ability to provide communications for massive users, high overhead, and computational complexity. This will address concerns over the outrageous 6G requirements. However, optimizing system functionality with these new technologies was found to be hard for conventional mathematical solutions. Therefore, using the ML algorithm and its derivatives could be the right solution. The present study aims to offer a thorough and organized overview of the various machine learning (ML), deep learning (DL), and reinforcement learning (RL) algorithms concerning the emerging 6G technologies. This study is motivated by the fact that there is a lack of research on the significance of these algorithms in this specific context. This study examines the potential of ML algorithms and their derivatives in optimizing emerging technologies to align with the visions and requirements of the 6G network. It is crucial in ushering in a new era of communication marked by substantial advancements and requires grand improvement. This study highlights potential challenges for wireless communications in 6G networks and suggests insights into possible ML algorithms and their derivatives as possible solutions. Finally, the survey concludes that integrating Ml algorithms and emerging technologies will play a vital role in developing 6G networks. Full article
(This article belongs to the Special Issue Communication, Sensing and Localization in 6G Systems)
Show Figures

Figure 1

26 pages, 11234 KiB  
Article
Algorithm for Topology Search Using Dilution of Precision Criterion in Ultra-Dense Network Positioning Service Area
by Grigoriy Fokin and Andrey Koucheryavy
Mathematics 2023, 11(10), 2227; https://doi.org/10.3390/math11102227 - 9 May 2023
Cited by 1 | Viewed by 1923
Abstract
User equipment (UE) location estimation in emerging 5G/B5G/6G Ultra-Dense Networks (UDNs) is a breakthrough technology in future wireless info-communication ecosystems. Apart from communication aspects, network infrastructure densification promises significant improvement in UE positioning accuracy. Unlike networks of previous generations, an increased number of [...] Read more.
User equipment (UE) location estimation in emerging 5G/B5G/6G Ultra-Dense Networks (UDNs) is a breakthrough technology in future wireless info-communication ecosystems. Apart from communication aspects, network infrastructure densification promises significant improvement in UE positioning accuracy. Unlike networks of previous generations, an increased number of gNodeBs (gNBs) per unit area and/or volume in UDNs allows to perform measurements for UE positioning only with those base stations whose topologies are most suitable from the geometric point of view. Quantitative measurements of gNB topology suitability include horizontal (HDOP), vertical (VDOP), and position (PDOP) dilution of the precision (DOP) criteria on the plane, in height, and in space, respectively. In the current work, we formalize a set of methods for gNB topology search using time of arrival (TOA), time difference of arrival (TDOA), angle of arrival (AOA), and combined TOA–AOA and TDOA-AOA measurements. The background of the topology search using DOP criteria is a significantly increased number of gNBs per unit volume in UDNs. Based on a simulation, we propose a novel approach for a topology search in a positioning service area, resulting in a PDOP less than one for the Gazprom Arena with only five gNBs. The contribution of the current research includes algorithm and software for an iterative search of all possible gNB and UE locations in space, minimizing UE geometric DOP. The practical application of the algorithm is the gNB topology substantiation for the given positioning scenarios in 5G/B5G/6G UDNs. Full article
(This article belongs to the Section E: Applied Mathematics)
Show Figures

Figure 1

20 pages, 1232 KiB  
Article
Backhaul Capacity-Limited Joint User Association and Power Allocation Scheme in Ultra-Dense Millimeter-Wave Networks
by Zhiwei Si, Gang Chuai, Kaisa Zhang, Weidong Gao, Xiangyu Chen and Xuewen Liu
Entropy 2023, 25(3), 409; https://doi.org/10.3390/e25030409 - 23 Feb 2023
Cited by 4 | Viewed by 1958
Abstract
Millimeter-wave (mmWave) communication is considered a promising technology for fifth-generation (5G) wireless communications systems since it can greatly improve system throughput. Unfortunately, because of extremely high frequency, mmWave transmission suffers from the signal blocking problem, which leads to the deterioration of transmission performance. [...] Read more.
Millimeter-wave (mmWave) communication is considered a promising technology for fifth-generation (5G) wireless communications systems since it can greatly improve system throughput. Unfortunately, because of extremely high frequency, mmWave transmission suffers from the signal blocking problem, which leads to the deterioration of transmission performance. In this paper, we solve this problem by the combination of ultra-dense network (UDN) and user-centric virtual cell architecture. The deployment of dense small base stations (SBSs) in UDN can reduce transmission distance of signals. The user-centric virtual cell architecture mitigates and exploits interference to improve throughput by using coordinated multipoint (CoMP) transmission technology. Nonetheless, the backhaul burden is heavy and interbeam interference still severe. Therefore, we propose a novel iterative backhaul capacity-limited joint user association and power allocation (JUAPA) scheme in ultra-dense mmWave networks under user-centric virtual cell architecture. To mitigate interference and satisfy quality of service (QoS) requirements of users, a nonconvex system throughput optimization problem is formulated. To solve this intractable optimization problem, we divide it into two alternating optimization subproblems, i.e., user association and power allocation. During each iteration, a many-to-many matching algorithm is designed to solve user association. Subsequently, we perform power allocation optimization using a successive convex approximation (SCA) algorithm. The results confirm that the performance of the proposed scheme is close to that of the exhaustive searching scheme, which greatly reduces complexity, and clearly superior to that of traditional schemes in improving system throughput and satisfying QoS requirements. Full article
Show Figures

Figure 1

20 pages, 4649 KiB  
Article
A Reinforcement Learning Handover Parameter Adaptation Method Based on LSTM-Aided Digital Twin for UDN
by Jiao He, Tianqi Xiang, Yixin Wang, Huiyuan Ruan and Xin Zhang
Sensors 2023, 23(4), 2191; https://doi.org/10.3390/s23042191 - 15 Feb 2023
Cited by 8 | Viewed by 2944
Abstract
Adaptation of handover parameters in ultra-dense networks has always been one of the key issues in optimizing network performance. Aiming at the optimization goal of effective handover ratio, this paper proposes a deep Q-learning (DQN) method that dynamically selects handover parameters according to [...] Read more.
Adaptation of handover parameters in ultra-dense networks has always been one of the key issues in optimizing network performance. Aiming at the optimization goal of effective handover ratio, this paper proposes a deep Q-learning (DQN) method that dynamically selects handover parameters according to wireless signal fading conditions. This approach seeks good backward compatibility. In order to enhance the efficiency and performance of the DQN method, Long Short Term Memory (LSTM) is used to build a digital twin and assist the DQN algorithm to achieve a more efficient search. Simulation experiments prove that the enhanced method has a faster convergence speed than the ordinary DQN method, and at the same time, achieves an average effective handover ratio increase of 2.7%. Moreover, in different wireless signal fading intervals, the method proposed in this paper has achieved better performance. Full article
Show Figures

Figure 1

15 pages, 5508 KiB  
Article
Adaptive Handover Decision Using Fuzzy Logic for 5G Ultra-Dense Networks
by Wen-Shyang Hwang, Teng-Yu Cheng, Yan-Jing Wu and Ming-Hua Cheng
Electronics 2022, 11(20), 3278; https://doi.org/10.3390/electronics11203278 - 12 Oct 2022
Cited by 31 | Viewed by 3137
Abstract
With the explosive increase in traffic volume in fifth-generation (5G) mobile wireless networks, an ultra-dense network (UDN) architecture, composed of highly concentrated millimeter-wave base stations within the fourth-generation (4G) system, has been developed. User equipment (UE) may encounter more frequent handover opportunities when [...] Read more.
With the explosive increase in traffic volume in fifth-generation (5G) mobile wireless networks, an ultra-dense network (UDN) architecture, composed of highly concentrated millimeter-wave base stations within the fourth-generation (4G) system, has been developed. User equipment (UE) may encounter more frequent handover opportunities when moving in a UDN. Conventional handover schemes are too simple to adapt to the diverse handover scenarios encountered in 5G UDNs because they consider only UE signal strength. Unnecessary handovers aggravate the ping-pong effect and degrade the quality of service of cellular networks. Fuzzy logic (FL) is considered the best technique to unravel the handover problem in a high-density scenario of small cells for 4G/5G networks. In this paper, we propose an FL-based handover scheme to dynamically adjust the values of two handover parameters, namely handover margin (HOM) and time to trigger (TTT), with respect to each UE. The proposed scheme, abbreviated as FLDHDT, has dynamic adjustment of TTT in addition to HOM by using the signal to interference plus noise ratio and horizontal moving speed of the UE as inputs to the FL controller. To demonstrate the effectiveness and superiority of FLDHDT, we perform simulations using the well-known ns-3 simulator. The performance measures include the number of handovers, overall system throughput, and ping-pong ratio. The simulation results demonstrate that FLDHDT improves the handover performance of 5G UDNs in terms of the number of handovers, ping-pong ratio, and overall system throughput compared to a conventional handover scheme, namely Event A3, and an FL-based handover scheme with dynamic adjustment of only HOM. Full article
(This article belongs to the Special Issue Advances in Millimeter-Wave Cellular Networks)
Show Figures

Figure 1

13 pages, 817 KiB  
Article
Self-Optimizing Traffic Steering for 5G mmWave Heterogeneous Networks
by Jun Zeng, Hao Wang and Wei Luo
Sensors 2022, 22(19), 7112; https://doi.org/10.3390/s22197112 - 20 Sep 2022
Cited by 1 | Viewed by 2472
Abstract
Driven by growing mobile traffic, millimeter wave (mmWave) communications have recently been developed to enhance wireless network capacity. Due to insufficient coverage and the lack of support for mobility, mmWave is often deployed in the ultra-dense small cells of the 5G heterogeneous network. [...] Read more.
Driven by growing mobile traffic, millimeter wave (mmWave) communications have recently been developed to enhance wireless network capacity. Due to insufficient coverage and the lack of support for mobility, mmWave is often deployed in the ultra-dense small cells of the 5G heterogeneous network. In this article, we first summarize the characteristics of the 5G heterogeneous network from the viewpoints of devices, spectra, and networks. We then propose a triple-band network structure which incorporates licensed bands, sub-6GHz unlicensed bands, and mmWave bands to support various types of mobile users. Based on the novel network structure, we further propose a self-optimizing traffic steering strategy which can intelligently steer traffic to specific networks and spectra according to the dynamic network and traffic environments. Several use cases are also discussed to facilitate the implementation of our proposals. Finally, we present numerical results to demonstrate that the proposed network structure and strategy can effectively enhance the system throughput and energy efficiency. Full article
(This article belongs to the Special Issue Advanced Technologies in 6G Heterogeneous Networks)
Show Figures

Figure 1

23 pages, 4174 KiB  
Article
Novel Channel/QoS Aware Downlink Scheduler for Next-Generation Cellular Networks
by Dalia H. Y. Taha, Huseyin Haci and Ali Serener
Electronics 2022, 11(18), 2895; https://doi.org/10.3390/electronics11182895 - 13 Sep 2022
Cited by 6 | Viewed by 2302
Abstract
Downlink schedulers play a vital part in the current and next-generation wireless networks. The next generation downlink scheduler should satisfy the demand for different requirements, such as dealing with ultra-dense networks and the need to run real-time (RT) and non-real-time (nRT) applications, with [...] Read more.
Downlink schedulers play a vital part in the current and next-generation wireless networks. The next generation downlink scheduler should satisfy the demand for different requirements, such as dealing with ultra-dense networks and the need to run real-time (RT) and non-real-time (nRT) applications, with a high quality of service (QoS). Many researchers have developed various schedulers for these, but none have introduced one scheduler to target them all. This paper introduces a novel channel/QoS aware downlink scheduler algorithm, called Advanced Fair Throughput Optimized Scheduler (AFTOS), for ultra-dense networks. AFTOS is a multi-QoS scheduler that aims to maximize system spectrum efficiency and user throughput with enhanced fairness, delay, and packet loss ratio (PLR). It is capable of handling RT and nRT traffic. We developed two new policies, called Adjusted Largest Weighted Delay First (ALWDF) and Fair Throughput Optimized Scheduler (FTOS), for RT and nRT traffic. Then, we joint them to introduce our novel downlink scheduler Advanced Fair Throughput Optimized Scheduler (AFTOS). For evaluating the suggested algorithm, we undertook experiments to decide the ideal parameter value for the proposed approaches and compared the proposed solution to current best practices. The findings prove that the AFTOS algorithm can achieve its objectives, outperforming the alternative techniques. Full article
(This article belongs to the Section Microwave and Wireless Communications)
Show Figures

Figure 1

49 pages, 6741 KiB  
Review
Interference Challenges and Management in B5G Network Design: A Comprehensive Review
by Osamah Thamer Hassan Alzubaidi, MHD Nour Hindia, Kaharudin Dimyati, Kamarul Ariffin Noordin, Amelia Natasya Abdul Wahab, Faizan Qamar and Rosilah Hassan
Electronics 2022, 11(18), 2842; https://doi.org/10.3390/electronics11182842 - 8 Sep 2022
Cited by 51 | Viewed by 8352
Abstract
Beyond Fifth Generation (B5G) networks are expected to be the most efficient cellular wireless networks with greater capacity, lower latency, and higher speed than the current networks. Key enabling technologies, such as millimeter-wave (mm-wave), beamforming, Massive Multiple-Input Multiple-Output (M-MIMO), Device-to-Device (D2D), Relay Node [...] Read more.
Beyond Fifth Generation (B5G) networks are expected to be the most efficient cellular wireless networks with greater capacity, lower latency, and higher speed than the current networks. Key enabling technologies, such as millimeter-wave (mm-wave), beamforming, Massive Multiple-Input Multiple-Output (M-MIMO), Device-to-Device (D2D), Relay Node (RN), and Heterogeneous Networks (HetNets) are essential to enable the new network to keep growing. In the forthcoming wireless networks with massive random deployment, frequency re-use strategies and multiple low power nodes, severe interference issues will impact the system. Consequently, interference management represents the main challenge for future wireless networks, commonly referred to as B5G. This paper provides an overview of the interference issues relating to the B5G networks from the perspective of HetNets, D2D, Ultra-Dense Networks (UDNs), and Unmanned Aerial Vehicles (UAVs). Furthermore, the existing interference mitigation techniques are discussed by reviewing the latest relevant studies with a focus on their methods, advantages, limitations, and future directions. Moreover, the open issues and future directions to reduce the effects of interference are also presented. The findings of this work can act as a guide to better understand the current and developing methodologies to mitigate the interference issues in B5G networks. Full article
(This article belongs to the Special Issue New Challenges in 5G Networks Design)
Show Figures

Figure 1

38 pages, 2665 KiB  
Review
A Review of Energy Efficiency and Power Control Schemes in Ultra-Dense Cell-Free Massive MIMO Systems for Sustainable 6G Wireless Communication
by Agbotiname Lucky Imoize, Hope Ikoghene Obakhena, Francis Ifeanyi Anyasi and Samarendra Nath Sur
Sustainability 2022, 14(17), 11100; https://doi.org/10.3390/su141711100 - 5 Sep 2022
Cited by 54 | Viewed by 7345
Abstract
The traditional multiple input multiple output (MIMO) systems cannot provide very high Spectral Efficiency (SE), Energy Efficiency (EE), and link reliability, which are critical to guaranteeing the desired Quality of Experience (QoE) in 5G and beyond 5G wireless networks. To bridge this gap, [...] Read more.
The traditional multiple input multiple output (MIMO) systems cannot provide very high Spectral Efficiency (SE), Energy Efficiency (EE), and link reliability, which are critical to guaranteeing the desired Quality of Experience (QoE) in 5G and beyond 5G wireless networks. To bridge this gap, ultra-dense cell-free massive MIMO (UD CF-mMIMO) systems are exploited to boost cell-edge performance and provide ultra-low latency in emerging wireless communication systems. This paper attempts to provide critical insights on high EE operation and power control schemes for maximizing the performance of UD CF-mMIMO systems. First, the recent advances in UD CF-mMIMO systems and the associated models are elaborated. The power consumption model, power consumption parts, and energy maximization techniques are discussed extensively. Further, the various power control optimization techniques are discussed comprehensively. Key findings from this study indicate an unprecedented growth in high-rate demands, leading to a significant increase in energy consumption. Additionally, substantial gains in EE require efficient utilization of optimal energy maximization techniques, green design, and dense deployment of massive antenna arrays. Overall, this review provides an elaborate discussion of the research gaps and proposes several research directions, critical challenges, and useful recommendations for future works in wireless communication systems. Full article
Show Figures

Figure 1

Back to TopTop