Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,308)

Search Parameters:
Keywords = 5G wireless networks

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
21 pages, 4405 KB  
Article
Performance Benchmarking of 5G SA and NSA Networks for Wireless Data Transfer
by Miha Pipan, Marko Šimic and Niko Herakovič
J. Sens. Actuator Netw. 2026, 15(1), 18; https://doi.org/10.3390/jsan15010018 - 2 Feb 2026
Abstract
This paper presents test results of the performance comparison of 5G standalone (SA) and non-standalone (NSA) networks in the context of gathering data of remote sensors and machines. The study evaluates key network characteristics such as latency, throughput, jitter and packet loss (for [...] Read more.
This paper presents test results of the performance comparison of 5G standalone (SA) and non-standalone (NSA) networks in the context of gathering data of remote sensors and machines. The study evaluates key network characteristics such as latency, throughput, jitter and packet loss (for UDP protocol only) using standardized tests to gain insights into the impact of these factors on real-time and data-intensive communication. In addition, a range of communication protocols including OPC UA, Modbus, MQTT, AMQP, CoAP, EtherCAT and gRPC were tested to assess their efficiency, scalability and suitability with different send data sizes. By conducting experiments in a controlled hardware environment, we have analyzed the impact of the 5G architecture on protocol behavior and measured the transmission performance at different data sizes and connection configurations. Particular attention is paid to protocol overhead, data transfer rates and responsiveness, which are crucial for industrial automation and IoT deployments. The results show that SA networks consistently offer lower latency and more stable performance, where robust and low-latency data transfer is essential. In contrast, lightweight IoT protocols such as MQTT and CoAP demonstrate reliable operation in both SA and NSA environments due to their low overhead and adaptability. These insights are equally important for time-critical industrial protocols such as EtherCAT and OPC UA, where stability and responsiveness are crucial for automation and control. The study highlights current limitations of 5G networks in supporting both remote sensing and industrial use cases, while providing guidance for selecting the most suitable communication protocols depending on network infrastructure and application requirements. Moreover, the results indicate directions for configuring and optimizing future 5G networks to better meet the demands of remote sensing systems and Industry 4.0 environments. Full article
(This article belongs to the Section Communications and Networking)
Show Figures

Figure 1

22 pages, 4725 KB  
Article
Design of Multi-Source Fusion Wireless Acquisition System for Grid-Forming SVG Device Valve Hall
by Liqian Liao, Yuanwei Zhou, Guangyu Tang, Jiayi Ding, Ping Wang, Bo Yin, Liangbo Xie, Jie Zhang and Hongxin Zhong
Electronics 2026, 15(3), 641; https://doi.org/10.3390/electronics15030641 - 2 Feb 2026
Abstract
With the increasing deployment of grid-forming static var generators (GFM-SVG) in modern power systems, the reliability of the valve hall that houses the core power modules has become a critical concern. To overcome the limitations of conventional wired monitoring systems—complex cabling, poor scalability, [...] Read more.
With the increasing deployment of grid-forming static var generators (GFM-SVG) in modern power systems, the reliability of the valve hall that houses the core power modules has become a critical concern. To overcome the limitations of conventional wired monitoring systems—complex cabling, poor scalability, and incomplete state perception—this paper proposes and implements a multi-source fusion wireless data acquisition system specifically designed for GFM-SVG valve halls. The system integrates acoustic, visual, and infrared sensing nodes into a wireless sensor network (WSN) to cooperatively capture thermoacoustic visual multi-physics information of key components. A dual-mode communication scheme, using Wireless Fidelity (Wi-Fi) as the primary link and Fourth-Generation Mobile Communication Network (4G) as a backup channel, is adopted together with data encryption, automatic reconnection, and retransmission-checking mechanisms to ensure reliable operation in strong electromagnetic interference environments. The main innovation lies in a multi-source information fusion algorithm based on an improved Dempster–Shafer (D–S) evidence theory, which is combined with the object detection capability of the You Only Look Once, Version 8 (YOLOv8) model to effectively handle the uncertainty and conflict of heterogeneous data sources. This enables accurate identification and early warning of multiple types of faults, including local overheating, abnormal acoustic signatures, and coolant leakage. Experimental results demonstrate that the proposed system achieves a fault-diagnosis accuracy of 98.5%, significantly outperforming single-sensor approaches, and thus provides an efficient and intelligent operation-and-maintenance solution for ensuring the safe and stable operation of GFM-SVG equipment. Full article
(This article belongs to the Section Industrial Electronics)
Show Figures

Figure 1

20 pages, 2121 KB  
Article
Reconfigurable Wireless Channel Optimization and Low-Complexity Control Methods Driven by Intelligent Metasurfaces 2.0
by Xiaoguang Hu, Junpeng Cui, Rui Zhang and Quanrong Fang
Telecom 2026, 7(1), 15; https://doi.org/10.3390/telecom7010015 - 2 Feb 2026
Abstract
With the evolution of Reconfigurable Intelligent Surface (RIS) technology, its potential for dynamically optimizing wireless channels has garnered significant attention. However, existing methods still face challenges in real-time control in complex environments due to high computational complexity. To address this, this paper proposes [...] Read more.
With the evolution of Reconfigurable Intelligent Surface (RIS) technology, its potential for dynamically optimizing wireless channels has garnered significant attention. However, existing methods still face challenges in real-time control in complex environments due to high computational complexity. To address this, this paper proposes a reconfigurable wireless channel optimization framework based on Intelligent Metasurfaces 2.0 and designs a low-complexity control strategy. The strategy integrates an adaptive adjustment mechanism and multi-dimensional feedback, aiming to reduce system computational load. Experimental results show that compared to traditional methods (such as MRC and MMSE), the proposed method improves signal transmission quality (SNR improvement of 3.8 dB) and system stability (exponential increase to 0.92). When compared to advanced deep reinforcement learning (DRL) and graph neural network (GNN) methods, it achieves similar signal quality while reducing computational overhead by 20.0% and energy consumption by approximately 32.4%. Ablation experiments further verify the effectiveness and synergistic role of the proposed core modules. This study provides a feasible approach toward high-efficiency, low-complexity dynamic channel optimization in 5G and future communication networks. Full article
Show Figures

Figure 1

15 pages, 13322 KB  
Article
A Cross-Layer Framework Integrating RF and OWC with Dynamic Modulation Scheme Selection for 6G Networks
by Ahmed Waheed, Borja Genoves Guzman, Somayeh Mohammady and Maite Brandt-Pearce
Sensors 2026, 26(3), 926; https://doi.org/10.3390/s26030926 - 1 Feb 2026
Viewed by 84
Abstract
With the rapid evolution of wireless networks, the need to explore novel technologies to meet the demands of future systems, particularly 6G, has become a significant challenge. One promising solution is integrating radio frequency (RF) and optical wireless communication (OWC) technologies to leverage [...] Read more.
With the rapid evolution of wireless networks, the need to explore novel technologies to meet the demands of future systems, particularly 6G, has become a significant challenge. One promising solution is integrating radio frequency (RF) and optical wireless communication (OWC) technologies to leverage their unique strengths. This paper introduces a novel model for integrating RF and OWC technologies within the framework of emerging 6G. The main objective of this approach is the dynamic technology selection (TS) and modulation scheme selection (MSS), which play a pivotal role in optimizing network efficiency and adapting to diverse 6G requirements. The proposed cross-layer architecture integrates the application layer, network layer based on a software-defined network (SDN), and physical layer consisting of a hybrid cell and software-defined radio with optical functionality (SDR-O). This approach facilitates real-time decision-making based on environmental factors and application requirements. Full article
(This article belongs to the Special Issue Recent Advances in Optical Wireless Communications)
Show Figures

Figure 1

17 pages, 858 KB  
Article
Large AI Model-Enhanced Digital Twin-Driven 6G Healthcare IoE
by Haoyuan Hu, Ziyi Song and Wenzao Shi
Electronics 2026, 15(3), 619; https://doi.org/10.3390/electronics15030619 - 31 Jan 2026
Viewed by 87
Abstract
The convergence of the Internet of Everything (IoE) and healthcare requires ultra-reliable, low-latency, and intelligent communication systems. Sixth-generation (6G) wireless networks, coupled with digital twin (DT) models and large AI models (LAMs), are envisioned to promise substantial and practically meaningful improvements in smart [...] Read more.
The convergence of the Internet of Everything (IoE) and healthcare requires ultra-reliable, low-latency, and intelligent communication systems. Sixth-generation (6G) wireless networks, coupled with digital twin (DT) models and large AI models (LAMs), are envisioned to promise substantial and practically meaningful improvements in smart healthcare by enabling real-time monitoring, diagnosis, and personalized treatment. In this article, we propose an LAM-enhanced DT-driven network slicing framework for healthcare applications. The framework leverages large models to provide predictive insights and adaptive orchestration by creating virtual replicas of patients and medical devices that guide dynamic slice allocation. Reinforcement learning (RL) techniques are employed to optimize slice orchestration under uncertain traffic conditions, with LAMs augmenting decision-making through cognitive-level reasoning. Numerical results show that the proposed LAM–DT–RL framework reduces service-level agreement (SLA) violations by approximately 42–43% compared to a reinforcement-learning-only slicing strategy, while improving spectral efficiency and fairness among heterogeneous healthcare services. Finally, we outline open challenges and future research opportunities in integrating LAMs, DTs, and 6G for resilient healthcare IoE systems. Full article
Show Figures

Figure 1

41 pages, 1830 KB  
Article
Cross Layer Optimization Using AI/ML-Assisted Federated Edge Learning in 6G Networks
by Spyridon Louvros, AnupKumar Pandey, Brijesh Shah and Yashesh Buch
Future Internet 2026, 18(2), 71; https://doi.org/10.3390/fi18020071 - 30 Jan 2026
Viewed by 102
Abstract
This paper introduces a novel methodology that integrates 6G wireless Federated Edge Learning (FEEL) frameworks with MAC to PHY cross layer optimization strategies. In the context of mobile edge computing, typically ensuring robust channel estimation within the 6G network use cases presents critical [...] Read more.
This paper introduces a novel methodology that integrates 6G wireless Federated Edge Learning (FEEL) frameworks with MAC to PHY cross layer optimization strategies. In the context of mobile edge computing, typically ensuring robust channel estimation within the 6G network use cases presents critical challenges, particularly in managing data retransmissions. Inaccurate updates from distributed 6G devices can undermine the reliability of federated learning, affecting its overall performance. To address this, rather than relying on direct evaluations of the objective function, we propose an AI/ML-assisted algorithm for global optimization based on radial basis functions (RBFs) decision-making process to assess learned preference options. Full article
(This article belongs to the Special Issue Toward 6G Networks: Challenges and Technologies)
17 pages, 941 KB  
Article
AI-Enabled Autoencoder-Based Physical Layer Design for 6G Communication Systems
by Andreani Christopoulou, Dimitrios Kosmanos, Apostolos Xenakis and Costas Chaikalis
Electronics 2026, 15(3), 538; https://doi.org/10.3390/electronics15030538 - 26 Jan 2026
Viewed by 242
Abstract
Next-generation wireless communication 6G systems are expected to operate under diverse channel conditions and structures, requiring flexible and data-driven communication schemes. As traditional techniques face limitations in complex and dynamic environments, trained communication architectures have emerged as promising alternatives. In this paper, we [...] Read more.
Next-generation wireless communication 6G systems are expected to operate under diverse channel conditions and structures, requiring flexible and data-driven communication schemes. As traditional techniques face limitations in complex and dynamic environments, trained communication architectures have emerged as promising alternatives. In this paper, we present a thorough study on deep learning trained physical layer components, focusing on autoencoder-based transceivers and neural network modules that enhance the receiver’s intelligence. We further investigate two essential deep learning capabilities for modern receivers—modulation classification using neural architectures and generative data synthesis for channel estimation training. Moreover, the proposed models and simulation framework provide insight into how deep learning can be systematically integrated into the physical layer to improve adaptability, robustness, and efficiency. Full article
(This article belongs to the Special Issue Advances in AI for 6G Signal Processing)
47 pages, 2599 KB  
Review
The Role of Artificial Intelligence in Next-Generation Handover Decision Techniques for UAVs over 6G Networks
by Mohammed Zaid, Rosdiadee Nordin and Ibraheem Shayea
Drones 2026, 10(2), 85; https://doi.org/10.3390/drones10020085 - 26 Jan 2026
Viewed by 180
Abstract
The rapid integration of unmanned aerial vehicles (UAVs) into next-generation wireless systems demands seamless and reliable handover (HO) mechanisms to ensure continuous connectivity. However, frequent topology changes, high mobility, and dynamic channel variations make traditional HO schemes inadequate for UAV-assisted 6G networks. This [...] Read more.
The rapid integration of unmanned aerial vehicles (UAVs) into next-generation wireless systems demands seamless and reliable handover (HO) mechanisms to ensure continuous connectivity. However, frequent topology changes, high mobility, and dynamic channel variations make traditional HO schemes inadequate for UAV-assisted 6G networks. This paper presents a comprehensive review of existing HO optimization studies, emphasizing artificial intelligence (AI) and machine learning (ML) approaches as enablers of intelligent mobility management. The surveyed works are categorized into three main scenarios: non-UAV HOs, UAVs acting as aerial base stations, and UAVs operating as user equipment, each examined under traditional rule-based and AI/ML-based paradigms. Comparative insights reveal that while conventional methods remain effective for static or low-mobility environments, AI- and ML-driven approaches significantly enhance adaptability, prediction accuracy, and overall network robustness. Emerging techniques such as deep reinforcement learning and federated learning (FL) demonstrate strong potential for proactive, scalable, and energy-efficient HO decisions in future 6G ecosystems. The paper concludes by outlining key open issues and identifying future directions toward hybrid, distributed, and context-aware learning frameworks for resilient UAV-enabled HO management. Full article
26 pages, 2167 KB  
Article
AI-Powered Service Robots for Smart Airport Operations: Real-World Implementation and Performance Analysis in Passenger Flow Management
by Eleni Giannopoulou, Panagiotis Demestichas, Panagiotis Katrakazas, Sophia Saliverou and Nikos Papagiannopoulos
Sensors 2026, 26(3), 806; https://doi.org/10.3390/s26030806 - 25 Jan 2026
Viewed by 318
Abstract
The proliferation of air travel demand necessitates innovative solutions to enhance passenger experience while optimizing airport operational efficiency. This paper presents the pilot-scale implementation and evaluation of an AI-powered service robot ecosystem integrated with thermal cameras and 5G wireless connectivity at Athens International [...] Read more.
The proliferation of air travel demand necessitates innovative solutions to enhance passenger experience while optimizing airport operational efficiency. This paper presents the pilot-scale implementation and evaluation of an AI-powered service robot ecosystem integrated with thermal cameras and 5G wireless connectivity at Athens International Airport. The system addresses critical challenges in passenger flow management through real-time crowd analytics, congestion detection, and personalized robotic assistance. Eight strategically deployed thermal cameras monitor passenger movements across check-in areas, security zones, and departure entrances while employing privacy-by-design principles through thermal imaging technology that reduces personally identifiable information capture. A humanoid service robot, equipped with Robot Operating System navigation capabilities and natural language processing interfaces, provides real-time passenger assistance including flight information, wayfinding guidance, and congestion avoidance recommendations. The wi.move platform serves as the central intelligence hub, processing video streams through advanced computer vision algorithms to generate actionable insights including passenger count statistics, flow rate analysis, queue length monitoring, and anomaly detection. Formal trial evaluation conducted on 10 April 2025, with extended operational monitoring from April to June 2025, demonstrated strong technical performance with application round-trip latency achieving 42.9 milliseconds, perfect service reliability and availability ratings of one hundred percent, and comprehensive passenger satisfaction scores exceeding 4.3/5 across all evaluated dimensions. Results indicate promising potential for scalable deployment across major international airports, with identified requirements for sixth-generation network capabilities to support enhanced multi-robot coordination and advanced predictive analytics functionalities in future implementations. Full article
(This article belongs to the Section Sensors and Robotics)
Show Figures

Figure 1

54 pages, 3083 KB  
Review
A Survey on Green Wireless Sensing: Energy-Efficient Sensing via WiFi CSI and Lightweight Learning
by Rod Koo, Xihao Liang, Deepak Mishra and Aruna Seneviratne
Energies 2026, 19(2), 573; https://doi.org/10.3390/en19020573 - 22 Jan 2026
Viewed by 169
Abstract
Conventional sensing expends energy at three stages: powering dedicated sensors, transmitting measurements, and executing computationally intensive inference. Wireless sensing re-purposes WiFi channel state information (CSI) inherent in every packet, eliminating extra sensors and uplink traffic, though reliance on deep neural networks (DNNs) often [...] Read more.
Conventional sensing expends energy at three stages: powering dedicated sensors, transmitting measurements, and executing computationally intensive inference. Wireless sensing re-purposes WiFi channel state information (CSI) inherent in every packet, eliminating extra sensors and uplink traffic, though reliance on deep neural networks (DNNs) often trained and run on graphics processing units (GPUs) can negate these gains. This review highlights two core energy efficiency levers in CSI-based wireless sensing. First ambient CSI harvesting cuts power use by an order of magnitude compared to radar and active Internet of Things (IoT) sensors. Second, integrated sensing and communication (ISAC) embeds sensing functionality into existing WiFi links, thereby reducing device count, battery waste, and carbon impact. We review conventional handcrafted and accuracy-first methods to set the stage for surveying green learning strategies and lightweight learning techniques, including compact hybrid neural architectures, pruning, knowledge distillation, quantisation, and semi-supervised training that preserve accuracy while reducing model size and memory footprint. We also discuss hardware co-design from low-power microcontrollers to edge application-specific integrated circuits (ASICs) and WiFi firmware extensions that align computation with platform constraints. Finally, we identify open challenges in domain-robust compression, multi-antenna calibration, energy-proportionate model scaling, and standardised joules per inference metrics. Our aim is a practical battery-friendly wireless sensing stack ready for smart home and 6G era deployments. Full article
Show Figures

Graphical abstract

45 pages, 5287 KB  
Systematic Review
Cybersecurity in Radio Frequency Technologies: A Scientometric and Systematic Review with Implications for IoT and Wireless Applications
by Patrícia Rodrigues de Araújo, José Antônio Moreira de Rezende, Décio Rennó de Mendonça Faria and Otávio de Souza Martins Gomes
Sensors 2026, 26(2), 747; https://doi.org/10.3390/s26020747 - 22 Jan 2026
Viewed by 177
Abstract
Cybersecurity in radio frequency (RF) technologies has become a critical concern, driven by the expansion of connected systems in urban and industrial environments. Although research on wireless networks and the Internet of Things (IoT) has advanced, comprehensive studies that provide a global and [...] Read more.
Cybersecurity in radio frequency (RF) technologies has become a critical concern, driven by the expansion of connected systems in urban and industrial environments. Although research on wireless networks and the Internet of Things (IoT) has advanced, comprehensive studies that provide a global and integrated view of cybersecurity development in this field remain limited. This work presents a scientometric and systematic review of international publications from 2009 to 2025, integrating the PRISMA protocol with semantic screening supported by a Large Language Model to enhance classification accuracy and reproducibility. The analysis identified two interdependent axes: one focusing on signal integrity and authentication in GNSS systems and cellular networks; the other addressing the resilience of IoT networks, both strongly associated with spoofing and jamming, as well as replay, relay, eavesdropping, and man-in-the-middle (MitM) attacks. The results highlight the relevance of RF cybersecurity in securing communication infrastructures and expose gaps in widely adopted technologies such as RFID, NFC, BLE, ZigBee, LoRa, Wi-Fi, and unlicensed ISM bands, as well as in emerging areas like terahertz and 6G. These gaps directly affect the reliability and availability of IoT and wireless communication systems, increasing security risks in large-scale deployments such as smart cities and cyber–physical infrastructures. Full article
(This article belongs to the Special Issue Cyber Security and Privacy in Internet of Things (IoT))
Show Figures

Figure 1

37 pages, 2717 KB  
Review
Synthetizing 6G KPIs for Diverse Future Use Cases: A Comprehensive Review of Emerging Standards, Technologies, and Societal Needs
by Shujat Ali, Asma Abu-Samah, Mohammed H. Alsharif, Rosdiadee Nordin, Nauman Saqib, Mohammed Sani Adam, Umawathy Techanamurthy, Manzareen Mustafa and Nor Fadzilah Abdullah
Future Internet 2026, 18(1), 63; https://doi.org/10.3390/fi18010063 - 21 Jan 2026
Viewed by 318
Abstract
The anticipated transition from 5G to 6G is driven not by incremental performance demands but by a widening mismatch between emerging application requirements and the capabilities of existing cellular systems. Despite rapid progress across 3GPP Releases 15–20, the current literature lacks a unified [...] Read more.
The anticipated transition from 5G to 6G is driven not by incremental performance demands but by a widening mismatch between emerging application requirements and the capabilities of existing cellular systems. Despite rapid progress across 3GPP Releases 15–20, the current literature lacks a unified analysis that connects these standardization milestones to the concrete technical gaps that 6G must resolve. This study addresses this omission through a cross-release, application-driven review that traces how the evolution from enhanced mobile broadband to intelligent, sensing integrated networks lays the foundation for three core 6G service pillars: immersive communication (IC), everything connected (EC), and high-precision positioning. By examining use cases such as holographic telepresence, cooperative drone swarms, and large-scale Extended Reality (XR) ecosystems, this study exposes the limitations of today’s spectrum strategies, network architectures, and device capabilities and identifies the performance thresholds of Tbps-level throughput, sub-10 cm localization, sub-ms latency, and 10 M/km2 device density that next-generation systems must achieve. The novelty of this review lies in its synthesis of 3GPP advancements in XR, the non-terrestrial network (NTN), RedCap, ambient Internet of Things (IoT), and consideration of sustainability into a cohesive key performance indicator (KPI) framework that links future services to the required architectural and protocol innovations, including AI-native design and sub-THz operation. Positioned against global initiatives such as Hexa-X and the Next G Alliance, this paper argues that 6G represents a fundamental redesign of wireless communication advancement in 5G, driven by intelligence, adaptability, and long-term energy efficiency to satisfy diverse uses cases and requirements. Full article
Show Figures

Graphical abstract

22 pages, 2909 KB  
Article
Study on Quality of AI Service Guarantee in Digital Twin Networks for XR Scenarios
by Jinfei Zhou, Yuehong Gao, Xinyao Wang, Yiran Li and Ziqi Zhao
Electronics 2026, 15(2), 344; https://doi.org/10.3390/electronics15020344 - 13 Jan 2026
Viewed by 174
Abstract
In line with the trend of “native intelligence”, artificial intelligence (AI) will be more deeply integrated into communication networks in the future. Quality of AI service (QoAIS) will become an important factor in measuring the performance of native AI wireless networks. Networks should [...] Read more.
In line with the trend of “native intelligence”, artificial intelligence (AI) will be more deeply integrated into communication networks in the future. Quality of AI service (QoAIS) will become an important factor in measuring the performance of native AI wireless networks. Networks should reasonably allocate multi-dimensional resources to ensure QoAIS for users. Extended Reality (XR) is one of the important application scenarios for future 6G networks. To ensure both the accuracy and latency requirements of users for AI services are met, this paper proposes a resource allocation algorithm called Asynchronous Multi-Agent Deep Deterministic Policy Gradient with Independent State and Action (A-MADDPG-ISA). The proposed algorithm supports agents to use different dimensional state spaces and action spaces; therefore, it enables agents to address different strategy issues separately and makes the algorithm design more flexible. The actions of different agents are executed asynchronously, enabling actions outputted earlier to be transmitted as additional information to other agents. The simulation results show that the proposed algorithm has a 10.41% improvement compared to MADDPG (Multi-Agent Deep Deterministic Policy Gradient). Furthermore, to overcome the limitations of directly applying AI or manual rule-based schemes to real networks, this research establishes a digital twin network (DTN) system and designs pre-validation functionality. The DTN system contributes to better ensuring users’ QoAIS. Full article
Show Figures

Figure 1

28 pages, 2487 KB  
Article
Optimal Resource Allocation via Unified Closed-Form Solutions for SWIPT Multi-Hop DF Relay Networks
by Yang Yu, Xiaoqing Tang and Guihui Xie
Sensors 2026, 26(2), 512; https://doi.org/10.3390/s26020512 - 12 Jan 2026
Viewed by 265
Abstract
Multi-hop relaying can solve the problems of limited single-hop wireless communication distance, poor signal quality, or the inability to communicate directly by “relaying” data transmission through multiple intermediate nodes. It serves as the cornerstone for building large-scale, highly reliable, and self-adapting wireless networks, [...] Read more.
Multi-hop relaying can solve the problems of limited single-hop wireless communication distance, poor signal quality, or the inability to communicate directly by “relaying” data transmission through multiple intermediate nodes. It serves as the cornerstone for building large-scale, highly reliable, and self-adapting wireless networks, especially for the Internet of Things (IoT) and future 6G. This paper focuses on a decode-and-forward (DF) multi-hop relay network that employs simultaneous wireless information and power transfer (SWIPT) technology, with relays operating in a passive state. We first investigate the optimization of the power splitting (PS) ratio at each relay, given the source node transmit power, to maximize end-to-end network throughput. Subsequently, we jointly optimized the PS ratios and the source transmit power to minimize the source transmit power while satisfying the system’s minimum quality of service (QoS) requirement. Although both problems are non-convex, they can be reformulated as convex optimization problems. Closed-form optimal solutions are then derived based on the Karush–Kuhn–Tucker (KKT) conditions and a recursive method, respectively. Moreover, we find that the closed-form optimal solutions for the PS ratios corresponding to the two problems are identical. Through simulations, we validate that the performance of the two proposed schemes based on the closed-form solutions is optimal, while also demonstrating their extremely fast algorithm execution speeds, thereby proving the deployment value of the two proposed schemes in practical communication scenarios. Full article
(This article belongs to the Special Issue Wireless Communication and Networking for loT)
Show Figures

Figure 1

33 pages, 729 KB  
Review
A Comprehensive Review of Energy Efficiency in 5G Networks: Past Strategies, Present Advances, and Future Research Directions
by Narjes Lassoued and Noureddine Boujnah
Computers 2026, 15(1), 50; https://doi.org/10.3390/computers15010050 - 12 Jan 2026
Viewed by 412
Abstract
The rapid evolution of wireless communication toward Fifth Generation (5G) networks has enabled unprecedented performance improvement in terms of data rate, latency, reliability, sustainability, and connectivity. Recent years have witnessed an excessive deployment of new 5G networks worldwide. This deployment lead to an [...] Read more.
The rapid evolution of wireless communication toward Fifth Generation (5G) networks has enabled unprecedented performance improvement in terms of data rate, latency, reliability, sustainability, and connectivity. Recent years have witnessed an excessive deployment of new 5G networks worldwide. This deployment lead to an exponential growth in traffic flow and a massive number of connected devices requiring a new generation of energy-hungry base stations (BSs). This results in increased power consumption, higher operational costs, and greater environmental impact, making energy efficiency (EE) a critical research challenge. This paper presents a comprehensive survey of EE optimization strategies in 5G networks. It reviews the transition from traditional methods such as resources allocation, energy harvesting, BS sleep modes, and power control to modern artificial intelligence (AI)-driven solutions employing machine learning, deep reinforcement learning, and self-organizing networks (SON). Comparative analyses highlight the trade-offs between energy savings, network performance, and implementation complexity. Finally, the paper outlines key open issues and future directions toward sustainable 5G and beyond-5G (B5G/Sixth Generation (6G)) systems, emphasizing explainable AI, zero-energy communications, and holistic green network design. Full article
Show Figures

Figure 1

Back to TopTop