Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (322)

Search Parameters:
Keywords = 5G mobile wireless networks

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
20 pages, 2352 KiB  
Article
Three-Dimensional Physics-Based Channel Modeling for Fluid Antenna System-Assisted Air–Ground Communications by Reconfigurable Intelligent Surfaces
by Yuran Jiang and Xiao Chen
Electronics 2025, 14(15), 2990; https://doi.org/10.3390/electronics14152990 - 27 Jul 2025
Viewed by 214
Abstract
Reconfigurable intelligent surfaces (RISs), recognized as one of the most promising key technologies for sixth-generation (6G) mobile communications, are characterized by their minimal energy expenditure, cost-effectiveness, and straightforward implementation. In this study, we develop a novel communication channel model that integrates RIS-enabled base [...] Read more.
Reconfigurable intelligent surfaces (RISs), recognized as one of the most promising key technologies for sixth-generation (6G) mobile communications, are characterized by their minimal energy expenditure, cost-effectiveness, and straightforward implementation. In this study, we develop a novel communication channel model that integrates RIS-enabled base stations with unmanned ground vehicles. To enhance the system’s adaptability, we implement a fluid antenna system (FAS) at the unmanned ground vehicle (UGV) terminal. This innovative model demonstrates exceptional versatility across various wireless communication scenarios through the strategic adjustment of active ports. The inherent dynamic reconfigurability of the FAS provides superior flexibility and adaptability in air-to-ground communication environments. In the paper, we derive and study key performance characteristics like the autocorrelation function (ACF), validating the model’s effectiveness. The results demonstrate that the RIS-FAS collaborative scheme significantly enhances channel reliability while effectively addressing critical challenges in 6G networks, including signal blockage and spatial constraints in mobile terminals. Full article
Show Figures

Figure 1

23 pages, 2363 KiB  
Review
Handover Decisions for Ultra-Dense Networks in Smart Cities: A Survey
by Akzhibek Amirova, Ibraheem Shayea, Didar Yedilkhan, Laura Aldasheva and Alma Zakirova
Technologies 2025, 13(8), 313; https://doi.org/10.3390/technologies13080313 - 23 Jul 2025
Viewed by 526
Abstract
Handover (HO) management plays a key role in ensuring uninterrupted connectivity across evolving wireless networks. While previous generations such as 4G and 5G have introduced several HO strategies, these techniques are insufficient to meet the rigorous demands of sixth-generation (6G) networks in ultra-dense, [...] Read more.
Handover (HO) management plays a key role in ensuring uninterrupted connectivity across evolving wireless networks. While previous generations such as 4G and 5G have introduced several HO strategies, these techniques are insufficient to meet the rigorous demands of sixth-generation (6G) networks in ultra-dense, heterogeneous smart city environments. Existing studies often fail to provide integrated HO solutions that consider key concerns such as energy efficiency, security vulnerabilities, and interoperability across diverse network domains, including terrestrial, aerial, and satellite systems. Moreover, the dynamic and high-mobility nature of smart city ecosystems further complicate real-time HO decision-making. This survey aims to highlight these critical gaps by systematically categorizing state-of-the-art HO approaches into AI-based, fuzzy logic-based, and hybrid frameworks, while evaluating their performance against emerging 6G requirements. Future research directions are also outlined, emphasizing the development of lightweight AI–fuzzy hybrid models for real-time decision-making, the implementation of decentralized security mechanisms using blockchain, and the need for global standardization to enable seamless handovers across multi-domain networks. The key outcome of this review is a structured and in-depth synthesis of current advancements, which serves as a foundational reference for researchers and engineers aiming to design intelligent, scalable, and secure HO mechanisms that can support the operational complexity of next-generation smart cities. Full article
(This article belongs to the Section Information and Communication Technologies)
Show Figures

Figure 1

41 pages, 2392 KiB  
Review
How Beyond-5G and 6G Makes IIoT and the Smart Grid Green—A Survey
by Pal Varga, Áron István Jászberényi, Dániel Pásztor, Balazs Nagy, Muhammad Nasar and David Raisz
Sensors 2025, 25(13), 4222; https://doi.org/10.3390/s25134222 - 6 Jul 2025
Viewed by 732
Abstract
The convergence of next-generation wireless communication technologies and modern energy infrastructure presents a promising path toward sustainable and intelligent systems. This survey explores how beyond-5G and 6G communication technologies can support the greening of Industrial Internet of Things (IIoT) systems and smart grids. [...] Read more.
The convergence of next-generation wireless communication technologies and modern energy infrastructure presents a promising path toward sustainable and intelligent systems. This survey explores how beyond-5G and 6G communication technologies can support the greening of Industrial Internet of Things (IIoT) systems and smart grids. It highlights the critical challenges in achieving energy efficiency, interoperability, and real-time responsiveness across different domains. The paper reviews key enablers such as LPWAN, wake-up radios, mobile edge computing, and energy harvesting techniques for green IoT, as well as optimization strategies for 5G/6G networks and data center operations. Furthermore, it examines the role of 5G in enabling reliable, ultra-low-latency data communication for advanced smart grid applications, such as distributed generation, precise load control, and intelligent feeder automation. Through a structured analysis of recent advances and open research problems, the paper aims to identify essential directions for future research and development in building energy-efficient, resilient, and scalable smart infrastructures powered by intelligent wireless networks. Full article
(This article belongs to the Special Issue Feature Papers in the Internet of Things Section 2025)
Show Figures

Figure 1

32 pages, 1277 KiB  
Article
Distributed Prediction-Enhanced Beamforming Using LR/SVR Fusion and MUSIC Refinement in 5G O-RAN Systems
by Mustafa Mayyahi, Jordi Mongay Batalla, Jerzy Żurek and Piotr Krawiec
Appl. Sci. 2025, 15(13), 7428; https://doi.org/10.3390/app15137428 - 2 Jul 2025
Viewed by 394
Abstract
Low-latency and robust beamforming are vital for sustaining signal quality and spectral efficiency in emerging high-mobility 5G and future 6G wireless networks. Conventional beam management approaches, which rely on periodic Channel State Information feedback and static codebooks, as outlined in 3GPP standards, are [...] Read more.
Low-latency and robust beamforming are vital for sustaining signal quality and spectral efficiency in emerging high-mobility 5G and future 6G wireless networks. Conventional beam management approaches, which rely on periodic Channel State Information feedback and static codebooks, as outlined in 3GPP standards, are insufficient in rapidly varying propagation environments. In this work, we propose a Dominance-Enforced Adaptive Clustered Sliding Window Regression (DE-ACSW-R) framework for predictive beamforming in O-RAN Split 7-2x architectures. DE-ACSW-R leverages a sliding window of recent angle of arrival (AoA) estimates, applying in-window change-point detection to segment user trajectories and performing both Linear Regression (LR) and curvature-adaptive Support Vector Regression (SVR) for short-term and non-linear prediction. A confidence-weighted fusion mechanism adaptively blends LR and SVR outputs, incorporating robust outlier detection and a dominance-enforced selection regime to address strong disagreements. The Open Radio Unit (O-RU) autonomously triggers localised MUSIC scans when prediction confidence degrades, minimising unnecessary full-spectrum searches and saving delay. Simulation results demonstrate that the proposed DE-ACSW-R approach significantly enhances AoA tracking accuracy, beamforming gain, and adaptability under realistic high-mobility conditions, surpassing conventional LR/SVR baselines. This AI-native modular pipeline aligns with O-RAN architectural principles, enabling scalable and real-time beam management for next-generation wireless deployments. Full article
Show Figures

Figure 1

40 pages, 5045 KiB  
Review
RF Energy-Harvesting Techniques: Applications, Recent Developments, Challenges, and Future Opportunities
by Stella N. Arinze, Emenike Raymond Obi, Solomon H. Ebenuwa and Augustine O. Nwajana
Telecom 2025, 6(3), 45; https://doi.org/10.3390/telecom6030045 - 1 Jul 2025
Viewed by 1281
Abstract
The increasing demand for sustainable and renewable energy solutions has made radio frequency energy harvesting (RFEH) a promising technique for powering low-power electronic devices. RFEH captures ambient RF signals from wireless communication systems, such as mobile networks, Wi-Fi, and broadcasting stations, and converts [...] Read more.
The increasing demand for sustainable and renewable energy solutions has made radio frequency energy harvesting (RFEH) a promising technique for powering low-power electronic devices. RFEH captures ambient RF signals from wireless communication systems, such as mobile networks, Wi-Fi, and broadcasting stations, and converts them into usable electrical energy. This approach offers a viable alternative for battery-dependent and hard-to-recharge applications, including streetlights, outdoor night/security lighting, wireless sensor networks, and biomedical body sensor networks. This article provides a comprehensive review of the RFEH techniques, including state-of-the-art rectenna designs, energy conversion efficiency improvements, and multi-band harvesting systems. We present a detailed analysis of recent advancements in RFEH circuits, impedance matching techniques, and integration with emerging technologies such as the Internet of Things (IoT), 5G, and wireless power transfer (WPT). Additionally, this review identifies existing challenges, including low conversion efficiency, unpredictable energy availability, and design limitations for small-scale and embedded systems. A critical assessment of current research gaps is provided, highlighting areas where further development is required to enhance performance and scalability. Finally, constructive recommendations for future opportunities in RFEH are discussed, focusing on advanced materials, AI-driven adaptive harvesting systems, hybrid energy-harvesting techniques, and novel antenna–rectifier architectures. The insights from this study will serve as a valuable resource for researchers and engineers working towards the realization of self-sustaining, battery-free electronic systems. Full article
(This article belongs to the Special Issue Advances in Wireless Communication: Applications and Developments)
Show Figures

Figure 1

21 pages, 1476 KiB  
Article
AI-Driven Handover Management and Load Balancing Optimization in Ultra-Dense 5G/6G Cellular Networks
by Chaima Chabira, Ibraheem Shayea, Gulsaya Nurzhaubayeva, Laura Aldasheva, Didar Yedilkhan and Saule Amanzholova
Technologies 2025, 13(7), 276; https://doi.org/10.3390/technologies13070276 - 1 Jul 2025
Cited by 1 | Viewed by 1187
Abstract
This paper presents a comprehensive review of handover management and load balancing optimization (LBO) in ultra-dense 5G and emerging 6G cellular networks. With the increasing deployment of small cells and the rapid growth of data traffic, these networks face significant challenges in ensuring [...] Read more.
This paper presents a comprehensive review of handover management and load balancing optimization (LBO) in ultra-dense 5G and emerging 6G cellular networks. With the increasing deployment of small cells and the rapid growth of data traffic, these networks face significant challenges in ensuring seamless mobility and efficient resource allocation. Traditional handover and load balancing techniques, primarily designed for 4G systems, are no longer sufficient to address the complexity of heterogeneous network environments that incorporate millimeter-wave communication, Internet of Things (IoT) devices, and unmanned aerial vehicles (UAVs). The review focuses on how recent advances in artificial intelligence (AI), particularly machine learning (ML) and deep learning (DL), are being applied to improve predictive handover decisions and enable real-time, adaptive load distribution. AI-driven solutions can significantly reduce handover failures, latency, and network congestion, while improving overall user experience and quality of service (QoS). This paper surveys state-of-the-art research on these techniques, categorizing them according to their application domains and evaluating their performance benefits and limitations. Furthermore, the paper discusses the integration of intelligent handover and load balancing methods in smart city scenarios, where ultra-dense networks must support diverse services with high reliability and low latency. Key research gaps are also identified, including the need for standardized datasets, energy-efficient AI models, and context-aware mobility strategies. Overall, this review aims to guide future research and development in designing robust, AI-assisted mobility and resource management frameworks for next-generation wireless systems. Full article
Show Figures

Figure 1

25 pages, 7855 KiB  
Article
Latency-Sensitive Wireless Communication in Dynamically Moving Robots for Urban Mobility Applications
by Jakub Krejčí, Marek Babiuch, Jiří Suder, Václav Krys and Zdenko Bobovský
Smart Cities 2025, 8(4), 105; https://doi.org/10.3390/smartcities8040105 - 25 Jun 2025
Viewed by 741
Abstract
Reliable wireless communication is essential for mobile robotic systems operating in dynamic environments, particularly in the context of smart mobility and cloud-integrated urban infrastructures. This article presents an experimental study analyzing the impact of robot motion dynamics on wireless network performance, contributing to [...] Read more.
Reliable wireless communication is essential for mobile robotic systems operating in dynamic environments, particularly in the context of smart mobility and cloud-integrated urban infrastructures. This article presents an experimental study analyzing the impact of robot motion dynamics on wireless network performance, contributing to the broader discussion on data reliability and communication efficiency in intelligent transportation systems. Measurements were conducted using a quadruped robot equipped with an onboard edge computing device, navigating predefined trajectories in a laboratory setting designed to emulate real-world variability. Key wireless parameters, including signal strength (RSSI), latency, and packet loss, were continuously monitored alongside robot kinematic data such as speed, orientation (roll, pitch, yaw), and movement patterns. The results show a significant correlation between dynamic motion—especially high forward velocities and rotational maneuvers—and degradations in network performance. Increased robot speeds and frequent orientation changes were associated with elevated latency and greater packet loss, while static or low-motion periods exhibited more stable communication. These findings highlight critical challenges for real-time data transmission in mobile IoRT (Internet of Robotic Things) systems, and emphasize the role of network-aware robotic behavior, interoperable communication protocols, and edge-to-cloud data integration in ensuring robust wireless performance within smart city environments. Full article
(This article belongs to the Special Issue Smart Mobility: Linking Research, Regulation, Innovation and Practice)
Show Figures

Figure 1

21 pages, 1329 KiB  
Article
DDPG-Based UAV-RIS Framework for Optimizing Mobility in Future Wireless Communication Networks
by Yasir Ullah, Idris Olalekan Adeoye, Mardeni Roslee, Mohd Azmi Ismail, Farman Ali, Shabeer Ahmad, Anwar Faizd Osman and Fatimah Zaharah Ali
Drones 2025, 9(6), 437; https://doi.org/10.3390/drones9060437 - 15 Jun 2025
Viewed by 512
Abstract
The development of beyond 5G (B5G) future wireless communication networks (FWCN) needs novel solutions to support high-speed, reliable, and low-latency communication. Unmanned aerial vehicles (UAVs) and reconfigurable intelligent surfaces (RISs) are promising techniques that can enhance wireless connectivity in urban environments where tall [...] Read more.
The development of beyond 5G (B5G) future wireless communication networks (FWCN) needs novel solutions to support high-speed, reliable, and low-latency communication. Unmanned aerial vehicles (UAVs) and reconfigurable intelligent surfaces (RISs) are promising techniques that can enhance wireless connectivity in urban environments where tall buildings block line-of-sight (LoS) links. However, existing UAV-assisted communication strategies do not fully address key challenges like mobility management, handover failures (HOFs), and path disorders in dense urban environments. This paper introduces a deep deterministic policy gradient (DDPG)-based UAV-RIS framework to overcome these limitations. The proposed framework jointly optimizes UAV trajectories and RIS phase shifts to improve throughput, energy efficiency (EE), and LoS probability while reducing outage probability (OP) and HOF. A modified K-means clustering algorithm is used to efficiently partition the ground users (GUs) considering the newly added GUs as well. The DDPG algorithm, based on reinforcement learning (RL), adapts UAV positioning and RIS configurations in a continuous action space. Simulation results show that the proposed approach significantly reduces HOF and OP, increases EE, enhances network throughput, and improves LoS probability compared to UAV-only, RIS-only, and without UAV-RIS deployments. Additionally, by dynamically adjusting UAV locations and RIS phase shifts based on GU mobility patterns, the framework further enhances connectivity and reliability. The findings highlight its potential to transform urban wireless communication by mitigating LoS blockages and ensuring uninterrupted connectivity in dense environments. Full article
(This article belongs to the Special Issue UAV-Assisted Mobile Wireless Networks and Applications)
Show Figures

Figure 1

59 pages, 4517 KiB  
Review
Artificial Intelligence Empowering Dynamic Spectrum Access in Advanced Wireless Communications: A Comprehensive Overview
by Abiodun Gbenga-Ilori, Agbotiname Lucky Imoize, Kinzah Noor and Paul Oluwadara Adebolu-Ololade
AI 2025, 6(6), 126; https://doi.org/10.3390/ai6060126 - 13 Jun 2025
Viewed by 1932
Abstract
This review paper examines the integration of artificial intelligence (AI) in wireless communication, focusing on cognitive radio (CR), spectrum sensing, and dynamic spectrum access (DSA). As the demand for spectrum continues to rise with the expansion of mobile users and connected devices, cognitive [...] Read more.
This review paper examines the integration of artificial intelligence (AI) in wireless communication, focusing on cognitive radio (CR), spectrum sensing, and dynamic spectrum access (DSA). As the demand for spectrum continues to rise with the expansion of mobile users and connected devices, cognitive radio networks (CRNs), leveraging AI-driven spectrum sensing and dynamic access, provide a promising solution to improve spectrum utilization. The paper reviews various deep learning (DL)-based spectrum-sensing methods, highlighting their advantages and challenges. It also explores the use of multi-agent reinforcement learning (MARL) for distributed DSA networks, where agents autonomously optimize power allocation (PA) to minimize interference and enhance quality of service. Additionally, the paper discusses the role of machine learning (ML) in predicting spectrum requirements, which is crucial for efficient frequency management in the fifth generation (5G) networks and beyond. Case studies show how ML can help self-optimize networks, reducing energy consumption while improving performance. The review also introduces the potential of generative AI (GenAI) for demand-planning and network optimization, enhancing spectrum efficiency and energy conservation in wireless networks (WNs). Finally, the paper highlights future research directions, including improving AI-driven network resilience, refining predictive models, and addressing ethical considerations. Overall, AI is poised to transform wireless communication, offering innovative solutions for spectrum management (SM), security, and network performance. Full article
(This article belongs to the Special Issue Artificial Intelligence for Network Management)
Show Figures

Figure 1

26 pages, 2568 KiB  
Article
Unified Framework for RIS-Enhanced Wireless Communication and Ambient RF Energy Harvesting: Performance and Sustainability Analysis
by Sunday Enahoro, Sunday Ekpo, Yasir Al-Yasir, Mfonobong Uko, Fanuel Elias, Rahul Unnikrishnan and Stephen Alabi
Technologies 2025, 13(6), 244; https://doi.org/10.3390/technologies13060244 - 12 Jun 2025
Viewed by 553
Abstract
The increasing demand for high-capacity, energy-efficient wireless networks poses significant challenges in maintaining spectral efficiency, minimizing interference, and ensuring sustainability. Traditional direct-link communication suffers from signal degradation due to path loss, multipath fading, and interference, limiting overall performance. To mitigate these challenges, this [...] Read more.
The increasing demand for high-capacity, energy-efficient wireless networks poses significant challenges in maintaining spectral efficiency, minimizing interference, and ensuring sustainability. Traditional direct-link communication suffers from signal degradation due to path loss, multipath fading, and interference, limiting overall performance. To mitigate these challenges, this paper proposes a unified RIS framework that integrates passive and active Reconfigurable Intelligent Surfaces (RISs) for enhanced communication and ambient RF energy harvesting. Our methodology optimizes RIS-assisted beamforming using successive convex approximation (SCA) and adaptive phase shift tuning, maximizing desired signal reception while reducing interference. Passive RIS efficiently reflects signals without external power, whereas active RIS employs amplification-assisted reflection for superior performance. Evaluations using realistic urban macrocell and mmWave channel models reveal that, compared to direct links, passive RIS boosts SNR from 3.0 dB to 7.1 dB, and throughput from 2.6 Gbps to 4.6 Gbps, while active RIS further enhances the SNR to 10.0 dB and throughput to 6.8 Gbps. Energy efficiency increases from 0.44 to 0.67 (passive) and 0.82 (active), with latency reduced from 80 ms to 35 ms. These performance metrics validate the proposed approach and highlight its potential applications in urban 5G networks, IoT systems, high-mobility scenarios, and other next-generation wireless environments. Full article
(This article belongs to the Special Issue Microwave/Millimeter-Wave Future Trends and Technologies)
Show Figures

Figure 1

30 pages, 8363 KiB  
Article
Integrating Reinforcement Learning into M/M/1/K Retry Queueing Models for 6G Applications
by Djamila Talbi and Zoltan Gal
Sensors 2025, 25(12), 3621; https://doi.org/10.3390/s25123621 - 9 Jun 2025
Viewed by 635
Abstract
The ever-growing demand for sustainable, efficient, and fair allocation in the next generation of wireless network applications is a serious challenge, especially in the context of high-speed communication networks that operate on Terahertz frequencies. This research work presents a novel approach to enhance [...] Read more.
The ever-growing demand for sustainable, efficient, and fair allocation in the next generation of wireless network applications is a serious challenge, especially in the context of high-speed communication networks that operate on Terahertz frequencies. This research work presents a novel approach to enhance queue management in 6G networks by integrating reinforcement learning, specifically Deep Q-Networks (DQN). We introduce an intelligent 6G Retrial Queueing System (RQS) that dynamically adjusts to varying traffic conditions, minimizes delays, reduces energy consumption, and guarantees equitable access to network resources. The system’s performance is examined under extensive simulations, taking into account multiple arrival rates, queue sizes, and reward scaling factors. The results show that the integration of RL in the 6G-RQS model successfully enhances queue management while maintaining the high performance of the system, and this is by increasing the number of mobile terminals served, even under different and higher traffic demands. Furthermore, singular value decomposition analysis reveals clusters and structured patterns, indicating the effective learning process and adaptation performed by the agent. Our research findings demonstrate that RL-based queue management is a promising solution for overcoming the challenges that 6G suffers from, particularly in the context of high-speed communication networks. Full article
(This article belongs to the Special Issue Future Horizons in Networking: Exploring the Potential of 6G)
Show Figures

Figure 1

35 pages, 3885 KiB  
Review
Supporting Global Communications of 6G Networks Using AI, Digital Twin, Hybrid and Integrated Networks, and Cloud: Features, Challenges, and Recommendations
by Shaymaa Ayad Mohammed, Sallar S. Murad, Havot J. Albeyboni, Mohammad Dehghani Soltani, Reham A. Ahmed, Rozin Badeel and Ping Chen
Telecom 2025, 6(2), 35; https://doi.org/10.3390/telecom6020035 - 27 May 2025
Viewed by 1318
Abstract
The commercial deployment of fifth generation (5G) mobile communication networks has begun, bringing with it novel offerings, improved user activities, and a variety of opportunities for different types of organizations. However, there still exist several challenges to implementing 5G technology. Sixth generation (6G) [...] Read more.
The commercial deployment of fifth generation (5G) mobile communication networks has begun, bringing with it novel offerings, improved user activities, and a variety of opportunities for different types of organizations. However, there still exist several challenges to implementing 5G technology. Sixth generation (6G) wireless communication technology development has begun on a worldwide scale in response to these challenges. Even though there have been many discussions on this topic in the past, many questions remain unanswered in the present literature. The article provides a comprehensive overview of 6G, including the common understanding of the concept, as well as its technical requirements and potential applications. A comprehensive analysis of the 6G network design, potential uses, and key elements are covered. This research article delineates future study topics and unresolved challenges to stimulate an ongoing global discourse. This analysis and content of this study supports the use of different applications and services that will benefit the community in the near future using the 6G technology. Subsequently, recommendations for each problem are provided, offering solutions to unresolved difficulties where functionalities are anticipated to improve, hence enhancing the overall user experience. Full article
(This article belongs to the Special Issue Advances in Wireless Communication: Applications and Developments)
Show Figures

Figure 1

22 pages, 1034 KiB  
Article
A Novel Crowdsourcing-Assisted 5G Wireless Signal Ranging Technique in MEC Architecture
by Rui Lu, Lei Shi, Yinlong Liu and Zhongkai Dang
Future Internet 2025, 17(5), 220; https://doi.org/10.3390/fi17050220 - 14 May 2025
Viewed by 457
Abstract
In complex indoor and outdoor scenarios, traditional GPS-based ranging technology faces limitations in availability due to signal occlusion and user privacy issues. Wireless signal ranging technology based on 5G base stations has emerged as a potential alternative. However, existing methods are limited by [...] Read more.
In complex indoor and outdoor scenarios, traditional GPS-based ranging technology faces limitations in availability due to signal occlusion and user privacy issues. Wireless signal ranging technology based on 5G base stations has emerged as a potential alternative. However, existing methods are limited by low efficiency in constructing static signal databases, poor environmental adaptability, and high resource overhead, restricting their practical application. This paper proposes a 5G wireless signal ranging framework that integrates mobile edge computing (MEC) and crowdsourced intelligence to systematically address the aforementioned issues. This study designs a progressive solution by (1) building a crowdsourced data collection network, using mobile terminals equipped with GPS technology to automatically collect device signal features, replacing inefficient manual drive tests; (2) developing a progressive signal update algorithm that integrates real-time crowdsourced data and historical signals to optimize the signal fingerprint database in dynamic environments; (3) establishing an edge service architecture to offload signal matching and trajectory estimation tasks to MEC nodes, using lightweight computing engines to reduce the load on the core network. Experimental results demonstrate a mean positioning error of 5 m, with 95% of devices achieving errors within 10 m, as well as building and floor prediction error rates of 0.5% and 1%, respectively. The proposed framework outperforms traditional static methods by 3× in ranging accuracy while maintaining computational efficiency, achieving significant improvements in environmental adaptability and service scalability. Full article
Show Figures

Figure 1

38 pages, 4091 KiB  
Article
Mitigating the Impact of Satellite Vibrations on the Acquisition of Satellite Laser Links Through Optimized Scan Path and Parameters
by Muhammad Khalid, Wu Ji, Deng Li and Li Kun
Photonics 2025, 12(5), 444; https://doi.org/10.3390/photonics12050444 - 4 May 2025
Viewed by 775
Abstract
In the past two decades, there has been a tremendous increase in demand for services requiring a high bandwidth, a low latency, and high data rates, such as broadband internet services, video streaming, cloud computing, IoT devices, and mobile data services (5G and [...] Read more.
In the past two decades, there has been a tremendous increase in demand for services requiring a high bandwidth, a low latency, and high data rates, such as broadband internet services, video streaming, cloud computing, IoT devices, and mobile data services (5G and beyond). Optical wireless communication (OWC) technology, which is also envisioned for next-generation satellite networks using laser links, offers a promising solution to meet these demands. Establishing a line-of-sight (LOS) link and initiating communication in laser links is a challenging task. This process is managed by the acquisition, pointing, and tracking (APT) system, which must deal with the narrow beam divergence and the presence of satellite platform vibrations. These factors increase acquisition time and decrease acquisition probability. This study presents a framework for evaluating the acquisition time of four different scanning methods: spiral, raster, square spiral, and hexagonal, using a probabilistic approach. A satellite platform vibration model is used, and an algorithm for estimating its power spectral density is applied. Maximum likelihood estimation is employed to estimate key parameters from satellite vibrations to optimize scan parameters, such as the overlap factor and beam divergence. The simulation results show that selecting the scan path, overlap factor, and beam divergence based on an accurate estimation of satellite vibrations can prevent multiple scans of the uncertainty region, improve target satellite detection, and increase acquisition probability, given that the satellite vibration amplitudes are within the constraints imposed by the scan parameters. This study contributes to improving the acquisition process, which can, in turn, enhance the pointing and tracking phases of the APT system in laser links. Full article
Show Figures

Figure 1

19 pages, 4692 KiB  
Article
Scalable Semantic Adaptive Communication for Task Requirements in WSNs
by Hong Yang, Xiaoqing Zhu, Jia Yang, Ji Li, Linbo Qing, Xiaohai He and Pingyu Wang
Sensors 2025, 25(9), 2823; https://doi.org/10.3390/s25092823 - 30 Apr 2025
Viewed by 476
Abstract
Wireless Sensor Networks (WSNs) have emerged as an efficient solution for numerous real-time applications, attributable to their compactness, cost effectiveness, and ease of deployment. The rapid advancement of the Internet of Things (IoT), Artificial Intelligence (AI), and sixth-generation mobile communication technology (6G) and [...] Read more.
Wireless Sensor Networks (WSNs) have emerged as an efficient solution for numerous real-time applications, attributable to their compactness, cost effectiveness, and ease of deployment. The rapid advancement of the Internet of Things (IoT), Artificial Intelligence (AI), and sixth-generation mobile communication technology (6G) and Mobile Edge Computing (MEC) in recent years has catalyzed the transition towards large-scale deployment of WSN devices, and changed the image sensing and understanding to novel modes (such as machine-to-machine or human-to-machine interactions). However, the resulting data proliferation and the dynamics of communication environments introduce new challenges for WSN communication: (1) ensuring robust communication in adverse environments and (2) effectively alleviating bandwidth pressure from massive data transmission. To address these issues, this paper proposes a Scalable Semantic Adaptive Communication (SSAC) for task requirement. Firstly, we design an Attention Mechanism-based Joint Source Channel Coding (AMJSCC) in order to fully exploit the correlation among semantic features, channel conditions, and tasks. Then, a Prediction Scalable Semantic Generator (PSSG) is constructed to implement scalable semantics, allowing for flexible adjustments to achieve channel adaptation. The experimental results show that the proposed SSAC is more robust than traditional and other semantic communication algorithms in image classification tasks, and achieves scalable compression rates without sacrificing classification performance, while improving the bandwidth utilization of the communication system. Full article
(This article belongs to the Special Issue 6G Communication and Edge Intelligence in Wireless Sensor Networks)
Show Figures

Figure 1

Back to TopTop