Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (474)

Search Parameters:
Keywords = heterogeneous wireless networks

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
18 pages, 404 KiB  
Article
Deterministic Scheduling for Asymmetric Flows in Future Wireless Networks
by Haie Dou, Taojie Zhu, Fei Li, Chen Liu and Lei Wang
Symmetry 2025, 17(8), 1246; https://doi.org/10.3390/sym17081246 - 6 Aug 2025
Abstract
In the era of Industry 5.0, future wireless networks are increasingly shifting from traditional symmetric architectures toward heterogeneous and asymmetric paradigms, driven by the demand for diversified and dynamic services. This architectural evolution gives rise to complex and asymmetric flows, such as the [...] Read more.
In the era of Industry 5.0, future wireless networks are increasingly shifting from traditional symmetric architectures toward heterogeneous and asymmetric paradigms, driven by the demand for diversified and dynamic services. This architectural evolution gives rise to complex and asymmetric flows, such as the coexistence of periodic and burst flows with varying latency, jitter, and deadline constraints, posing new challenges for deterministic transmission. Traditional time-sensitive networking (TSN) is well-suited for periodic flows but lacks the flexibility to effectively handle dynamic, asymmetric traffi. To address this limitation, we propose a two-stage asymmetric flow scheduling framework with dynamic deadline control, termed A-TSN. In the first stage, we design a Deep Q-Network-based Dynamic Injection Time Slot algorithm (DQN-DITS) to optimize slot allocation for periodic flows under varying network loads. In the second stage, we introduce the Dynamic Deadline Online (DDO) scheduling algorithm, which enables real-time scheduling for asymmetric flows while satisfying flow deadlines and capacity constraints. Simulation results demonstrate that our approach significantly reduces end-to-end latency, improves scheduling efficiency, and enhances adaptability to high-volume asymmetric traffic, offering a scalable solution for future deterministic wireless networks. Full article
(This article belongs to the Special Issue Symmetry/Asymmetry in Future Wireless Networks)
Show Figures

Figure 1

24 pages, 1530 KiB  
Article
A Lightweight Robust Training Method for Defending Model Poisoning Attacks in Federated Learning Assisted UAV Networks
by Lucheng Chen, Weiwei Zhai, Xiangfeng Bu, Ming Sun and Chenglin Zhu
Drones 2025, 9(8), 528; https://doi.org/10.3390/drones9080528 - 28 Jul 2025
Viewed by 397
Abstract
The integration of unmanned aerial vehicles (UAVs) into next-generation wireless networks greatly enhances the flexibility and efficiency of communication and distributed computation for ground mobile devices. Federated learning (FL) provides a privacy-preserving paradigm for device collaboration but remains highly vulnerable to poisoning attacks [...] Read more.
The integration of unmanned aerial vehicles (UAVs) into next-generation wireless networks greatly enhances the flexibility and efficiency of communication and distributed computation for ground mobile devices. Federated learning (FL) provides a privacy-preserving paradigm for device collaboration but remains highly vulnerable to poisoning attacks and is further challenged by the resource constraints and heterogeneous data common to UAV-assisted systems. Existing robust aggregation and anomaly detection methods often degrade in efficiency and reliability under these realistic adversarial and non-IID settings. To bridge these gaps, we propose FedULite, a lightweight and robust federated learning framework specifically designed for UAV-assisted environments. FedULite features unsupervised local representation learning optimized for unlabeled, non-IID data. Moreover, FedULite leverages a robust, adaptive server-side aggregation strategy that uses cosine similarity-based update filtering and dimension-wise adaptive learning rates to neutralize sophisticated data and model poisoning attacks. Extensive experiments across diverse datasets and adversarial scenarios demonstrate that FedULite reduces the attack success rate (ASR) from over 90% in undefended scenarios to below 5%, while maintaining the main task accuracy loss within 2%. Moreover, it introduces negligible computational overhead compared to standard FedAvg, with approximately 7% additional training time. Full article
(This article belongs to the Special Issue IoT-Enabled UAV Networks for Secure Communication)
Show Figures

Figure 1

23 pages, 2363 KiB  
Review
Handover Decisions for Ultra-Dense Networks in Smart Cities: A Survey
by Akzhibek Amirova, Ibraheem Shayea, Didar Yedilkhan, Laura Aldasheva and Alma Zakirova
Technologies 2025, 13(8), 313; https://doi.org/10.3390/technologies13080313 - 23 Jul 2025
Viewed by 451
Abstract
Handover (HO) management plays a key role in ensuring uninterrupted connectivity across evolving wireless networks. While previous generations such as 4G and 5G have introduced several HO strategies, these techniques are insufficient to meet the rigorous demands of sixth-generation (6G) networks in ultra-dense, [...] Read more.
Handover (HO) management plays a key role in ensuring uninterrupted connectivity across evolving wireless networks. While previous generations such as 4G and 5G have introduced several HO strategies, these techniques are insufficient to meet the rigorous demands of sixth-generation (6G) networks in ultra-dense, heterogeneous smart city environments. Existing studies often fail to provide integrated HO solutions that consider key concerns such as energy efficiency, security vulnerabilities, and interoperability across diverse network domains, including terrestrial, aerial, and satellite systems. Moreover, the dynamic and high-mobility nature of smart city ecosystems further complicate real-time HO decision-making. This survey aims to highlight these critical gaps by systematically categorizing state-of-the-art HO approaches into AI-based, fuzzy logic-based, and hybrid frameworks, while evaluating their performance against emerging 6G requirements. Future research directions are also outlined, emphasizing the development of lightweight AI–fuzzy hybrid models for real-time decision-making, the implementation of decentralized security mechanisms using blockchain, and the need for global standardization to enable seamless handovers across multi-domain networks. The key outcome of this review is a structured and in-depth synthesis of current advancements, which serves as a foundational reference for researchers and engineers aiming to design intelligent, scalable, and secure HO mechanisms that can support the operational complexity of next-generation smart cities. Full article
(This article belongs to the Section Information and Communication Technologies)
Show Figures

Figure 1

21 pages, 2065 KiB  
Article
Enhancing Security in 5G and Future 6G Networks: Machine Learning Approaches for Adaptive Intrusion Detection and Prevention
by Konstantinos Kalodanis, Charalampos Papapavlou and Georgios Feretzakis
Future Internet 2025, 17(7), 312; https://doi.org/10.3390/fi17070312 - 18 Jul 2025
Viewed by 367
Abstract
The evolution from 4G to 5G—and eventually to the forthcoming 6G networks—has revolutionized wireless communications by enabling high-speed, low-latency services that support a wide range of applications, including the Internet of Things (IoT), smart cities, and critical infrastructures. However, the unique characteristics of [...] Read more.
The evolution from 4G to 5G—and eventually to the forthcoming 6G networks—has revolutionized wireless communications by enabling high-speed, low-latency services that support a wide range of applications, including the Internet of Things (IoT), smart cities, and critical infrastructures. However, the unique characteristics of these networks—extensive connectivity, device heterogeneity, and architectural flexibility—impose significant security challenges. This paper introduces a comprehensive framework for enhancing the security of current and emerging wireless networks by integrating state-of-the-art machine learning (ML) techniques into intrusion detection and prevention systems. It also thoroughly explores the key aspects of wireless network security, including architectural vulnerabilities in both 5G and future 6G networks, novel ML algorithms tailored to address evolving threats, privacy-preserving mechanisms, and regulatory compliance with the EU AI Act. Finally, a Wireless Intrusion Detection Algorithm (WIDA) is proposed, demonstrating promising results in improving wireless network security. Full article
(This article belongs to the Special Issue Advanced 5G and Beyond Networks)
Show Figures

Figure 1

23 pages, 1585 KiB  
Article
Binary Secretary Bird Optimization Clustering by Novel Fitness Function Based on Voronoi Diagram in Wireless Sensor Networks
by Mohammed Abdulkareem, Hadi S. Aghdasi, Pedram Salehpour and Mina Zolfy
Sensors 2025, 25(14), 4339; https://doi.org/10.3390/s25144339 - 11 Jul 2025
Viewed by 240
Abstract
Minimizing energy consumption remains a critical challenge in wireless sensor networks (WSNs) because of their reliance on nonrechargeable batteries. Clustering-based hierarchical communication has been widely adopted to address this issue by improving residual energy and balancing the network load. In this architecture, cluster [...] Read more.
Minimizing energy consumption remains a critical challenge in wireless sensor networks (WSNs) because of their reliance on nonrechargeable batteries. Clustering-based hierarchical communication has been widely adopted to address this issue by improving residual energy and balancing the network load. In this architecture, cluster heads (CHs) are responsible for data collection, aggregation, and forwarding, making their optimal selection essential for prolonging network lifetime. The effectiveness of CH selection is highly dependent on the choice of metaheuristic optimization method and the design of the fitness function. Although numerous studies have applied metaheuristic algorithms with suitably designed fitness functions to tackle the CH selection problem, many existing approaches fail to fully capture both the spatial distribution of nodes and dynamic energy conditions. To address these limitations, we propose the binary secretary bird optimization clustering (BSBOC) method. BSBOC introduces a binary variant of the secretary bird optimization algorithm (SBOA) to handle the discrete nature of CH selection. Additionally, it defines a novel multiobjective fitness function that, for the first time, considers the Voronoi diagram of CHs as an optimization objective, besides other well-known objectives. BSBOC was thoroughly assessed via comprehensive simulation experiments, benchmarked against two advanced methods (MOBGWO and WAOA), under both homogeneous and heterogeneous network models across two deployment scenarios. Findings from these simulations demonstrated that BSBOC notably decreased energy usage and prolonged network lifetime, highlighting its effectiveness as a reliable method for energy-aware clustering in WSNs. Full article
(This article belongs to the Section Sensor Networks)
Show Figures

Figure 1

21 pages, 1476 KiB  
Article
AI-Driven Handover Management and Load Balancing Optimization in Ultra-Dense 5G/6G Cellular Networks
by Chaima Chabira, Ibraheem Shayea, Gulsaya Nurzhaubayeva, Laura Aldasheva, Didar Yedilkhan and Saule Amanzholova
Technologies 2025, 13(7), 276; https://doi.org/10.3390/technologies13070276 - 1 Jul 2025
Cited by 1 | Viewed by 1176
Abstract
This paper presents a comprehensive review of handover management and load balancing optimization (LBO) in ultra-dense 5G and emerging 6G cellular networks. With the increasing deployment of small cells and the rapid growth of data traffic, these networks face significant challenges in ensuring [...] Read more.
This paper presents a comprehensive review of handover management and load balancing optimization (LBO) in ultra-dense 5G and emerging 6G cellular networks. With the increasing deployment of small cells and the rapid growth of data traffic, these networks face significant challenges in ensuring seamless mobility and efficient resource allocation. Traditional handover and load balancing techniques, primarily designed for 4G systems, are no longer sufficient to address the complexity of heterogeneous network environments that incorporate millimeter-wave communication, Internet of Things (IoT) devices, and unmanned aerial vehicles (UAVs). The review focuses on how recent advances in artificial intelligence (AI), particularly machine learning (ML) and deep learning (DL), are being applied to improve predictive handover decisions and enable real-time, adaptive load distribution. AI-driven solutions can significantly reduce handover failures, latency, and network congestion, while improving overall user experience and quality of service (QoS). This paper surveys state-of-the-art research on these techniques, categorizing them according to their application domains and evaluating their performance benefits and limitations. Furthermore, the paper discusses the integration of intelligent handover and load balancing methods in smart city scenarios, where ultra-dense networks must support diverse services with high reliability and low latency. Key research gaps are also identified, including the need for standardized datasets, energy-efficient AI models, and context-aware mobility strategies. Overall, this review aims to guide future research and development in designing robust, AI-assisted mobility and resource management frameworks for next-generation wireless systems. Full article
Show Figures

Figure 1

31 pages, 3093 KiB  
Review
A Comprehensive Review of IoT Standards: The Role of IEEE 1451 in Smart Cities and Smart Buildings
by José Rita, José Salvado, Helbert da Rocha and António Espírito-Santo
Smart Cities 2025, 8(4), 108; https://doi.org/10.3390/smartcities8040108 - 30 Jun 2025
Viewed by 844
Abstract
The increasing demand for IoT solutions in smart cities, coupled with the increasing use of sensors and actuators and automation in these environments, has highlighted the need for efficient communication between Internet of Things (IoT) devices. The success of such systems relies on [...] Read more.
The increasing demand for IoT solutions in smart cities, coupled with the increasing use of sensors and actuators and automation in these environments, has highlighted the need for efficient communication between Internet of Things (IoT) devices. The success of such systems relies on interactions between devices that are governed by communication protocols which define how information is exchanged. However, the heterogeneity of sensor networks (wired and wireless) often leads to incompatibility issues, hindering the seamless integration of diverse devices. To address these challenges, standardisation is essential to promote scalability and interoperability across IoT systems. The IEEE 1451 standard provides a solution by defining a common interface that enables plug-and-play integration and enhances flexibility across diverse IoT devices. This standard enables seamless communication between devices from different manufacturers, irrespective of their characteristics, and ensures compatibility via the Transducer Electronic Data Sheet (TEDS) and the Network Capable Application Processor (NCAP). By reducing system costs and promoting adaptability, the standard mitigates the complexities posed by heterogeneity in IoT systems, fostering scalable, interoperable, and cost-effective solutions for IoT systems. The IEEE 1451 standard addresses key barriers to system integration, enabling the full potential of IoT technologies. This paper aims to provide a comprehensive review of the challenges transducer networks face around IoT applications, focused on the context of smart cities. This review underscores the significance and potential of the IEEE 1451 standard in establishing a framework that enables the harmonisation of IoT applications. The primary contribution of this work lies in emphasising the importance of adopting the standards for the development of harmonised and flexible systems. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

21 pages, 1469 KiB  
Article
Enhanced Distributed Energy-Efficient Clustering (DEEC) Protocol for Wireless Sensor Networks: A Modular Implementation and Performance Analysis
by Abdulla Juwaied, Lidia Jackowska-Strumillo and Michal Majchrowicz
Sensors 2025, 25(13), 4015; https://doi.org/10.3390/s25134015 - 27 Jun 2025
Cited by 1 | Viewed by 330
Abstract
Wireless Sensor Networks (WSNs) are a component of various applications, including environmental monitoring and the Internet of Things (IoT). Energy efficiency is one of the significant issues in WSNs, since sensor nodes are usually battery-powered and have limited energy resources. The Enhanced Distributed [...] Read more.
Wireless Sensor Networks (WSNs) are a component of various applications, including environmental monitoring and the Internet of Things (IoT). Energy efficiency is one of the significant issues in WSNs, since sensor nodes are usually battery-powered and have limited energy resources. The Enhanced Distributed Energy-Efficient Clustering (DEEC) protocol is one of the most common methods for improving energy efficiency and network lifespan by selecting cluster heads according to residual energy. Nevertheless, standard DEEC methods are limited in dynamic environments because of their fixed nature. This paper presents a novel modular implementation of the DEEC protocol for Wireless Sensor Networks, addressing the limitations of the standard DEEC in dynamic and heterogeneous environments. Unlike the typical DEEC protocol, the proposed approach incorporates realistic energy models, supports heterogeneous nodes, implements load balancing, and enables dynamic cluster head selection Numerical simulations in MATLAB® demonstrate that the improved DEEC protocol achieves a 133% longer stability period (first node death at 1166 rounds vs. 472 rounds), nearly doubles the network lifetime (4000 rounds vs. 2111 rounds), and significantly enhances energy efficiency compared to the standard DEEC. These results verify the effectiveness of the proposed enhancements, making the protocol a robust solution for modern WSN and IoT applications. Full article
(This article belongs to the Section Sensor Networks)
Show Figures

Figure 1

22 pages, 891 KiB  
Article
Federated Learning-Based Location Similarity Model for Location Privacy Preserving Recommendation
by Liang Zhu, Jingzhe Mu, Liping Yu, Yanpei Liu, Fubao Zhu and Jingzhong Gu
Electronics 2025, 14(13), 2578; https://doi.org/10.3390/electronics14132578 - 26 Jun 2025
Viewed by 285
Abstract
With the proliferation of mobile devices and wireless communications, Location-Based Social Networks (LBSNs) have seen tremendous growth. Location recommendation, as an important service in LBSNs, can provide users with locations of interest by analyzing their complex check-in information. Currently, most location recommendations use [...] Read more.
With the proliferation of mobile devices and wireless communications, Location-Based Social Networks (LBSNs) have seen tremendous growth. Location recommendation, as an important service in LBSNs, can provide users with locations of interest by analyzing their complex check-in information. Currently, most location recommendations use centralized learning strategies, which carry the risk of user privacy breaches. As an emerging learning strategy, federated learning is widely applied in the field of location recommendation to address privacy concerns. We propose a Federated Learning-Based Location Similarity Model for Location Privacy Preserving Recommendation (FedLSM-LPR) scheme. First, the location-based similarity model is used to capture the differences between locations and make location recommendations. Second, the penalty term is added to the loss function to constrain the distance between the local model parameters and the global model parameters. Finally, we use the REPAgg method, which is based on clustering for client selection, to perform global model aggregation to address data heterogeneity issues. Extensive experiments demonstrate that the proposed FedLSM-LPR scheme not only delivers superior performance but also effectively protects the privacy of users. Full article
(This article belongs to the Special Issue Big Data Security and Privacy)
Show Figures

Figure 1

28 pages, 3828 KiB  
Article
Hybrid VLC-RF Channel Estimation for GFDM Wireless Sensor Networks Using Tree-Based Regressor
by Azam Isam Aladwani, Tarik Adnan Almohamad, Abdullah Talha Sözer and İsmail Rakıp Karaş
Sensors 2025, 25(13), 3906; https://doi.org/10.3390/s25133906 - 23 Jun 2025
Viewed by 766
Abstract
This paper proposes a tree-based regression model for hybrid channel estimation in wireless sensor networks (WSNs) in generalized frequency division multiplexing (GFDM) over both visible light communication (VLC) and radio frequency (RF) links. The hybrid channel incorporates both additive white Gaussian noise (AWGN) [...] Read more.
This paper proposes a tree-based regression model for hybrid channel estimation in wireless sensor networks (WSNs) in generalized frequency division multiplexing (GFDM) over both visible light communication (VLC) and radio frequency (RF) links. The hybrid channel incorporates both additive white Gaussian noise (AWGN) and Rayleigh fading to mimic realistic environments. Traditional estimators, such as MMSE and LMMSE, often underperform in such heterogeneous and nonlinear conditions due to their analytical rigidity. To overcome these limitations, we introduce a data-driven approach using a decision tree regressor trained on 18,000 signal samples across 36 SNR levels. Simulation results show that support vector machine (SVM) achieved 91.34% accuracy and a BER of 0.0866 at 10 dB, as well as 96.77% accuracy with a BER of 0.0323 at 30 dB. Random forest achieved 91.01% accuracy and a BER of 0.0899 at 10 dB, as well as 97.88% accuracy with a BER of 0.0212 at 30 dB. The proposed tree model attained 90.83% and 97.63% accuracy with BERs of 0.0917 and 0.0237, respectively, at the corresponding SNR values. The distinguishing advantage of the tree model lies in its inference efficiency. It completes predictions on the test dataset in just 45.53 s, making it over three times faster than random forest (140.09 s) and more than four times faster than SVM (189.35 s). This significant reduction in inference time makes the proposed tree model particularly well suited for real-time and resource-constrained WSN scenarios, where fast and efficient estimation is often more critical than marginal gains in accuracy. The results also highlight a trade-off, where the tree model provides sub-optimal predictive performance while significantly reducing computational overhead, making it an attractive choice for low-power and latency-sensitive wireless systems. Full article
(This article belongs to the Section Sensor Networks)
Show Figures

Figure 1

20 pages, 3416 KiB  
Article
Deflection Prediction of Highway Bridges Using Wireless Sensor Networks and Enhanced iTransformer Model
by Cong Mu, Chen Chang, Jiuyuan Huo and Jiguang Yang
Buildings 2025, 15(13), 2176; https://doi.org/10.3390/buildings15132176 - 22 Jun 2025
Viewed by 372
Abstract
As an important part of national transportation infrastructure, the operation status of bridges is directly related to transportation safety and social stability. Structural deflection, which reflects the deformation behavior of bridge systems, serves as a key indicator for identifying stiffness degradation and the [...] Read more.
As an important part of national transportation infrastructure, the operation status of bridges is directly related to transportation safety and social stability. Structural deflection, which reflects the deformation behavior of bridge systems, serves as a key indicator for identifying stiffness degradation and the progression of localized damage. The accurate modeling and forecasting of deflection are thus essential for effective bridge health monitoring and intelligent maintenance. To address the limitations of traditional methods in handling multi-source data fusion and nonlinear temporal dependencies, this study proposes an enhanced iTransformer-based prediction model, termed LDAiT (LSTM Differential Attention iTransformer), which integrates Long Short-Term Memory (LSTM) networks and a differential attention mechanism for high-fidelity deflection prediction under complex working conditions. Firstly, a multi-source heterogeneous time series dataset is constructed based on wireless sensor network (WSN) technology, enabling the real-time acquisition and fusion of key structural response parameters such as deflection, strain, and temperature across critical bridge sections. Secondly, LDAiT enhances the modeling capability of long-term dependence through the introduction of LSTM and combines with the differential attention mechanism to improve the precision of response to the local dynamic changes in disturbance. Finally, experimental validation is carried out based on the measured data of Xintian Yellow River Bridge, and the results show that LDAiT outperforms the existing mainstream models in the indexes of R2, RMSE, MAE, and MAPE and has good accuracy, stability and generalization ability. The proposed approach offers a novel and effective framework for deflection forecasting in complex bridge systems and holds significant potential for practical deployment in structural health monitoring and intelligent decision-making applications. Full article
(This article belongs to the Section Construction Management, and Computers & Digitization)
Show Figures

Figure 1

15 pages, 4886 KiB  
Article
A Research Study on an Entropy-Weighted Multi-View Fusion Approach for Agricultural WSN Data Based on Fuzzy Clustering
by Xun Wang and Xiaohu You
Electronics 2025, 14(12), 2424; https://doi.org/10.3390/electronics14122424 - 13 Jun 2025
Viewed by 266
Abstract
This study proposes an entropy-weighted multi-view collaborative fusion algorithm to address key challenges in agricultural Wireless Sensor Network (WSN) monitoring systems, including high redundancy in multi-modal data, low energy efficiency, and poor cross-parameter adaptability of traditional fusion methods. A fuzzy clustering framework based [...] Read more.
This study proposes an entropy-weighted multi-view collaborative fusion algorithm to address key challenges in agricultural Wireless Sensor Network (WSN) monitoring systems, including high redundancy in multi-modal data, low energy efficiency, and poor cross-parameter adaptability of traditional fusion methods. A fuzzy clustering framework based on principal property selection is established to enable dynamic compression of multi-source heterogeneous data at cluster head nodes. The algorithm innovatively distinguishes between principal and secondary properties based on their contributions to clustering. Clustering is performed using principal properties, allowing data from nodes with similar values to be fused into unified categories, thereby enhancing the reliability of environmental information. Experimental results show that, compared to existing agricultural WSN data fusion algorithms, the proposed method reduces fusion error by an average of 5.6%, lowers the computational complexity of the original multi-view algorithm, and is more suitable for small-sized, low-capacity sensor nodes. Moreover, it has better adaptability to multiple agricultural parameters, reduces network energy consumption, and improves computational accuracy. Full article
Show Figures

Figure 1

32 pages, 3240 KiB  
Review
From 6G to SeaX-G: Integrated 6G TN/NTN for AI-Assisted Maritime Communications—Architecture, Enablers, and Optimization Problems
by Anastasios Giannopoulos, Panagiotis Gkonis, Alexandros Kalafatelis, Nikolaos Nomikos, Sotirios Spantideas, Panagiotis Trakadas and Theodoros Syriopoulos
J. Mar. Sci. Eng. 2025, 13(6), 1103; https://doi.org/10.3390/jmse13061103 - 30 May 2025
Viewed by 963
Abstract
The rapid evolution of wireless communications has introduced new possibilities for the digital transformation of maritime operations. As 5G begins to take shape in selected nearshore and port environments, the forthcoming 6G promises to unlock transformative capabilities across the entire maritime domain, integrating [...] Read more.
The rapid evolution of wireless communications has introduced new possibilities for the digital transformation of maritime operations. As 5G begins to take shape in selected nearshore and port environments, the forthcoming 6G promises to unlock transformative capabilities across the entire maritime domain, integrating Terrestrial/Non-Terrestrial Networks (TN/NTN) to form a space-air-ground-sea-underwater system. This paper presents a comprehensive review of how 6G-enabling technologies can be adapted to address the unique challenges of Maritime Communication Networks (MCNs). We begin by outlining a reference architecture for heterogeneous MCNs and reviewing the limitations of existing 5G deployments at sea. We then explore the key technical advancements introduced by 6G and map them to maritime use cases such as fleet coordination, just-in-time port logistics, and low-latency emergency response. Furthermore, the critical Artificial Intelligence/Machine Learning (AI/ML) concepts and algorithms are described to highlight their potential in optimizing maritime functionalities. Finally, we propose a set of resource optimization scenarios, including dynamic spectrum allocation, energy-efficient communications and edge offloading in MCNs, and discuss how AI/ML and learning-based methods can offer scalable, adaptive solutions. By bridging the gap between emerging 6G capabilities and practical maritime requirements, this paper highlights the role of intelligent, resilient, and globally connected networks in shaping the future of maritime communications. Full article
(This article belongs to the Section Ocean Engineering)
Show Figures

Figure 1

31 pages, 2799 KiB  
Article
A Cluster Head Selection Algorithm for Extending Last Node Lifetime in Wireless Sensor Networks
by Marcin Lewandowski and Bartłomiej Płaczek
Sensors 2025, 25(11), 3466; https://doi.org/10.3390/s25113466 - 30 May 2025
Cited by 1 | Viewed by 695
Abstract
This paper introduces a new cluster head selection algorithm for wireless sensor networks (WSNs) to maximize the time until the last sensor node depletes its energy. The algorithm is based on a formal analysis in which network lifetime is modeled as a function [...] Read more.
This paper introduces a new cluster head selection algorithm for wireless sensor networks (WSNs) to maximize the time until the last sensor node depletes its energy. The algorithm is based on a formal analysis in which network lifetime is modeled as a function of node energy consumption. In contrast to existing energy-balancing strategies, this analytical foundation leads to a distinctive selection rule that prioritizes the node with the highest transmission probability and the lowest initial energy as the initial cluster head. The algorithm employs distributed per-cluster computation, enabling scalability without increasing complexity relative to network size. Unlike traditional approaches that rotate cluster heads based on time or equal energy use, our method adapts to heterogeneous energy consumption patterns and enforces a cluster head rotation order that maximizes the lifetime of the final active node. To validate the effectiveness of the proposed approach, we implement it on a real-world LoRaWAN-based sensor network prototype. Experimental results demonstrate that our method significantly extends the lifetime of the last active node compared to representative state-of-the-art algorithms. This research provides a practical and robust solution for energy-efficient WSN operation in real deployment scenarios by considering realistic and application-driven communication behavior along with hardware-level energy consumption. Full article
Show Figures

Figure 1

50 pages, 2715 KiB  
Review
Interference Mitigation Strategies in Beyond 5G Wireless Systems: A Review
by Osamah Thamer Hassan Alzubaidi, Salah Alheejawi, Mhd Nour Hindia, Kaharudin Dimyati and Kamarul Ariffin Noordin
Electronics 2025, 14(11), 2237; https://doi.org/10.3390/electronics14112237 - 30 May 2025
Viewed by 1488
Abstract
Over the past few years, wireless communication has grown dramatically, and the consumer demand for wireless services has seen a significant jump. One of the main challenges for beyond fifth generation (B5G) networks is the increased capacity of the network. The continuously increasing [...] Read more.
Over the past few years, wireless communication has grown dramatically, and the consumer demand for wireless services has seen a significant jump. One of the main challenges for beyond fifth generation (B5G) networks is the increased capacity of the network. The continuously increasing number of network users and the limited radio spectrum in wireless technologies have led to severe congestion in communication channels. This issue leads to traffic congestion at base stations and introduces interference in the network, thereby degrading system capability and quality of service. Interference reduction has thus become a major design challenge in wireless communication systems. This review paper comprehensively explores interference management (IM) strategies in B5G networks. We critically analyze and summarize existing research on interference issues related to device-to-device communication, heterogeneous networks, inter-cell interference, and artificial intelligence (AI)-based frameworks. The paper reviews a wide range of methodologies, highlights the strengths and limitations of state-of-the-art approaches, and discusses standardized techniques such as power control, resource allocation, spectrum separation and mode selection, carrier aggregation, load balancing and cell range expansion, enhanced inter-cell interference coordination, coordinated scheduling and beamforming, coordinated multipoint, and AI-based interference prediction methods. A structured taxonomy and comparative summary are introduced to help categorize these techniques. Several related works based on their methodologies, shortcomings, and future directions have been critically reviewed. In addition, the paper identifies open research challenges and outlines key trends that are shaping future B5G IM systems. A comparative visualization is also provided to highlight dominant and underexplored optimization objectives across IM domains. This review serves as a valuable reference for researchers aiming to understand and evaluate current and emerging solutions for interference mitigation in B5G wireless systems. Full article
(This article belongs to the Special Issue Next-Generation Industrial Wireless Communication)
Show Figures

Figure 1

Back to TopTop