Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (187)

Search Parameters:
Keywords = LEO satellite network

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
31 pages, 3480 KiB  
Article
The First Step of AI in LEO SOPs: DRL-Driven Epoch Credibility Evaluation to Enhance Opportunistic Positioning Accuracy
by Jiaqi Yin, Feilong Li, Ruidan Luo, Xiao Chen, Linhui Zhao, Hong Yuan and Guang Yang
Remote Sens. 2025, 17(15), 2692; https://doi.org/10.3390/rs17152692 - 3 Aug 2025
Viewed by 122
Abstract
Low Earth orbit (LEO) signal of opportunity (SOP) positioning relies on the accumulation of epochs obtained through prolonged observation periods. The contribution of an LEO satellite single epoch to positioning accuracy is influenced by multi-level characteristics that are challenging for traditional models. To [...] Read more.
Low Earth orbit (LEO) signal of opportunity (SOP) positioning relies on the accumulation of epochs obtained through prolonged observation periods. The contribution of an LEO satellite single epoch to positioning accuracy is influenced by multi-level characteristics that are challenging for traditional models. To address this limitation, we propose an Agent-Weighted Recursive Least Squares (RLS) Positioning Framework (AWR-PF). This framework employs an agent to comprehensively analyze individual epoch characteristics, assess their credibility, and convert them into adaptive weights for RLS iterations. We developed a novel Markov Decision Process (MDP) model to assist the agent in addressing the epoch weighting problem and trained the agent utilizing the Double Deep Q-Network (DDQN) algorithm on 107 h of Iridium signal data. Experimental validation on a separate 28 h Iridium signal test set through 97 positioning trials demonstrated that AWR-PF achieves superior average positioning accuracy compared to both standard RLS and randomly weighted RLS throughout nearly the entire iterative process. In a single positioning trial, AWR-PF improves positioning accuracy by up to 45.15% over standard RLS. To the best of our knowledge, this work represents the first instance where an AI algorithm is used as the core decision-maker in LEO SOP positioning, establishing a groundbreaking paradigm for future research. Full article
(This article belongs to the Special Issue LEO-Augmented PNT Service)
Show Figures

Graphical abstract

16 pages, 2357 KiB  
Article
Joint Traffic Prediction and Handover Design for LEO Satellite Networks with LSTM and Attention-Enhanced Rainbow DQN
by Dinghe Fan, Shilei Zhou, Jihao Luo, Zijian Yang and Ming Zeng
Electronics 2025, 14(15), 3040; https://doi.org/10.3390/electronics14153040 - 30 Jul 2025
Viewed by 231
Abstract
With the increasing scale of low Earth orbit (LEO) satellite networks, leveraging non−terrestrial networks (NTNs) to complement terrestrial networks (TNs) has become a critical issue. In this paper, we investigate the issue of handover satellite selection between multiple terrestrial terminal groups (TTGs). To [...] Read more.
With the increasing scale of low Earth orbit (LEO) satellite networks, leveraging non−terrestrial networks (NTNs) to complement terrestrial networks (TNs) has become a critical issue. In this paper, we investigate the issue of handover satellite selection between multiple terrestrial terminal groups (TTGs). To support effective handover decision-making, we propose a long short-term memory (LSTM)-network-based traffic prediction mechanism based on historical traffic data. Building on these predictions, we formulate the handover strategy as a Markov Decision Process (MDP) and propose an attention-enhanced rainbow-DQN-based joint traffic prediction and handover design framework (ARTHF) by jointly considering the satellite switching frequency, communication quality, and satellite load. Simulation results demonstrate that our approach significantly outperforms existing methods in terms of the handover efficiency, service quality, and load balancing across satellites. Full article
Show Figures

Figure 1

21 pages, 4738 KiB  
Article
Research on Computation Offloading and Resource Allocation Strategy Based on MADDPG for Integrated Space–Air–Marine Network
by Haixiang Gao
Entropy 2025, 27(8), 803; https://doi.org/10.3390/e27080803 - 28 Jul 2025
Viewed by 301
Abstract
This paper investigates the problem of computation offloading and resource allocation in an integrated space–air–sea network based on unmanned aerial vehicle (UAV) and low Earth orbit (LEO) satellites supporting Maritime Internet of Things (M-IoT) devices. Considering the complex, dynamic environment comprising M-IoT devices, [...] Read more.
This paper investigates the problem of computation offloading and resource allocation in an integrated space–air–sea network based on unmanned aerial vehicle (UAV) and low Earth orbit (LEO) satellites supporting Maritime Internet of Things (M-IoT) devices. Considering the complex, dynamic environment comprising M-IoT devices, UAVs and LEO satellites, traditional optimization methods encounter significant limitations due to non-convexity and the combinatorial explosion in possible solutions. A multi-agent deep deterministic policy gradient (MADDPG)-based optimization algorithm is proposed to address these challenges. This algorithm is designed to minimize the total system costs, balancing energy consumption and latency through partial task offloading within a cloud–edge-device collaborative mobile edge computing (MEC) system. A comprehensive system model is proposed, with the problem formulated as a partially observable Markov decision process (POMDP) that integrates association control, power control, computing resource allocation, and task distribution. Each M-IoT device and UAV acts as an intelligent agent, collaboratively learning the optimal offloading strategies through a centralized training and decentralized execution framework inherent in the MADDPG. The numerical simulations validate the effectiveness of the proposed MADDPG-based approach, which demonstrates rapid convergence and significantly outperforms baseline methods, and indicate that the proposed MADDPG-based algorithm reduces the total system cost by 15–60% specifically. Full article
(This article belongs to the Special Issue Space-Air-Ground-Sea Integrated Communication Networks)
Show Figures

Figure 1

20 pages, 5343 KiB  
Article
System-Level Assessment of Ka-Band Digital Beamforming Receivers and Transmitters Implementing Large Thinned Antenna Array for Low Earth Orbit Satellite Communications
by Giovanni Lasagni, Alessandro Calcaterra, Monica Righini, Giovanni Gasparro, Stefano Maddio, Vincenzo Pascale, Alessandro Cidronali and Stefano Selleri
Sensors 2025, 25(15), 4645; https://doi.org/10.3390/s25154645 - 26 Jul 2025
Viewed by 337
Abstract
In this paper, we present a system-level model of a digital multibeam antenna designed for Low Earth Orbit satellite communications operating in the Ka-band. We initially develop a suitable array topology, which is based on a thinned lattice, then adopt it as the [...] Read more.
In this paper, we present a system-level model of a digital multibeam antenna designed for Low Earth Orbit satellite communications operating in the Ka-band. We initially develop a suitable array topology, which is based on a thinned lattice, then adopt it as the foundation for evaluating its performance within a digital beamforming architecture. This architecture is implemented in a system-level simulator to evaluate the performance of the transmitter and receiver chains. This study advances the analysis of the digital antennas by incorporating both the RF front-end and digital sections non-idealities into a digital-twin framework. This approach enhances the designer’s ability to optimize the system with a holistic approach and provides insights into how various impairments affect the transmitter and receiver performance, identifying the subsystems’ parameter limits. To achieve this, we analyze several subsystems’ parameters and impairments, assessing their effects on both the antenna radiation and quality of the transmitted and received signals in a real applicative context. The results of this study reveal the sensitivity of the system to the impairments and suggest strategies to trade them off, emphasizing the importance of selecting appropriate subsystem features to optimize overall system performance. Full article
Show Figures

Figure 1

18 pages, 1411 KiB  
Article
A Framework for Joint Beam Scheduling and Resource Allocation in Beam-Hopping-Based Satellite Systems
by Jinfeng Zhang, Wei Li, Yong Li, Haomin Wang and Shilin Li
Electronics 2025, 14(14), 2887; https://doi.org/10.3390/electronics14142887 - 18 Jul 2025
Viewed by 245
Abstract
With the rapid development of heterogeneous satellite networks integrating geostationary earth orbit (GEO) and low earth orbit (LEO) satellite systems, along with the significant growth in the number of satellite users, it is essential to consider frequency compatibility and coexistence between GEO and [...] Read more.
With the rapid development of heterogeneous satellite networks integrating geostationary earth orbit (GEO) and low earth orbit (LEO) satellite systems, along with the significant growth in the number of satellite users, it is essential to consider frequency compatibility and coexistence between GEO and LEO systems, as well as to design effective system resource allocation strategies to achieve efficient utilization of system resources. However, existing beam-hopping (BH) resource allocation algorithms in LEO systems primarily focus on beam scheduling within a single time slot, lacking unified beam management across the entire BH cycle, resulting in low beam-resource utilization. Moreover, existing algorithms often employ iterative optimization across multiple resource dimensions, leading to high computational complexity and imposing stringent requirements on satellite on-board processing capabilities. In this paper, we propose a BH-based beam scheduling and resource allocation framework. The proposed framework first employs geographic isolation to protect the GEO system from the interference of the LEO system and subsequently optimizes beam partitioning over the entire BH cycle, time-slot beam scheduling, and frequency and power resource allocation for users within the LEO system. The proposed scheme achieves frequency coexistence between the GEO and LEO satellite systems and performs joint optimization of system resources across four dimensions—time, space, frequency, and power—with reduced complexity and a progressive optimization framework. Simulation results demonstrate that the proposed framework achieves effective suppression of both intra-system and inter-system interference via geographic isolation, while enabling globally efficient and dynamic beam scheduling across the entire BH cycle. Furthermore, by integrating the user-level frequency and power allocation algorithm, the scheme significantly enhances the total system throughput. The proposed progressive optimization framework offers a promising direction for achieving globally optimal and computationally tractable resource management in future satellite networks. Full article
Show Figures

Figure 1

25 pages, 732 KiB  
Article
Accuracy-Aware MLLM Task Offloading and Resource Allocation in UAV-Assisted Satellite Edge Computing
by Huabing Yan, Hualong Huang, Zijia Zhao, Zhi Wang and Zitian Zhao
Drones 2025, 9(7), 500; https://doi.org/10.3390/drones9070500 - 16 Jul 2025
Viewed by 366
Abstract
This paper presents a novel framework for optimizing multimodal large language model (MLLM) inference through task offloading and resource allocation in UAV-assisted satellite edge computing (SEC) networks. MLLMs leverage transformer architectures to integrate heterogeneous data modalities for IoT applications, particularly real-time monitoring in [...] Read more.
This paper presents a novel framework for optimizing multimodal large language model (MLLM) inference through task offloading and resource allocation in UAV-assisted satellite edge computing (SEC) networks. MLLMs leverage transformer architectures to integrate heterogeneous data modalities for IoT applications, particularly real-time monitoring in remote areas. However, cloud computing dependency introduces latency, bandwidth, and privacy challenges, while IoT device limitations require efficient distributed computing solutions. SEC, utilizing low-earth orbit (LEO) satellites and unmanned aerial vehicles (UAVs), extends mobile edge computing to provide ubiquitous computational resources for remote IoTDs. We formulate the joint optimization of MLLM task offloading and resource allocation as a mixed-integer nonlinear programming (MINLP) problem, minimizing latency and energy consumption while optimizing offloading decisions, power allocation, and UAV trajectories. To address the dynamic SEC environment characterized by satellite mobility, we propose an action-decoupled soft actor–critic (AD-SAC) algorithm with discrete–continuous hybrid action spaces. The simulation results demonstrate that our approach significantly outperforms conventional deep reinforcement learning methods in convergence and system cost reduction compared to baseline algorithms. Full article
Show Figures

Figure 1

15 pages, 2538 KiB  
Article
Parallel Eclipse-Aware Routing on FPGA for SpaceWire-Based OBC in LEO Satellite Networks
by Jin Hyung Park, Heoncheol Lee and Myonghun Han
J. Sens. Actuator Netw. 2025, 14(4), 73; https://doi.org/10.3390/jsan14040073 - 15 Jul 2025
Viewed by 361
Abstract
Low Earth orbit (LEO) satellite networks deliver superior real-time performance and responsiveness compared to conventional satellite networks, despite technical and economic challenges such as high deployment costs and operational complexity. Nevertheless, rapid topology changes and severe energy constraints of LEO satellites make real-time [...] Read more.
Low Earth orbit (LEO) satellite networks deliver superior real-time performance and responsiveness compared to conventional satellite networks, despite technical and economic challenges such as high deployment costs and operational complexity. Nevertheless, rapid topology changes and severe energy constraints of LEO satellites make real-time routing a persistent challenge. In this paper, we employ field-programmable gate arrays (FPGAs) to overcome the resource limitations of on-board computers (OBCs) and to manage energy consumption effectively using the Eclipse-Aware Routing (EAR) algorithm, and we implement the K-Shortest Paths (KSP) algorithm directly on the FPGA. Our method first generates multiple routes from the source to the destination using KSP, then selects the optimal path based on energy consumption rate, eclipse duration, and estimated transmission load as evaluated by EAR. In large-scale LEO networks, the computational burden of KSP grows substantially as connectivity data become more voluminous and complex. To enhance performance, we accelerate complex computations in the programmable logic (PL) via pipelining and design a collaborative architecture between the processing system (PS) and PL, achieving approximately a 3.83× speedup compared to a PS-only implementation. We validate the feasibility of the proposed approach by successfully performing remote routing-table updates on the SpaceWire-based SpaceWire Brick MK4 network system. Full article
(This article belongs to the Section Communications and Networking)
Show Figures

Figure 1

17 pages, 1184 KiB  
Article
A Biologically Inspired Cost-Efficient Zero-Trust Security Approach for Attacker Detection and Classification in Inter-Satellite Communication Networks
by Sridhar Varadala and Hao Xu
Future Internet 2025, 17(7), 304; https://doi.org/10.3390/fi17070304 - 13 Jul 2025
Viewed by 234
Abstract
In next-generation Low-Earth-Orbit (LEO) satellite networks, securing inter-satellite communication links (ISLs) through strong authentication is essential due to the network’s dynamic and distributed structure. Traditional authentication systems often struggle in these environments, leading to the adoption of Zero-Trust Security (ZTS) models. However, current [...] Read more.
In next-generation Low-Earth-Orbit (LEO) satellite networks, securing inter-satellite communication links (ISLs) through strong authentication is essential due to the network’s dynamic and distributed structure. Traditional authentication systems often struggle in these environments, leading to the adoption of Zero-Trust Security (ZTS) models. However, current ZTS protocols typically introduce high computational overhead, especially as the number of satellite nodes grows, which can impact both security and network performance. To overcome these challenges, a new bio-inspired ZTS framework called Manta Ray Foraging Cost-Optimized Zero-Trust Security (MRFCO-ZTS) has been introduced. This approach uses data-driven learning methods to enhance security across satellite communications. It continuously evaluates access requests by applying a cost function that accounts for risk level, likelihood of attack, and computational delay. The Manta Ray Foraging Optimization (MRFO) algorithm is used to minimize this cost, enabling effective classification of nodes as either trusted or malicious based on historical authentication records and real-time behavior. MRFCO-ZTS improves the accuracy of attacker detection while maintaining secure data exchange between authenticated satellites. Its effectiveness has been tested through numerical simulations under different satellite traffic conditions, with performance measured in terms of security accuracy, latency, and operational efficiency. Full article
(This article belongs to the Special Issue Joint Design and Integration in Smart IoT Systems, 2nd Edition)
Show Figures

Figure 1

17 pages, 2103 KiB  
Article
Optimizing Time-Sensitive Traffic Scheduling in Low-Earth-Orbit Satellite Networks
by Wei Liu, Nan Xiao, Bo Liu, Yuxian Zhang and Taoyong Li
Sensors 2025, 25(14), 4327; https://doi.org/10.3390/s25144327 - 10 Jul 2025
Viewed by 332
Abstract
In contrast to terrestrial networks, the rapid movement of low-earth-orbit (LEO) satellites causes frequent changes in the topology of intersatellite links (ISLs), resulting in dynamic shifts in transmission paths and fluctuations in multi-hop latency. Moreover, limited onboard resources such as buffer capacity and [...] Read more.
In contrast to terrestrial networks, the rapid movement of low-earth-orbit (LEO) satellites causes frequent changes in the topology of intersatellite links (ISLs), resulting in dynamic shifts in transmission paths and fluctuations in multi-hop latency. Moreover, limited onboard resources such as buffer capacity and bandwidth competition contribute to the instability of these links. As a result, providing reliable quality of service (QoS) for time-sensitive flows (TSFs) in LEO satellite networks becomes a challenging task. Traditional terrestrial time-sensitive networking methods, which depend on fixed paths and static priority scheduling, are ill-equipped to handle the dynamic nature and resource constraints typical of satellite environments. This often leads to congestion, packet loss, and excessive latency, especially for high-priority TSFs. This study addresses the primary challenges faced by time-sensitive satellite networks and introduces a management framework based on software-defined networking (SDN) tailored for LEO satellites. An advanced queue management and scheduling system, influenced by terrestrial time-sensitive networking approaches, is developed. By incorporating differentiated forwarding strategies and priority-based classification, the proposed method improves the efficiency of transmitting time-sensitive traffic at multiple levels. To assess the scheme’s performance, simulations under various workloads are conducted, and the results reveal that it significantly boosts network throughput, reduces packet loss, and maintains low latency, thus optimizing the performance of time-sensitive traffic in LEO satellite networks. Full article
(This article belongs to the Section Communications)
Show Figures

Figure 1

23 pages, 2431 KiB  
Article
SatScope: A Data-Driven Simulator for Low-Earth-Orbit Satellite Internet
by Qichen Wang, Guozheng Yang, Yongyu Liang, Chiyu Chen, Qingsong Zhao and Sugai Chen
Future Internet 2025, 17(7), 278; https://doi.org/10.3390/fi17070278 - 24 Jun 2025
Viewed by 409
Abstract
The rapid development of low-Earth-orbit (LEO) satellite constellations has not only provided global users with low-latency and unrestricted high-speed data services but also presented researchers with the challenge of understanding dynamic changes in global network behavior. Unlike geostationary satellites and terrestrial internet infrastructure, [...] Read more.
The rapid development of low-Earth-orbit (LEO) satellite constellations has not only provided global users with low-latency and unrestricted high-speed data services but also presented researchers with the challenge of understanding dynamic changes in global network behavior. Unlike geostationary satellites and terrestrial internet infrastructure, LEO satellites move at a relative velocity of 7.6 km/s, leading to frequent alterations in their connectivity status with ground stations. Given the complexity of the space environment, current research on LEO satellite internet primarily focuses on modeling and simulation. However, existing LEO satellite network simulators often overlook the global network characteristics of these systems. We present SatScope, a data-driven simulator for LEO satellite internet. SatScope consists of three main components, space segment modeling, ground segment modeling, and network simulation configuration, providing researchers with an interface to interact with these models. Utilizing both space and ground segment models, SatScope can configure various network topology models, routing algorithms, and load balancing schemes, thereby enabling the evaluation of optimization algorithms for LEO satellite communication systems. We also compare SatScope’s fidelity, lightweight design, scalability, and openness against other simulators. Based on our simulation results using SatScope, we propose two metrics—ground node IP coverage rate and the number of satellite service IPs—to assess the service performance of single-layer satellite networks. Our findings reveal that during each network handover, on average, 38.94% of nodes and 83.66% of links change. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

18 pages, 2521 KiB  
Article
A Doppler Frequency-Offset Estimation Method Based on the Beam Pointing of LEO Satellites
by Yanjun Song, Jun Xu, Chenhua Sun, Xudong Li and Shaoyi An
Electronics 2025, 14(13), 2539; https://doi.org/10.3390/electronics14132539 - 23 Jun 2025
Viewed by 349
Abstract
With the advancement of 5G-Advanced Non-Terrestrial Network (5G-A NTN) mobile communication technologies, direct satellite connectivity for mobile devices has been increasingly adopted. In the highly dynamic environment of low-Earth-orbit (LEO) satellite communications, the synchronization of satellite–ground signals remains a critical challenge. In this [...] Read more.
With the advancement of 5G-Advanced Non-Terrestrial Network (5G-A NTN) mobile communication technologies, direct satellite connectivity for mobile devices has been increasingly adopted. In the highly dynamic environment of low-Earth-orbit (LEO) satellite communications, the synchronization of satellite–ground signals remains a critical challenge. In this study, a Doppler frequency-shift estimation method applicable to high-mobility LEO scenarios is proposed, without reliance on the Global Navigation Satellite System (GNSS). Rapid access to satellite systems by mobile devices is enabled without the need for additional time–frequency synchronization infrastructure. The generation mechanism of satellite–ground Doppler frequency shifts is analyzed, and a relationship between satellite velocity and beam-pointing direction is established. Based on this relationship, a Doppler frequency-shift estimation method, referred to as DFS-BP (Doppler frequency-shift estimation using beam pointing), is developed. The effects of Earth’s latitude and satellite orbital inclination are systematically investigated and optimized. Through simulation, the estimation performance under varying minimum satellite elevation angles and terminal geographic locations is evaluated. The algorithm may provide a novel solution for Doppler frequency-shift compensation in Non-Terrestrial Networks (NTNs). Full article
Show Figures

Figure 1

25 pages, 4360 KiB  
Article
Positioning-Based Uplink Synchronization Method for NB-IoT in LEO Satellite Networks
by Qiang Qi, Tao Hong and Gengxin Zhang
Symmetry 2025, 17(7), 984; https://doi.org/10.3390/sym17070984 - 21 Jun 2025
Viewed by 629
Abstract
With the growth of Internet of Things (IoT) business demands, NB-IoT integrating low earth orbit (LEO) satellite communication systems is considered a crucial component for achieving global coverage of IoT networks in the future. However, the long propagation delay and significant Doppler frequency [...] Read more.
With the growth of Internet of Things (IoT) business demands, NB-IoT integrating low earth orbit (LEO) satellite communication systems is considered a crucial component for achieving global coverage of IoT networks in the future. However, the long propagation delay and significant Doppler frequency shift of the satellite-to-ground link pose substantial challenges to the uplink and downlink synchronization in LEO satellite-based NB-IoT networks. To address this challenge, we first propose a Multiple Segment Auto-correlation (MSA) algorithm to detect the downlink Narrow-band Primary Synchronization Signal (NPSS), specifically tailored for the large Doppler frequency shift of LEO satellites. After detection, downlink synchronization can be realized by determining the arrival time and frequency of the NPSS. Then, to complete the uplink synchronization, we propose a position-based scheme to obtain the Timing Advance (TA) values and pre-compensated Doppler shift value. In this scheme, we formulate a time difference of arrival (TDOA) equation using the arrival times of NPSSs from different satellites or at different times as observations. After solving the TDOA equation using the Chan method, the uplink synchronization is completed by obtaining the TA values and pre-compensated Doppler shift value from the terminal position combined with satellite ephemeris. Finally, the feasibility of the proposed scheme is verified in an Iridium satellite constellation. Compared to conventional GNSS-assisted methods, the approach proposed in this paper reduces terminal power consumption by 15–40%. Moreover, it achieves an uplink synchronization success rate of over 98% under negative SNR conditions. Full article
(This article belongs to the Special Issue Symmetry/Asymmetry in Future Wireless Networks)
Show Figures

Figure 1

17 pages, 474 KiB  
Article
User Experience-Oriented Content Caching for Low Earth Orbit Satellite-Enabled Mobile Edge Computing Networks
by Jianhua He, Youhan Zhao, Yonghua Ma and Qiang Wang
Electronics 2025, 14(12), 2413; https://doi.org/10.3390/electronics14122413 - 13 Jun 2025
Viewed by 290
Abstract
In this paper, we investigate a low Earth orbit (LEO) satellite-enabled mobile edge computing (MEC) network, where multiple cache-enabled LEO satellites are deployed to address heterogeneous content requests from ground users. To evaluate the network’s capability in meeting user demands, we adopt the [...] Read more.
In this paper, we investigate a low Earth orbit (LEO) satellite-enabled mobile edge computing (MEC) network, where multiple cache-enabled LEO satellites are deployed to address heterogeneous content requests from ground users. To evaluate the network’s capability in meeting user demands, we adopt the average quality of experience (QoE) of the users as the performance metric, defined based on the effective transmission rate under communication interference. Our analysis reveals that the average QoE is determined by the content caching decisions at the satellites, thereby allowing us to formulate an average QoE maximization problem, subject to practical constraints on the satellite caching capacity. To tackle this NP-hard problem, we design a two-stage content caching algorithm that combines divide-and-conquer and greedy policies for efficient solution. The numerical results validate the effectiveness of the proposed approach. Compared with several benchmark schemes, our algorithm achieves notable improvements in terms of the average QoE while significantly reducing caching costs, particularly under resource-constrained satellite settings. Full article
Show Figures

Figure 1

19 pages, 1821 KiB  
Article
Mitigating DDoS Attacks in LEO Satellite Networks Through Bottleneck Minimize Routing
by Fangzhou Meng, Xiaodan Yan, Yuanjian Zhang, Jian Yang, Ang Cao, Ruiqi Liu and Yongli Zhao
Electronics 2025, 14(12), 2376; https://doi.org/10.3390/electronics14122376 - 10 Jun 2025
Viewed by 604
Abstract
In this paper, we focus on defending against distributed denial-of-service (DDoS) attacks in a low-earth-orbit (LEO) satellite network (LSN). To enhance the security of LSN, we propose the K-Bottleneck Minimize routing method. The algorithm ensures path diversity while avoiding vulnerable bottleneck paths, which [...] Read more.
In this paper, we focus on defending against distributed denial-of-service (DDoS) attacks in a low-earth-orbit (LEO) satellite network (LSN). To enhance the security of LSN, we propose the K-Bottleneck Minimize routing method. The algorithm ensures path diversity while avoiding vulnerable bottleneck paths, which significantly increases the cost for attackers. Additionally, the attacker’s detectability is reduced. The results show that the algorithm avoids the bottleneck paths that are vulnerable to attacks, improves the attacker’s cost by about 13.1% and 16.6% on average and median, and improves the detectability of attackers by 48.5% and 45.4% on average and median. The algorithm generates multiple non-overlapping inter-satellite paths, preventing the exploitation of bottleneck paths and ensuring better robustness and attack resistance. Full article
Show Figures

Figure 1

21 pages, 676 KiB  
Article
Service-Driven Dynamic Beam Hopping with Resource Allocation for LEO Satellites
by Huaixiu Xu, Lilan Liu and Zhizhong Zhang
Electronics 2025, 14(12), 2367; https://doi.org/10.3390/electronics14122367 - 10 Jun 2025
Viewed by 672
Abstract
Given the problems of uneven distribution, strong time variability of ground service demands, and low utilization rate of on-board resources in Low-Earth-Orbit (LEO) satellite communication systems, how to efficiently utilize limited beam resources to flexibly and dynamically serve ground users has become a [...] Read more.
Given the problems of uneven distribution, strong time variability of ground service demands, and low utilization rate of on-board resources in Low-Earth-Orbit (LEO) satellite communication systems, how to efficiently utilize limited beam resources to flexibly and dynamically serve ground users has become a research hotspot. This paper studies the dynamic resource allocation and interference suppression strategies for beam hopping satellite communication systems. Specifically, in the full-frequency-reuse scenario, we adopt spatial isolation techniques to avoid co-channel interference between beams and construct a multi-objective optimization problem by introducing weight coefficients, aiming to maximize user satisfaction and minimize transmission delay simultaneously. We model this optimization problem as a Markov decision process and apply a value decomposition network (VDN) algorithm based on cooperative multi-agent reinforcement learning (MARL-VDN) to reduce computational complexity. In this algorithm framework, each beam acts as an agent, making independent decisions on hopping patterns and power allocation strategies, while achieving multi-agent cooperative optimization through sharing global states and joint reward mechanisms. Simulation results show that the applied algorithm can effectively enhance user satisfaction, reduce delay, and maintain high resource utilization in dynamic service demand scenarios. Additionally, the offline-trained MARL-VDN model can be deployed on LEO satellites in a distributed mode to achieve real-time on-board resource allocation on demand. Full article
Show Figures

Figure 1

Back to TopTop