Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (673)

Search Parameters:
Keywords = low-earth-orbit satellite

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 3217 KiB  
Article
A Deep Reinforcement Learning Approach for Energy Management in Low Earth Orbit Satellite Electrical Power Systems
by Silvio Baccari, Elisa Mostacciuolo, Massimo Tipaldi and Valerio Mariani
Electronics 2025, 14(15), 3110; https://doi.org/10.3390/electronics14153110 - 5 Aug 2025
Abstract
Effective energy management in Low Earth Orbit satellites is critical, as inefficient energy management can significantly affect mission objectives. The dynamic and harsh space environment further complicates the development of effective energy management strategies. To address these challenges, we propose a Deep Reinforcement [...] Read more.
Effective energy management in Low Earth Orbit satellites is critical, as inefficient energy management can significantly affect mission objectives. The dynamic and harsh space environment further complicates the development of effective energy management strategies. To address these challenges, we propose a Deep Reinforcement Learning approach using Deep-Q Network to develop an adaptive energy management framework for Low Earth Orbit satellites. Compared to traditional techniques, the proposed solution autonomously learns from environmental interaction, offering robustness to uncertainty and online adaptability. It adjusts to changing conditions without manual retraining, making it well-suited for handling modeling uncertainties and non-stationary dynamics typical of space operations. Training is conducted using a realistic satellite electric power system model with accurate component parameters and single-orbit power profiles derived from real space missions. Numerical simulations validate the controller performance across diverse scenarios, including multi-orbit settings, demonstrating superior adaptability and efficiency compared to conventional Maximum Power Point Tracking methods. Full article
Show Figures

Figure 1

31 pages, 3480 KiB  
Article
The First Step of AI in LEO SOPs: DRL-Driven Epoch Credibility Evaluation to Enhance Opportunistic Positioning Accuracy
by Jiaqi Yin, Feilong Li, Ruidan Luo, Xiao Chen, Linhui Zhao, Hong Yuan and Guang Yang
Remote Sens. 2025, 17(15), 2692; https://doi.org/10.3390/rs17152692 - 3 Aug 2025
Viewed by 122
Abstract
Low Earth orbit (LEO) signal of opportunity (SOP) positioning relies on the accumulation of epochs obtained through prolonged observation periods. The contribution of an LEO satellite single epoch to positioning accuracy is influenced by multi-level characteristics that are challenging for traditional models. To [...] Read more.
Low Earth orbit (LEO) signal of opportunity (SOP) positioning relies on the accumulation of epochs obtained through prolonged observation periods. The contribution of an LEO satellite single epoch to positioning accuracy is influenced by multi-level characteristics that are challenging for traditional models. To address this limitation, we propose an Agent-Weighted Recursive Least Squares (RLS) Positioning Framework (AWR-PF). This framework employs an agent to comprehensively analyze individual epoch characteristics, assess their credibility, and convert them into adaptive weights for RLS iterations. We developed a novel Markov Decision Process (MDP) model to assist the agent in addressing the epoch weighting problem and trained the agent utilizing the Double Deep Q-Network (DDQN) algorithm on 107 h of Iridium signal data. Experimental validation on a separate 28 h Iridium signal test set through 97 positioning trials demonstrated that AWR-PF achieves superior average positioning accuracy compared to both standard RLS and randomly weighted RLS throughout nearly the entire iterative process. In a single positioning trial, AWR-PF improves positioning accuracy by up to 45.15% over standard RLS. To the best of our knowledge, this work represents the first instance where an AI algorithm is used as the core decision-maker in LEO SOP positioning, establishing a groundbreaking paradigm for future research. Full article
(This article belongs to the Special Issue LEO-Augmented PNT Service)
Show Figures

Graphical abstract

28 pages, 2841 KiB  
Article
A Multi-Constraint Co-Optimization LQG Frequency Steering Method for LEO Satellite Oscillators
by Dongdong Wang, Wenhe Liao, Bin Liu and Qianghua Yu
Sensors 2025, 25(15), 4733; https://doi.org/10.3390/s25154733 - 31 Jul 2025
Viewed by 216
Abstract
High-precision time–frequency systems are essential for low Earth orbit (LEO) navigation satellites to achieve real-time (RT) centimeter-level positioning services. However, subject to stringent size, power, and cost constraints, LEO satellites are typically equipped with oven-controlled crystal oscillators (OCXOs) as the system clock. The [...] Read more.
High-precision time–frequency systems are essential for low Earth orbit (LEO) navigation satellites to achieve real-time (RT) centimeter-level positioning services. However, subject to stringent size, power, and cost constraints, LEO satellites are typically equipped with oven-controlled crystal oscillators (OCXOs) as the system clock. The inherent long-term stability of OCXOs leads to rapid clock error accumulation, severely degrading positioning accuracy. To simultaneously balance multi-dimensional requirements such as clock bias accuracy, and frequency stability and phase continuity, this study proposes a linear quadratic Gaussian (LQG) frequency precision steering method that integrates a four-dimensional constraint integrated (FDCI) model and hierarchical weight optimization. An improved system error model is refined to quantify the covariance components (Σ11, Σ22) of the LQG closed-loop control system. Then, based on the FDCI model that explicitly incorporates quantization noise, frequency adjustment, frequency stability, and clock bias variance, a priority-driven collaborative optimization mechanism systematically determines the weight matrices, ensuring a robust tradeoff among multiple performance criteria. Experiments on OCXO payload products, with micro-step actuation, demonstrate that the proposed method reduces the clock error RMS to 0.14 ns and achieves multi-timescale stability enhancement. The short-to-long-term frequency stability reaches 9.38 × 10−13 at 100 s, and long-term frequency stability is 4.22 × 10−14 at 10,000 s, representing three orders of magnitude enhancement over a free-running OCXO. Compared to conventional PID control (clock bias RMS 0.38 ns) and pure Kalman filtering (stability 6.1 × 10−13 at 10,000 s), the proposed method reduces clock bias by 37% and improves stability by 93%. The impact of quantization noise on short-term stability (1–40 s) is contained within 13%. The principal novelty arises from the systematic integration of theoretical constraints and performance optimization within a unified framework. This approach comprehensively enhances the time–frequency performance of OCXOs, providing a low-cost, high-precision timing–frequency reference solution for LEO satellites. Full article
(This article belongs to the Section Remote Sensors)
Show Figures

Figure 1

16 pages, 2357 KiB  
Article
Joint Traffic Prediction and Handover Design for LEO Satellite Networks with LSTM and Attention-Enhanced Rainbow DQN
by Dinghe Fan, Shilei Zhou, Jihao Luo, Zijian Yang and Ming Zeng
Electronics 2025, 14(15), 3040; https://doi.org/10.3390/electronics14153040 - 30 Jul 2025
Viewed by 231
Abstract
With the increasing scale of low Earth orbit (LEO) satellite networks, leveraging non−terrestrial networks (NTNs) to complement terrestrial networks (TNs) has become a critical issue. In this paper, we investigate the issue of handover satellite selection between multiple terrestrial terminal groups (TTGs). To [...] Read more.
With the increasing scale of low Earth orbit (LEO) satellite networks, leveraging non−terrestrial networks (NTNs) to complement terrestrial networks (TNs) has become a critical issue. In this paper, we investigate the issue of handover satellite selection between multiple terrestrial terminal groups (TTGs). To support effective handover decision-making, we propose a long short-term memory (LSTM)-network-based traffic prediction mechanism based on historical traffic data. Building on these predictions, we formulate the handover strategy as a Markov Decision Process (MDP) and propose an attention-enhanced rainbow-DQN-based joint traffic prediction and handover design framework (ARTHF) by jointly considering the satellite switching frequency, communication quality, and satellite load. Simulation results demonstrate that our approach significantly outperforms existing methods in terms of the handover efficiency, service quality, and load balancing across satellites. Full article
Show Figures

Figure 1

28 pages, 7048 KiB  
Article
Enhanced Conjunction Assessment in LEO: A Hybrid Monte Carlo and Spline-Based Method Using TLE Data
by Shafeeq Koheal Tealib, Ahmed Magdy Abdelaziz, Igor E. Molotov, Xu Yang, Jian Sun and Jing Liu
Aerospace 2025, 12(8), 674; https://doi.org/10.3390/aerospace12080674 - 28 Jul 2025
Viewed by 219
Abstract
The growing density of space objects in low Earth orbit (LEO), driven by the deployment of large satellite constellations, has elevated the risk of orbital collisions and the need for high-precision conjunction analysis. Traditional methods based on Two-Line Element (TLE) data suffer from [...] Read more.
The growing density of space objects in low Earth orbit (LEO), driven by the deployment of large satellite constellations, has elevated the risk of orbital collisions and the need for high-precision conjunction analysis. Traditional methods based on Two-Line Element (TLE) data suffer from limited accuracy and insufficient uncertainty modeling. This study proposes a hybrid collision assessment framework that combines Monte Carlo simulation, spline-based refinement of the time of closest approach (TCA), and a multi-stage deterministic refinement process. The methodology begins with probabilistic sampling of TLE uncertainties, followed by a coarse search for TCA using the SGP4 propagator. A cubic spline interpolation then enhances temporal resolution, and a hierarchical multi-stage refinement computes the final TCA and minimum distance with sub-second and sub-kilometer accuracy. The framework was validated using real-world TLE data from over 2600 debris objects and active satellites. Results demonstrated a reduction in average TCA error to 0.081 s and distance estimation error to 0.688 km. The approach is computationally efficient, with average processing times below one minute per conjunction event using standard hardware. Its compatibility with operational space situational awareness (SSA) systems and scalability for high-volume screening make it suitable for integration into real-time space traffic management workflows. Full article
(This article belongs to the Section Astronautics & Space Science)
Show Figures

Figure 1

21 pages, 4738 KiB  
Article
Research on Computation Offloading and Resource Allocation Strategy Based on MADDPG for Integrated Space–Air–Marine Network
by Haixiang Gao
Entropy 2025, 27(8), 803; https://doi.org/10.3390/e27080803 - 28 Jul 2025
Viewed by 301
Abstract
This paper investigates the problem of computation offloading and resource allocation in an integrated space–air–sea network based on unmanned aerial vehicle (UAV) and low Earth orbit (LEO) satellites supporting Maritime Internet of Things (M-IoT) devices. Considering the complex, dynamic environment comprising M-IoT devices, [...] Read more.
This paper investigates the problem of computation offloading and resource allocation in an integrated space–air–sea network based on unmanned aerial vehicle (UAV) and low Earth orbit (LEO) satellites supporting Maritime Internet of Things (M-IoT) devices. Considering the complex, dynamic environment comprising M-IoT devices, UAVs and LEO satellites, traditional optimization methods encounter significant limitations due to non-convexity and the combinatorial explosion in possible solutions. A multi-agent deep deterministic policy gradient (MADDPG)-based optimization algorithm is proposed to address these challenges. This algorithm is designed to minimize the total system costs, balancing energy consumption and latency through partial task offloading within a cloud–edge-device collaborative mobile edge computing (MEC) system. A comprehensive system model is proposed, with the problem formulated as a partially observable Markov decision process (POMDP) that integrates association control, power control, computing resource allocation, and task distribution. Each M-IoT device and UAV acts as an intelligent agent, collaboratively learning the optimal offloading strategies through a centralized training and decentralized execution framework inherent in the MADDPG. The numerical simulations validate the effectiveness of the proposed MADDPG-based approach, which demonstrates rapid convergence and significantly outperforms baseline methods, and indicate that the proposed MADDPG-based algorithm reduces the total system cost by 15–60% specifically. Full article
(This article belongs to the Special Issue Space-Air-Ground-Sea Integrated Communication Networks)
Show Figures

Figure 1

20 pages, 5343 KiB  
Article
System-Level Assessment of Ka-Band Digital Beamforming Receivers and Transmitters Implementing Large Thinned Antenna Array for Low Earth Orbit Satellite Communications
by Giovanni Lasagni, Alessandro Calcaterra, Monica Righini, Giovanni Gasparro, Stefano Maddio, Vincenzo Pascale, Alessandro Cidronali and Stefano Selleri
Sensors 2025, 25(15), 4645; https://doi.org/10.3390/s25154645 - 26 Jul 2025
Viewed by 337
Abstract
In this paper, we present a system-level model of a digital multibeam antenna designed for Low Earth Orbit satellite communications operating in the Ka-band. We initially develop a suitable array topology, which is based on a thinned lattice, then adopt it as the [...] Read more.
In this paper, we present a system-level model of a digital multibeam antenna designed for Low Earth Orbit satellite communications operating in the Ka-band. We initially develop a suitable array topology, which is based on a thinned lattice, then adopt it as the foundation for evaluating its performance within a digital beamforming architecture. This architecture is implemented in a system-level simulator to evaluate the performance of the transmitter and receiver chains. This study advances the analysis of the digital antennas by incorporating both the RF front-end and digital sections non-idealities into a digital-twin framework. This approach enhances the designer’s ability to optimize the system with a holistic approach and provides insights into how various impairments affect the transmitter and receiver performance, identifying the subsystems’ parameter limits. To achieve this, we analyze several subsystems’ parameters and impairments, assessing their effects on both the antenna radiation and quality of the transmitted and received signals in a real applicative context. The results of this study reveal the sensitivity of the system to the impairments and suggest strategies to trade them off, emphasizing the importance of selecting appropriate subsystem features to optimize overall system performance. Full article
Show Figures

Figure 1

22 pages, 3073 KiB  
Article
Research on Sliding-Window Batch Processing Orbit Determination Algorithm for Satellite-to-Satellite Tracking
by Yingjie Xu, Xuan Feng, Shuanglin Li, Jinghui Pu, Shixu Chen and Wenbin Wang
Aerospace 2025, 12(8), 662; https://doi.org/10.3390/aerospace12080662 - 25 Jul 2025
Viewed by 215
Abstract
In response to the increasing demand for high-precision navigation of satellites operating in the cislunar space, this study introduces an onboard orbit determination algorithm considering both convergence and computational efficiency, referred to as the Sliding-Window Batch Processing (SWBP) algorithm. This algorithm combines the [...] Read more.
In response to the increasing demand for high-precision navigation of satellites operating in the cislunar space, this study introduces an onboard orbit determination algorithm considering both convergence and computational efficiency, referred to as the Sliding-Window Batch Processing (SWBP) algorithm. This algorithm combines the strengths of data batch processing and the sequential processing algorithm, utilizing measurement data from multiple historical and current epochs to update the orbit state of the current epoch. This algorithm facilitates rapid convergence in orbit determination, even in instances where the initial orbit error is large. The SWBP algorithm has been used to evaluate the navigation performance in the Distant Retrograde Orbit (DRO) and the Earth–Moon transfer orbit. The scenario involves a low-Earth-orbit (LEO) satellite establishing satellite-to-satellite tracking (SST) links with both a DRO satellite and an Earth–Moon transfer satellite. The LEO satellite can determine its orbit accurately by receiving GNSS signals. The experiments show that the DRO satellite achieves an orbit determination accuracy of 100 m within 100 h under an initial position error of 500 km, and the transfer orbit satellite reaches an orbit determination accuracy of 600 m within 3.5 h under an initial position error of 100 km. When the Earth–Moon transfer satellite exhibits a large initial orbital error (on the order of hundreds of kilometers) or the LEO satellite’s positional accuracy is degraded, the SWBP algorithm demonstrates superior convergence speed and precision in orbit determination compared to the Extended Kalman Filter (EKF). This confirms the proposed algorithm’s capability to handle complex orbital determination scenarios effectively. Full article
(This article belongs to the Section Astronautics & Space Science)
Show Figures

Figure 1

37 pages, 11546 KiB  
Review
Advances in Interferometric Synthetic Aperture Radar Technology and Systems and Recent Advances in Chinese SAR Missions
by Qingjun Zhang, Huangjiang Fan, Yuxiao Qin and Yashi Zhou
Sensors 2025, 25(15), 4616; https://doi.org/10.3390/s25154616 - 25 Jul 2025
Viewed by 445
Abstract
With advancements in radar sensors, communications, and computer technologies, alongside an increasing number of ground observation tasks, Synthetic Aperture Radar (SAR) remote sensing is transitioning from being theory and technology-driven to being application-demand-driven. Since the late 1960s, Interferometric Synthetic Aperture Radar (InSAR) theories [...] Read more.
With advancements in radar sensors, communications, and computer technologies, alongside an increasing number of ground observation tasks, Synthetic Aperture Radar (SAR) remote sensing is transitioning from being theory and technology-driven to being application-demand-driven. Since the late 1960s, Interferometric Synthetic Aperture Radar (InSAR) theories and techniques have continued to develop. They have been applied significantly in various fields, such as in the generation of global topography maps, monitoring of ground deformation, marine observations, and disaster reduction efforts. This article classifies InSAR into repeated-pass interference and single-pass interference. Repeated-pass interference mainly includes D-InSAR, PS-InSAR and SBAS-InSAR. Single-pass interference mainly includes CT-InSAR and AT-InSAR. Recently, China has made significant progress in the field of SAR satellite development, successfully launching several satellites equipped with interferometric measurement capabilities. These advancements have driven the evolution of spaceborne InSAR systems from single-frequency to multi-frequency, from low Earth orbit to higher orbits, and from single-platform to multi-platform configurations. These advancements have supported high precision and high-temporal-resolution land observation, and promoted the broader application of InSAR technology in disaster early warning, ecological monitoring, and infrastructure safety. Full article
Show Figures

Figure 1

26 pages, 795 KiB  
Review
New Space Engineering Design: Characterization of Key Drivers
by Daniele Ferrara, Paolo Cicconi, Angelo Minotti, Michele Trovato and Antonio Casimiro Caputo
Appl. Sci. 2025, 15(15), 8138; https://doi.org/10.3390/app15158138 - 22 Jul 2025
Viewed by 324
Abstract
The recent evolution of the space industry, commonly referred to as New Space, has changed the way space missions are conceived, developed, and executed. In contrast to traditional approaches, the current paradigm emphasizes accessibility, commercial competitiveness, and rapid and sustainable innovation. This study [...] Read more.
The recent evolution of the space industry, commonly referred to as New Space, has changed the way space missions are conceived, developed, and executed. In contrast to traditional approaches, the current paradigm emphasizes accessibility, commercial competitiveness, and rapid and sustainable innovation. This study proposes a research methodology for selecting relevant literature to identify the key design drivers and associated enablers that characterize the New Space context from an engineering design perspective. These elements are then organized into three categories: the evolution of traditional drivers, emerging manufacturing and integration practices, and sustainability and technology independence. This categorization highlights their role and relevance, providing a baseline for the development of systems for New Space missions. The results are further contextualized within three major application domains, namely Low Earth Orbit (LEO) small satellite constellations, operations and servicing in space, and space exploration, to illustrate their practical role in engineering space systems. By linking high-level industry trends to concrete design choices, this work aims to support the early design phases of New Space innovative systems and promote a more integrated approach between strategic objectives and technical development. Full article
Show Figures

Figure 1

18 pages, 1411 KiB  
Article
A Framework for Joint Beam Scheduling and Resource Allocation in Beam-Hopping-Based Satellite Systems
by Jinfeng Zhang, Wei Li, Yong Li, Haomin Wang and Shilin Li
Electronics 2025, 14(14), 2887; https://doi.org/10.3390/electronics14142887 - 18 Jul 2025
Viewed by 245
Abstract
With the rapid development of heterogeneous satellite networks integrating geostationary earth orbit (GEO) and low earth orbit (LEO) satellite systems, along with the significant growth in the number of satellite users, it is essential to consider frequency compatibility and coexistence between GEO and [...] Read more.
With the rapid development of heterogeneous satellite networks integrating geostationary earth orbit (GEO) and low earth orbit (LEO) satellite systems, along with the significant growth in the number of satellite users, it is essential to consider frequency compatibility and coexistence between GEO and LEO systems, as well as to design effective system resource allocation strategies to achieve efficient utilization of system resources. However, existing beam-hopping (BH) resource allocation algorithms in LEO systems primarily focus on beam scheduling within a single time slot, lacking unified beam management across the entire BH cycle, resulting in low beam-resource utilization. Moreover, existing algorithms often employ iterative optimization across multiple resource dimensions, leading to high computational complexity and imposing stringent requirements on satellite on-board processing capabilities. In this paper, we propose a BH-based beam scheduling and resource allocation framework. The proposed framework first employs geographic isolation to protect the GEO system from the interference of the LEO system and subsequently optimizes beam partitioning over the entire BH cycle, time-slot beam scheduling, and frequency and power resource allocation for users within the LEO system. The proposed scheme achieves frequency coexistence between the GEO and LEO satellite systems and performs joint optimization of system resources across four dimensions—time, space, frequency, and power—with reduced complexity and a progressive optimization framework. Simulation results demonstrate that the proposed framework achieves effective suppression of both intra-system and inter-system interference via geographic isolation, while enabling globally efficient and dynamic beam scheduling across the entire BH cycle. Furthermore, by integrating the user-level frequency and power allocation algorithm, the scheme significantly enhances the total system throughput. The proposed progressive optimization framework offers a promising direction for achieving globally optimal and computationally tractable resource management in future satellite networks. Full article
Show Figures

Figure 1

15 pages, 547 KiB  
Article
Improvements in PPP by Integrating GNSS with LEO Satellites: A Geometric Simulation
by Marianna Alghisi, Nikolina Zallemi and Ludovico Biagi
Sensors 2025, 25(14), 4427; https://doi.org/10.3390/s25144427 - 16 Jul 2025
Viewed by 469
Abstract
The precise point positioning (PPP) method in GNSS is based on the processing of undifferenced phase observations. For long static sessions, this method provides results characterized by accuracies better than one centimeter, and has become a standard practice in the processing of geodetic [...] Read more.
The precise point positioning (PPP) method in GNSS is based on the processing of undifferenced phase observations. For long static sessions, this method provides results characterized by accuracies better than one centimeter, and has become a standard practice in the processing of geodetic permanent stations data. However, a drawback of the PPP method is its slow convergence, which results from the necessity of jointly estimating the coordinates and the initial phase ambiguities. This poses a challenge for very short sessions or kinematic applications. The introduction of new satellites in Low Earth Orbits (LEO) that provide phase observations for positioning, such as those currently provided by GNSS constellations, has the potential to radically improve this scenario. In this work, a preliminary case study is discussed. For a given day, two configurations are analyzed: the first considers only the GNSS satellites currently in operation, while the second includes a simulated constellation of LEO satellites. For both configurations, the geometric quality of a PPP solution is evaluated over different session lengths throughout the day. The adopted quality index is the trace of the cofactor matrix of the estimated coordinates, commonly referred to as the position dilution of precision (PDOP). The simulated LEO constellation demonstrates the capability to enhance positioning performance, particularly under conditions of good sky visibility, where the time needed to obtain a reliable solution decreases significantly. Furthermore, even in scenarios with limited satellite visibility, the inclusion of LEO satellites helps to reduce PDOP values and overall convergence time. Full article
(This article belongs to the Special Issue Advances in GNSS Signal Processing and Navigation)
Show Figures

Figure 1

25 pages, 732 KiB  
Article
Accuracy-Aware MLLM Task Offloading and Resource Allocation in UAV-Assisted Satellite Edge Computing
by Huabing Yan, Hualong Huang, Zijia Zhao, Zhi Wang and Zitian Zhao
Drones 2025, 9(7), 500; https://doi.org/10.3390/drones9070500 - 16 Jul 2025
Viewed by 366
Abstract
This paper presents a novel framework for optimizing multimodal large language model (MLLM) inference through task offloading and resource allocation in UAV-assisted satellite edge computing (SEC) networks. MLLMs leverage transformer architectures to integrate heterogeneous data modalities for IoT applications, particularly real-time monitoring in [...] Read more.
This paper presents a novel framework for optimizing multimodal large language model (MLLM) inference through task offloading and resource allocation in UAV-assisted satellite edge computing (SEC) networks. MLLMs leverage transformer architectures to integrate heterogeneous data modalities for IoT applications, particularly real-time monitoring in remote areas. However, cloud computing dependency introduces latency, bandwidth, and privacy challenges, while IoT device limitations require efficient distributed computing solutions. SEC, utilizing low-earth orbit (LEO) satellites and unmanned aerial vehicles (UAVs), extends mobile edge computing to provide ubiquitous computational resources for remote IoTDs. We formulate the joint optimization of MLLM task offloading and resource allocation as a mixed-integer nonlinear programming (MINLP) problem, minimizing latency and energy consumption while optimizing offloading decisions, power allocation, and UAV trajectories. To address the dynamic SEC environment characterized by satellite mobility, we propose an action-decoupled soft actor–critic (AD-SAC) algorithm with discrete–continuous hybrid action spaces. The simulation results demonstrate that our approach significantly outperforms conventional deep reinforcement learning methods in convergence and system cost reduction compared to baseline algorithms. Full article
Show Figures

Figure 1

15 pages, 2538 KiB  
Article
Parallel Eclipse-Aware Routing on FPGA for SpaceWire-Based OBC in LEO Satellite Networks
by Jin Hyung Park, Heoncheol Lee and Myonghun Han
J. Sens. Actuator Netw. 2025, 14(4), 73; https://doi.org/10.3390/jsan14040073 - 15 Jul 2025
Viewed by 361
Abstract
Low Earth orbit (LEO) satellite networks deliver superior real-time performance and responsiveness compared to conventional satellite networks, despite technical and economic challenges such as high deployment costs and operational complexity. Nevertheless, rapid topology changes and severe energy constraints of LEO satellites make real-time [...] Read more.
Low Earth orbit (LEO) satellite networks deliver superior real-time performance and responsiveness compared to conventional satellite networks, despite technical and economic challenges such as high deployment costs and operational complexity. Nevertheless, rapid topology changes and severe energy constraints of LEO satellites make real-time routing a persistent challenge. In this paper, we employ field-programmable gate arrays (FPGAs) to overcome the resource limitations of on-board computers (OBCs) and to manage energy consumption effectively using the Eclipse-Aware Routing (EAR) algorithm, and we implement the K-Shortest Paths (KSP) algorithm directly on the FPGA. Our method first generates multiple routes from the source to the destination using KSP, then selects the optimal path based on energy consumption rate, eclipse duration, and estimated transmission load as evaluated by EAR. In large-scale LEO networks, the computational burden of KSP grows substantially as connectivity data become more voluminous and complex. To enhance performance, we accelerate complex computations in the programmable logic (PL) via pipelining and design a collaborative architecture between the processing system (PS) and PL, achieving approximately a 3.83× speedup compared to a PS-only implementation. We validate the feasibility of the proposed approach by successfully performing remote routing-table updates on the SpaceWire-based SpaceWire Brick MK4 network system. Full article
(This article belongs to the Section Communications and Networking)
Show Figures

Figure 1

23 pages, 3056 KiB  
Article
Methodology for Evaluating Collision Avoidance Maneuvers Using Aerodynamic Control
by Desiree González Rodríguez, Pedro Orgeira-Crespo, Jose M. Nuñez-Ortuño and Fernando Aguado-Agelet
Remote Sens. 2025, 17(14), 2437; https://doi.org/10.3390/rs17142437 - 14 Jul 2025
Viewed by 206
Abstract
The increasing congestion of low Earth orbit (LEO) has raised the need for efficient collision avoidance strategies, especially for CubeSats without propulsion systems. This study proposes a methodology for evaluating passive collision avoidance maneuvers using aerodynamic control via a satellite’s Attitude Determination and [...] Read more.
The increasing congestion of low Earth orbit (LEO) has raised the need for efficient collision avoidance strategies, especially for CubeSats without propulsion systems. This study proposes a methodology for evaluating passive collision avoidance maneuvers using aerodynamic control via a satellite’s Attitude Determination and Control System (ADCS). By adjusting orientation, the satellite modifies its exposed surface area, altering atmospheric drag and lift forces to shift its orbit. This new approach integrates atmospheric modeling (NRLMSISE-00), aerodynamic coefficient estimation using the ADBSat panel method, and orbital simulations in Systems Tool Kit (STK). The LUME-1 CubeSat mission is used as a reference case, with simulations at three altitudes (500, 460, and 420 km). Results show that attitude-induced drag modulation can generate significant orbital displacements—measured by Horizontal and Vertical Distance Differences (HDD and VDD)—sufficient to reduce collision risk. Compared to constant-drag models, the panel method offers more accurate, orientation-dependent predictions. While lift forces are minor, their inclusion enhances modeling fidelity. This methodology supports the development of low-resource, autonomous collision avoidance systems for future CubeSat missions, particularly in remote sensing applications where orbital precision is essential. Full article
(This article belongs to the Special Issue Advances in CubeSat Missions and Applications in Remote Sensing)
Show Figures

Figure 1

Back to TopTop