Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (513)

Search Parameters:
Keywords = low Earth orbit (LEO) satellite

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
21 pages, 3270 KB  
Article
Reliability Case Study of COTS Storage on the Jilin-1 KF Satellite: On-Board Operations, Failure Analysis, and Closed-Loop Management
by Chunjuan Zhao, Jianan Pan, Hongwei Sun, Xiaoming Li, Kai Xu, Yang Zhao and Lei Zhang
Aerospace 2026, 13(2), 116; https://doi.org/10.3390/aerospace13020116 - 24 Jan 2026
Viewed by 52
Abstract
In recent years, the rapid development of commercial satellite projects, such as low-Earth orbit (LEO) communication and remote sensing constellations, has driven the satellite industry toward low-cost, rapid development, and large-scale deployment. Commercial off-the-shelf (COTS) components have been widely adopted across various commercial [...] Read more.
In recent years, the rapid development of commercial satellite projects, such as low-Earth orbit (LEO) communication and remote sensing constellations, has driven the satellite industry toward low-cost, rapid development, and large-scale deployment. Commercial off-the-shelf (COTS) components have been widely adopted across various commercial satellite platforms due to their advantages of low cost, high performance, and plug-and-play availability. However, the space environment is complex and hostile. COTS components were not originally designed for such conditions, and they often lack systematically flight-verified protective frameworks, making their reliability issues a core bottleneck limiting their extensive application in critical missions. This paper focuses on COTS solid-state drives (SSDs) onboard the Jilin-1 KF satellite and presents a full-lifecycle reliability practice covering component selection, system design, on-orbit operation, and failure feedback. The core contribution lies in proposing a full-lifecycle methodology that integrates proactive design—including multi-module redundancy architecture and targeted environmental stress screening—with on-orbit data monitoring and failure cause analysis. Through fault tree analysis, on-orbit data mining, and statistical analysis, it was found that SSD failures show a significant correlation with high-energy particle radiation in the South Atlantic Anomaly region. Building on this key spatial correlation, the on-orbit failure mode was successfully reproduced via proton irradiation experiments, confirming the mechanism of radiation-induced SSD damage and providing a basis for subsequent model development and management decisions. The study demonstrates that although individual COTS SSDs exhibit a certain failure rate, reasonable design, protection, and testing can enhance the on-orbit survivability of storage systems using COTS components. More broadly, by providing a validated closed-loop paradigm—encompassing design, flight verification and feedback, and iterative improvement—we enable the reliable use of COTS components in future cost-sensitive, high-performance satellite missions, adopting system-level solutions to balance cost and reliability without being confined to expensive radiation-hardened products. Full article
(This article belongs to the Section Astronautics & Space Science)
17 pages, 2940 KB  
Article
Loss-Driven Design Methodology for MHz-Class GaN QSW Buck Converters with a PCB Air-Core Inductor in SWaP-Constrained Aerospace Applications
by Jinshu Lin, Hui Li, Shan Yin, Xi Liu, Chen Song, Honglang Zhang and Minghai Dong
Aerospace 2026, 13(1), 105; https://doi.org/10.3390/aerospace13010105 - 21 Jan 2026
Viewed by 69
Abstract
Aerospace power systems, including satellites in low earth orbit (LEO) and geostationary earth orbit (GEO), face stringent thermal constraints to minimize size, weight, and power (SWaP). Gallium nitride (GaN) devices offer superior radiation hardness—critical for the harsh space environment—and MHz-level switching capabilities. This [...] Read more.
Aerospace power systems, including satellites in low earth orbit (LEO) and geostationary earth orbit (GEO), face stringent thermal constraints to minimize size, weight, and power (SWaP). Gallium nitride (GaN) devices offer superior radiation hardness—critical for the harsh space environment—and MHz-level switching capabilities. This high-frequency operation minimizes passive components, particularly magnetics, thereby reducing the overall volume. However, above 10 MHz, magnetic cores become impractical due to material limitations. To address these issues, this article proposes a design methodology for a GaN-based quasi-square-wave (QSW) buck converter integrated with a PCB air-core inductor. First, the impact of the switching frequency and dead time on the zero-voltage switching (ZVS) condition is elaborated. Then, a power loss model accounting for various loss mechanisms is presented. To overcome high GaN body diode reverse conduction loss, an auxiliary diode is employed. Based on the model, a design procedure is developed to optimize the inductor for 10 MHz operation while maximizing efficiency. To validate the design, a 28 V/12 V, 18 W prototype was built and tested. Experimental results demonstrate a peak efficiency of 86.5% at 10 MHz. The auxiliary diode improves efficiency by 4%, verifying reverse conduction loss mitigation. Thermal analysis confirms a full-load case temperature of 62.2 °C, providing a 47.8 °C safety margin compliant with aerospace derating standards. These findings validate the solution for high-frequency, space-constrained aerospace applications. Full article
Show Figures

Figure 1

24 pages, 5472 KB  
Article
GRACE-FO Real-Time Precise Orbit Determination Using Onboard GPS and Inter-Satellite Ranging Measurements with Quality Control Strategy
by Shengjian Zhong, Xiaoya Wang, Min Li, Jungang Wang, Peng Luo, Yabo Li and Houxiang Zhou
Remote Sens. 2026, 18(2), 351; https://doi.org/10.3390/rs18020351 - 20 Jan 2026
Viewed by 105
Abstract
Real-Time Precise Orbit Determination (RTPOD) of Low Earth Orbit (LEO) satellites relies primarily on onboard GNSS observations and may suffer from degraded performance when observation geometry weakens or tracking conditions deteriorate within satellite formations. To enhance the robustness and accuracy of RTPOD under [...] Read more.
Real-Time Precise Orbit Determination (RTPOD) of Low Earth Orbit (LEO) satellites relies primarily on onboard GNSS observations and may suffer from degraded performance when observation geometry weakens or tracking conditions deteriorate within satellite formations. To enhance the robustness and accuracy of RTPOD under such conditions, a cooperative Extended Kalman Filter (EKF) framework that fuses onboard GNSS and inter-satellite link (ISL) range measurements is established, integrated with an iterative Detection, Identification, and Adaptation (DIA) quality control algorithm. By introducing high-precision ISL range measurements, the strategy increases observation redundancy, improves the effective observation geometry, and provides strong relative position constraints among LEO satellites. This constraint strengthens solution stability and convergence, while simultaneously enhancing the sensitivity of the DIA-based quality control to observation outliers. The proposed strategy is validated in a simulated real-time environment using Centre National d’Etudes Spatiales (CNES) real-time products and onboard observations of the GRACE-FO mission. The results demonstrate comprehensive performance enhancements for both satellites over the experimental period. For the GRACE-D satellite, which suffers from about 17% data loss and a cycle slip ratio several times higher than that of GRACE-C, the mean orbit accuracy improves by 39% (from 13.1 cm to 8.0 cm), and the average convergence time is shortened by 44.3%. In comparison, the GRACE-C satellite achieves a 4.2% mean accuracy refinement and a 1.3% reduction in convergence time. These findings reveal a cooperative stabilization mechanism, where the high-precision spatiotemporal reference is transferred from the robust node to the degraded node via inter-satellite range measurements. This study demonstrates the effectiveness of the proposed method in enhancing the robustness and stability of formation orbit determination and provides algorithmic validation for future RTPOD of LEO satellite formations or large-scale constellations. Full article
Show Figures

Figure 1

40 pages, 3201 KB  
Article
Scalable Satellite-Assisted Adaptive Federated Learning for Robust Precision Farming
by Sai Puppala and Koushik Sinha
Agronomy 2026, 16(2), 229; https://doi.org/10.3390/agronomy16020229 - 18 Jan 2026
Viewed by 175
Abstract
Dynamic network conditions in precision agriculture motivate a scalable, privacypreserving federated learning architecture that tightly integrates ground-based edge intelligence with a space-assisted hierarchical aggregation layer. In Phase 1, heterogeneous tractors act as intelligent farm nodes that train local models, form capability- and task-aware [...] Read more.
Dynamic network conditions in precision agriculture motivate a scalable, privacypreserving federated learning architecture that tightly integrates ground-based edge intelligence with a space-assisted hierarchical aggregation layer. In Phase 1, heterogeneous tractors act as intelligent farm nodes that train local models, form capability- and task-aware clusters, and employ Network Quality Index (NQI)-driven scheduling, similarity-based checkpointing, and compressed transmissions to cope with highly variable 3G/4G/5G connectivity. In Phase 2, cluster drivers synchronize with Low Earth Orbit (LEO) and Geostationary Earth Orbit (GEO) satellites that perform regional and global aggregation using staleness- and fairness-aware weighting, while end-to-end Salsa20 + MAC encryption preserves the confidentiality and integrity of all model updates. Across two representative tasks—nutrient prediction and crop health assessment—our full hierarchical system matches or exceeds centralized performance (e.g., AUC 0.92 vs. 0.91 for crop health) while reducing uplink traffic by ∼90% relative to vanilla FedAvg and cutting the communication energy proxy by more than 4×. The proposed fairness-aware GEO aggregation substantially narrows regional performance gaps (standard deviation of AUC across regions reduced from 0.058 to 0.017) and delivers the largest gains in low-connectivity areas (AUC 0.74 → 0.88). These results demonstrate that coupling on-farm intelligence with multi-orbit federated aggregation enables near-centralized model quality, strong privacy guarantees, and communication efficiency suitable for large-scale, connectivity-challenged agricultural deployments. Full article
(This article belongs to the Collection AI, Sensors and Robotics for Smart Agriculture)
Show Figures

Figure 1

16 pages, 1725 KB  
Article
A Reinforcement Learning-Based Link State Optimization for Handover and Link Duration Performance Enhancement in Low Earth Orbit Satellite Networks
by Sihwa Jin, Doyeon Park, Sieun Kim, Jinho Lee and Inwhee Joe
Electronics 2026, 15(2), 398; https://doi.org/10.3390/electronics15020398 - 16 Jan 2026
Viewed by 211
Abstract
This study proposes a reinforcement learning-based link selection method for Low Earth Orbit satellite networks, aiming to reduce handover frequency while extending link duration under highly dynamic orbital environments. The proposed approach relies solely on basic satellite positional information, namely latitude, longitude, and [...] Read more.
This study proposes a reinforcement learning-based link selection method for Low Earth Orbit satellite networks, aiming to reduce handover frequency while extending link duration under highly dynamic orbital environments. The proposed approach relies solely on basic satellite positional information, namely latitude, longitude, and altitude, to construct compact state representations without requiring complex sensing or prediction mechanisms. Using relative satellite and terminal geometry, each state is represented as a vector consisting of azimuth, elevation, range, and direction difference. To validate the feasibility of policy learning under realistic conditions, a total of 871,105 orbit based data samples were generated through simulations of 300 LEO satellite orbits. The reinforcement learning environment was implemented using the OpenAI Gym framework, in which an agent selects an optimal communication target from a prefiltered set of candidate satellites at each time step. Three reinforcement learning algorithms, namely SARSA, Q-Learning, and Deep Q-Network, were evaluated under identical experimental conditions. Performance was assessed in terms of smoothed total reward per episode, average handover count, and average link duration. The results show that the Deep Q-Network-based approach achieves approximately 77.4% fewer handovers than SARSA and 49.9% fewer than Q-Learning, while providing the longest average link duration. These findings demonstrate that effective handover control can be achieved using lightweight state information and indicate the potential of deep reinforcement learning for future LEO satellite communication systems. Full article
Show Figures

Figure 1

36 pages, 2298 KB  
Review
Onboard Deployment of Remote Sensing Foundation Models: A Comprehensive Review of Architecture, Optimization, and Hardware
by Hanbo Sang, Limeng Zhang, Tianrui Chen, Weiwei Guo and Zenghui Zhang
Remote Sens. 2026, 18(2), 298; https://doi.org/10.3390/rs18020298 - 16 Jan 2026
Viewed by 253
Abstract
With the rapid growth of multimodal remote sensing (RS) data, there is an increasing demand for intelligent onboard computing to alleviate the transmission and latency bottlenecks of traditional orbit-to-ground downlinking workflows. While many lightweight AI algorithms have been widely developed and deployed for [...] Read more.
With the rapid growth of multimodal remote sensing (RS) data, there is an increasing demand for intelligent onboard computing to alleviate the transmission and latency bottlenecks of traditional orbit-to-ground downlinking workflows. While many lightweight AI algorithms have been widely developed and deployed for onboard inference, their limited generalization capability restricts performance under the diverse and dynamic conditions of advanced Earth observation. Recent advances in remote sensing foundation models (RSFMs) offer a promising solution by providing pretrained representations with strong adaptability across diverse tasks and modalities. However, the deployment of RSFMs onboard resource-constrained devices such as nano satellites remains a significant challenge due to strict limitations in memory, energy, computation, and radiation tolerance. To this end, this review proposes the first comprehensive survey of onboard RSFMs deployment, where a unified deployment pipeline including RSFMs development, model compression techniques, and hardware optimization is introduced and surveyed in detail. Available hardware platforms are also discussed and compared, based on which some typical case studies for low Earth orbit (LEO) CubeSats are presented to analyze the feasibility of onboard RSFMs’ deployment. To conclude, this review aims to serve as a practical roadmap for future research on the deployment of RSFMs on edge devices, bridging the gap between the large-scale RSFMs and the resource constraints of spaceborne platforms for onboard computing. Full article
Show Figures

Graphical abstract

13 pages, 3377 KB  
Article
Clock Synchronization with Kuramoto Oscillators for Space Systems
by Nathaniel Ristoff, Hunter Kettering and James Camparo
Time Space 2026, 2(1), 1; https://doi.org/10.3390/timespace2010001 - 15 Jan 2026
Viewed by 101
Abstract
As space systems evolve towards cis-lunar missions and beyond, the demand for precise yet low-size, -weight, and -power (SWaP) clocks and synchronization methods becomes increasingly critical. We introduce a novel clock synchronization approach based on the Kuramoto oscillator model that facilitates the creation [...] Read more.
As space systems evolve towards cis-lunar missions and beyond, the demand for precise yet low-size, -weight, and -power (SWaP) clocks and synchronization methods becomes increasingly critical. We introduce a novel clock synchronization approach based on the Kuramoto oscillator model that facilitates the creation of an ensemble timescale for satellite constellations. Unlike traditional ensembling algorithms, the proposed Kuramoto method leverages nearest-neighbor interactions to achieve collective synchronization. This method simplifies the communication architecture and data-sharing requirements, making it well suited for dynamically connected networks such as proliferated low Earth orbit (pLEO) and lunar or Martian constellations, where intersatellite links may frequently change. Through simulations incorporating realistic noise models for small-scale atomic clocks, we demonstrate that the Kuramoto ensemble can yield an improvement in stability on the order of 1/√N, while mitigating the impact of constellation fragmentation and defragmentation. The results indicate that the Kuramoto oscillator-based algorithm can potentially deliver performance comparable to established techniques like Equal Weights Frequency Averaging (EWFA), yet with enhanced scalability and resource efficiency critical for future spaceborne PNT and communication systems. Full article
Show Figures

Figure 1

10 pages, 233 KB  
Proceeding Paper
Artificial Intelligence in Satellite Network Defense: Architectures, Threats, and Security Protocols
by Rumen Doynov, Maksim Sharabov, Georgi Tsochev and Samiha Ayed
Eng. Proc. 2026, 121(1), 7; https://doi.org/10.3390/engproc2025121007 - 13 Jan 2026
Viewed by 278
Abstract
This paper examines the application of Artificial Intelligence (AI) to protect satellite communication networks, focusing on the identification and prevention of cyber threats. With the rapid development of the commercial space sector, the importance of effective cyber defense has grown due to the [...] Read more.
This paper examines the application of Artificial Intelligence (AI) to protect satellite communication networks, focusing on the identification and prevention of cyber threats. With the rapid development of the commercial space sector, the importance of effective cyber defense has grown due to the increasing dependence of global infrastructure on satellite technologies. The study applies a structured comparative analysis of AI methods across three main satellite architectures: geostationary (GEO), low Earth orbit (LEO), and hybrid systems. The methodology is based on guiding research question and evaluates representative AI algorithms in the context of specific threat scenarios, including jamming, spoofing, DDoS attacks, and signal interception. Real-world cases such as the KA-SAT AcidRain attack and reported Starlink jamming in Ukraine, as well as experimental demonstrations of RL-based anti-jamming and GNN/DQN routing, are used to provide evidence of practical applicability. The results highlight both the potential and limitations of AI solutions, showing measurable improvements in detection accuracy, throughput, latency reduction, and resilience under interference. Architectural approaches for integrating AI into satellite security are presented, and their effectiveness, trade-offs, and deployment feasibility are discussed. Full article
18 pages, 964 KB  
Article
Stacked Intelligent Metasurfaces: Key Technologies, Scenario Adaptation, and Future Directions
by Jiayi Liu and Jiacheng Kong
Electronics 2026, 15(2), 274; https://doi.org/10.3390/electronics15020274 - 7 Jan 2026
Viewed by 355
Abstract
The advent of sixth-generation (6G) imposes stringent demands on wireless networks, while traditional 2D rigid reconfigurable intelligent surfaces (RISs) face bottlenecks in regulatory freedom and scenario adaptability. To address this, stacked intelligent metasurfaces (SIMs) have emerged. This paper presents a systematic review of [...] Read more.
The advent of sixth-generation (6G) imposes stringent demands on wireless networks, while traditional 2D rigid reconfigurable intelligent surfaces (RISs) face bottlenecks in regulatory freedom and scenario adaptability. To address this, stacked intelligent metasurfaces (SIMs) have emerged. This paper presents a systematic review of SIM technology. It first elaborates on the SIM multi-layer stacked architecture and wave-domain signal-processing principles, which overcome the spatial constraints of conventional RISs. Then, it analyzes challenges, including beamforming and channel estimation for SIM, and explores its application prospects in key 6G scenarios such as integrated sensing and communication (ISAC), low earth orbit (LEO) satellite communication, semantic communication, and UAV communication, as well as future trends like integration with machine learning and nonlinear devices. Finally, it summarizes the open challenges in low-complexity design, modeling and optimization, and performance evaluation, aiming to provide insights to promote the large-scale adoption of SIM in next-generation wireless communications. Full article
Show Figures

Figure 1

25 pages, 4082 KB  
Article
Statistical CSI-Based Downlink Precoding for Multi-Beam LEO Satellite Communications
by Feng Zhu, Yunfei Wang, Ziyu Xiang and Xiqi Gao
Aerospace 2026, 13(1), 60; https://doi.org/10.3390/aerospace13010060 - 7 Jan 2026
Viewed by 173
Abstract
With the rapid development of low-Earth-orbit (LEO) satellite communications, multi-beam precoding has emerged as a key technology for improving spectrum efficiency. However, the long propagation delay and large Doppler frequency offset pose significant challenges to existing precoding techniques. To address this issue, this [...] Read more.
With the rapid development of low-Earth-orbit (LEO) satellite communications, multi-beam precoding has emerged as a key technology for improving spectrum efficiency. However, the long propagation delay and large Doppler frequency offset pose significant challenges to existing precoding techniques. To address this issue, this paper investigates downlink precoding design for multi-beam LEO satellite communications. First, the downlink channel and signal models are established. Then, we reveal that traditional zero-forcing (ZF), regularized zero-forcing (RZF), and minimum mean square error (MMSE) precoding schemes all require the satellite transmitter to acquire the instantaneous channel state information (iCSI) of all users, which is challenging to obtain in satellite communication systems. Subsequently, we propose a downlink precoding design based on statistical channel state information (sCSI) and derive closed-form solutions for statistical-ZF, statistical-RZF, and statistical-MMSE precoding. Furthermore, we propose that sCSI can be computed using the positions of the satellite and users, which reduces the system overhead and complexity of sCSI acquisition. Monte Carlo simulations under the 3GPP non-terrestrial network (NTN) channel model are employed to verify the performance of the proposed method. The simulation results show that the proposed method achieves sum-rate performance comparable to that of iCSI-based schemes and the optimal transmission performance based on sum-rate maximization. In addition, the proposed method significantly reduces the computational complexity of the satellite payload and the system feedback overhead. Full article
(This article belongs to the Section Astronautics & Space Science)
Show Figures

Figure 1

21 pages, 11735 KB  
Article
Low-Thrust Transfer Method for Full Orbital Element Convergence Using J2 Precession
by Zhengqing Fang, Roberto Armellin and Yingkai Cai
Astronautics 2026, 1(1), 4; https://doi.org/10.3390/astronautics1010004 - 5 Jan 2026
Viewed by 268
Abstract
Low-thrust propulsion systems have become mainstream for Low Earth Orbit (LEO) satellites due to their superior propellant efficiency, yet conventional low-thrust transfer strategies suffer from high computational costs and failure to achieve full orbital element convergence. To address these drawbacks, this paper proposes [...] Read more.
Low-thrust propulsion systems have become mainstream for Low Earth Orbit (LEO) satellites due to their superior propellant efficiency, yet conventional low-thrust transfer strategies suffer from high computational costs and failure to achieve full orbital element convergence. To address these drawbacks, this paper proposes a novel semi-analytical three-phase low-thrust transfer strategy that leverages J2 gravitational precession to realize convergence of all orbital elements for circular orbits. The core of the method lies in the design of two symmetric thrust arcs and an intermediate coasting period that utilizes J2 precession. By solving the resulting polynomial equation, the strategy achieves simultaneous controlled convergence of the Right Ascension of the Ascending Node (RAAN) and the argument of latitude (AOL). Simulation results demonstrate that the proposed method achieves significant fuel savings compared to direct transfer strategies, while simultaneously achieving superior computational speed. Extensive validation via 100,000 Monte Carlo simulations confirms the method’s scope of applicability, and the sufficient conditions for the existence of a solution are provided. It is further found that the proposed method is particularly well-suited for missions involving medium-to-high inclination orbits and large RAAN gaps, such as constellation deployment. In conclusion, this strategy provides a fuel-efficient and computationally fast solution for low-thrust transfer, establishing the basis for the operational management of future large-scale space systems equipped with low-thrust propulsion. Full article
(This article belongs to the Special Issue Feature Papers on Spacecraft Dynamics and Control)
Show Figures

Figure 1

19 pages, 2392 KB  
Review
Low Internet Penetration in Sub-Saharan Africa and the Role of LEO Satellites in Addressing the Issue
by Olabisi Falowo and Samuel Falowo
Telecom 2026, 7(1), 7; https://doi.org/10.3390/telecom7010007 - 5 Jan 2026
Viewed by 451
Abstract
Sub-Saharan Africa (SSA), with an estimated population of 1.243 billion people as of December 2024, had the lowest mobile Internet penetration in the world at 29%, significantly below the global average of 58%. Moreover, SSA also had the lowest mobile data traffic per [...] Read more.
Sub-Saharan Africa (SSA), with an estimated population of 1.243 billion people as of December 2024, had the lowest mobile Internet penetration in the world at 29%, significantly below the global average of 58%. Moreover, SSA also had the lowest mobile data traffic per active smartphone, averaging 5 GB per month—about a quarter of the global average of 19 GB per month in 2024. This paper analyses the factors responsible for the low Internet penetration in SSA, which include limited Internet service availability, Internet device and service affordability, digital ability, government regulation and policy, and deficit of network-supporting infrastructure. The paper then discusses the popular Internet access networks in SSA and their limitations. It presents low Earth orbit (LEO) satellites as a possible access network for enhancing Internet penetration in SSA, giving examples of LEO network service deployment in some SSA countries. The paper discusses the feasible business models for LEO satellite Internet services in SSA, the challenges to LEO satellite service penetration, and possible solutions. Full article
Show Figures

Figure 1

18 pages, 3518 KB  
Article
A Scalable Solution for Node Mobility Problems in NDN-Based Massive LEO Constellations
by Miguel Rodríguez Pérez, Sergio Herrería Alonso, José Carlos López Ardao and Andrés Suárez González
Sensors 2026, 26(1), 309; https://doi.org/10.3390/s26010309 - 3 Jan 2026
Viewed by 372
Abstract
In recent years, there has been increasing investment in the deployment of massive commercial Low Earth Orbit (LEO) constellations to provide global Internet connectivity. These constellations, now equipped with inter-satellite links, can serve as low-latency Internet backbones, requiring LEO satellites to act not [...] Read more.
In recent years, there has been increasing investment in the deployment of massive commercial Low Earth Orbit (LEO) constellations to provide global Internet connectivity. These constellations, now equipped with inter-satellite links, can serve as low-latency Internet backbones, requiring LEO satellites to act not only as access nodes for ground stations, but also as in-orbit core routers. Due to their high velocity and the resulting frequent handovers of ground gateways, LEO networks highly stress mobility procedures at both the sender and receiver endpoints. On the other hand, a growing trend in networking is the use of technologies based on the Information Centric Networking (ICN) paradigm for servicing IoT networks and sensor networks in general, as its addressing, storage, and security mechanisms are usually a good match for IoT needs. Furthermore, ICN networks possess additional characteristics that are beneficial for the massive LEO scenario. For instance, the mobility of the receiver is helped by the inherent data-forwarding procedures in their architectures. However, the mobility of the senders remains an open problem. This paper proposes a comprehensive solution to the mobility problem for massive LEO constellations using the Named-Data Networking (NDN) architecture, as it is probably the most mature ICN proposal. Our solution includes a scalable method to relate content to ground gateways and a way to address traffic to the gateway that does not require cooperation from the network routing algorithm. Moreover, our solution works without requiring modifications to the actual NDN protocol itself, so it is easy to test and deploy. Our results indicate that, for long enough handover lengths, traffic losses are negligible even for ground stations with just one satellite in sight. Full article
(This article belongs to the Special Issue Future Wireless Communication Networks: 3rd Edition)
Show Figures

Figure 1

27 pages, 3061 KB  
Article
LEO Satellite and UAV-Assisted Maritime Internet of Things: Modeling and Performance Analysis for Data Acquisition
by Xu Hu, Bin Lin, Ping Wang and Xiao Lu
Future Internet 2026, 18(1), 24; https://doi.org/10.3390/fi18010024 - 1 Jan 2026
Viewed by 260
Abstract
The integration of low Earth orbit (LEO) satellites and unmanned aerial vehicles (UAVs) into the maritime Internet of Things (MIoT) offers an effective solution to overcoming the limitations of connectivity and transmission reliability in conventional MIoT, thereby supporting marine data acquisition. However, the [...] Read more.
The integration of low Earth orbit (LEO) satellites and unmanned aerial vehicles (UAVs) into the maritime Internet of Things (MIoT) offers an effective solution to overcoming the limitations of connectivity and transmission reliability in conventional MIoT, thereby supporting marine data acquisition. However, the highly dynamic ocean environment necessitates a theoretical framework for system-level performance evaluation before practical deployment. In this article, we consider a LEO satellite and UAV-assisted MIoT (LSU-MIoT) network and develop an analytical framework to evaluate its transmission performance. Specifically, marine devices and relaying buoys are modeled as a Matérn cluster process on the sea surface, UAVs as a homogeneous Poisson point process, and LEO satellites as a spherical Poisson point process. Signal transmissions over marine, aerial, and space links are characterized by Nakagami-m, Rician, and shadowed Rician fading, respectively, with the two-ray path loss model applied to sea and air links for accurately capturing propagation characteristics. By leveraging stochastic geometry, we derive analytical expressions for transmission success probability and end-to-end delay of regular and emergency data under the time division multiple access and non-orthogonal multiple access schemes. Simulation results validate the accuracy of derived expressions and reveal the impact of key parameters on the performance of LSU-MIoT networks. Full article
(This article belongs to the Special Issue Wireless Sensor Networks and Internet of Things)
Show Figures

Graphical abstract

21 pages, 1330 KB  
Article
A Clustering and Reinforcement Learning-Based Handover Strategy for LEO Satellite Networks in Power IoT Scenarios
by Jin Shao, Weidong Gao, Kuixing Liu, Rantong Qiao, Haizhi Yu, Kaisa Zhang, Xu Zhao and Junbao Duan
Electronics 2026, 15(1), 174; https://doi.org/10.3390/electronics15010174 - 30 Dec 2025
Viewed by 243
Abstract
Communication infrastructure in remote areas struggles to deliver stable, high-quality services for power systems. Low Earth Orbit (LEO) satellite networks offer an effective solution through their low latency and extensive coverage. Nevertheless, the high orbital velocity of LEO satellites combined with massive user [...] Read more.
Communication infrastructure in remote areas struggles to deliver stable, high-quality services for power systems. Low Earth Orbit (LEO) satellite networks offer an effective solution through their low latency and extensive coverage. Nevertheless, the high orbital velocity of LEO satellites combined with massive user access frequently leads to signaling congestion and degradation of service quality. To address these challenges, this paper proposes a LEO satellite handover strategy based on Quality of Service (QoS)-constrained K-Means clustering and Deep Q-Network (DQN) learning. The proposed framework first partitions users into groups via the K-Means algorithm and then imposes an intra-group QoS fairness constraint to refine clustering and designate a cluster head for each group. These cluster heads act as proxies that execute unified DQN-driven handover decisions on behalf of all group members, thereby enabling coordinated multi-user handover. Simulation results demonstrate that, compared with conventional handover schemes, the proposed strategy achieves an optimal balance between performance and signaling overhead, significantly enhances system scalability while ensuring long-term QoS gains, and provides an efficient solution for mobility management in future large-scale LEO satellite networks. Full article
Show Figures

Figure 1

Back to TopTop