Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (60)

Search Parameters:
Keywords = LEO satellite internet

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
21 pages, 2309 KiB  
Review
A Comprehensive Review of Satellite Orbital Placement and Coverage Optimization for Low Earth Orbit Satellite Networks: Challenges and Solutions
by Adel A. Ahmed
Network 2025, 5(3), 32; https://doi.org/10.3390/network5030032 - 20 Aug 2025
Viewed by 217
Abstract
Nowadays, internet connectivity suffers from instability and slowness due to optical fiber cable attacks across the seas and oceans. The optimal solution to this problem is using the Low Earth Orbit (LEO) satellite network, which can resolve the problem of internet connectivity and [...] Read more.
Nowadays, internet connectivity suffers from instability and slowness due to optical fiber cable attacks across the seas and oceans. The optimal solution to this problem is using the Low Earth Orbit (LEO) satellite network, which can resolve the problem of internet connectivity and reachability, and it has the power to bring real-time, reliable, low-latency, high-bandwidth, cost-effective internet access to many urban and rural areas in any region of the Earth. However, satellite orbital placement (SOP) and navigation should be carefully designed to reduce signal impairments. The challenges of orbital satellite placement for LEO include constellation development, satellite parameter optimization, bandwidth optimization, consideration of signal impairment, and coverage optimization. This paper presents a comprehensive review of SOP and coverage optimization, examines prevalent issues affecting LEO internet connectivity, evaluates existing solutions, and proposes novel solutions to address these challenges. Furthermore, it recommends a machine learning solution for coverage optimization and SOP that can be used to efficiently enhance internet reliability and reachability for LEO satellite networks. This survey will open the gate for developing an optimal solution for global internet connectivity and reachability. Full article
Show Figures

Figure 1

18 pages, 6788 KiB  
Review
Weather Forecasting Satellites—Past, Present, & Future
by Etai Nardi, Ohad Cohen, Yosef Pinhasi, Motti Haridim and Jacob Gavan
Information 2025, 16(8), 677; https://doi.org/10.3390/info16080677 - 8 Aug 2025
Viewed by 340
Abstract
Climate change has made weather more erratic and unpredictable. As a result, a growing need to develop more reliable short-term weather prediction models paved the way for a new era in satellite instrumentation technology, where radar systems for meteorological applications became critically important. [...] Read more.
Climate change has made weather more erratic and unpredictable. As a result, a growing need to develop more reliable short-term weather prediction models paved the way for a new era in satellite instrumentation technology, where radar systems for meteorological applications became critically important. This paper presents a comprehensive review of the evolution of weather forecasting satellites. We trace the technological development from the early weather and climate monitoring systems of the 1960s. Since the use of stabilized TV camera platforms on satellites aimed at capturing cloud cover data and storing it on magnetic tape for later readout and transmission back to ground stations, satellite sensor instrument technologies took great strides in the following decades, incorporating advancements in image and signal processing into satellite imagery methodologies. As innovative as they were, these technologies still lacked the capabilities needed to allow for practical use cases other than scientific research. The paper further examines how the next phase of satellite platforms is aimed at addressing this technological gap by leveraging the advantages of low Earth orbit (LEO) based satellite constellation deployments for near-real-time tracking of atmospheric hydrometers and precipitation profiles through innovative methods. These methods involve combining the collected data into big-data lakes on internet cloud platforms and constructing innovative AI-based multi-layered weather prediction models specifically tailored to remote sensing. Finally, we discuss how these recent advancements form the basis for new applications in aviation, severe weather readiness, energy, agriculture, and beyond. Full article
(This article belongs to the Special Issue Sensing and Wireless Communications)
Show Figures

Figure 1

21 pages, 4738 KiB  
Article
Research on Computation Offloading and Resource Allocation Strategy Based on MADDPG for Integrated Space–Air–Marine Network
by Haixiang Gao
Entropy 2025, 27(8), 803; https://doi.org/10.3390/e27080803 - 28 Jul 2025
Viewed by 419
Abstract
This paper investigates the problem of computation offloading and resource allocation in an integrated space–air–sea network based on unmanned aerial vehicle (UAV) and low Earth orbit (LEO) satellites supporting Maritime Internet of Things (M-IoT) devices. Considering the complex, dynamic environment comprising M-IoT devices, [...] Read more.
This paper investigates the problem of computation offloading and resource allocation in an integrated space–air–sea network based on unmanned aerial vehicle (UAV) and low Earth orbit (LEO) satellites supporting Maritime Internet of Things (M-IoT) devices. Considering the complex, dynamic environment comprising M-IoT devices, UAVs and LEO satellites, traditional optimization methods encounter significant limitations due to non-convexity and the combinatorial explosion in possible solutions. A multi-agent deep deterministic policy gradient (MADDPG)-based optimization algorithm is proposed to address these challenges. This algorithm is designed to minimize the total system costs, balancing energy consumption and latency through partial task offloading within a cloud–edge-device collaborative mobile edge computing (MEC) system. A comprehensive system model is proposed, with the problem formulated as a partially observable Markov decision process (POMDP) that integrates association control, power control, computing resource allocation, and task distribution. Each M-IoT device and UAV acts as an intelligent agent, collaboratively learning the optimal offloading strategies through a centralized training and decentralized execution framework inherent in the MADDPG. The numerical simulations validate the effectiveness of the proposed MADDPG-based approach, which demonstrates rapid convergence and significantly outperforms baseline methods, and indicate that the proposed MADDPG-based algorithm reduces the total system cost by 15–60% specifically. Full article
(This article belongs to the Special Issue Space-Air-Ground-Sea Integrated Communication Networks)
Show Figures

Figure 1

20 pages, 5343 KiB  
Article
System-Level Assessment of Ka-Band Digital Beamforming Receivers and Transmitters Implementing Large Thinned Antenna Array for Low Earth Orbit Satellite Communications
by Giovanni Lasagni, Alessandro Calcaterra, Monica Righini, Giovanni Gasparro, Stefano Maddio, Vincenzo Pascale, Alessandro Cidronali and Stefano Selleri
Sensors 2025, 25(15), 4645; https://doi.org/10.3390/s25154645 - 26 Jul 2025
Viewed by 455
Abstract
In this paper, we present a system-level model of a digital multibeam antenna designed for Low Earth Orbit satellite communications operating in the Ka-band. We initially develop a suitable array topology, which is based on a thinned lattice, then adopt it as the [...] Read more.
In this paper, we present a system-level model of a digital multibeam antenna designed for Low Earth Orbit satellite communications operating in the Ka-band. We initially develop a suitable array topology, which is based on a thinned lattice, then adopt it as the foundation for evaluating its performance within a digital beamforming architecture. This architecture is implemented in a system-level simulator to evaluate the performance of the transmitter and receiver chains. This study advances the analysis of the digital antennas by incorporating both the RF front-end and digital sections non-idealities into a digital-twin framework. This approach enhances the designer’s ability to optimize the system with a holistic approach and provides insights into how various impairments affect the transmitter and receiver performance, identifying the subsystems’ parameter limits. To achieve this, we analyze several subsystems’ parameters and impairments, assessing their effects on both the antenna radiation and quality of the transmitted and received signals in a real applicative context. The results of this study reveal the sensitivity of the system to the impairments and suggest strategies to trade them off, emphasizing the importance of selecting appropriate subsystem features to optimize overall system performance. Full article
Show Figures

Figure 1

21 pages, 6801 KiB  
Article
Performance Evaluation of a High-Gain Axisymmetric Minkowski Fractal Reflectarray for Ku-Band Satellite Internet Communication
by Prabhat Kumar Patnaik, Harish Chandra Mohanta, Dhruba Charan Panda, Ribhu Abhusan Panda, Malijeddi Murali and Heba G. Mohamed
Fractal Fract. 2025, 9(7), 421; https://doi.org/10.3390/fractalfract9070421 - 27 Jun 2025
Viewed by 636
Abstract
In this article, a high-gain axisymmetric Minkowski fractal reflectarray is designed and fabricated for Ku-Band satellite internet communications. High gain is achieved here by carefully optimising the number of unit cells, their shape modifier, focal length, feed position and scan angle. The space-filling [...] Read more.
In this article, a high-gain axisymmetric Minkowski fractal reflectarray is designed and fabricated for Ku-Band satellite internet communications. High gain is achieved here by carefully optimising the number of unit cells, their shape modifier, focal length, feed position and scan angle. The space-filling properties of Minkowski fractals help in miniaturising the fractal. The scan angle of the reflectarray varied by adjusting the fractal scaling factor for each unit cell in the array. The reflectarray is symmetric along the X-axis in its design and configuration. Initially, a Minkowski fractal unit cell is designed using iteration-1 in the simulation software. Then, its design parameters are optimised to achieve high gain, a narrow beam, and beam scan capabilities. The sensitivity of design parameters is examined individually using the array synthesis method to achieve these performance parameters. It helps to establish the maximum range of design and performance parameters for this design. The proposed reflectarray resonates at 12 GHz, achieving a gain of over 20 dB and a narrow beamwidth of less than 15 degrees. Finally, the designed fractal reflectarray is tested in real-time simulation environments using MATLAB R2023b, and its performance is evaluated in an interference scenario involving LEO and MEO satellites, as well as a ground station, under various time conditions. For real-world applicability, it is necessary to identify, analyse, and mitigate the unwanted interference signals that degrade the desired satellite signal. The proposed reflectarray, with its performance characteristics and beam scanning capabilities, is found to be an excellent choice for Ku-band satellite internet communications. Full article
Show Figures

Figure 1

23 pages, 2431 KiB  
Article
SatScope: A Data-Driven Simulator for Low-Earth-Orbit Satellite Internet
by Qichen Wang, Guozheng Yang, Yongyu Liang, Chiyu Chen, Qingsong Zhao and Sugai Chen
Future Internet 2025, 17(7), 278; https://doi.org/10.3390/fi17070278 - 24 Jun 2025
Viewed by 535
Abstract
The rapid development of low-Earth-orbit (LEO) satellite constellations has not only provided global users with low-latency and unrestricted high-speed data services but also presented researchers with the challenge of understanding dynamic changes in global network behavior. Unlike geostationary satellites and terrestrial internet infrastructure, [...] Read more.
The rapid development of low-Earth-orbit (LEO) satellite constellations has not only provided global users with low-latency and unrestricted high-speed data services but also presented researchers with the challenge of understanding dynamic changes in global network behavior. Unlike geostationary satellites and terrestrial internet infrastructure, LEO satellites move at a relative velocity of 7.6 km/s, leading to frequent alterations in their connectivity status with ground stations. Given the complexity of the space environment, current research on LEO satellite internet primarily focuses on modeling and simulation. However, existing LEO satellite network simulators often overlook the global network characteristics of these systems. We present SatScope, a data-driven simulator for LEO satellite internet. SatScope consists of three main components, space segment modeling, ground segment modeling, and network simulation configuration, providing researchers with an interface to interact with these models. Utilizing both space and ground segment models, SatScope can configure various network topology models, routing algorithms, and load balancing schemes, thereby enabling the evaluation of optimization algorithms for LEO satellite communication systems. We also compare SatScope’s fidelity, lightweight design, scalability, and openness against other simulators. Based on our simulation results using SatScope, we propose two metrics—ground node IP coverage rate and the number of satellite service IPs—to assess the service performance of single-layer satellite networks. Our findings reveal that during each network handover, on average, 38.94% of nodes and 83.66% of links change. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

25 pages, 4360 KiB  
Article
Positioning-Based Uplink Synchronization Method for NB-IoT in LEO Satellite Networks
by Qiang Qi, Tao Hong and Gengxin Zhang
Symmetry 2025, 17(7), 984; https://doi.org/10.3390/sym17070984 - 21 Jun 2025
Viewed by 734
Abstract
With the growth of Internet of Things (IoT) business demands, NB-IoT integrating low earth orbit (LEO) satellite communication systems is considered a crucial component for achieving global coverage of IoT networks in the future. However, the long propagation delay and significant Doppler frequency [...] Read more.
With the growth of Internet of Things (IoT) business demands, NB-IoT integrating low earth orbit (LEO) satellite communication systems is considered a crucial component for achieving global coverage of IoT networks in the future. However, the long propagation delay and significant Doppler frequency shift of the satellite-to-ground link pose substantial challenges to the uplink and downlink synchronization in LEO satellite-based NB-IoT networks. To address this challenge, we first propose a Multiple Segment Auto-correlation (MSA) algorithm to detect the downlink Narrow-band Primary Synchronization Signal (NPSS), specifically tailored for the large Doppler frequency shift of LEO satellites. After detection, downlink synchronization can be realized by determining the arrival time and frequency of the NPSS. Then, to complete the uplink synchronization, we propose a position-based scheme to obtain the Timing Advance (TA) values and pre-compensated Doppler shift value. In this scheme, we formulate a time difference of arrival (TDOA) equation using the arrival times of NPSSs from different satellites or at different times as observations. After solving the TDOA equation using the Chan method, the uplink synchronization is completed by obtaining the TA values and pre-compensated Doppler shift value from the terminal position combined with satellite ephemeris. Finally, the feasibility of the proposed scheme is verified in an Iridium satellite constellation. Compared to conventional GNSS-assisted methods, the approach proposed in this paper reduces terminal power consumption by 15–40%. Moreover, it achieves an uplink synchronization success rate of over 98% under negative SNR conditions. Full article
(This article belongs to the Special Issue Symmetry/Asymmetry in Future Wireless Networks)
Show Figures

Figure 1

24 pages, 3748 KiB  
Article
Adaptive Resource Optimization for LoRa-Enabled LEO Satellite IoT System in High-Dynamic Environments
by Chen Zhang, Haoyou Peng, Yonghua Ji, Tao Hong and Gengxin Zhang
Sensors 2025, 25(11), 3318; https://doi.org/10.3390/s25113318 - 25 May 2025
Viewed by 846
Abstract
The integration of Low-Earth Orbit (LEO) satellites with Long Range Radio (LoRa)-based Internet of Things (IoT) systems for extensive wide-area coverage has gained traction in academia and industry, challenging traditional terrestrial resource optimization designed for semi-static single-base-station environments. This paper addresses LEO’s high [...] Read more.
The integration of Low-Earth Orbit (LEO) satellites with Long Range Radio (LoRa)-based Internet of Things (IoT) systems for extensive wide-area coverage has gained traction in academia and industry, challenging traditional terrestrial resource optimization designed for semi-static single-base-station environments. This paper addresses LEO’s high dynamics and satellite-ground channel variability by introducing a beacon-triggered framework for LoRa-LEO IoT systems as a foundation for resource optimization. Then, in order to decouple the intertwined objectives of optimizing energy efficiency and maximizing the data extraction rate, an adaptive spreading factor (SF) allocation algorithm is proposed to mitigate collisions and resource waste, followed by a practical dynamic power control mechanism optimizing LoRa device power usage. Simulations validate that the proposed adaptive resource optimization outperforms conventional methods in dynamic, resource-constrained LEO environments, offering a robust solution for satellite IoT applications. In terms of energy efficiency and data extraction rate, the algorithm proposed in this paper outperforms other comparative algorithms. When the number of users reaches 3000, the energy efficiency is improved by at least 119%, and the data extraction rate is increased by at least 48%. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

29 pages, 4136 KiB  
Article
IoT-NTN with VLEO and LEO Satellite Constellations and LPWAN: A Comparative Study of LoRa, NB-IoT, and Mioty
by Changmin Lee, Taekhyun Kim, Chanhee Jung and Zizung Yoon
Electronics 2025, 14(9), 1798; https://doi.org/10.3390/electronics14091798 - 28 Apr 2025
Viewed by 1214
Abstract
This study investigates the optimization of satellite constellations for Low-Power, Wide-Area Network (LPWAN)-based Internet of Things (IoT) communications in Very Low Earth Orbit (VLEO) at 200 km and 300 km altitudes and Low Earth Orbit (LEO) at 600km using a Genetic Algorithm (GA). [...] Read more.
This study investigates the optimization of satellite constellations for Low-Power, Wide-Area Network (LPWAN)-based Internet of Things (IoT) communications in Very Low Earth Orbit (VLEO) at 200 km and 300 km altitudes and Low Earth Orbit (LEO) at 600km using a Genetic Algorithm (GA). Focusing on three LPWAN technologies—LoRa, Narrowband IoT (NB-IoT), and Mioty—we evaluate their performance in terms of revisit time, data transmission volume, and economic efficiency. Results indicate that a 300 km VLEO constellation with LoRa achieves the shortest average revisit time and requires the fewest satellites, offering notable cost benefits. NB-IoT provides the highest data transmission volume. Mioty demonstrates strong scalability but necessitates a larger satellite count. These findings highlight the potential of VLEO satellites, particularly at 300 km, combined with LPWAN solutions for efficient and scalable IoT Non-Terrestrial Network (IoT-NTN) applications. Future work will explore multi-altitude simulations and hybrid LPWAN integration for further optimization. Full article
(This article belongs to the Special Issue Future Generation Non-Terrestrial Networks)
Show Figures

Figure 1

17 pages, 810 KiB  
Article
Fast Reroute Mechanism for Satellite Networks Based on Segment Routing and Dual Timers Switching
by Jinyan Du, Ran Zhang, Jiangbo Hu, Tian Xia and Jiang Liu
Aerospace 2025, 12(3), 233; https://doi.org/10.3390/aerospace12030233 - 13 Mar 2025
Viewed by 787
Abstract
Low-Earth-Orbit (LEO) satellite networks have the advantage of global internet coverage and low latency, and they have enjoyed great success in the past few years. In LEO satellite networks, laser-based inter-satellite links (ISLs) are widely employed to achieve on-board data relay, and further [...] Read more.
Low-Earth-Orbit (LEO) satellite networks have the advantage of global internet coverage and low latency, and they have enjoyed great success in the past few years. In LEO satellite networks, laser-based inter-satellite links (ISLs) are widely employed to achieve on-board data relay, and further to provide high-capacity backhaul worldwide. However, ISLs are prone to break due to the outage of the ISL capturing, tracking, and aiming systems. Meanwhile, breaks caused by different reasons can last from milliseconds to hours. The hybrid ISL fault leads to the on-board routing protocol to flap frequently, thus causing high routing overheads, low convergence speed, and degraded service consistency. In this work, we propose a hybrid fault detection mechanism to identify transient and long-term ISL outage. Further, for transient link outage, the segment routing-based loop-free backup path is adopted to provide real-time transmission recovery, and precise global route convergence is adopted to restore the long-term routing failure. For the inconsistent routing table switch between the phase from transient to long-term fault, we propose a dual timer mechanism to make sure the path can be smoothly switched without micro-loops. Simulation results validate the feasibility and efficiency of the proposed scheme. Full article
(This article belongs to the Section Astronautics & Space Science)
Show Figures

Figure 1

22 pages, 3393 KiB  
Article
A Dynamic Spatio-Temporal Traffic Prediction Model Applicable to Low Earth Orbit Satellite Constellations
by Kexuan Liu, Yasheng Zhang and Shan Lu
Electronics 2025, 14(5), 1052; https://doi.org/10.3390/electronics14051052 - 6 Mar 2025
Cited by 1 | Viewed by 1245
Abstract
Low Earth Orbit (LEO) constellations support the transmission of various communication services and have been widely applied in fields such as global Internet access, the Internet of Things, remote sensing monitoring, and emergency communication. With the surge in traffic volume, the quality of [...] Read more.
Low Earth Orbit (LEO) constellations support the transmission of various communication services and have been widely applied in fields such as global Internet access, the Internet of Things, remote sensing monitoring, and emergency communication. With the surge in traffic volume, the quality of user services has faced unprecedented challenges. Achieving accurate low Earth orbit constellation network traffic prediction can optimize resource allocation, enhance the performance of LEO constellation networks, reduce unnecessary costs in operation management, and enable the system to adapt to the development of future services. Ground networks often adopt methods such as machine learning (support vector machine, SVM) or deep learning (convolutional neural network, CNN; generative adversarial network, GAN) to predict future short- and long-term traffic information, aiming to optimize network performance and ensure service quality. However, these methods lack an understanding of the high-dynamics of LEO satellites and are not applicable to LEO constellations. Therefore, designing an intelligent traffic prediction model that can accurately predict multi-service scenarios in LEO constellations remains an unsolved challenge. In this paper, in light of the characteristics of high-dynamics and the high-frequency data streams of LEO constellation traffic, the authors propose a DST-LEO satellite-traffic prediction model (a dynamic spatio-temporal low Earth orbit satellite traffic prediction model). This model captures the implicit features among satellite nodes through multiple attention mechanism modules and processes the traffic volume and traffic connection/disconnection data of inter-satellite links via a multi-source data separation and fusion strategy, respectively. After splicing and fusing at a specific scale, the model performs prediction through the attention mechanism. The model proposed by the authors achieved a short-term prediction RMSE of 0.0028 and an MAE of 0.0018 on the Abilene dataset. For long-term prediction on the Abilene dataset, the RMSE was 0.0054 and the MAE was 0.0039. The RMSE of the short-term prediction on the dataset simulated by the internal low Earth orbit constellation business simulation system was 0.0034, and the MAE was 0.0026. For the long-term prediction, the RMSE reached 0.0029 and the MAE reached 0.0022. Compared with other time series prediction models, it decreased by 22.3% in terms of the mean squared error and 18.0% in terms of the mean absolute error. The authors validated the functions of each module within the model through ablation experiments and further analyzed the effectiveness of this model in the task of LEO constellation network traffic prediction. Full article
(This article belongs to the Special Issue Future Generation Non-Terrestrial Networks)
Show Figures

Figure 1

42 pages, 1602 KiB  
Article
Hierarchical Resource Management for Mega-LEO Satellite Constellation
by Liang Gou, Dongming Bian, Yulei Nie, Gengxin Zhang, Hongwei Zhou, Yulin Shi and Lei Zhang
Sensors 2025, 25(3), 902; https://doi.org/10.3390/s25030902 - 2 Feb 2025
Viewed by 1732
Abstract
The mega-low Earth orbit (LEO) satellite constellation is pivotal for the future of satellite Internet and 6G networks. In the mega-LEO satellite constellation system (MLSCS), which is the spatial distribution of satellites, global users, and their services, along with the utilization of global [...] Read more.
The mega-low Earth orbit (LEO) satellite constellation is pivotal for the future of satellite Internet and 6G networks. In the mega-LEO satellite constellation system (MLSCS), which is the spatial distribution of satellites, global users, and their services, along with the utilization of global spectrum resources, significantly impacts resource allocation and scheduling. This paper addresses the challenge of effectively allocating system resources based on service and resource distribution, particularly in hotspot areas where user demand is concentrated, to enhance resource utilization efficiency. We propose a novel three-layer management architecture designed to implement scheduling strategies and alleviate the processing burden on the terrestrial Network Control Center (NCC), while providing real-time scheduling capabilities to adapt to rapid changes in network topology, resource distribution, and service requirements. The three layers of the resource management architecture—NCC, space base station (SBS), and user terminal (UT)—are discussed in detail, along with the functions and responsibilities of each layer. Additionally, we explore various resource scheduling strategies, approaches, and algorithms, including spectrum cognition, interference coordination, beam scheduling, multi-satellite collaboration, and random access. Simulations demonstrate the effectiveness of the proposed approaches and algorithms, indicating significant improvements in resource management in the MLSCS. Full article
(This article belongs to the Section Remote Sensors)
Show Figures

Figure 1

19 pages, 5172 KiB  
Article
Towards Digital-Twin Assisted Software-Defined Quantum Satellite Networks
by Francesco Chiti, Tommaso Pecorella, Roberto Picchi and Laura Pierucci
Sensors 2025, 25(3), 889; https://doi.org/10.3390/s25030889 - 31 Jan 2025
Viewed by 1254
Abstract
The Quantum Internet (QI) necessitates a complete revision of the classical protocol stack and the technologies used, whereas its operating principles depend on the physical laws governing quantum mechanics. Recent experiments demonstrate that Optical Fibers (OFs) allow connections only in urban areas. Therefore, [...] Read more.
The Quantum Internet (QI) necessitates a complete revision of the classical protocol stack and the technologies used, whereas its operating principles depend on the physical laws governing quantum mechanics. Recent experiments demonstrate that Optical Fibers (OFs) allow connections only in urban areas. Therefore, a novel Quantum Satellite Backbone (QSB) composed of a considerable number of Quantum Satellite Repeaters (QSRs) deployed in Low Earth Orbit (LEO) would allow for the overcoming of typical OFs’ attenuation problems. Nevertheless, the dynamic nature of the scenario represents a challenge for novel satellite networks, making their design and management complicated. Therefore, we have designed an ad hoc QSB considering the interaction between Digital Twin (DT) and Software-Defined Networking (SDN). In addition to defining the system architecture, we present a DT monitoring protocol that allows efficient status recovery for the creation of multiple End-to-End (E2E) entanglement states. Moreover, we have evaluated the system performance by assessing the path monitoring and configuration time, the time required to establish the E2E entanglement, and the fidelity between a couple of Ground Stations (GSs) interconnected through the QSB, also conducting a deep analysis of the created temporal paths. Full article
(This article belongs to the Special Issue Quantum Technologies for Communications and Networks Security)
Show Figures

Figure 1

18 pages, 3653 KiB  
Article
Intelligent Beam-Hopping-Based Grant-Free Random Access in Secure IoT-Oriented Satellite Networks
by Zhongliang Deng and Yicheng Liao
Sensors 2025, 25(1), 199; https://doi.org/10.3390/s25010199 - 1 Jan 2025
Viewed by 1189
Abstract
This research presents an intelligent beam-hopping-based grant-free random access (GFRA) architecture designed for secure Internet of Things (IoT) communications in Low Earth Orbit (LEO) satellite networks. In light of the difficulties associated with facilitating extensive device connectivity while ensuring low latency and high [...] Read more.
This research presents an intelligent beam-hopping-based grant-free random access (GFRA) architecture designed for secure Internet of Things (IoT) communications in Low Earth Orbit (LEO) satellite networks. In light of the difficulties associated with facilitating extensive device connectivity while ensuring low latency and high reliability, we present a beam-hopping GFRA (BH-GFRA) scheme that enhances access efficiency and reduces resource collisions. Three distinct resource-hopping schemes, random hopping, group hopping, and orthogonal group hopping, are examined and utilized within the framework. This technique utilizes orthogonal resource allocation algorithms to facilitate efficient resource sharing, effectively tackling the irregular and dynamic traffic. Also, a kind of activity mechanism is proposed based on the constraints of the spatio-temporal distribution of devices. We assess the system’s performance through a thorough mathematical analysis. Furthermore, we ascertain the access delay and success rate to evaluate its capability to serve a substantial number of IoT devices under satellite–terrestrial delay and interference of massive connections. The suggested method demonstrably improves connection, stability, and access efficiency in 6G IoT satellite networks, meeting the rigorous demands of next-generation IoT applications. Full article
(This article belongs to the Special Issue Advances in Security for Emerging Intelligent Systems)
Show Figures

Figure 1

14 pages, 1311 KiB  
Article
Decision Transformer-Based Efficient Data Offloading in LEO-IoT
by Pengcheng Xia, Mengfei Zang, Jie Zhao, Ting Ma, Jie Zhang, Changxu Ni, Jun Li and Yiyang Ni
Entropy 2024, 26(10), 846; https://doi.org/10.3390/e26100846 - 7 Oct 2024
Viewed by 1368
Abstract
Recently, the Internet of Things (IoT) has witnessed rapid development. However, the scarcity of computing resources on the ground has constrained the application scenarios of IoT. Low Earth Orbit (LEO) satellites have drawn people’s attention due to their broader coverage and shorter transmission [...] Read more.
Recently, the Internet of Things (IoT) has witnessed rapid development. However, the scarcity of computing resources on the ground has constrained the application scenarios of IoT. Low Earth Orbit (LEO) satellites have drawn people’s attention due to their broader coverage and shorter transmission delay. They are capable of offloading more IoT computing tasks to mobile edge computing (MEC) servers with lower latency in order to address the issue of scarce computing resources on the ground. Nevertheless, it is highly challenging to share bandwidth and power resources among multiple IoT devices and LEO satellites. In this paper, we explore the efficient data offloading mechanism in the LEO satellite-based IoT (LEO-IoT), where LEO satellites forward data from the terrestrial to the MEC servers. Specifically, by optimally selecting the forwarding LEO satellite for each IoT task and allocating communication resources, we aim to minimize the data offloading latency and energy consumption. Particularly, we employ the state-of-the-art Decision Transformer (DT) to solve this optimization problem. We initially obtain a pre-trained DT through training on a specific task. Subsequently, the pre-trained DT is fine-tuned by acquiring a small quantity of data under the new task, enabling it to converge rapidly, with less training time and superior performance. Numerical simulation results demonstrate that in contrast to the classical reinforcement learning approach (Proximal Policy Optimization), the convergence speed of DT can be increased by up to three times, and the performance can be improved by up to 30%. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

Back to TopTop