Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (142)

Search Parameters:
Keywords = low Earth orbit (LEO) satellite communications

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
16 pages, 2357 KiB  
Article
Joint Traffic Prediction and Handover Design for LEO Satellite Networks with LSTM and Attention-Enhanced Rainbow DQN
by Dinghe Fan, Shilei Zhou, Jihao Luo, Zijian Yang and Ming Zeng
Electronics 2025, 14(15), 3040; https://doi.org/10.3390/electronics14153040 - 30 Jul 2025
Viewed by 380
Abstract
With the increasing scale of low Earth orbit (LEO) satellite networks, leveraging non−terrestrial networks (NTNs) to complement terrestrial networks (TNs) has become a critical issue. In this paper, we investigate the issue of handover satellite selection between multiple terrestrial terminal groups (TTGs). To [...] Read more.
With the increasing scale of low Earth orbit (LEO) satellite networks, leveraging non−terrestrial networks (NTNs) to complement terrestrial networks (TNs) has become a critical issue. In this paper, we investigate the issue of handover satellite selection between multiple terrestrial terminal groups (TTGs). To support effective handover decision-making, we propose a long short-term memory (LSTM)-network-based traffic prediction mechanism based on historical traffic data. Building on these predictions, we formulate the handover strategy as a Markov Decision Process (MDP) and propose an attention-enhanced rainbow-DQN-based joint traffic prediction and handover design framework (ARTHF) by jointly considering the satellite switching frequency, communication quality, and satellite load. Simulation results demonstrate that our approach significantly outperforms existing methods in terms of the handover efficiency, service quality, and load balancing across satellites. Full article
Show Figures

Figure 1

20 pages, 5343 KiB  
Article
System-Level Assessment of Ka-Band Digital Beamforming Receivers and Transmitters Implementing Large Thinned Antenna Array for Low Earth Orbit Satellite Communications
by Giovanni Lasagni, Alessandro Calcaterra, Monica Righini, Giovanni Gasparro, Stefano Maddio, Vincenzo Pascale, Alessandro Cidronali and Stefano Selleri
Sensors 2025, 25(15), 4645; https://doi.org/10.3390/s25154645 - 26 Jul 2025
Viewed by 455
Abstract
In this paper, we present a system-level model of a digital multibeam antenna designed for Low Earth Orbit satellite communications operating in the Ka-band. We initially develop a suitable array topology, which is based on a thinned lattice, then adopt it as the [...] Read more.
In this paper, we present a system-level model of a digital multibeam antenna designed for Low Earth Orbit satellite communications operating in the Ka-band. We initially develop a suitable array topology, which is based on a thinned lattice, then adopt it as the foundation for evaluating its performance within a digital beamforming architecture. This architecture is implemented in a system-level simulator to evaluate the performance of the transmitter and receiver chains. This study advances the analysis of the digital antennas by incorporating both the RF front-end and digital sections non-idealities into a digital-twin framework. This approach enhances the designer’s ability to optimize the system with a holistic approach and provides insights into how various impairments affect the transmitter and receiver performance, identifying the subsystems’ parameter limits. To achieve this, we analyze several subsystems’ parameters and impairments, assessing their effects on both the antenna radiation and quality of the transmitted and received signals in a real applicative context. The results of this study reveal the sensitivity of the system to the impairments and suggest strategies to trade them off, emphasizing the importance of selecting appropriate subsystem features to optimize overall system performance. Full article
Show Figures

Figure 1

17 pages, 1184 KiB  
Article
A Biologically Inspired Cost-Efficient Zero-Trust Security Approach for Attacker Detection and Classification in Inter-Satellite Communication Networks
by Sridhar Varadala and Hao Xu
Future Internet 2025, 17(7), 304; https://doi.org/10.3390/fi17070304 - 13 Jul 2025
Viewed by 294
Abstract
In next-generation Low-Earth-Orbit (LEO) satellite networks, securing inter-satellite communication links (ISLs) through strong authentication is essential due to the network’s dynamic and distributed structure. Traditional authentication systems often struggle in these environments, leading to the adoption of Zero-Trust Security (ZTS) models. However, current [...] Read more.
In next-generation Low-Earth-Orbit (LEO) satellite networks, securing inter-satellite communication links (ISLs) through strong authentication is essential due to the network’s dynamic and distributed structure. Traditional authentication systems often struggle in these environments, leading to the adoption of Zero-Trust Security (ZTS) models. However, current ZTS protocols typically introduce high computational overhead, especially as the number of satellite nodes grows, which can impact both security and network performance. To overcome these challenges, a new bio-inspired ZTS framework called Manta Ray Foraging Cost-Optimized Zero-Trust Security (MRFCO-ZTS) has been introduced. This approach uses data-driven learning methods to enhance security across satellite communications. It continuously evaluates access requests by applying a cost function that accounts for risk level, likelihood of attack, and computational delay. The Manta Ray Foraging Optimization (MRFO) algorithm is used to minimize this cost, enabling effective classification of nodes as either trusted or malicious based on historical authentication records and real-time behavior. MRFCO-ZTS improves the accuracy of attacker detection while maintaining secure data exchange between authenticated satellites. Its effectiveness has been tested through numerical simulations under different satellite traffic conditions, with performance measured in terms of security accuracy, latency, and operational efficiency. Full article
(This article belongs to the Special Issue Joint Design and Integration in Smart IoT Systems, 2nd Edition)
Show Figures

Figure 1

23 pages, 2431 KiB  
Article
SatScope: A Data-Driven Simulator for Low-Earth-Orbit Satellite Internet
by Qichen Wang, Guozheng Yang, Yongyu Liang, Chiyu Chen, Qingsong Zhao and Sugai Chen
Future Internet 2025, 17(7), 278; https://doi.org/10.3390/fi17070278 - 24 Jun 2025
Viewed by 535
Abstract
The rapid development of low-Earth-orbit (LEO) satellite constellations has not only provided global users with low-latency and unrestricted high-speed data services but also presented researchers with the challenge of understanding dynamic changes in global network behavior. Unlike geostationary satellites and terrestrial internet infrastructure, [...] Read more.
The rapid development of low-Earth-orbit (LEO) satellite constellations has not only provided global users with low-latency and unrestricted high-speed data services but also presented researchers with the challenge of understanding dynamic changes in global network behavior. Unlike geostationary satellites and terrestrial internet infrastructure, LEO satellites move at a relative velocity of 7.6 km/s, leading to frequent alterations in their connectivity status with ground stations. Given the complexity of the space environment, current research on LEO satellite internet primarily focuses on modeling and simulation. However, existing LEO satellite network simulators often overlook the global network characteristics of these systems. We present SatScope, a data-driven simulator for LEO satellite internet. SatScope consists of three main components, space segment modeling, ground segment modeling, and network simulation configuration, providing researchers with an interface to interact with these models. Utilizing both space and ground segment models, SatScope can configure various network topology models, routing algorithms, and load balancing schemes, thereby enabling the evaluation of optimization algorithms for LEO satellite communication systems. We also compare SatScope’s fidelity, lightweight design, scalability, and openness against other simulators. Based on our simulation results using SatScope, we propose two metrics—ground node IP coverage rate and the number of satellite service IPs—to assess the service performance of single-layer satellite networks. Our findings reveal that during each network handover, on average, 38.94% of nodes and 83.66% of links change. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

18 pages, 2521 KiB  
Article
A Doppler Frequency-Offset Estimation Method Based on the Beam Pointing of LEO Satellites
by Yanjun Song, Jun Xu, Chenhua Sun, Xudong Li and Shaoyi An
Electronics 2025, 14(13), 2539; https://doi.org/10.3390/electronics14132539 - 23 Jun 2025
Viewed by 427
Abstract
With the advancement of 5G-Advanced Non-Terrestrial Network (5G-A NTN) mobile communication technologies, direct satellite connectivity for mobile devices has been increasingly adopted. In the highly dynamic environment of low-Earth-orbit (LEO) satellite communications, the synchronization of satellite–ground signals remains a critical challenge. In this [...] Read more.
With the advancement of 5G-Advanced Non-Terrestrial Network (5G-A NTN) mobile communication technologies, direct satellite connectivity for mobile devices has been increasingly adopted. In the highly dynamic environment of low-Earth-orbit (LEO) satellite communications, the synchronization of satellite–ground signals remains a critical challenge. In this study, a Doppler frequency-shift estimation method applicable to high-mobility LEO scenarios is proposed, without reliance on the Global Navigation Satellite System (GNSS). Rapid access to satellite systems by mobile devices is enabled without the need for additional time–frequency synchronization infrastructure. The generation mechanism of satellite–ground Doppler frequency shifts is analyzed, and a relationship between satellite velocity and beam-pointing direction is established. Based on this relationship, a Doppler frequency-shift estimation method, referred to as DFS-BP (Doppler frequency-shift estimation using beam pointing), is developed. The effects of Earth’s latitude and satellite orbital inclination are systematically investigated and optimized. Through simulation, the estimation performance under varying minimum satellite elevation angles and terminal geographic locations is evaluated. The algorithm may provide a novel solution for Doppler frequency-shift compensation in Non-Terrestrial Networks (NTNs). Full article
Show Figures

Figure 1

25 pages, 4360 KiB  
Article
Positioning-Based Uplink Synchronization Method for NB-IoT in LEO Satellite Networks
by Qiang Qi, Tao Hong and Gengxin Zhang
Symmetry 2025, 17(7), 984; https://doi.org/10.3390/sym17070984 - 21 Jun 2025
Viewed by 734
Abstract
With the growth of Internet of Things (IoT) business demands, NB-IoT integrating low earth orbit (LEO) satellite communication systems is considered a crucial component for achieving global coverage of IoT networks in the future. However, the long propagation delay and significant Doppler frequency [...] Read more.
With the growth of Internet of Things (IoT) business demands, NB-IoT integrating low earth orbit (LEO) satellite communication systems is considered a crucial component for achieving global coverage of IoT networks in the future. However, the long propagation delay and significant Doppler frequency shift of the satellite-to-ground link pose substantial challenges to the uplink and downlink synchronization in LEO satellite-based NB-IoT networks. To address this challenge, we first propose a Multiple Segment Auto-correlation (MSA) algorithm to detect the downlink Narrow-band Primary Synchronization Signal (NPSS), specifically tailored for the large Doppler frequency shift of LEO satellites. After detection, downlink synchronization can be realized by determining the arrival time and frequency of the NPSS. Then, to complete the uplink synchronization, we propose a position-based scheme to obtain the Timing Advance (TA) values and pre-compensated Doppler shift value. In this scheme, we formulate a time difference of arrival (TDOA) equation using the arrival times of NPSSs from different satellites or at different times as observations. After solving the TDOA equation using the Chan method, the uplink synchronization is completed by obtaining the TA values and pre-compensated Doppler shift value from the terminal position combined with satellite ephemeris. Finally, the feasibility of the proposed scheme is verified in an Iridium satellite constellation. Compared to conventional GNSS-assisted methods, the approach proposed in this paper reduces terminal power consumption by 15–40%. Moreover, it achieves an uplink synchronization success rate of over 98% under negative SNR conditions. Full article
(This article belongs to the Special Issue Symmetry/Asymmetry in Future Wireless Networks)
Show Figures

Figure 1

17 pages, 474 KiB  
Article
User Experience-Oriented Content Caching for Low Earth Orbit Satellite-Enabled Mobile Edge Computing Networks
by Jianhua He, Youhan Zhao, Yonghua Ma and Qiang Wang
Electronics 2025, 14(12), 2413; https://doi.org/10.3390/electronics14122413 - 13 Jun 2025
Viewed by 340
Abstract
In this paper, we investigate a low Earth orbit (LEO) satellite-enabled mobile edge computing (MEC) network, where multiple cache-enabled LEO satellites are deployed to address heterogeneous content requests from ground users. To evaluate the network’s capability in meeting user demands, we adopt the [...] Read more.
In this paper, we investigate a low Earth orbit (LEO) satellite-enabled mobile edge computing (MEC) network, where multiple cache-enabled LEO satellites are deployed to address heterogeneous content requests from ground users. To evaluate the network’s capability in meeting user demands, we adopt the average quality of experience (QoE) of the users as the performance metric, defined based on the effective transmission rate under communication interference. Our analysis reveals that the average QoE is determined by the content caching decisions at the satellites, thereby allowing us to formulate an average QoE maximization problem, subject to practical constraints on the satellite caching capacity. To tackle this NP-hard problem, we design a two-stage content caching algorithm that combines divide-and-conquer and greedy policies for efficient solution. The numerical results validate the effectiveness of the proposed approach. Compared with several benchmark schemes, our algorithm achieves notable improvements in terms of the average QoE while significantly reducing caching costs, particularly under resource-constrained satellite settings. Full article
Show Figures

Figure 1

21 pages, 676 KiB  
Article
Service-Driven Dynamic Beam Hopping with Resource Allocation for LEO Satellites
by Huaixiu Xu, Lilan Liu and Zhizhong Zhang
Electronics 2025, 14(12), 2367; https://doi.org/10.3390/electronics14122367 - 10 Jun 2025
Viewed by 838
Abstract
Given the problems of uneven distribution, strong time variability of ground service demands, and low utilization rate of on-board resources in Low-Earth-Orbit (LEO) satellite communication systems, how to efficiently utilize limited beam resources to flexibly and dynamically serve ground users has become a [...] Read more.
Given the problems of uneven distribution, strong time variability of ground service demands, and low utilization rate of on-board resources in Low-Earth-Orbit (LEO) satellite communication systems, how to efficiently utilize limited beam resources to flexibly and dynamically serve ground users has become a research hotspot. This paper studies the dynamic resource allocation and interference suppression strategies for beam hopping satellite communication systems. Specifically, in the full-frequency-reuse scenario, we adopt spatial isolation techniques to avoid co-channel interference between beams and construct a multi-objective optimization problem by introducing weight coefficients, aiming to maximize user satisfaction and minimize transmission delay simultaneously. We model this optimization problem as a Markov decision process and apply a value decomposition network (VDN) algorithm based on cooperative multi-agent reinforcement learning (MARL-VDN) to reduce computational complexity. In this algorithm framework, each beam acts as an agent, making independent decisions on hopping patterns and power allocation strategies, while achieving multi-agent cooperative optimization through sharing global states and joint reward mechanisms. Simulation results show that the applied algorithm can effectively enhance user satisfaction, reduce delay, and maintain high resource utilization in dynamic service demand scenarios. Additionally, the offline-trained MARL-VDN model can be deployed on LEO satellites in a distributed mode to achieve real-time on-board resource allocation on demand. Full article
Show Figures

Figure 1

10 pages, 4421 KiB  
Proceeding Paper
Geometric Analysis of LEO-Based Monitoring of GNSS Constellations
by Can Oezmaden, Omar García Crespillo, Michael Niestroj, Marius Brachvogel and Michael Meurer
Eng. Proc. 2025, 88(1), 57; https://doi.org/10.3390/engproc2025088057 - 19 May 2025
Viewed by 621
Abstract
The last decade has seen a surge in the development and deployment of low Earth orbit (LEO) constellations primarily serving broadband communication applications. These developments have also influenced the interest providing positioning, navigation, and timing (PNT) services from LEO. Potential services include new [...] Read more.
The last decade has seen a surge in the development and deployment of low Earth orbit (LEO) constellations primarily serving broadband communication applications. These developments have also influenced the interest providing positioning, navigation, and timing (PNT) services from LEO. Potential services include new ranging signals from LEO, augmentation of global navigation satellite systems (GNSS), and monitoring of GNSS. The latter promises an advantage over existing ground-based monitoring due to the reception of observables with reduced atmospheric error contributions and the potential for lower costs. In this paper, we investigate the influence of LEO constellation design on the line-of-sight visibility conditions for GNSS monitoring. We simulate a series of Walker constellations in LEO with a varying number of total satellites, orbital planes, and orbital heights. From the simulated data, we gather statistics on the number of visible GNSS and LEO satellites, durations of visibility periods, and the quality of this visibility quantified by the dilution of precision (DOP) metric. Our findings indicate that increasing the total number of LEO satellites results in diminishing returns. We find that constellations with relatively few total satellites equally yield an adequate monitoring capability. We also identify orbital geometric constraints resulting in suboptimal performance and discuss optimization strategies. Full article
(This article belongs to the Proceedings of European Navigation Conference 2024)
Show Figures

Figure 1

24 pages, 5736 KiB  
Article
Joint Task Offloading and Power Allocation for Satellite Edge Computing Networks
by Yuxuan Li, Shibing Zhu, Ting Xiong, Yuwei Li, Qi Su and Jianmei Dai
Sensors 2025, 25(9), 2892; https://doi.org/10.3390/s25092892 - 3 May 2025
Viewed by 631
Abstract
Low Earth orbit (LEO) satellite networks have shown extensive application in the fields of navigation, communication services in remote areas, and disaster early warning. Inspired by multi-access edge computing (MEC) technology, satellite edge computing (SEC) technology emerges, which deploys mobile edge computing on [...] Read more.
Low Earth orbit (LEO) satellite networks have shown extensive application in the fields of navigation, communication services in remote areas, and disaster early warning. Inspired by multi-access edge computing (MEC) technology, satellite edge computing (SEC) technology emerges, which deploys mobile edge computing on satellites to achieve lower service latency by leveraging the advantage of satellites being closer to users. However, due to the limitations in the size and power of LEO satellites, processing computationally intensive tasks with a single satellite may overload it, reducing its lifespan and resulting in high service latency. In this paper, we consider a scenario of multi-satellite collaborative offloading. We mainly focus on computation offloading in the satellite edge computing network (SECN) by jointly considering the transmission power and task assignment ratios. A maximum delay minimization problem under the power and energy constraints is formulated, and a distributed balance increasing penalty dual decomposition (DB-IPDD) algorithm is proposed, utilizing the triple-layer computing structure that can leverage the computing resources of multiple LEO satellites. Simulation results demonstrate the advantage of the proposed solution over several baseline schemes. Full article
(This article belongs to the Section Communications)
Show Figures

Figure 1

10 pages, 1265 KiB  
Proceeding Paper
Indoor Signal Strength Evaluation of the Orbcomm Low Earth Orbit Satellite Constellation
by Wout Van Uytsel, Thomas Janssen, Maarten Weyn and Rafael Berkvens
Eng. Proc. 2025, 88(1), 39; https://doi.org/10.3390/engproc2025088039 - 29 Apr 2025
Viewed by 590
Abstract
In this connected world, communication in all kinds of complex environments is crucial. As a result, indoor satellite communication could enable many new applications and use cases. In this study, we explore the potential of Low Earth Orbit (LEO) satellites to provide indoor [...] Read more.
In this connected world, communication in all kinds of complex environments is crucial. As a result, indoor satellite communication could enable many new applications and use cases. In this study, we explore the potential of Low Earth Orbit (LEO) satellites to provide indoor coverage. This is done by evaluating the signal strength of Orbcomm LEO satellite signals in multiple indoor environments within a suburban home. Starting from IQ samples, we developed an algorithm to calculate the Carrier-to-Noise Density Ratio (C/N0) as a key performance metric to compare environments when the Carrier-To-Noise Ratio (CNR) is above 0 dB. By utilizing a Software Defined Radio (SDR) in combination with this algorithm, we were able to evaluate the signal strength differences between environments. We found that the LEO satellite signals penetrated into every environment including the basement. The signals were even received with high signal strength in the attic, reaching values above 55 dB-Hz. Moreover, the signals were well received in every above-ground environment. Unsurprisingly, the satellite signals were received the weakest in the basement and only for a short duration of time. Full article
(This article belongs to the Proceedings of European Navigation Conference 2024)
Show Figures

Figure 1

29 pages, 4136 KiB  
Article
IoT-NTN with VLEO and LEO Satellite Constellations and LPWAN: A Comparative Study of LoRa, NB-IoT, and Mioty
by Changmin Lee, Taekhyun Kim, Chanhee Jung and Zizung Yoon
Electronics 2025, 14(9), 1798; https://doi.org/10.3390/electronics14091798 - 28 Apr 2025
Viewed by 1214
Abstract
This study investigates the optimization of satellite constellations for Low-Power, Wide-Area Network (LPWAN)-based Internet of Things (IoT) communications in Very Low Earth Orbit (VLEO) at 200 km and 300 km altitudes and Low Earth Orbit (LEO) at 600km using a Genetic Algorithm (GA). [...] Read more.
This study investigates the optimization of satellite constellations for Low-Power, Wide-Area Network (LPWAN)-based Internet of Things (IoT) communications in Very Low Earth Orbit (VLEO) at 200 km and 300 km altitudes and Low Earth Orbit (LEO) at 600km using a Genetic Algorithm (GA). Focusing on three LPWAN technologies—LoRa, Narrowband IoT (NB-IoT), and Mioty—we evaluate their performance in terms of revisit time, data transmission volume, and economic efficiency. Results indicate that a 300 km VLEO constellation with LoRa achieves the shortest average revisit time and requires the fewest satellites, offering notable cost benefits. NB-IoT provides the highest data transmission volume. Mioty demonstrates strong scalability but necessitates a larger satellite count. These findings highlight the potential of VLEO satellites, particularly at 300 km, combined with LPWAN solutions for efficient and scalable IoT Non-Terrestrial Network (IoT-NTN) applications. Future work will explore multi-altitude simulations and hybrid LPWAN integration for further optimization. Full article
(This article belongs to the Special Issue Future Generation Non-Terrestrial Networks)
Show Figures

Figure 1

24 pages, 718 KiB  
Article
An Accelerated Maximum Flow Algorithm with Prediction Enhancement in Dynamic LEO Networks
by Jiayin Sheng, Xinjie Guan, Fuliang Yang and Xili Wan
Sensors 2025, 25(8), 2555; https://doi.org/10.3390/s25082555 - 17 Apr 2025
Viewed by 595
Abstract
Efficient data transmission in low Earth orbit (LEO) satellite networks is critical for supporting real-time global communication, Earth observation, and numerous data-intensive space missions. A fundamental challenge in these networks involves solving the maximum flow problem, which determines the optimal data throughput across [...] Read more.
Efficient data transmission in low Earth orbit (LEO) satellite networks is critical for supporting real-time global communication, Earth observation, and numerous data-intensive space missions. A fundamental challenge in these networks involves solving the maximum flow problem, which determines the optimal data throughput across highly dynamic topologies with limited onboard energy and data processing capability. Traditional algorithms often fall short in these environments due to their high computational costs and inability to adapt to frequent topological changes or fluctuating link capacities. This paper introduces an accelerated maximum flow algorithm specifically designed for dynamic LEO networks, leveraging a prediction-enhanced approach to improve both speed and adaptability. The proposed algorithm integrates a novel energy-time expanded graph (e-TEG) framework, which jointly models satellite-specific constraints including time-varying inter-satellite visibility, limited onboard processing capacities, and dynamic link capacities. In addition, a learning-augmented warm-start strategy is introduced to enhance the Ford–Fulkerson algorithm. It generates near-optimal initial flows based on historical network states, which reduces the number of augmentation steps required and accelerates computation under dynamic conditions. Theoretical analyses confirm the correctness and time efficiency of the proposed approach. Evaluation results validate that the prediction-enhanced approach achieves up to a 32.2% reduction in computation time compared to conventional methods, particularly under varying storage capacity and network topologies. These results demonstrate the algorithm’s potential to support high-throughput, efficient data transmission in future satellite communication systems. Full article
(This article belongs to the Section Navigation and Positioning)
Show Figures

Figure 1

23 pages, 6257 KiB  
Article
LEO Satellite Navigation Signal Multi-Dimensional Interference Optimisation Method Based on Hybrid Game Theory
by Chengkai Tang, Xunbin Zhou, Lingling Zhang, Yangyang Liu and Zesheng Dan
Remote Sens. 2025, 17(8), 1444; https://doi.org/10.3390/rs17081444 - 17 Apr 2025
Viewed by 609
Abstract
Low Earth Orbit (LEO) satellite communication is gradually becoming the main carrier for satellite communication by virtue of its advantages, such as high landing power, narrow beam, large transmission bandwidth, and small time delay. In the military field, interference with LEO satellites has [...] Read more.
Low Earth Orbit (LEO) satellite communication is gradually becoming the main carrier for satellite communication by virtue of its advantages, such as high landing power, narrow beam, large transmission bandwidth, and small time delay. In the military field, interference with LEO satellites has become a core element in combat, but the existing interference and confrontation methods cannot meet the needs of LEO satellite interference. Aiming at the above problems, this paper proposes an LEO satellite navigation signal multi-dimensional interference optimisation method based on hybrid game theory. Firstly, the method achieves a dynamic classification of jammers within the airspace. Then, an interference effectiveness evaluation function is established, which reflects the time, frequency, and power domain losses, as well as the strategy gains. With the help of hybrid game theory, the optimal resource allocation under Nash equilibrium is achieved, and the distributed interference optimisation problem is effectively solved. The experiment uses a large microwave darkroom as an interference verification scenario. The results indicate that the interference bit error rate (BER) of the algorithm proposed in this paper is on the order of 102, under the premise of guaranteeing the full coverage of the area to be interfered. The value of the multidimensional interference utility function, including the power, time, and frequency domains, is improved by at least 0.4993 times compared to other algorithms. Full article
Show Figures

Graphical abstract

23 pages, 678 KiB  
Article
Monitoring High-Dynamic Wideband Orthogonal Frequency Division Multiplexing Signal Under Weak Prior Knowledge
by Chaoqun Hou, Linan Wang, Yuqing Wang, Xiangni Zou, Jianbo Liu, Teng Hou, Zehui Zhang and Jianxiong Pan
Electronics 2025, 14(8), 1620; https://doi.org/10.3390/electronics14081620 - 17 Apr 2025
Viewed by 329
Abstract
In the context of the escalating requirement for high-throughput multimedia services, Orthogonal Frequency Division Multiplexing (OFDM) signal systems have become increasingly prevalent in Low Earth Orbit (LEO) satellite communications. In order to promote the judicious, efficient, and cost-effective deployment of satellite spectrum resources, [...] Read more.
In the context of the escalating requirement for high-throughput multimedia services, Orthogonal Frequency Division Multiplexing (OFDM) signal systems have become increasingly prevalent in Low Earth Orbit (LEO) satellite communications. In order to promote the judicious, efficient, and cost-effective deployment of satellite spectrum resources, there is a critical need to augment the capabilities of spectrum monitoring technology for uncollaborative LEO satellite OFDM signals. Addressing the complexities inherent in the broad bandwidth, substantial dynamic range, and weak prior knowledge associated with LEO satellite OFDM signal monitoring, this study introduces an innovative methodology. This approach harnesses a broadband parallel flexible filtering variable sampling technique to facilitate the real-time observation of satellite OFDM signals across a wide bandwidth spectrum. Moreover, the research presents a dynamic compensation technique, which utilizes ephemeris information, to mitigate frequency offset and amplitude fading issues within the monitored signals. Post compensation, the study conducts signal identification and parameter analysis, utilizing the intrinsic features of OFDM signals. This technique empowers real-time monitoring and the accurate analysis of satellite broadband OFDM signals, ensuring robust performance in scenarios characterized by weak prior knowledge and significant dynamic variations. Full article
Show Figures

Figure 1

Back to TopTop