Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (52)

Search Parameters:
Keywords = terrestrial constellation

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
41 pages, 1710 KiB  
Article
Toward Integrated Satellite Operations and Network Management: A Review and Novel Framework
by Arnau Singla, Franco Criscola, David Canales, Juan A. Fraire, Anna Calveras and Joan A. Ruiz-de-Azua
Technologies 2025, 13(8), 312; https://doi.org/10.3390/technologies13080312 - 22 Jul 2025
Viewed by 402
Abstract
Achieving global coverage and performance goals for 6G requires seamless integration of satellite and terrestrial networks, yet current operational frameworks lack common standards for managing these heterogeneous infrastructures. This paper addresses the critical need for unified satellite-terrestrial network operations by proposing the CMS [...] Read more.
Achieving global coverage and performance goals for 6G requires seamless integration of satellite and terrestrial networks, yet current operational frameworks lack common standards for managing these heterogeneous infrastructures. This paper addresses the critical need for unified satellite-terrestrial network operations by proposing the CMS framework, a novel task-scheduling-based approach that bridges the operational gap between satellite operations (SatOps) and network operations (NetOps). The framework integrates satellite-specific constraints with network service requirements and QoS metrics through constraint satisfaction programming and multi-objective optimization. Three novel architectures are introduced: integrated operations (embedding NetOps within SatOps), coordinated operations (unified control with separate execution channels), and adaptive operations (mutual adaptation through intelligent interfaces). Each architecture addresses different connectivity scenarios and integration requirements for both sporadic and persistent satellite constellations. The proposed architectures are evaluated against challenges spanning infrastructure and architecture, interoperability and standardization, integrated management, operational dynamics, and technology maturation and deployment. Validation through simulation demonstrates significant performance improvements, with task completion rates improving by 17.87% to 44.02% and data throughput gains of 25.09% to 93.62% compared to traditional approaches. The CMS framework establishes a resilient operational standard for future 6G networks, offering practical solutions to bridge the current divide between satellite and terrestrial network operations. Full article
(This article belongs to the Section Information and Communication Technologies)
Show Figures

Figure 1

23 pages, 5644 KiB  
Article
Exploring the Performance of Transparent 5G NTN Architectures Based on Operational Mega-Constellations
by Oscar Baselga, Anna Calveras and Joan Adrià Ruiz-de-Azua
Network 2025, 5(3), 25; https://doi.org/10.3390/network5030025 - 18 Jul 2025
Viewed by 299
Abstract
The evolution of 3GPP non-terrestrial networks (NTNs) is enabling new avenues for broadband connectivity via satellite, especially within the scope of 5G. The parallel rise in satellite mega-constellations has further fueled efforts toward ubiquitous global Internet access. This convergence has fostered collaboration between [...] Read more.
The evolution of 3GPP non-terrestrial networks (NTNs) is enabling new avenues for broadband connectivity via satellite, especially within the scope of 5G. The parallel rise in satellite mega-constellations has further fueled efforts toward ubiquitous global Internet access. This convergence has fostered collaboration between mobile network operators and satellite providers, allowing the former to leverage mature space infrastructure and the latter to integrate with terrestrial mobile standards. However, integrating these technologies presents significant architectural challenges. This study investigates 5G NTN architectures using satellite mega-constellations, focusing on transparent architectures where Starlink is employed to relay the backhaul, midhaul, and new radio (NR) links. The performance of these architectures is assessed through a testbed utilizing OpenAirInterface (OAI) and Open5GS, which collects key user-experience metrics such as round-trip time (RTT) and jitter when pinging the User Plane Function (UPF) in the 5G core (5GC). Results show that backhaul and midhaul relays maintain delays of 50–60 ms, while NR relays incur delays exceeding one second due to traffic overload introduced by the RFSimulator tool, which is indispensable to transmit the NR signal over Starlink. These findings suggest that while transparent architectures provide valuable insights and utility, regenerative architectures are essential for addressing current time issues and fully realizing the capabilities of space-based broadband services. Full article
Show Figures

Figure 1

23 pages, 2431 KiB  
Article
SatScope: A Data-Driven Simulator for Low-Earth-Orbit Satellite Internet
by Qichen Wang, Guozheng Yang, Yongyu Liang, Chiyu Chen, Qingsong Zhao and Sugai Chen
Future Internet 2025, 17(7), 278; https://doi.org/10.3390/fi17070278 - 24 Jun 2025
Viewed by 403
Abstract
The rapid development of low-Earth-orbit (LEO) satellite constellations has not only provided global users with low-latency and unrestricted high-speed data services but also presented researchers with the challenge of understanding dynamic changes in global network behavior. Unlike geostationary satellites and terrestrial internet infrastructure, [...] Read more.
The rapid development of low-Earth-orbit (LEO) satellite constellations has not only provided global users with low-latency and unrestricted high-speed data services but also presented researchers with the challenge of understanding dynamic changes in global network behavior. Unlike geostationary satellites and terrestrial internet infrastructure, LEO satellites move at a relative velocity of 7.6 km/s, leading to frequent alterations in their connectivity status with ground stations. Given the complexity of the space environment, current research on LEO satellite internet primarily focuses on modeling and simulation. However, existing LEO satellite network simulators often overlook the global network characteristics of these systems. We present SatScope, a data-driven simulator for LEO satellite internet. SatScope consists of three main components, space segment modeling, ground segment modeling, and network simulation configuration, providing researchers with an interface to interact with these models. Utilizing both space and ground segment models, SatScope can configure various network topology models, routing algorithms, and load balancing schemes, thereby enabling the evaluation of optimization algorithms for LEO satellite communication systems. We also compare SatScope’s fidelity, lightweight design, scalability, and openness against other simulators. Based on our simulation results using SatScope, we propose two metrics—ground node IP coverage rate and the number of satellite service IPs—to assess the service performance of single-layer satellite networks. Our findings reveal that during each network handover, on average, 38.94% of nodes and 83.66% of links change. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

29 pages, 2193 KiB  
Article
Evaluation of TOPSIS Algorithm for Multi-Criteria Handover in LEO Satellite Networks: A Sensitivity Analysis
by Pascal Buhinyori Ngango, Marie-Line Lufua Binda, Michel Matalatala Tamasala, Pierre Sedi Nzakuna, Vincenzo Paciello and Angelo Kuti Lusala
Network 2025, 5(2), 15; https://doi.org/10.3390/network5020015 - 2 May 2025
Viewed by 961
Abstract
The Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) is widely recognized as an effective multi-criteria decision-making algorithm for handover management in terrestrial cellular networks, especially in scenarios involving dynamic and multi-faceted criteria. While TOPSIS is widely adopted in terrestrial cellular [...] Read more.
The Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) is widely recognized as an effective multi-criteria decision-making algorithm for handover management in terrestrial cellular networks, especially in scenarios involving dynamic and multi-faceted criteria. While TOPSIS is widely adopted in terrestrial cellular networks for handover management, its application in satellite networks, particularly in Low Earth Orbit (LEO) constellations, remains limited and underexplored. In this work, the performance of three TOPSIS algorithms is evaluated for handover management in LEO satellite networks, where efficient handover management is crucial due to rapid changes in satellite positions and network conditions. Sensitivity analysis is conducted on Standard Deviation TOPSIS (SD-TOPSIS), Entropy-TOPSIS, and Importance-TOPSIS in the context of LEO satellite networks, assessing their responsiveness to small variations in key performance metrics such as upload speed, download speed, ping, and packet loss. This study uses real-world data from “Starlink-on-the-road-Dataset”. Results show that SD-TOPSIS effectively optimizes handover management in dynamic LEO satellite networks thanks to its lower standard deviation scores and reduced score variation rate, thus demonstrating superior stability and lower sensitivity to small variations in performance metrics values compared to both Entropy-TOPSIS and Importance-TOPSIS. This ensures more consistent decision-making, avoidance of unnecessary handovers, and enhanced robustness in rapidly-changing network conditions, making it particularly suitable for real-time services that require stable, low-latency, and reliable connectivity. Full article
Show Figures

Figure 1

29 pages, 4136 KiB  
Article
IoT-NTN with VLEO and LEO Satellite Constellations and LPWAN: A Comparative Study of LoRa, NB-IoT, and Mioty
by Changmin Lee, Taekhyun Kim, Chanhee Jung and Zizung Yoon
Electronics 2025, 14(9), 1798; https://doi.org/10.3390/electronics14091798 - 28 Apr 2025
Viewed by 1016
Abstract
This study investigates the optimization of satellite constellations for Low-Power, Wide-Area Network (LPWAN)-based Internet of Things (IoT) communications in Very Low Earth Orbit (VLEO) at 200 km and 300 km altitudes and Low Earth Orbit (LEO) at 600km using a Genetic Algorithm (GA). [...] Read more.
This study investigates the optimization of satellite constellations for Low-Power, Wide-Area Network (LPWAN)-based Internet of Things (IoT) communications in Very Low Earth Orbit (VLEO) at 200 km and 300 km altitudes and Low Earth Orbit (LEO) at 600km using a Genetic Algorithm (GA). Focusing on three LPWAN technologies—LoRa, Narrowband IoT (NB-IoT), and Mioty—we evaluate their performance in terms of revisit time, data transmission volume, and economic efficiency. Results indicate that a 300 km VLEO constellation with LoRa achieves the shortest average revisit time and requires the fewest satellites, offering notable cost benefits. NB-IoT provides the highest data transmission volume. Mioty demonstrates strong scalability but necessitates a larger satellite count. These findings highlight the potential of VLEO satellites, particularly at 300 km, combined with LPWAN solutions for efficient and scalable IoT Non-Terrestrial Network (IoT-NTN) applications. Future work will explore multi-altitude simulations and hybrid LPWAN integration for further optimization. Full article
(This article belongs to the Special Issue Future Generation Non-Terrestrial Networks)
Show Figures

Figure 1

32 pages, 1019 KiB  
Article
Time Scale in Alternative Positioning, Navigation, and Timing: New Dynamic Radio Resource Assignments and Clock Steering Strategies
by Khanh Pham
Information 2025, 16(3), 210; https://doi.org/10.3390/info16030210 - 9 Mar 2025
Viewed by 892
Abstract
Terrestrial and satellite communications, tactical data links, positioning, navigation, and timing (PNT), as well as distributed sensing will continue to require precise timing and the ability to synchronize and disseminate time effectively. However, the supply of space-qualified clocks that meet Global Navigation Satellite [...] Read more.
Terrestrial and satellite communications, tactical data links, positioning, navigation, and timing (PNT), as well as distributed sensing will continue to require precise timing and the ability to synchronize and disseminate time effectively. However, the supply of space-qualified clocks that meet Global Navigation Satellite Systems (GNSS)-level performance standards is limited. As the awareness of potential disruptions to GNSS due to adversarial actions grows, the current reliance on GNSS-level timing appears costly and outdated. This is especially relevant given the benefits of developing robust and stable time scale references in orbit, especially as various alternatives to GNSS are being explored. The onboard realization of clock ensembles is particularly promising for applications such as those providing the on-demand dissemination of a reference time scale for navigation services via a proliferated Low-Earth Orbit (pLEO) constellation. This article investigates potential inter-satellite network architectures for coordinating time and frequency across pLEO platforms. These architectures dynamically allocate radio resources for clock data transport based on the requirements for pLEO time scale formations. Additionally, this work proposes a model-based control system for wireless networked timekeeping systems. It envisions the optimal placement of critical information concerning the implicit ensemble mean (IEM) estimation across a multi-platform clock ensemble, which can offer better stability than relying on any single ensemble member. This approach aims to reduce data traffic flexibly. By making the IEM estimation sensor more intelligent and running it on the anchor platform while also optimizing the steering of remote frequency standards on participating platforms, the networked control system can better predict the future behavior of local reference clocks paired with low-noise oscillators. This system would then send precise IEM estimation information at critical moments to ensure a common pLEO time scale is realized across all participating platforms. Clock steering is essential for establishing these time scales, and the effectiveness of the realization depends on the selected control intervals and steering techniques. To enhance performance reliability beyond what the existing Linear Quadratic Gaussian (LQG) control technique can provide, the minimal-cost-variance (MCV) control theory is proposed for clock steering operations. The steering process enabled by the MCV control technique significantly impacts the overall performance reliability of the time scale, which is generated by the onboard ensemble of compact, lightweight, and low-power clocks. This is achieved by minimizing the variance of the chi-squared random performance of LQG control while maintaining a constraint on its mean. Full article
(This article belongs to the Special Issue Sensing and Wireless Communications)
Show Figures

Graphical abstract

17 pages, 725 KiB  
Article
Polar Code BP Decoding Optimization for Green 6G Satellite Communication: A Geometry Perspective
by Chuanji Zhu, Yuanzhi He and Zheng Dou
Axioms 2025, 14(3), 174; https://doi.org/10.3390/axioms14030174 - 27 Feb 2025
Cited by 1 | Viewed by 555
Abstract
The rapid evolution of mega-constellation networks and 6G satellite communication systems has ushered in an era of ubiquitous connectivity, yet their sustainability is threatened by the energy-computation dilemma inherent in high-throughput data transmission. Polar codes, as a coding scheme capable of achieving Shannon’s [...] Read more.
The rapid evolution of mega-constellation networks and 6G satellite communication systems has ushered in an era of ubiquitous connectivity, yet their sustainability is threatened by the energy-computation dilemma inherent in high-throughput data transmission. Polar codes, as a coding scheme capable of achieving Shannon’s limit, have emerged as one of the key candidate coding technologies for 6G networks. Despite the high parallelism and excellent performance of their Belief Propagation (BP) decoding algorithm, its drawbacks of numerous iterations and slow convergence can lead to higher energy consumption, impacting system energy efficiency and sustainability. Therefore, research on efficient early termination algorithms has become an important direction in polar code research. In this paper, based on information geometry theory, we propose a novel geometric framework for BP decoding of polar codes and design two early termination algorithms under this framework: an early termination algorithm based on Riemannian distance and an early termination algorithm based on divergence. These algorithms improve convergence speed by geometrically analyzing the changes in soft information during the BP decoding process. Simulation results indicate that, when Eb/N0 is between 1.5 dB and 2.5 dB, compared to three classical early termination algorithms, the two early termination algorithms proposed in this paper reduce the number of iterations by 4.7–11% and 8.8–15.9%, respectively. Crucially, while this work is motivated by the unique demands of satellite networks, the geometric characterization of polar code BP decoding transcends specific applications. The proposed framework is inherently adaptable to any communication system requiring energy-efficient channel coding, including 6G terrestrial networks, Internet of Things (IoT) edge devices, and unmanned aerial vehicle (UAV) swarms, thereby bridging theoretical coding advances with real-world scalability challenges. Full article
(This article belongs to the Special Issue Mathematical Modeling, Simulations and Applications)
Show Figures

Figure 1

42 pages, 1602 KiB  
Article
Hierarchical Resource Management for Mega-LEO Satellite Constellation
by Liang Gou, Dongming Bian, Yulei Nie, Gengxin Zhang, Hongwei Zhou, Yulin Shi and Lei Zhang
Sensors 2025, 25(3), 902; https://doi.org/10.3390/s25030902 - 2 Feb 2025
Viewed by 1544
Abstract
The mega-low Earth orbit (LEO) satellite constellation is pivotal for the future of satellite Internet and 6G networks. In the mega-LEO satellite constellation system (MLSCS), which is the spatial distribution of satellites, global users, and their services, along with the utilization of global [...] Read more.
The mega-low Earth orbit (LEO) satellite constellation is pivotal for the future of satellite Internet and 6G networks. In the mega-LEO satellite constellation system (MLSCS), which is the spatial distribution of satellites, global users, and their services, along with the utilization of global spectrum resources, significantly impacts resource allocation and scheduling. This paper addresses the challenge of effectively allocating system resources based on service and resource distribution, particularly in hotspot areas where user demand is concentrated, to enhance resource utilization efficiency. We propose a novel three-layer management architecture designed to implement scheduling strategies and alleviate the processing burden on the terrestrial Network Control Center (NCC), while providing real-time scheduling capabilities to adapt to rapid changes in network topology, resource distribution, and service requirements. The three layers of the resource management architecture—NCC, space base station (SBS), and user terminal (UT)—are discussed in detail, along with the functions and responsibilities of each layer. Additionally, we explore various resource scheduling strategies, approaches, and algorithms, including spectrum cognition, interference coordination, beam scheduling, multi-satellite collaboration, and random access. Simulations demonstrate the effectiveness of the proposed approaches and algorithms, indicating significant improvements in resource management in the MLSCS. Full article
(This article belongs to the Section Remote Sensors)
Show Figures

Figure 1

37 pages, 1824 KiB  
Article
Carrier Frequency Offset Impact on Universal Filtered Multicarrier/Non-Uniform Constellations Performance: A Digital Video Broadcasting—Terrestrial, Second Generation Case Study
by Sonia Zannou, Anne-Carole Honfoga, Michel Dossou and Véronique Moeyaert
Telecom 2024, 5(4), 1205-1241; https://doi.org/10.3390/telecom5040061 - 4 Dec 2024
Cited by 1 | Viewed by 1056
Abstract
Digital terrestrial television is now implemented in many countries worldwide and is now mature. Digital Video Broadcasting-Terrestrial, second generation (DVB-T2) is the European standard adopted or deployed by European and African countries which uses Orthogonal Frequency-Division Multiplexing (OFDM) modulation to achieve good throughput [...] Read more.
Digital terrestrial television is now implemented in many countries worldwide and is now mature. Digital Video Broadcasting-Terrestrial, second generation (DVB-T2) is the European standard adopted or deployed by European and African countries which uses Orthogonal Frequency-Division Multiplexing (OFDM) modulation to achieve good throughput performance. However, its main particularity is the number of subcarriers operated for OFDM modulation which varies from 1024 to 32,768 subcarriers. Also, mobile reception is planned in DVB-T2 in addition to rooftop antenna and portable receptions planned in DVB-T. However, the main challenge of DVB-T2 for mobile reception is the presence of a carrier frequency offset (CFO) which degrades the system performance by inducing an Intercarrier Interference (ICI) on the DVB-T2 signal. This paper evaluates the system performance in the presence of the CFO when Gaussian noise and a TU6 channel are applied. Universal Filtered Multicarrier (UFMC) and non-uniform constellations (NUCs) have previously demonstrated good performance in comparison with OFDM and Quadrature Amplitude Modulation (QAM) in DVB-T2. The impact of CFO on the UFMC- and NUC-based DVB-T2 system is additionally investigated in this work. The results demonstrate that the penalties induced by CFO insertion in UFMC- and NUC-based DVB-T2 are highly reduced in comparison to those for the native DVB-T2. At a bit error rate (BER) of 103, the CFO penalties induced by the native DVB-T2 are 0.96dB and 4 dB, respectively, when only Additive White Gaussian Noise (AWGN) is used and when TU6 is additionally considered. The penalties are equal to 0.84dB and 0.2dB for UFMC/NUC-based DVB-T2. Full article
(This article belongs to the Topic Advances in Wireless and Mobile Networking)
Show Figures

Figure 1

27 pages, 4239 KiB  
Article
Code-Based Differential GNSS Ranging for Lunar Orbiters: Theoretical Review and Application to the NaviMoon Observables
by Anaïs Delépaut, Alex Minetto and Fabio Dovis
Remote Sens. 2024, 16(15), 2755; https://doi.org/10.3390/rs16152755 - 28 Jul 2024
Cited by 3 | Viewed by 1822
Abstract
In the near future, international space agencies have planned to achieve significant milestones in investigating the utilization of Global Navigation Satellite Systems (GNSS) within and beyond the current space service volume up to their application to lunar missions. These initiatives aim to demonstrate [...] Read more.
In the near future, international space agencies have planned to achieve significant milestones in investigating the utilization of Global Navigation Satellite Systems (GNSS) within and beyond the current space service volume up to their application to lunar missions. These initiatives aim to demonstrate the feasibility of GNSS navigation at lunar altitudes. Based on the outcomes of such demonstrations, dozens of lunar missions will likely be equipped with a GNSS receiver to support autonomous navigation in the lunar proximity. Relying on non-invasive, consolidated differential techniques, GNSS will enable baseline estimation, thus supporting a number of potential applications to lunar orbiters such as collaborative navigation, formation flight, orbital manoeuvers, remote sensing, augmentation systems and beyond. Unfortunately, the large dynamics and the geometry of such differential GNSS scenarios set them apart from current terrestrial and low-earth orbit use cases. These characteristics result in an increased sensitivity to measurements time misalignment among orbiters. Hence, this paper offers a review of baseline estimation methods and characterizes the divergences and limitations w.r.t. to terrestrial applications. The study showcases the estimation of the baseline length between a lunar CubeSat mission, VMMO, and the communication relay Lunar Pathfinder mission. Notably, real GNSS measurements generated by an Engineering Model of the NaviMoon receiver in the European Space Agency (ESA/ESTEC) Radio Navigation Laboratory are utilized. A radio-frequency constellation simulator is used to generate the GNSS signals in these hardware-in-the-loop tests. The performed analyses showed the invalidity of common terrestrial differential GNSS ranging techniques for space scenarios due to the introduction of significant biases. Improved ranging algorithms were proposed and their potential to cancel ranging errors common to both receivers involved was confirmed. Full article
Show Figures

Figure 1

21 pages, 4981 KiB  
Article
A Segmented Sliding Window Reference Signal Reconstruction Method Based on Fuzzy C-Means
by Haobo Liang, Yuan Feng, Yushi Zhang, Xingshuai Qiao, Zhi Wang and Tao Shan
Remote Sens. 2024, 16(10), 1813; https://doi.org/10.3390/rs16101813 - 20 May 2024
Cited by 2 | Viewed by 1490
Abstract
Reference signal reconstruction serves as a crucial technique for suppressing multipath interference and noise in the reference channel of passive radar. Aiming at the challenge of detecting Low-Slow-Small (LSS) targets using Digital Terrestrial Multimedia Broadcasting (DTMB) signals, this article proposes a novel segmented [...] Read more.
Reference signal reconstruction serves as a crucial technique for suppressing multipath interference and noise in the reference channel of passive radar. Aiming at the challenge of detecting Low-Slow-Small (LSS) targets using Digital Terrestrial Multimedia Broadcasting (DTMB) signals, this article proposes a novel segmented sliding window reference signal reconstruction method based on Fuzzy C-Means (FCM). By partitioning the reference signals based on the structure of DTMB signal frames, this approach compensates for frequency offset and sample rate deviation individually for each segment. Additionally, FCM clustering is utilized for symbol mapping reconstruction. Both simulation and experimental results show that the proposed method significantly suppresses constellation diagram divergence and phase rotation, increases the adaptive cancellation gain and signal-to-noise ratio (SNR), and in the meantime reduces the computation cost. Full article
(This article belongs to the Topic Radar Signal and Data Processing with Applications)
Show Figures

Figure 1

24 pages, 16989 KiB  
Article
3D Galileo Reference Antenna Pattern for Space Service Volume Applications
by Francesco Menzione and Matteo Paonni
Sensors 2024, 24(7), 2220; https://doi.org/10.3390/s24072220 - 30 Mar 2024
Cited by 3 | Viewed by 1373
Abstract
There is an increasing demand for navigation capability for space vehicles. The exploitation of the so-called Space Service Volume (SSV), and hence the extension of the Global Navigation Satellite System (GNSS) from terrestrial to space users, is currently considered a fundamental step. Knowledge [...] Read more.
There is an increasing demand for navigation capability for space vehicles. The exploitation of the so-called Space Service Volume (SSV), and hence the extension of the Global Navigation Satellite System (GNSS) from terrestrial to space users, is currently considered a fundamental step. Knowledge of the constellation antenna pattern, including the side lobe signals, is the main input for assessing the expected GNSS signal availability and navigation performance, especially for high orbits. The best way to define and share this information with the final GNSS user is still an open question. This paper proposes a novel methodology for the definition of a high-fidelity and easy-to-use statistical model to represent GNSS constellation antenna patterns. The reconstruction procedure, based on antenna characterization techniques and statistical learning, is presented here through its successful implementation for the “Galileo Reference Antenna Pattern (GRAP)” model, which has been proposed as the reference model for the Galileo programme. The GRAP represents the expected Equivalent Isotropic Radiated Power (EIRP) variation for the Galileo FOC satellites, and it is obtained by processing the measurements retrieved during the characterization campaign performed on the Galileo FOC antennas. The mathematical background of the model is analyzed in depth in order to better assess the GRAP with respect to different objectives such as improved resolution, smoothness and proper representation of the antenna pattern statistical distribution. The analysis confirms the enhanced GRAP properties and envisages the possibility of extending the approach to other GNSSs. The discussion is complemented by a preliminary use case characterization of the Galileo performance in SSV. The accessibility, a novel indicator, is defined in order to represent in a quick and compact manner, the expected Galileo SSV quality for different altitudes and target mission requirements. The SSV characterization is performed to demonstrate how simply and effectively the GRAP model can be inserted into user analysis. The work creates the basis for an improved capability for assessing Galileo-based navigation in SSV according to the current knowledge of the antenna pattern. Full article
Show Figures

Figure 1

23 pages, 7467 KiB  
Article
Architectural Framework and Feasibility of Internet of Things-Driven Mars Exploration via Satellite Constellations
by Oscar Ledesma, Paula Lamo, Juan A. Fraire, María Ruiz and Miguel A. Sánchez
Electronics 2024, 13(7), 1289; https://doi.org/10.3390/electronics13071289 - 30 Mar 2024
Cited by 7 | Viewed by 2503
Abstract
This study outlines a technical framework for Internet of Things (IoT) communications on Mars, leveraging Long Range (LoRa) technology to connect Martian surface sensors and orbiting satellites. The designed architecture adapts terrestrial satellite constellation models to Martian environments and the specific needs of [...] Read more.
This study outlines a technical framework for Internet of Things (IoT) communications on Mars, leveraging Long Range (LoRa) technology to connect Martian surface sensors and orbiting satellites. The designed architecture adapts terrestrial satellite constellation models to Martian environments and the specific needs of interplanetary communication with Earth. It incorporates multiple layers, including Martian IoT nodes, satellite linkage, constellation configuration, and Earth communication, emphasizing potential Martian IoT applications. The analysis covers four critical feasibility aspects: the maximum communication range between surface IoT nodes and orbiting satellites, the satellite constellation’s message processing capacity to determine IoT node volume support, the communication frequency and visibility of IoT nodes based on the satellite constellation arrangement, and the interplanetary data transmission capabilities of LoRa-based IoT devices. The findings affirm LoRa’s suitability for Martian IoT communication, demonstrating extensive coverage, sufficient satellite processing capacity for anticipated IoT node volumes, and effective data transmission in challenging interplanetary conditions. This establishes the framework’s viability for advancing Mars exploration and IoT in space exploration contexts. Full article
(This article belongs to the Special Issue Future Generation Non-Terrestrial Networks)
Show Figures

Figure 1

25 pages, 5087 KiB  
Review
Multi-Layered Satellite Communications Systems for Ultra-High Availability and Resilience
by Marko Höyhtyä, Antti Anttonen, Mikko Majanen, Anastasia Yastrebova-Castillo, Mihaly Varga, Luca Lodigiani, Marius Corici and Hemant Zope
Electronics 2024, 13(7), 1269; https://doi.org/10.3390/electronics13071269 - 29 Mar 2024
Cited by 10 | Viewed by 5307
Abstract
Satellite communications systems provide a means to connect people and devices in hard-to-reach locations. Traditional geostationary orbit (GEO) satellite systems and low Earth orbit (LEO) constellations, having their own strengths and weaknesses, have been used as separate systems serving different markets and customers. [...] Read more.
Satellite communications systems provide a means to connect people and devices in hard-to-reach locations. Traditional geostationary orbit (GEO) satellite systems and low Earth orbit (LEO) constellations, having their own strengths and weaknesses, have been used as separate systems serving different markets and customers. In this article, we analyze how satellite systems in different orbits could be integrated together and used as a multi-layer satellite system (MLSS) to improve communication services. The optimization concerns combining the strengths of different layers that include a larger coverage area as one moves up by each layer of altitude and a shorter delay as one moves down by each layer of altitude. We review the current literature and market estimates and use the information to provide a thorough assessment of the economic, regulatory, and technological enablers of the MLSS. We define the MLSS concept and the architecture and describe our testbed and the simulation tools used as a comprehensive engineering proof-of-concept. The validation results confirm that the MLSS approach can intelligently exploit the smaller jitter of GEO and shorter delay of LEO connections, and it can increase the availability and resilience of communication services. As a main conclusion, we can say that multi-layered networks and the integration of satellite and terrestrial segments seem very promising candidates for future 6G systems. Full article
(This article belongs to the Special Issue Future Generation Non-Terrestrial Networks)
Show Figures

Figure 1

19 pages, 1304 KiB  
Article
Cooperative Caching and Resource Allocation in Integrated Satellite–Terrestrial Networks
by Xiangqiang Gao, Yingzhao Shao, Yuanle Wang, Hangyu Zhang and Yang Liu
Electronics 2024, 13(7), 1216; https://doi.org/10.3390/electronics13071216 - 26 Mar 2024
Cited by 2 | Viewed by 1762
Abstract
Due to the rapid development of low earth orbit satellite constellations, e.g., Starlink, OneWeb, etc., integrated satellite-terrestrial networks have been viewed as a promising paradigm to globally provide satellite internet services for users. However, when the contents from ground data centers are provided [...] Read more.
Due to the rapid development of low earth orbit satellite constellations, e.g., Starlink, OneWeb, etc., integrated satellite-terrestrial networks have been viewed as a promising paradigm to globally provide satellite internet services for users. However, when the contents from ground data centers are provided for users by satellite networks, there will be high capital expenditures in terms of communication delay and bandwidth usage. To this end, in this paper, a cooperative-caching and resource-allocation problem is investigated in integrated satellite–terrestrial networks. Popular contents, which are cached on satellites and ground data centers, can be accessed via inter-satellite and satellite–terrestrial networks in a cooperative way. The optimization problem is formulated to jointly minimize the deployment costs of storage resource usage and network bandwidth consumption. A cooperative caching and resource allocation (CCRA) algorithm based on a neighborhood search is proposed to address the problem. The simulation results demonstrate that the proposed CCRA algorithm outperforms Greedy and BFS in reducing the deployment costs. Full article
Show Figures

Figure 1

Back to TopTop