A Relative Orbital Motion-Guided Framework for Generating Multimodal Visual Data of Spacecraft
Highlights
- A novel, physically grounded framework is proposed to generate large-scale synthetic visual datasets for non-cooperative spacecraft. It incorporates relative orbital motion simulation and end-to-end imaging degradation to produce realistic multimodal data (RGB, mask, depth, normal) with precise 6-DoF pose annotations for four representative spacecraft types.
- The generated dataset, comprising 8000 high-fidelity samples, effectively bridges the domain gap between synthetic and real on-orbit imagery. It specifically addresses the critical data scarcity issue in spacecraft visual perception, which arises from limited and inaccessible real-world data.
- This work provides a crucial, open-access data foundation and simulation tool for the development, benchmarking, and validation of data-driven algorithms (e.g., for pose estimation, segmentation, tracking) in on-orbit servicing and space debris removal, potentially accelerating related research and technology maturation.
- The proposed integrated pipeline establishes a new paradigm for generating physically accurate and task-specific synthetic data in remote sensing by integrating motion physics, multimodal ground truth, and sensor degradation. This paradigm can be adapted to other space and Earth observation applications facing similar data scarcity challenges.
Abstract
1. Introduction
- (1)
- Relative motion modeling with orbital constraints: Orbital dynamics equations are integrated into the data generation pipeline to construct physically plausible relative motion trajectories, ensuring the kinematic realism of simulated scenarios.
- (2)
- Multi-source spacecraft ground truth data generation: RGB images, instance segmentation masks, depth maps, and surface normal maps are rendered synchronously, accompanied by precise 6-DoF pose ground truth, establishing a benchmark for related algorithm research.
- (3)
- Physics-driven image degradation model: An end-to-end imaging chain model incorporating optical degradation and sensor noise is developed, effectively narrowing the domain gap between synthetic and real satellite imagery, thereby enhancing the generalization capability of models trained on synthetic data for real-world tasks.
2. Related Works
3. Method
3.1. Relative Orbital Motion Pattern Modeling
3.1.1. Scene Description
3.1.2. Feature Parameterization Formulation
3.1.3. Typical Relative Motion Classification
3.2. Imaging Geometry Principles
3.2.1. Camera Imaging Model
3.2.2. Attitude Representation
- (1)
- Euler Angles
- (2)
- Quaternions
3.3. Visibility Analysis
3.3.1. Geometric Visibility
3.3.2. Optical Visibility
3.4. Multimodal Ground Truth Rendering and Imaging Simulation
3.4.1. Simulation Environment and Configuration
3.4.2. Multimodal Image Rendering Pipeline
3.4.3. Image Physical Degradation Simulation
4. Results
4.1. Synthetic Image Results
4.2. Ablation Study
5. Discussion
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Geller, D.K. Orbital rendezvous: When is autonomy required? J. Guid. Control Dyn. 2007, 30, 974–981. [Google Scholar] [CrossRef]
- Li, W.-J.; Cheng, D.-Y.; Liu, X.-G.; Wang, Y.-B.; Shi, W.-H.; Tang, Z.-X.; Gao, F.; Zeng, F.-M.; Chai, H.-Y.; Luo, W.-B.; et al. On-orbit service (OOS) of spacecraft: A review of engineering developments. Prog. Aerosp. Sci. 2019, 108, 32–120. [Google Scholar] [CrossRef]
- Flores-Abad, A.; Ma, O.; Pham, K.; Ulrich, S. A review of space robotics technologies for on-orbit servicing. Prog. Aerosp. Sci. 2014, 68, 1–26. [Google Scholar] [CrossRef]
- Xu, Q.; Hu, M.; Fang, Y.; Zhang, X. A neural radiance fields method for 3D reconstruction of space target. Adv. Space Res. 2025, 75, 6924–6943. [Google Scholar] [CrossRef]
- Feng, J.; Yang, Z.; Zhang, Z.; Chen, W.; Yu, J.; Chang, L. Point Cloud Model Reconstruction and Pose Estimation of Non-Cooperative Spacecraft Without Feature Extraction. Acta Opt. Sin. 2025, 45, 0712004. [Google Scholar]
- Li, Y.; Hu, Q.; Liang, T.; Li, D.; Ouyang, Z. Active 2DGS for 3D Reconstruction of Space Targets Under Orbital Constraints. IEEE Trans. Circuits Syst. Video Technol. 2025, 36, 4971–4983. [Google Scholar] [CrossRef]
- Zhao, Y.; Yi, J.; Pan, Y.; Chen, L. 3D reconstruction of non-cooperative space targets of poor lighting based on 3D gaussian splatting. Signal Image Video Process. 2025, 19, 509. [Google Scholar] [CrossRef]
- Chen, Z.; Gui, H.; Zhong, R. Neural network-based navigation filter for monocular pose and motion tracking of noncooperative spacecraft. Adv. Space Res. 2025, 75, 2908–2928. [Google Scholar] [CrossRef]
- Feng, Q.; Zhu, Z.H.; Pan, Q.; Liu, Y. Pose and motion estimation of unknown tumbling spacecraft using stereoscopic vision. Adv. Space Res. 2018, 62, 359–369. [Google Scholar] [CrossRef]
- Yin, H.; Ren, X.; Jiang, L.; Wang, C.; Xiong, Q.; Wang, Z. Robust Semantic Feature Extraction and Attitude Estimation of Unseen Noncooperative On-Orbit Spacecraft. IEEE Sens. J. 2025, 25, 31858–31873. [Google Scholar] [CrossRef]
- Zhang, Y.; Wang, J.; Chen, J.; Shi, D.; Chen, X. A Space Non-Cooperative Target Recognition Method for Multi-Satellite Cooperative Observation Systems. Remote Sens. 2024, 16, 3368. [Google Scholar] [CrossRef]
- Li, X.; Zhang, L.; Li, Z.; He, X. Application of the Relative Orbit in an On-Orbit Service Mission. Electronics 2023, 12, 3034. [Google Scholar] [CrossRef]
- Dai, C.; Qiang, H.; Zhang, D.; Hu, S.; Gong, B. Relative Orbit Determination Algorithm of Space Targets with Passive Observation. J. Syst. Eng. Electron. 2024, 35, 793–804. [Google Scholar] [CrossRef]
- Doshi, M.J.; Pathak, N.M.; Abouelmagd, E.I. Periodic orbits of the perturbed relative motion. Adv. Space Res. 2023, 72, 2020–2038. [Google Scholar] [CrossRef]
- Kisantal, M.; Sharma, S.; Park, T.H.; Izzo, D.; Martens, M.; D’Amico, S. Satellite pose estimation challenge: Dataset, competition design, and results. IEEE Trans. Aerosp. Electron. Syst. 2020, 56, 4083–4098. [Google Scholar] [CrossRef]
- Park, T.H.; Märtens, M.; Lecuyer, G.; Izzo, D.; Amico, S.D. SPEED+: Next-Generation Dataset for Spacecraft Pose Estimation across Domain Gap. In Proceedings of the 2022 IEEE Aerospace Conference, Big Sky, MT, USA, 5–12 March 2022; IEEE: New York, NY, USA, 2022; pp. 1–15. [Google Scholar]
- Park, T.H.; D’amico, S. Adaptive Neural-Network-Based Unscented Kalman Filter for Robust Pose Tracking of Noncooperative Spacecraft. In Proceedings of the AIAA Science and Technology Forum and Exposition (AIAA SciTech Forum), San Diego, CA, USA, 3–7 January 2022; pp. 1671–1688. [Google Scholar]
- Park, T.H.; D’amico, S. Rapid Abstraction of Spacecraft 3D Structure from Single 2D Image. In Proceedings of the AIAA SCITECH 2024 Forum, Orlando, FL, USA, 8–12 January 2024; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2024. [Google Scholar]
- Huang, K.; Zhang, Y.; Ma, F.; Chen, J.; Tan, Z.; Qi, Y. WDICD: A novel simulated dataset and structure-aware framework for semantic segmentation of spacecraft component. Acta Astronaut. 2024, 225, 1–15. [Google Scholar] [CrossRef]
- Sam, J.J.; Sathe, J.; Chigali, N.; Gupta, N.; Ruparel, R.; Jiang, Y.; Singh, J.; Berck, J.W.; Barman, A. A New Dataset and Performance Benchmark for Real-time Spacecraft Segmentation in Onboard Computers. arXiv 2025, arXiv:250710775. [Google Scholar]
- Cao, Y.; Mu, J.; Cheng, X.; Liu, F. Spacecraft-DS: A Spacecraft Dataset for Key Components Detection and Segmentation via Hardware-in-the-Loop Capture. IEEE Sens. J. 2024, 24, 5347–5358. [Google Scholar] [CrossRef]
- Proença, P.F.; Gao, Y. Deep Learning for Spacecraft Pose Estimation from Photorealistic Rendering. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation, Paris, France, 31 May–31 August 2020; IEEE: New York, NY, USA, 2020; pp. 6007–6013. [Google Scholar]
- Dung, H.A.; Chen, B.; Chin, T.J. A Spacecraft Dataset for Detection, Segmentation and Parts Recognition. In Proceedings of the 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Nashville, TN, USA, 19–25 June 2021; IEEE: New York, NY, USA, 2021; pp. 2012–2019. [Google Scholar]
- Hematulin, W.; Kamsing, P.; Phisannupawong, T.; Panyalert, T.; Manuthasna, S.; Torteeka, P.; Boonsrimuang, P. Generating Large-Scale Datasets for Spacecraft Pose Estimation via a High-Resolution Synthetic Image Renderer. Aerospace 2025, 12, 334. [Google Scholar] [CrossRef]
- Hu, Y.; Speierer, S.; Jakob, W.; Fua, P.; Salzmann, M. Wide-Depth-Range 6D Object Pose Estimation in Space. In Proceedings of the 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; IEEE: New York, NY, USA, 2021; pp. 15865–15874. [Google Scholar]
- Bechini, M.; Lavagna, M.; Lunghi, P. Dataset generation and validation for spacecraft pose estimation via monocular images processing. Acta Astronaut. 2023, 204, 358–369. [Google Scholar] [CrossRef]
- Gallet, F.; Marabotto, C.; Chambon, T. Exploring AI-Based Satellite Pose Estimation: From Novel Synthetic Dataset to In-Depth Performance Evaluation. In Proceedings of the 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Seattle, WA, USA, 17–18 June 2024; pp. 6770–6778. [Google Scholar]
- Musallam, M.A.; Gaudillière, V.; Ghorbel, E.; Al Ismaeil, K.; Perez, M.D.; Poucet, M.; Aouada, D. Spacecraft Recognition Leveraging Knowledge of Space Environment: Simulator, Dataset, Competition Design and Analysis. In Proceedings of the 2021 IEEE International Conference on Image Processing Challenges (ICIPC), Anchorage, AK, USA, 19–22 September 2021; pp. 11–15. [Google Scholar]
- Yang, X.; Cao, M.; Li, C.; Zhao, H.; Yang, D. Learning Implicit Neural Representation for Satellite Object Mesh Reconstruction. Remote Sens. 2023, 15, 4163. [Google Scholar] [CrossRef]
- Liu, X.; Li, D.; Dong, N.; Ip, W.H.; Yung, K.L. Noncooperative Target Detection of Spacecraft Objects Based on Artificial Bee Colony Algorithm. Tianjin Univ. Hong Kong Polytech. Univ. 2019, 34, 3–15. [Google Scholar] [CrossRef]
- Yuan, Y.; Bai, H.; Wu, P.; Guo, H.; Deng, T.; Qin, W. An Intelligent Detection Method for Small and Weak Objects in Space. Remote Sens. 2023, 15, 3169. [Google Scholar] [CrossRef]
- Liu, Y.; Zhou, R.; Yao, Z.; She, J.; Qi, N. STDSD: A Spacecraft Target Detection Framework Considering Similarity and Diversity. IEEE Trans. Autom. Sci. Eng. 2025, 22, 11038–11049. [Google Scholar] [CrossRef]
- Zhu, Q.; Lu, Y.; Li, P.; Li, J.; Li, W.; Zhang, Y. SO-PEN: Strong transformers enable a pan-dimensional equilibrium network for non-controlled space object pose estimation. Adv. Space Res. 2026, 77, 7387–7405. [Google Scholar] [CrossRef]
- Zhao, M.; Xu, L. Robust Pose Estimation for Noncooperative Spacecraft Under Rapid Inter-Frame Motion: A Two-Stage Point Cloud Registration Approach. Remote Sens. 2025, 17, 1944. [Google Scholar] [CrossRef]
- Yi, W.; Zhang, Z.; Chang, L. M4MLF-YOLO: A Lightweight Semantic Segmentation Framework for Spacecraft Component Recognition. Remote Sens. 2025, 17, 3144. [Google Scholar] [CrossRef]
- Li, F.; Zhang, Z.; Wang, X.; Xu, Y. Exploiting Diffusion Priors for Generalizable Few-Shot Satellite Image Semantic Segmentation. Remote Sens. 2025, 17, 3706. [Google Scholar] [CrossRef]
- Zhou, X.; Qiao, D.; Macdonald, M.; Li, X. Optical Orbit Determination for Asteroid Approach with Vision-Based Range Information. J. Guid. Control Dyn. 2025, 48, 1492–1510. [Google Scholar] [CrossRef]
- Piccolo, F.; Pugliatti, M.; Mcmahon, J.W.; Topputo, F. Autonomous Vision-Based Navigation at Small Bodies Combining Centroiding and Visual Odometry. J. Spacecr. Rocket. 2026, 63, 308–326. [Google Scholar] [CrossRef]
- NASA’s Gateway Program Office. Available online: https://nasa3d.arc.nasa.gov/model (accessed on 27 March 2026).
- Jiao, B.; Sun, Q.; Han, H.; Dang, Z. A parametric design method of nanosatellite close-range formation for on-orbit target inspection. Chin. J. Aeronaut. 2023, 36, 194–209. [Google Scholar] [CrossRef]
- Sun, Q.; Zhao, L.; Dang, Z. Comprehensive classification of non-periodic relative motion styles of spacecraft: A geometric approach and applications. Astrodynamics 2025, 9, 969–992. [Google Scholar] [CrossRef]
- Pauly, L.; Rharbaoui, W.; Shneider, C.; Rathinam, A.; Gaudillière, V.; Aouada, D. A survey on deep learning-based monocular spacecraft pose estimation: Current state, limitations and prospects. Acta Astronaut. 2023, 212, 339–360. [Google Scholar] [CrossRef]









| Trajectory Classification | Range | Dynamic Characteristics |
|---|---|---|
| Elliptical trajectory | = 0, the trajectory forms a closed ellipse, and the observer spacecraft moves periodically around the target spacecraft. | |
| Spiral trajectory | Drift and periodic motion combine into a smooth, non-intersecting spiral, drifting along-track away from or toward the target. | |
| Drip-drop trajectory | The trajectory shows periodic loops in the drift direction, resembling a water droplet, with two subcategories based on -axis crossing. | |
| Drift Trajectory | With small periodic oscillation, the trajectory is nearly linear or gently curved, moving unidirectionally away from the target along-track, representing the simplest relative motion. |
| Scene | Position Range (m) | Average Angular Variation (°) | Total Illumination Energy (W/m2) | Ambient Light Intensity (W/m2) |
|---|---|---|---|---|
| Tiangong | X ∈ [−19.80, 19.80] Y ∈ [−25.38, 25.38] Z ∈ [4.26, 4.26] | 0.70 | 264.06 | 1.40 |
| Spacedragon | X ∈ [−13.91, 13.15] Y ∈ [−13.08, 12.67] Z ∈ [7.39, 7.39] | 0.73 | 40.08 | 1.00 |
| ICESat | X ∈ [−41.26, 41.51] Y ∈ [−39.08, 20.47] Z ∈ [−19.05, 40.19] | 1.52 | 100.54 | 0.80 |
| Cassini | X ∈ [−30.06, 29.98] Y ∈ [−38.97, −24.30] Z ∈ [13.28, 13.29] | 0.20 | 357.29 | 1.00 |
| Data Modality | Description | Samples per Object | Total Samples | Primary Use |
|---|---|---|---|---|
| RGB Image | 24-bit true color image | 500 | 2000 | Network input |
| Segmentation Mask | Binary pixel-level annotation | 500 | 2000 | Segmentation tasks |
| Depth Map | Z-depth in camera space | 500 | 2000 | 3D reconstruction; depth estimation |
| Normal Map | Surface normal vectors | 500 | 2000 | Normal estimation; lighting analysis |
| Experiment Groups | PSNR (dB) ↑ | SSIM ↑ | LPIPS ↓ |
|---|---|---|---|
| D1 | 29.60 ± 2.59 | 0.90 ± 0.06 | 0.13 ± 0.05 |
| D2 | 25.24 ± 2.60 | 0.58 ± 0.19 | 0.54 ± 0.26 |
| D3 | 21.85 ± 0.60 | 0.11 ± 0.05 | 0.89 ± 0.08 |
| D4 | 20.13 ± 1.23 | 0.06 ± 0.02 | 0.95 ± 0.05 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Li, W.; Huo, Y.; Zhu, Q.; Lu, Y.; Fang, Y.; Zhang, Y. A Relative Orbital Motion-Guided Framework for Generating Multimodal Visual Data of Spacecraft. Remote Sens. 2026, 18, 1177. https://doi.org/10.3390/rs18081177
Li W, Huo Y, Zhu Q, Lu Y, Fang Y, Zhang Y. A Relative Orbital Motion-Guided Framework for Generating Multimodal Visual Data of Spacecraft. Remote Sensing. 2026; 18(8):1177. https://doi.org/10.3390/rs18081177
Chicago/Turabian StyleLi, Wanyun, Yurong Huo, Qinyu Zhu, Yao Lu, Yuqiang Fang, and Yasheng Zhang. 2026. "A Relative Orbital Motion-Guided Framework for Generating Multimodal Visual Data of Spacecraft" Remote Sensing 18, no. 8: 1177. https://doi.org/10.3390/rs18081177
APA StyleLi, W., Huo, Y., Zhu, Q., Lu, Y., Fang, Y., & Zhang, Y. (2026). A Relative Orbital Motion-Guided Framework for Generating Multimodal Visual Data of Spacecraft. Remote Sensing, 18(8), 1177. https://doi.org/10.3390/rs18081177

