Balance-URSONet: A Real-Time Efficient Pose Spacecraft Estimation Network
Abstract
1. Introduction
- (1)
- In this paper, we propose a spacecraft pose estimation network, Balance-URSONet, that balances the accuracy and number of parameters. We make the pose estimation model more capable of feature extraction by innovatively using RepVGG as the feature extraction network. The inference stage significantly improves inference speed using structural reparameterization, while ensuring model accuracy.
- (2)
- This paper proposes a feature excitation unit (FEU) consisting of an encoder and decoder. The encoder compresses spatial redundancy, reinforces spatial features, and improves spatial semantic expression. The decoder further enhances channel feature expression and adds a skip connection between the encoder and decoder to enhance the model’s ability to capture the detailed contours of the spacecraft. The model inference stage only requires a 1 × 1 convolution to achieve dual-domain feature interaction, which greatly improves the accuracy of pose estimation without significantly increasing the number of model parameters.
- (3)
- In this paper, evaluation experiments are conducted on the spacecraft pose estimation dataset (SPEED). The experimental results show that the Balance-URSONet proposed in this paper exhibits excellent performance on the spacecraft pose estimation task, with an ESA score of 0.13 and 13 times fewer parameters than URSONet.
2. Related Work
2.1. End-to-End Pose Estimation Methods
2.2. Attitude Estimation Network Lightweighting
3. Proposed Method
3.1. Overall Architectural Design of Balance-URSONet
3.2. Feature Extraction Network
3.3. Feature Excitation Unit
3.4. Loss Function
4. Experimentation and Analysis
4.1. Dataset
4.2. Implementation Details
4.3. Evaluation Indicators
4.4. Comparative Experiment
4.5. Computational Efficiency and Real-Time Performance
4.6. Genuine Performance
4.7. Ablation Study
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Zhang, P.; Wu, D.; Baoyin, H. Real-time hybrid method for maneuver detection and estimation of non-cooperative space targets. Astrodynamics 2024, 8, 437–453. [Google Scholar] [CrossRef]
- Bechini, M.; Lavagna, M. Robust and efficient single-CNN-based spacecraft relative pose estimation from monocular images. Acta Astronaut. 2025, 233, 198–217. [Google Scholar] [CrossRef]
- Nanjangud, A.; Blacker, P.C.; Bandyopadhyay, S.; Gao, Y. Robotics and AI-enabled on-orbit operations with future generation of small satellites. Proc. IEEE 2018, 106, 429–439. [Google Scholar] [CrossRef]
- Aglietti, G.S.; Taylor, B.; Fellowes, S.; Salmon, T.; Retat, I.; Hall, A.; Chabot, T.; Pisseloup, A.; Cox, C.; Mafficini, A.; et al. The active space debris removal mission RemoveDebris. Part 2: In orbit operations. Acta Astronaut. 2020, 168, 310–322. [Google Scholar] [CrossRef]
- Chen, C.; Jing, Z.; Pan, H.; Dun, X.; Huang, J.; Wu, H.; Cao, S. SESPnet: A lightweight network with attention mechanism for spacecraft pose estimation. Aerosp. Syst. 2024, 7, 1–10. [Google Scholar] [CrossRef]
- Li, K.; Zhang, H.; Hu, C. Learning-based pose estimation of non-cooperative spacecrafts with uncertainty prediction. Aerospace 2022, 9, 592. [Google Scholar] [CrossRef]
- Liu, Y.; Xie, Z.; Zhang, Q.; Zhao, X.; Liu, H. A new approach for the estimation of non-cooperative satellites based on circular feature extraction. Robot. Auton. Syst. 2020, 129, 103532. [Google Scholar] [CrossRef]
- Gao, X.; Liang, B.; Xu, W. Attitude determination of large non-cooperative spacecrafts in final approach. In Proceedings of the 2010 11th International Conference on Control Automation Robotics & Vision, Singapore, 7–10 December 2010; pp. 1571–1576. [Google Scholar]
- Meng, C.; Li, Z.; Sun, H.; Yuan, D.; Bai, X.; Zhou, F. Satellite pose estimation via single perspective circle and line. IEEE Trans. Aerosp. Electron. Syst. 2018, 54, 3084–3095. [Google Scholar] [CrossRef]
- Long, C.; Hu, Q. Monocular-vision-based relative pose estimation of noncooperative spacecraft using multicircular features. IEEE/ASME Trans. Mechatron. 2022, 27, 5403–5414. [Google Scholar] [CrossRef]
- Cassinis, L.P.; Fonod, R.; Gill, E. Review of the robustness and applicability of monocular pose estimation systems for relative navigation with an uncooperative spacecraft. Prog. Aerosp. Sci. 2019, 110, 100548. [Google Scholar] [CrossRef]
- Yang, H.; Xiao, X.; Yao, M.; Xiong, Y.; Cui, H.; Fu, Y. PVSPE: A pyramid vision multitask transformer network for spacecraft pose estimation. Adv. Space Res. 2024, 74, 1327–1342. [Google Scholar] [CrossRef]
- Posso, J.; Bois, G.; Savaria, Y. Mobile-ursonet: An embeddable neural network for onboard spacecraft pose estimation. In Proceedings of the 2022 IEEE International Symposium on Circuits and Systems (ISCAS), Austin, TX, USA, 27 May–1 June 2022; pp. 794–798. [Google Scholar]
- Proença, P.F.; Gao, Y. Deep learning for spacecraft pose estimation from photorealistic rendering. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 6007–6013. [Google Scholar]
- Kisantal, M.; Sharma, S.; Park, T.H.; Izzo, D.; Märtens, M.; D’Amico, S. Satellite pose estimation challenge: Dataset, competition design, and results. IEEE Trans. Aerosp. Electron. Syst. 2020, 56, 4083–4098. [Google Scholar] [CrossRef]
- Park, T.H.; Märtens, M.; Lecuyer, G.; Izzo, D.; D’Amico, S. SPEED+: Next-generation dataset for spacecraft pose estimation across domain gap. In Proceedings of the 2022 IEEE Aerospace Conference (AERO), Big Sky, MT, USA, 5–12 March 2022; pp. 1–15. [Google Scholar]
- Wang, Z.; Zhang, Z.; Sun, X.; Li, Z.; Yu, Q. Revisiting monocular satellite pose estimation with transformer. IEEE Trans. Aerosp. Electron. Syst. 2022, 58, 4279–4294. [Google Scholar] [CrossRef]
- Chu, S.; Duan, Y.; Schilling, K.; Wu, S. Monocular 6-DoF Pose Estimation for Non-cooperative Spacecrafts Using Riemannian Regression Network. In Proceedings of the European Conference on Computer Vision, Tel Aviv, Israel, 23–27 October 2022; pp. 186–198. [Google Scholar]
- Wang, S.; Wang, S.; Jiao, B.; Yang, D.; Su, L.; Zhai, P.; Chen, C.; Zhang, L. Ca-spacenet: Counterfactual analysis for 6d pose estimation in space. In Proceedings of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Kyoto, Japan, 23–27 October 2022; pp. 10627–10634. [Google Scholar]
- Park, T.H.; Sharma, S.; D’Amico, S. Towards Robust Learning-Based Pose Estimation of Noncooperative Spacecraft. arXiv 2019, arXiv:1909.00392. [Google Scholar] [CrossRef]
- Chen, B.; Cao, J.; Parra, A.; Chin, T.J. Satellite pose estimation with deep landmark regression and nonlinear pose refinement. In Proceedings of the IEEE/CVF International Conference on Computer Vision Workshops, Seoul, Republic of Korea, 27 October–2 November 2019. [Google Scholar]
- Legrand, A.; Detry, R.; De Vleeschouwer, C. End-to-end neural estimation of spacecraft pose with intermediate detection of keypoints. In Proceedings of the European Conference on Computer Vision, Tel Aviv, Israel, 23–27 October 2022; pp. 154–169. [Google Scholar]
- Pérez-Villar, J.I.B.; García-Martín, Á.; Bescós, J.; SanMiguel, J.C. Test-Time Adaptation for Keypoint-Based Spacecraft Pose Estimation Based on Predicted-View Synthesis. IEEE Trans. Aerosp. Electron. Syst. 2024, 60, 6752–6764. [Google Scholar] [CrossRef]
- Sharma, S.; Beierle, C.; D’Amico, S. Pose estimation for non-cooperative spacecraft rendezvous using convolutional neural networks. In Proceedings of the 2018 IEEE Aerospace Conference, Big Sky, MT, USA, 3–10 March 2018; pp. 1–12. [Google Scholar]
- Sharma, S.; D’Amico, S. Neural network-based pose estimation for noncooperative spacecraft rendezvous. IEEE Trans. Aerosp. Electron. Syst. 2020, 56, 4638–4658. [Google Scholar] [CrossRef]
- Huang, H.; Zhao, G.; Gu, D.; Bo, Y. Non-model-based monocular pose estimation network for uncooperative spacecraft using convolutional neural network. IEEE Sens. J. 2021, 21, 24579–24590. [Google Scholar] [CrossRef]
- Liu, J.; Zhang, Y.; Zhang, W. Lightweight Network-Based End-to-End Pose Estimation for Noncooperative Targets. Laser Optoelectron. Prog. 2024, 61, 1437010. [Google Scholar]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. In Proceedings of the Advances in Neural Information Processing Systems 25 (NIPS 2012), Lake Tahoe, NV, USA, 3–6 December 2012; Volume 25. [Google Scholar]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 8–10 June 2015; pp. 1–9. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 26 June–1 July 2015; pp. 770–778. [Google Scholar]
- Li, J.; Wen, Y.; He, L. Scconv: Spatial and channel reconstruction convolution for feature redundancy. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Vancouver, BC, Canada, 17–24 June 2023; pp. 6153–6162. [Google Scholar]
- Markley, F.L.; Cheng, Y.; Crassidis, J.L.; Oshman, Y. Averaging quaternions. J. Guid. Control Dyn. 2007, 30, 1193–1197. [Google Scholar] [CrossRef]
- Kelvins Pose Estimation Challenge 2019. 2019. Available online: https://kelvins.esa.int/satellite-pose-estimation-challenge/ (accessed on 2 August 2025).
- Phisannupawong, T.; Kamsing, P.; Torteeka, P.; Channumsin, S.; Sawangwit, U.; Hematulin, W.; Jarawan, T.; Somjit, T.; Yooyen, S.; Delahaye, D.; et al. Vision-based spacecraft pose estimation via a deep convolutional neural network for noncooperative docking operations. Aerospace 2020, 7, 126. [Google Scholar] [CrossRef]
- Black, K.; Shankar, S.; Fonseka, D.; Deutsch, J.; Dhir, A.; Akella, M.R. Real-Time, Flight-Ready, Non-Cooperative Spacecraft Pose Estimation Using Monocular Imagery. arXiv 2021, arXiv:2101.09553. [Google Scholar]
- Sharma, S.; D’Amico, S. Pose Estimation for Non-Cooperative Rendezvous Using Neural Networks. arXiv 2019, arXiv:1906.09868. [Google Scholar] [CrossRef]
Method | Feature Map | Params (M) | Ori err. (°) ↓ | Pos err. (m) ↓ |
---|---|---|---|---|
8 | 40 | 10.2 | 0.720 | |
URSONet [14] | 128 | 80 | 7.8 | 0.540 |
512 | 240 | 7.2 | 0.480 | |
Ours | 1280 | 18.6 | 5.61 | 0.339 |
Method | Params (M) | (°) ↓ | (m) ↓ |
---|---|---|---|
LSPENet (8 bins) [27] | 6.1 | 10.01 | - |
LSPENet (12 bins) [27] | 7.6 | 6.75 | - |
LSPENet (16 bins) [27] | 10.6 | 5.78 | 0.540 |
Torteeka [34] | 45.5 | 13.70 | 1.190 |
Ours (8 bins) | 11.2 | 8.69 | 0.459 |
Ours (12 bins) | 14.3 | 7.06 | 0.395 |
Ours (16 bins) | 18.6 | 5.61 | 0.339 |
Soft Classification | Method | Params | ↓ | ↓ | ↓ | ↓ | ↓ |
---|---|---|---|---|---|---|---|
8 bins | Mobile-URSONet [13] SESPnet [5] | 2.8 M 2.57 M | 11.24 10.40 | 0.537 0.346 | 0.414 0.3840 | 0.251 0.210 | 1.65 1.82 |
Ours | 11.2 M | 8.69 | 0.459 | 0.2309 | 0.176 | 1.31 | |
12 bins | Mobile-URSONet [13] SESPnet [5] | 4.4 M 3.4 M | 7.32 8.76 | 0.501 0.397 | 0.89 0.307 | 0.205 0.191 | 4.34 1.60 |
Ours | 14.3 M | 7.06 | 0.395 | 0.35 | 0.151 | 2.31 | |
16 bins | Mobile-URSONet [13] SESPnet [5] | 7.4 M 5.09 M | 6.58 8.71 | 0.570 0.464 | 1.52 0.914 | 0.201 0.185 | 7.56 4.94 |
Ours | 18.6 M | 5.61 | 0.339 | 0.34 | 0.129 | 2.67 |
Participants | ↓ | ↓ | ↓ |
---|---|---|---|
Chen et al. [21] | 0.0094 | 0.3752 | 39.9 |
EPFL_cvlab [13] | 0.0215 | 0.1140 | 5.3 |
Black et al. [35] | 0.0409 | 0.2918 | 7.13 |
Proença et al. [14] | 0.0571 | 0.1555 | 2.7 |
Posso et al. (8 bins) [13] | 0.2513 | 0.4138 | 1.65 |
Posso et al. (12 bins) [13] | 0.2054 | 0.8900 | 4.34 |
Posso et al. (16 bins) [13] | 0.2011 | 1.5201 | 7.56 |
Chen et al. (8 bins) [5] | 0.2100 | 0.3840 | 1.82 |
Chen et al. (12 bins) [5] | 0.1910 | 0.3070 | 1.60 |
Chen et al. (16 bins) [5] | 0.1850 | 0.9140 | 4.94 |
Sharma et al. [36] | 0.0626 | 0.3951 | 6.3 |
Ours (8 bins) | 0.1760 | 0.2309 | 1.31 |
Ours (12 bins) | 0.1511 | 0.3501 | 2.31 |
Ours (16 bins) | 0.1291 | 0.3441 | 2.67 |
Method | Soft Classification | FLOPs | Valid Time (ms) |
---|---|---|---|
Ours (8 bins) | 8 bins | 4.094 G | 30.53 |
Ours (12 bins) | 12 bins | 4.094 G | 31.46 |
Ours (16 bins) | 16 bins | 4.094 G | 32.71 |
Network Number | RepVGG | FEU | Smooth L1 | Params | ↓ | ↓ | ↓ |
---|---|---|---|---|---|---|---|
Net.1 | ✓ | - | - | 8.30 M | 10.97 | 0.697 | 1.23 |
Net.2 | ✓ | ✓ | - | 11.2 M | 8.73 | 0.670 | 0.61 |
Net.3 | ✓ | - | ✓ | 8.30 M | 10.28 | 0.471 | 1.03 |
Net.4 | ✓ | ✓ | ✓ | 11.2 M | 8.69 | 0.459 | 0.23 |
Method | Params (M) | (°) | (m) |
---|---|---|---|
Ours (8 bins) | 11.2 | 8.69 ± 0.3 | 0.459 ± 0.05 |
Ours (12 bins) | 14.3 | 7.06 ± 0.4 | 0.395 ± 0.02 |
Ours (16 bins) | 18.6 | 5.61 ± 0.1 | 0.339 ± 0.03 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Bi, Z.; Chen, M.; Ding, G.; Yan, H.; Han, S.; Li, Z.; Ma, R. Balance-URSONet: A Real-Time Efficient Pose Spacecraft Estimation Network. Aerospace 2025, 12, 840. https://doi.org/10.3390/aerospace12090840
Bi Z, Chen M, Ding G, Yan H, Han S, Li Z, Ma R. Balance-URSONet: A Real-Time Efficient Pose Spacecraft Estimation Network. Aerospace. 2025; 12(9):840. https://doi.org/10.3390/aerospace12090840
Chicago/Turabian StyleBi, Zhiyu, Ming Chen, Guopeng Ding, Haodong Yan, Shihao Han, Zhaoxiong Li, and Ruixue Ma. 2025. "Balance-URSONet: A Real-Time Efficient Pose Spacecraft Estimation Network" Aerospace 12, no. 9: 840. https://doi.org/10.3390/aerospace12090840
APA StyleBi, Z., Chen, M., Ding, G., Yan, H., Han, S., Li, Z., & Ma, R. (2025). Balance-URSONet: A Real-Time Efficient Pose Spacecraft Estimation Network. Aerospace, 12(9), 840. https://doi.org/10.3390/aerospace12090840