An Enhanced Method for Optical Imaging Computation of Space Objects Integrating an Improved Phong Model and Higher-Order Spherical Harmonics
Highlights
- We propose a computational method for optical imaging of space objects that inte-grates an enhanced Phong model with high-order spherical harmonics, accompanied by a comprehensive simulation workflow, specific parameter settings, pose labels, and surface material property analysis.
- We delineate two distinct sampling methodologies for optical simulation imagery of space objects and furnish a diverse array of satellite simulation datasets for utilization.
- Based on a holistic research approach incorporating geometric and radiative rendering pipelines, this method addresses the scarcity of datasets in space-based optical vision, enabling rapid rendering of optical images under various operational modes.
- The improved model systematically enhances the quality and diversity of optical im-age simulation datasets for space objects from multiple perspectives, including orbit, attitude, and light rendering, laying a solid foundation for subsequent algorithmic research.
Abstract
1. Introduction
- The simulation process is based on two space operational modes, fly-around and flyby, simulating the motion states during hovering and approaching phases, and images are generated at varying relative distances using two corresponding sampling methods, with each image accompanied by a 6D pose label.
- The optical calculations of space target surface materials utilize measurement results from spectral reflectance experiments. In the absence of direct measurement data, utilizing semi-ground experiments and physical simulation to model the optical imaging computational process of space objects is an effective method.
- The integration of the enhanced Phong model with HOSH allows for effective simulation of the multiple scattering phenomenon. This integration establishes a unified model that characterizes complex optical behaviors in space, thereby enhancing the visual clarity of the optical calculation process.
- The remainder of this study is organized as follows. Section 1 introduces the background of the problem. Section 2 details the overall research concept and workflow. Section 3 describes in depth the methods for computing surface materials and light sources in optical imaging. Section 4 presents the experimental parameter settings and a comparative analysis of the results. The conclusion is provided in Section 5.
2. Methodology
2.1. Space-Based Visual Model
2.2. Geometric and Radiative Rendering Pipeline
2.3. Fitting the Light Field Distribution Using HOSH
- Sunlight as Parallel Light:
- 2.
- Light from Distant Celestial Bodies:
3. Optical Scattering Model of Space Objects
3.1. Experiment of Spectral Reflectance
3.2. An Improved Phong Model
3.3. Satellite Surface Materials
- Metallic:
- 2.
- Roughness:
- 3.
- IOR:
- 4.
- Alpha:
4. Simulation Results
4.1. Simulation Scenarios
4.2. Experimental Parameters
- Camera Parameters:
- 2.
- Satellite Surface Material Parameters:
4.3. Texture Features
4.4. Comparative Analysis
5. Discussion
- High-Resolution Mode:
- 2.
- Wide-FOV Mode:
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Zhu, W.S.; Mu, J.J.; Li, S.; Han, F. Research progress on spacecraft pose estimation based on deep learning. J. Astronaut. 2023, 44, 1633–1644. [Google Scholar]
- Sun, C.M.; Yuan, Y.; Lv, Q.B. Modeling and verification of space-based optical scattering characteristics of space objects. Acta Opt. Sin. 2019, 39, 1129001. [Google Scholar] [CrossRef]
- Peng, X.D.; Liu, B.; Meng, X.; Xie, W.M. Imaging simulation modeling of space-borne visible light camera. Acta Photon. Sin. 2011, 40, 1106–1111. [Google Scholar]
- Hu, J. Research on Imaging Effect Simulation of Space Camera Based on Monte Carlo Ray Tracing. Ph.D. Thesis, University of Chinese Academy of Sciences, Beijing, China, 2013. [Google Scholar]
- Huang, J.M.; Liu, L.J.; Wang, Y.; Wei, X.Q. Research on space target imaging simulation based on target visible light scattering characteristics. Aerosp. Shanghai 2015, 32, 39–43. [Google Scholar]
- Han, Y.; Chen, M.; Sun, H.Y.; Zhang, Y. Imaging simulation method of Tiangong-2 companion satellite visible light camera. Infrared Laser Eng. 2017, 46, 258–264. [Google Scholar]
- Xu, X.X. Imaging Simulation of Space-Borne Visible Light Camera for Space Targets. Ph.D. Thesis, University of Chinese Academy of Sciences, Beijing, China, 2020. [Google Scholar]
- Yang, J.S.; Li, T.J. Research on imaging simulation of space-based space target scene. Laser Optoelectron. Prog. 2022, 59, 253–260. [Google Scholar]
- Hong, J.; Hnatyshyn, R.; Santos, E.A.D.; Maciejewski, R.; Isenberg, T. A survey of designs for combined 2D+3D visual representations. IEEE Trans. Vis. Comput. Graph. 2024, 30, 2888–2902. [Google Scholar] [CrossRef]
- Musallam, M.A.; Gaudilliere, V.; Ghorbel, E.; Al Ismaeil, K.; Perez, M.D.; Poucet, M. Spacecraft recognition leveraging knowledge of space environment: Simulator, dataset, competition design and analysis. In Proceedings of the 2021 IEEE International Conference on Image Processing Challenges (ICIPC), Anchorage, AK, USA, 19–22 September 2021; pp. 11–15. [Google Scholar]
- Al Homssi, B.; Al-Hourani, A.; Wang, K.; Conder, P.; Kandeepan, S.; Choi, J. Next generation mega satellite networks for access equality: Opportunities, challenges, and performance. IEEE Commun. Mag. 2022, 60, 18–24. [Google Scholar] [CrossRef]
- Hu, Y.L.; Speierer, S.; Jakob, W.; Fua, P.; Salzmann, M. Wide-depth-range 6D object pose estimation in space. In Proceedings of the 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Nashville, TN, USA, 19–25 June 2021. [Google Scholar]
- Price, A.; Yoshida, K. A monocular pose estimation case study: The Hayabusa2 Minerva-II2 deployment. In Proceedings of the 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Nashville, TN, USA, 19–25 June 2021; pp. 1992–2001. [Google Scholar]
- Cao, Y.; Mu, J.Z.; Cheng, X.H.; Liu, F.Y. Spacecraft-DS: A spacecraft dataset for key components detection and segmentation via hardware-in-the-loop capture. IEEE Sens. J. 2024, 24, 5347–5358. [Google Scholar] [CrossRef]
- Park, T.H.; D’Amico, S. Adaptive neural network-based unscented Kalman filter for robust pose tracking of noncooperative spacecraft. J. Guid. Control Dyn. 2023, 46, 1671–1688. [Google Scholar] [CrossRef]
- Li, T.J. Research on Monitoring Method and Imaging System Simulation of Space-Based Target Visible Light Camera. Ph.D. Thesis, Tianjin University, Tianjin, China, 2021. [Google Scholar]
- Yu, K. Research on Key Technologies of Space Target Photoelectric Detection Scene Simulation and Visual Navigation. Ph.D. Thesis, Harbin Institute of Technology, Harbin, China, 2020. [Google Scholar]
- Alexa, M. Super-Fibonacci spirals: Fast, low-discrepancy sampling of SO(3). In Proceedings of the 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), New Orleans, LA, USA, 18–24 June 2022; pp. 8281–8290. [Google Scholar]
- Bechini, M.; Lavagna, M.; Longhi, P. Dataset generation and validation for spacecraft pose estimation via monocular images processing. Acta Astronaut. 2023, 204, 358–369. [Google Scholar] [CrossRef]
- Ji, X.; Zhan, F.; Lu, S.; Huang, S.-S.; Huang, H. MixLight: Borrowing the best of both spherical harmonics and Gaussian models. arXiv 2024, arXiv:2404.12768. [Google Scholar] [CrossRef]
- Price, M.A.; McEwen, J.D. Differentiable and accelerated spherical harmonic and Wigner transforms. J. Comput. Phys. 2024, 510, 113109. [Google Scholar] [CrossRef]
- Li, P.; Li, Z.; Xu, C.; Fang, Y. Research on scattering spectrum of third-order gallium arsenide battery based on thin-film interference theory. Spectrosc. Spectr. Anal. 2020, 40, 3092–3097. [Google Scholar]
- Synopsys.org. Optical Scattering Measurements & Equipment. Available online: https://www.keysight.com/us/en/products/software/optical-solutions-software.html (accessed on 21 March 2025).
- Cao, Y.; Cao, Y.; Wu, Z.; Yang, K. A calculation method for the hyperspectral imaging of targets utilizing a ray-tracing algorithm. Remote Sens. 2024, 16, 1779. [Google Scholar] [CrossRef]
- Liu, Y.; Dai, J.; Zhao, S.; Zhang, J.; Li, T.; Shang, W.; Zheng, Y. A bidirectional reflectance distribution function model of space targets in visible spectrum based on GA-BP network. Appl. Phys. 2020, 126, 114. [Google Scholar] [CrossRef]
- Li, Z.; Xu, C.; Huo, Y.; Fang, Y. Principles and Applications of Optical Characteristics of Space Targets; Tsinghua University Press: Beijing, China, 2025. [Google Scholar]
- Pharr, M.; Humphreys, G. Physically Based Rendering: From Theory to Implementation, 2nd ed.; Morgan Kaufmann, Elsevier: Burlington, MA, USA, 2010. [Google Scholar]
- Walter, B.; Marschner, S.R.; Li, H.; Torrance, K.E. Microfacet models for refraction through rough surfaces. In Proceedings of the 18th Eurographics Conference on Rendering Techniques (EGSR ’07), Grenoble, France, 25–27 June 2007; Eurographics Association: Goslar, Germany, 2007; pp. 223–231. [Google Scholar]
- Schlick, C. An inexpensive BRDF model for physically-based rendering. Comput. Graph. Forum 1994, 13, 233–246. [Google Scholar] [CrossRef]
- Karis, B. Real Shading in Unreal Engine 4; Beijing Institute of Technology Press: Beijing, China, 2013. [Google Scholar]
- Ning, Q.; Wang, H.; Yan, Z.; Liu, X.; Lu, Y. Space-based THz radar fly-around imaging simulation for space targets based on improved path tracing. Remote Sens. 2023, 15, 4010. [Google Scholar] [CrossRef]
- Hell, M.C.; Horvat, C. A method for constructing directional surface wave spectra from ICESat-2 altimetry. Cryosphere 2024, 18, 341–361. [Google Scholar] [CrossRef]
- Proença, P.F.; Gao, Y. Deep learning for spacecraft pose estimation from photorealistic rendering. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 1234–1238. [Google Scholar]















| Orbital Parameters | Initial OE for Space Objects | Initial OE for Observing Satellite | Terminal OE for Observing Satellite (Fly-By) | Terminal OE for Observing Satellite (Fly-Around) |
|---|---|---|---|---|
| 6878.137 | 6878.137 | 6873.136077 | 6861.380356 | |
| 0.0000 | 0.0000 | 0.000515 | 0.001317 | |
| 98.5 | 98.5 | 98.506 | 98.512 | |
| 360 | 360 | 1.115 | 0.545 | |
| 0 | 0 | 179.268 | 292.901 | |
| 0 | 30 | 147.547 | 172.686 |
| Optical System Design Index | Value | |
|---|---|---|
| Sensor size/mm × mm | 61.44 × 61.68 | |
| Wavelength/nm | 450–800 | |
| Number of pixels | 4096 × 4112 | |
| Pixel size/μm × μm | 15 × 15 | |
| Entrance pupil/mm | 280 | |
| Diameter of Target/m | 5 | |
| Orbital Altitude of Target/km | 500 | |
| FOV/degree | High-resolution Mode | 2 |
| Wide-FOV Mode | 10 | |
| Focal Length/m | High-resolution Mode | 3 |
| Wide-FOV Mode | 0.7 | |
| Material | Component Information | Typical Materials | (M, R, IOR, and Alpha) |
|---|---|---|---|
| Alloy | Satellite Main Body Structure | Brushed Metal Panel | (1, 0.3, 1.45, 1) |
| Fuel Tank | Titanium Alloy | (1, 0.3, 1.45, 1) | |
| Connection Components | Black Beryllium Alloy | (1, 0.5, 1.45, 1) | |
| Hinge | Nickel-based High-Temperature Alloy | (1, 1, 1.45, 1) | |
| Cladding Materials | Multilayer Insulation Material MLI | PI Film (gold, silver, black) | (1, 0.3, 1.45, 1) |
| Support Structures | Black Carbon Fiber Composite Substrate | (0, 0.5, 1.45, 1) | |
| Thermal Control Coatings | Sprayed Thermal Insulation Material | Organic Paints (organic white paint, black paint, epoxy paint, titanium blue paint) | (1, 1, 1.45, 1) |
| Radiation-Texture Coating | Silica-based Coating | (0, 0.5, 1.5, 1) | |
| Black Wrinkle Paint | Epoxy Resin | (0, 0.6, 1.45, 1) | |
| Solar Sail Panels | Solar Sail Panels | GaAs/Si | (1, 0.3, 1.45, 1) |
| Replaceable Payloads | Optical Lenses | OSR | (0.1, 0.85, 1.5, 1) |
| Model | Angular Error/° | |||
|---|---|---|---|---|
| Phong + 3rd SH | 32.5 | 0.53 | 6.5 | |
| Improved Phong + 3rd SH | 42.8 | 1.65 | 2.8 | |
| Improved Phong + 4th SH | 45.2 | 2.86 | 1.2 |
| No. | ||||
|---|---|---|---|---|
| (a) | 0.8281 | 0.4632 | 0.1545 | 0.2768 |
| (b) | 0.5836 | 0.7054 | −0.3100 | −0.2564 |
| (c) | 0.2465 | 0.7432 | 0.5904 | 0.1958 |
| (d) | −0.3380 | −0.6189 | 0.6223 | 0.3398 |
| (e) | 0.4988 | 0.5466 | −0.4968 | −0.4534 |
| (f) | 0.0177 | 0.0123 | 0.5691 | 0.8203 |
| (g) | 0.5991 | −0.7997 | 0.0323 | 0.0242 |
| (h) | 0.6641 | −0.7452 | 0.0454 | 0.0405 |
| (i) | 0.6947 | −0.7158 | 0.0507 | 0.0492 |
| (j) | 0.7239 | −0.6852 | 0.0553 | 0.0584 |
| (k) | 0.7518 | −0.6532 | 0.0592 | 0.0681 |
| (l) | 0.8028 | −0.5859 | 0.0653 | 0.0895 |
| (g) | 0.5991 | −0.7997 | 0.0323 | 0.0242 |
| Dataset Types | SPARK [10] | Speed+ [11] | URSO [33] | Swisscube [12] | Minerva-II2 [13] | Ours |
|---|---|---|---|---|---|---|
| Total Number of Images | 150 k | 15 k | 15 k | 21 k | 20 k | 100 k |
| Category | 15 | 1 | 2 | 1 | 1 | 5 |
| Color | √ | × | √ | √ | × | √ |
| Depth | × | × | × | × | × | √ |
| Mask | √ | × | × | × | √ | √ |
| 6D Pose | √ | √ | √ | √ | √ | √ |
| 3D Model | √ | × | × | √ | √ | √ |
| Surface Texture Fidelity | × | √ | × | × | × | √ |
| Render | simulated | simulated + lab | simulated + lab | simulated | simulated + lab | simulated + lab |
| Observation Distance | Operational Mode | Spatial Resolution | Satellite Size in Pixels | Recognition Capability |
|---|---|---|---|---|
| 5 km | High-resolution Mode | 0.025 m | 200 | Texture-Level Identification |
| 10 km | High-resolution Mode | 0.05 m | 100 | Component-Level Identification |
| 20 km | High-resolution Mode | 0.1 m | 50 | Component-Level Identification |
| 30 km | Wide-FOV Mode | 0.643 m | 7.8 | Shape Recognition |
| 50 km | Wide-FOV Mode | 1.071 m | 4.7 | Wide-Area Detection and Tracking |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhu, Q.; Xu, C.; Zhang, Y.; Lu, Y.; Wang, X.; Li, P. An Enhanced Method for Optical Imaging Computation of Space Objects Integrating an Improved Phong Model and Higher-Order Spherical Harmonics. Remote Sens. 2025, 17, 3543. https://doi.org/10.3390/rs17213543
Zhu Q, Xu C, Zhang Y, Lu Y, Wang X, Li P. An Enhanced Method for Optical Imaging Computation of Space Objects Integrating an Improved Phong Model and Higher-Order Spherical Harmonics. Remote Sensing. 2025; 17(21):3543. https://doi.org/10.3390/rs17213543
Chicago/Turabian StyleZhu, Qinyu, Can Xu, Yasheng Zhang, Yao Lu, Xia Wang, and Peng Li. 2025. "An Enhanced Method for Optical Imaging Computation of Space Objects Integrating an Improved Phong Model and Higher-Order Spherical Harmonics" Remote Sensing 17, no. 21: 3543. https://doi.org/10.3390/rs17213543
APA StyleZhu, Q., Xu, C., Zhang, Y., Lu, Y., Wang, X., & Li, P. (2025). An Enhanced Method for Optical Imaging Computation of Space Objects Integrating an Improved Phong Model and Higher-Order Spherical Harmonics. Remote Sensing, 17(21), 3543. https://doi.org/10.3390/rs17213543

