On-Orbit Calibration for Spaceborne Line Array Camera and LiDAR
Abstract
:1. Introduction
2. Calibration Model and the Method of Calculation
2.1. Line Array Camera Imaging Model
2.1.1. Definition of Line Array Camera Coordinate System
2.1.2. Line Array Camera Imaging Model
2.2. LiDAR Observation Model
2.2.1. Definition of LiDAR Coordinate System
2.2.2. LiDAR Observation Model
2.3. Line Array Camera and LiDAR Joint Calibration Model
2.4. Method of Calculation
2.4.1. Solving the Imaging Scale Factor
2.4.2. Solving the Relative Extrinsic Parameters
3. Simulation Results and Analysis
3.1. Numerical Validation
3.1.1. Influence of Observation Error
- The fixed number of point pairs is 100;
- The fixed number of LiDAR beams is 127;
- The on-orbit attitude shifting Euler angle of the line array camera is ;
- The on-orbit attitude shifting Euler angle of the LiDAR is .
3.1.2. Influence of On-Orbit Shifting Angles of Sensors
- The fixed number of point pairs is 100;
- The fixed number of LiDAR beams is 127;
- The camera image coordinate and LiDAR image coordinate errors are both normal distribution errors with a standard deviation of 0.2 pixels;
- The laser ranging errors are normal distribution errors with a standard deviation of 10 m.
3.1.3. Influence of the Number of LiDAR Beams
- The fixed on-orbit shifting Euler angle of the line array camera is ;
- The fixed on-orbit shifting Euler angle of the LiDAR is ;
- The fixed number of points is 100;
- The camera image coordinate and LiDAR image coordinate errors are both normal distribution errors with a standard deviation of 0.2 pixels;
- The laser ranging errors are normal distribution errors with a standard deviation of 10 m.
3.1.4. Influence of the Number of Point Pairs
- The fixed number of LiDAR beams is 127;
- The fixed on-orbit shifting Euler angle of the line array camera is ;
- The fixed on-orbit shifting Euler angle of the LiDAR is ;
- The camera image coordinate errors and LiDAR image coordinate errors are both normal distribution errors with a standard deviation of 0.2 pixels;
- The laser ranging errors are normal distribution errors with a standard deviation of 10 m.
3.2. Simulation Experiment
3.2.1. Scheme of Hardware-in-Loop Simulation Experiment
3.2.2. Simulation Experiment Results
- The proposed method has a higher accuracy of matching between the camera image and LiDAR data compared to manual matching.
- The proposed method has stronger fault tolerance to the error of distance measurements of points on the ground.
- The EPnP method is based on the coordinates of points, and its error will decrease as the number of points increases. However, in the hardware-in-loop experiment, it is hard to increase the number of points for EPnP because the 3D coordinates of the points are required in the method, and the measurement of the 3D coordinates is less efficient, which is different from the proposed method.
4. Discussion
- The relative installation parameters of the line array camera and LiDAR on the satellite are calibrated with ground features in this paper. Thus, the proposed method does not need any additional control points on the ground.
- During the calibration procedure, there is no need to maneuver the satellite, which simplified the calibration steps.
- The feature-point searching from the multi-source images is avoided due to the mutual information matching method for the camera image and the LiDAR image.
- The difficulty in solving the deficient-rank equation is ingeniously overcome by the alternating iterative method. Moreover, the convergence rate of the alternating iterative method is fast and the constraint that the on-orbit shifting angles are all small angles is introduced in the calculation step.
- The results of the numerical validation and hardware-in-loop experiment show that the proposed joint calibration method is effective for the spaceborne line array camera and LiDAR in the cases considered in Section 3.1 and in the indoor simulated scene.
- The EPnP method is one of the most popular pose estimation methods for area array cameras with representative results. Due to the differences between the proposed method and other calibration methods for spaceborne sensors, the comparison is difficult to perform. Instead, the EPnP method is used for the data from the hardware-in-loop experiment and its results are compared with the results of our approach in Section 3.2.2. The comparison result shows that the proposed method has higher geo-positioning accuracy compared with the EPnP method in the indoor simulated scene.
- The influence of the satellite attitude error is not considered in the calibration model, which is a key factor in the calibration effect and our next research focus.
- The mean error of horizontal positioning of the camera and LiDAR is used as the criterion for the accuracy of the proposed method because it is an important factor in optical image and LiDAR data fusion. Meanwhile, the vertical positioning error calculated by the proposed method is inconsistent with the actual situation, because the camera and LiDAR vertical positioning are both mainly determined by the laser ranging data and the vertical positioning error is always small in this way.
- The results of numerical validation show that the error of horizontal positioning will be less than 0.8 m when the parameters are well set and the measurement errors are in the reasonable range. However, the result of the hardware-in-loop experiment shows that the calculation error is about 48 m in real satellite situations. Two reasons are listed subsequently:
- (1)
- To focus on the calibration performance of the proposed method on the relative pose of the camera and LiDAR, the numerical validation is performed in the ideal situation, that is, measurement errors of parameters and observations are not considered except for the aspects which are listed in Section 3.1. Therefore, the calibration errors of numerical validation are smaller than in the real situation.
- (2)
- The hardware-in-loop simulation is a scaled-down experiment. When the solution is zoomed to a normal scale, the measurement errors of parameters and observations and their effects are amplified to unreasonable ranges. The calculation errors are much larger than in the real situation in this way.
- Since there is almost no operational satellite simultaneously equipped with the line array camera and LiDAR and relevant actual data, the proposed method is verified by numerical validation and simulation experiments rather than real remote sensing data. After the satellite is in operation, the data will be used to further verify the accuracy and reliability of the proposed method, and the method will be modified based on the real data.
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Doyle, T.B.; Woodroffe, C.D. The application of LiDAR to investigate foredune morphology and vegetation. Geomorphology 2018, 303, 106–121. [Google Scholar] [CrossRef]
- Eagleston, H.; Marion, J.L. Application of airborne LiDAR and GIS in modeling trail erosion along the Appalachian Trail in New Hampshire, USA. Landsc. Urban Plan. 2020, 198, 103765. [Google Scholar] [CrossRef]
- Zhang, G.; Chen, W.; Xie, H. Tibetan Plateau’s lake level and volume changes from NASA’s ICESat/ICESat-2 and Landsat Missions. Geophys. Res. Lett. 2019, 46, 13107–13118. [Google Scholar] [CrossRef]
- Farrell, S.; Duncan, K.; Buckley, E.; Richter-Menge, J.; Li, R. Mapping sea ice surface topography in high fidelity with ICESat-2. Geophys. Res. Lett. 2020, 47, e2020GL090708. [Google Scholar] [CrossRef]
- Neuenschwander, A.; Guenther, E.; White, J.C.; Duncanson, L.; Montesano, P. Validation of ICESat-2 terrain and canopy heights in boreal forests. Remote Sens. Environ. 2020, 251, 112110. [Google Scholar] [CrossRef]
- Li, W.; Niu, Z.; Shang, R.; Qin, Y.; Wang, L.; Chen, H. High-resolution mapping of forest canopy height using machine learning by coupling ICESat-2 LiDAR with Sentinel-1, Sentinel-2 and Landsat-8 data. Int. J. Appl. Earth Obs. Geoinf. 2020, 92, 102163. [Google Scholar] [CrossRef]
- Lin, X.; Xu, M.; Cao, C.; Dang, Y.; Bashir, B.; Xie, B.; Huang, Z. Estimates of Forest Canopy Height Using a Combination of ICESat-2/ATLAS Data and Stereo-Photogrammetry. Remote Sens. 2020, 12, 3649. [Google Scholar] [CrossRef]
- Ma, Y.; Xu, N.; Sun, J.; Wang, X.H.; Yang, F.; Li, S. Estimating water levels and volumes of lakes dated back to the 1980s using Landsat imagery and photon-counting lidar datasets. Remote Sens. Environ. 2019, 232, 111287. [Google Scholar] [CrossRef]
- Ma, Y.; Xu, N.; Liu, Z.; Yang, B.; Yang, F.; Wang, X.H.; Li, S. Satellite-derived bathymetry using the ICESat-2 lidar and Sentinel-2 imagery datasets. Remote Sens. Environ. 2020, 250, 112047. [Google Scholar] [CrossRef]
- Zhang, H.; Zhao, X.; Mei, Q.; Wang, Y.; Song, S.; Yu, F. On-orbit thermal deformation prediction for a high-resolution satellite camera. Appl. Therm. Eng. 2021, 195, 117152. [Google Scholar] [CrossRef]
- Wang, M.; Yang, B.; Hu, F.; Zang, X. On-orbit geometric calibration model and its applications for high-resolution optical satellite imagery. Remote Sens. 2014, 6, 4391–4408. [Google Scholar] [CrossRef] [Green Version]
- Meng, W.; Zhu, S.; Cao, W.; Cao, B.; Gao, X. High Accuracy On-Orbit Geometric Calibration of Linear Push-broom Cameras. Geomat. Inf. Sci. Wuhan Univ. 2015, 40, 1392–1399. [Google Scholar]
- Pi, Y.; Xie, B.; Yang, B.; Zhang, Y.; Li, X.; Wang, M. On-orbit Geometric Calibration of Linear Push-broom Optical Satellite Based on Sparse GCPs. J. Geod. Geoinf. Sci. 2020, 3, 64–75. [Google Scholar]
- Pi, Y. On-orbit Internal Calibration Based on the Cross Image Pairs for an Agile Optical Satellite Under the Condition without Use of Calibration Site. Master’s Thesis, Wuhan University, Wuhan, China, 2017. [Google Scholar]
- Wang, J.; Wang, R. EFP multi-functional bundle adjustment of Mapping Satellite-1 without ground control points. J. Remote Sens. 2012, 1, 112–115. [Google Scholar]
- Yang, B.; Pi, Y.; Li, X.; Yang, Y. Integrated geometric self-calibration of stereo cameras onboard the ZiYuan-3 satellite. ISPRS J. Photogramm. Remote Sens. 2020, 162, 173–183. [Google Scholar] [CrossRef]
- Luthcke, S.; Rowlands, D.D.; McCarthy, J.J.; Pavlis, D.E.; Stoneking, E. Spaceborne laser-altimeter-pointing bias calibration from range residual analysis. J. Spacecr. Rocket. 2000, 37, 374–384. [Google Scholar] [CrossRef]
- Hong, Y.; Song, L.; Yue, M.; Shi, G. On-orbit calibration of satellite laser altimeters based on footprint detection. Acta Phys. Sin. 2017, 66, 126–135. [Google Scholar] [CrossRef]
- Guo, Y.; Xie, H.; Xu, Q.; Liu, X.; Wang, X.; Li, B.; Tong, X. A satellite photon-counting laser altimeter calibration algorithm using CCRs and indirect adjustment. In Proceedings of the Sixteenth National Conference on Laser Technology and Optoelectronics, Shanghai, China, 3–6 June 2021; Volume 11907, p. 1190724. [Google Scholar]
- Yi, H.; Li, S.; Weng, Y.; Ma, Y. On-orbit calibration of spaceborne laser altimeter using natural surface range residuals. J. Huazhong Univ. Sci. Technol. (Nat. Sci. Ed.) 2016, 44, 58–61. [Google Scholar]
- Tang, X.; Xie, J.; Gao, X.; Mo, F.; Feng, W.; Liu, R. The in-orbit calibration method based on terrain matching with pyramid-search for the spaceborne laser altimeter. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 1053–1062. [Google Scholar] [CrossRef]
- Pusztai, Z.; Eichhardt, I.; Hajder, L. Accurate calibration of multi-lidar-multi-camera systems. Sensors 2018, 18, 2139. [Google Scholar] [CrossRef] [Green Version]
- Zhou, L.; Li, Z.; Kaess, M. Automatic extrinsic calibration of a camera and a 3d lidar using line and plane correspondences. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 5562–5569. [Google Scholar]
- Verma, S.; Berrio, J.S.; Worrall, S.; Nebot, E. Automatic extrinsic calibration between a camera and a 3D Lidar using 3D point and plane correspondences. In Proceedings of the 2019 IEEE Intelligent Transportation Systems Conference (ITSC), Auckland, New Zealand, 27–30 October 2019; pp. 3906–3912. [Google Scholar]
- Tóth, T.; Pusztai, Z.; Hajder, L. Automatic LiDAR-camera calibration of extrinsic parameters using a spherical target. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May 2020–31 August 2020; pp. 8580–8586. [Google Scholar]
- Hsu, C.M.; Wang, H.T.; Tsai, A.; Lee, C.Y. Online Recalibration of a Camera and Lidar System. In Proceedings of the 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Miyazaki, Japan, 7–10 October 2018; pp. 4053–4058. [Google Scholar]
- Nagy, B.; Kovács, L.; Benedek, C. Online targetless end-to-end camera-LiDAR self-calibration. In Proceedings of the 2019 16th International Conference on Machine Vision Applications (MVA), Tokyo, Japan, 27–31 May 2019; pp. 1–6. [Google Scholar]
- Huang, Y.; Yuan, B. An Algorithm of Motion Estimation Based on Unit Quaternion Decomposition of the Rotation Matrix. J. Electron. 1996, 18, 337–343. [Google Scholar]
- Guo, Z.; Chen, Q.; Wu, G.; Xu, Y.; Shibasaki, R.; Shao, X. Village building identification based on ensemble convolutional neural networks. Sensors 2017, 17, 2487. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Alshehhi, R.; Marpu, P.R.; Woon, W.L.; Dalla Mura, M. Simultaneous extraction of roads and buildings in remote sensing imagery with convolutional neural networks. ISPRS J. Photogramm. Remote Sens. 2017, 130, 139–149. [Google Scholar] [CrossRef]
- Lepetit, V.; Moreno-Noguer, F.; Fua, P. EPNP: An accurate o (n) solution to the pnp problem. Int. J. Comput. Vis. 2009, 81, 155–166. [Google Scholar] [CrossRef] [Green Version]
- Liu, L.; Xie, J.; Tang, X.; Ren, C.; Chen, J.; Liu, R. Coarse-to-Fine Image Matching-Based Footprint Camera Calibration of the GF-7 Satellite. Sensors 2021, 21, 2297. [Google Scholar] [CrossRef] [PubMed]
Error | X Direction | Y Direction | |||||
---|---|---|---|---|---|---|---|
Min | Max | Mean | Min | Max | Mean | ||
CC | 0.2 | ||||||
LC | 0 | 0.0009 | 1.4169 | 0.3335 | 0.0030 | 1.3077 | 0.3727 |
LR | 0 | ||||||
CC | 0.5 | ||||||
LC | 0 | 0.0025 | 3.5425 | 0.8338 | 0.0005 | 3.0639 | 0.8698 |
LR | 0 | ||||||
CC | 0 | ||||||
LC | 0.2 | 0.0124 | 1.0598 | 0.3120 | 0.0037 | 0.3106 | 0.1432 |
LR | 0 | ||||||
CC | 0 | ||||||
LC | 0.5 | 0.0307 | 2.6498 | 0.7801 | 0.0041 | 0.3103 | 0.1431 |
LR | 0 | ||||||
CC | 0 | ||||||
LC | 0 | 0.0059 | 1.7430 | 0.5113 | 0.0048 | 0.3106 | 0.1433 |
LR | 10 | ||||||
CC | 0 | ||||||
LC | 0 | 0.0284 | 8.7158 | 2.5567 | 0.0044 | 0.3092 | 0.1437 |
LR | 50 | ||||||
CC | 0.1 | ||||||
LC | 0.1 | 0.0156 | 1.8940 | 0.5843 | 0.0014 | 0.7228 | 0.2249 |
LR | 10 | ||||||
CC | 0.2 | ||||||
LC | 0.2 | 0.0035 | 2.0974 | 0.7265 | 0.0033 | 1.3083 | 0.3728 |
LR | 10 | ||||||
CC | 0.5 | ||||||
LC | 0.5 | 0.0201 | 3.9419 | 1.3341 | 0.0009 | 3.0648 | 0.8698 |
LR | 10 | ||||||
CC | 1 | ||||||
LC | 1 | 0.0170 | 10.4874 | 3.6323 | 0.0706 | 5.9933 | 1.7229 |
LR | 50 |
On-Orbit Shifting Angle (Degree) | X Direction | Y Direction | |||||||
---|---|---|---|---|---|---|---|---|---|
x | y | z | Min | Max | Mean | Min | Max | Mean | |
Camera | −0.0500 | 0.0300 | −0.0400 | 0.0035 | 2.0974 | 0.7265 | 0.0033 | 1.3083 | 0.3728 |
LiDAR | 0.0100 | −0.0300 | 0.0100 | ||||||
Camera | 0.0700 | 0.0500 | −0.0400 | 0.0071 | 2.1185 | 0.7328 | 0.0189 | 1.3376 | 0.3915 |
LiDAR | −0.0200 | 0.0500 | 0.0300 | ||||||
Camera | 0.1000 | 0.3000 | −0.2000 | 0.0037 | 2.0867 | 0.7222 | 0.0250 | 2.6545 | 0.9250 |
LiDAR | −0.3000 | 0.2000 | 0.1000 | ||||||
Camera | 1.0000 | 1.0000 | −2.0000 | 0.0196 | 2.1164 | 0.7293 | 0.0086 | 2.3571 | 0.7361 |
LiDAR | −1.0000 | 1.0000 | 1.0000 |
Number of LiDAR Beams | X Direction | Y Direction | ||||
---|---|---|---|---|---|---|
Min | Max | Mean | Min | Max | Mean | |
4 | 0.0123 | 2.1072 | 0.6310 | 0.0129 | 5.8226 | 2.0472 |
7 | 0.0114 | 2.0402 | 0.7025 | 0.0346 | 4.4871 | 1.9170 |
15 | 0.0017 | 2.0794 | 0.7235 | 0.0025 | 2.9571 | 1.2076 |
31 | 0.0028 | 2.0951 | 0.7263 | 0.0012 | 1.8963 | 0.6721 |
63 | 0.0033 | 2.0974 | 0.7265 | 0.0010 | 1.4485 | 0.4510 |
127 | 0.0035 | 2.0974 | 0.7265 | 0.0033 | 1.3083 | 0.3728 |
255 | 0.0037 | 2.0971 | 0.7264 | 0.0018 | 1.2387 | 0.3502 |
Number of Points | X Direction | Y Direction | ||||
---|---|---|---|---|---|---|
Min | Max | Mean | Min | Max | Mean | |
10 | 0.0124 | 1.5103 | 0.5351 | 0.0144 | 1.0556 | 0.4617 |
50 | 0.0175 | 2.4454 | 0.7416 | 0.0052 | 1.0953 | 0.4035 |
100 | 0.0035 | 2.0974 | 0.7265 | 0.0033 | 1.3083 | 0.3728 |
1000 | 0.0003 | 2.8153 | 0.7196 | 0.0007 | 1.5610 | 0.3382 |
10,000 | 0.0005 | 3.7995 | 0.6994 | 0.0007 | 1.6772 | 0.3411 |
X Direction | Y Direction | |||||
---|---|---|---|---|---|---|
Min | Max | Mean | Min | Max | Mean | |
Before Calibration | 0.0155 | 0.0236 | 0.0197 | 0.0470 | 0.0529 | 0.0495 |
Proposed Method | 0.0000 | 0.0052 | 0.0021 | 0.0000 | 0.0029 | 0.0013 |
EPnP Method | 0.0001 | 0.0188 | 0.0091 | 0.0052 | 0.1039 | 0.0440 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xu, X.; Zhuge, S.; Guan, B.; Lin, B.; Gan, S.; Yang, X.; Zhang, X. On-Orbit Calibration for Spaceborne Line Array Camera and LiDAR. Remote Sens. 2022, 14, 2949. https://doi.org/10.3390/rs14122949
Xu X, Zhuge S, Guan B, Lin B, Gan S, Yang X, Zhang X. On-Orbit Calibration for Spaceborne Line Array Camera and LiDAR. Remote Sensing. 2022; 14(12):2949. https://doi.org/10.3390/rs14122949
Chicago/Turabian StyleXu, Xiangpeng, Sheng Zhuge, Banglei Guan, Bin Lin, Shuwei Gan, Xia Yang, and Xiaohu Zhang. 2022. "On-Orbit Calibration for Spaceborne Line Array Camera and LiDAR" Remote Sensing 14, no. 12: 2949. https://doi.org/10.3390/rs14122949
APA StyleXu, X., Zhuge, S., Guan, B., Lin, B., Gan, S., Yang, X., & Zhang, X. (2022). On-Orbit Calibration for Spaceborne Line Array Camera and LiDAR. Remote Sensing, 14(12), 2949. https://doi.org/10.3390/rs14122949