RTLIO: Real-Time LiDAR-Inertial Odometry and Mapping for UAVs
Abstract
:1. Introduction
1.1. Background
1.2. Related Works
- IMU excitation is not required for initialization, in contrast to [7].
- Online relocalization combined with loop closure and pose-graph optimization methods have been developed for odometry and mapping that are more accurate than in [9].
- In contrast to the odometry and mapping algorithm [11], the developed RTLIO can provide a high-frequency of odometry for the UAV and constructing maps synchronously.
1.3. Overview
2. Methodology
2.1. Measurement Preprocessing
2.1.1. Time Alignment
2.1.2. IMU Preintegration
2.1.3. Correction of Preintegration
2.1.4. LiDAR Feature Extraction and Distortion Compensation
2.1.5. LiDAR Odometry
2.2. Estimator Initialization
2.2.1. Rotational Alignment
2.2.2. Linear Alignment
2.3. Front-End: Tightly Coupled LIO and Mapping
2.3.1. IMU Measurement Model
2.3.2. LiDAR Measurement Model
2.3.3. Residuals for the Edge Features
2.3.4. Residuals for the Plane Features
2.3.5. Marginalization
2.4. Back-End: Loop Closure and Pose-Graph Optimization
2.4.1. Loop Closure
Algorithm 1: Loop closure algorithm. |
Input: from the sliding window Output:
|
2.4.2. Tightly Coupled Relocalization
2.4.3. Global Pose-Graph Optimization
2.4.4. Sequential Edge
2.4.5. Pose-Graph Optimization for Four Degrees of Freedom
3. Experiment Results and Discussions
3.1. Indoor Flight Test
3.1.1. System Setup
3.1.2. Precision and Time Cost
3.1.3. Indoor Flights
3.1.4. Indoor Flight with an Obstacle
3.2. KITTI Dataset Evaluation
3.2.1. Front-End Performance
3.2.2. Full Closed-Loop Performance
3.3. Time Consumption
4. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Lippitt, C.D.; Zhang, S. The impact of small unmanned airborne platforms on passive optical remote sensing: A conceptual perspective. Int. J. Remote Sens. 2018, 39, 4852–4868. [Google Scholar] [CrossRef]
- Lin, Y.; Gao, F.; Qin, T.; Gao, W.; Liu, T.; Wu, W.; Yang, Z.; Shen, S. Autonomous aerial navigation using monocular visual-inertial fusion. J. Field Robot. 2018, 35, 23–51. [Google Scholar] [CrossRef]
- Weiss, S.; Achtelik, M.W.; Lynen, S.; Chli, M.; Siegwart, R. Real-time onboard visual-inertial state estimation and self-calibration of mavs in unknown environments. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, 14–18 May 2012; pp. 957–964. [Google Scholar]
- Leutenegger, S.; Lynen, S.; Bosse, M.; Siegwart, R.; Furgale, P. Keyframe-based visual-inertial odometry using nonlinear optimization. Int. J. Robot. Res. 2015, 34, 314–334. [Google Scholar] [CrossRef] [Green Version]
- Mourikis, A.I.; Roumeliotis, S.I. A multi-state constraint kalman filter for vision-aided inertial navigation. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Rome, Italy, 10–14 April 2007; pp. 3565–3572. [Google Scholar]
- Sun, K.; Mohta, K.; Pfrommer, B.; Watterson, M.; Liu, S.; Mulgaonkar, Y.; Taylor, C.J.; Kumar, V. Robust stereo visual inertial odometry for fast autonomous flight. IEEE Robot. Autom. Lett. 2018, 3, 965–972. [Google Scholar] [CrossRef] [Green Version]
- Qin, T.; Li, P.; Shen, S. Vins-mono: A robust and versatile monocular visual-inertial state estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar] [CrossRef] [Green Version]
- Qin, C.; Ye, H.; Pranata, C.E.; Han, J.; Liu, M. LINS: A lidar-inerital state estimator for robust and fast navigation. arXiv 2019, arXiv:1907.02233. [Google Scholar]
- Ye, H.; Chen, Y.; Liu, M. Tightly coupled 3d lidar inertial odometry and mapping. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 3144–3150. [Google Scholar]
- Shan, T.; Englot, B. Lego-loam: Lightweight and ground-optimized lidar odometry and mapping on variable terrain. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 4758–4765. [Google Scholar]
- Zhang, J.; Singh, S. Loam: Lidar odometry and mapping in real-time. In Proceedings of the Robotics: Science and Systems Conference, Berkeley, CA, USA, 12–16 July 2014. [Google Scholar]
- Geneva, P.; Eckenhoff, K.; Yang, Y.; Huang, G. Lips: Lidar-inertial 3d plane slam. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 123–130. [Google Scholar]
- Lin, J.; Zhang, F. Loam_livox: A fast, robust, high-precision lidar odometry and mapping package for lidars of small fov. arXiv 2019, arXiv:1909.06700. [Google Scholar]
- Hess, W.; Kohler, D.; Rapp, H.; Andor, D. Real-time loop closure in 2d lidar slam. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 1271–1278. [Google Scholar]
- Dube, R.; Dugas, D.; Stumm, E.; Nieto, J.; Siegwart, R.; Cadena, C. Segmatch: Segment based place recognition in 3d point clouds. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017. [Google Scholar]
- Geiger, A.; Lenz, P.; Urtasun, R. Are we ready for autonomous driving? the KITTI vision benchmark suite. In Proceedings of the Conference on Computer Vision and Pattern Recognition (CVPR), Providence, RI, USA, 16–21 June 2012. [Google Scholar]
- Geiger, A.; Lenz, P.; Stiller, C.; Urtasun, R. Vision meets robotics: The KITTI dataset. Int. J. Robot. Res. IJRR 2013, 32, 1231–1237. [Google Scholar] [CrossRef] [Green Version]
- Lupton, T.; Sukkarieh, S. Visual-inertial-aided navigation for high-dynamic motion in built environments without initial conditions. IEEE Trans. Robot. 2012, 28, 61–76. [Google Scholar] [CrossRef]
- Shen, S.; Michael, N.; Kumar, V. Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft mavs. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Washington, DC, USA, 26–30 May 2015; pp. 5303–5310. [Google Scholar]
- Solà, J. Quaternion kinematics for the error-state kalman filter. arXiv 2017, arXiv:1711.02508. [Google Scholar]
- Agarwal, S.; Mierle, K. Ceres Solver. Available online: http://ceres-solver.org (accessed on 3 April 2021).
- Sibley, G.; Matthies, L.; Sukhatme, G. Sliding window filter with application to planetary landing. J. Field Robot. 2010, 27, 587–608. [Google Scholar] [CrossRef]
- Dong-Si, T.; Mourikis, A.I. Motion tracking with fixed-lag smoothing: Algorithm and consistency analysis. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 5655–5662. [Google Scholar]
Index | Note |
---|---|
position | |
velocity | |
q | quaternion |
Euler angle | |
rotation matrix | |
transformation matrix | |
angular velocity | |
linear acceleration | |
gravity | |
acceleration and gyroscope bias | |
acceleration and gyroscope noise | |
P | point cloud |
a point in P | |
b | body frame |
w | world frame |
l | LiDAR frame |
state representation in th frame | |
state at time t | |
nominal state | |
cardinality of the denoted argument | |
number of frames in sliding window |
Method | Number of Frames | Translation (m) | Rotation (deg) | Computation Time (ms) |
---|---|---|---|---|
LOAM (10 Hz) | 1203 | 0.0599 | 1.4218 | 67.5977 |
ALOAM (10 Hz) | 1224 | 0.0078 | 0.3955 | 61.1810 |
RTLIO (10 Hz) | 1224 | 0.0066 | 0.1881 | 96.3577 |
RTLIO | RTLIO with Back-End | |||
---|---|---|---|---|
Sequence | Translation (m) | Rotation (deg) | Translation (m) | Rotation (deg) |
00 | 9.4542 | 2.5884 | 1.8196 | 0.7324 |
* 01 | 27.5966 | 8.0052 | 31.3346 | 9.2077 |
02 | 10.3673 | 1.5718 | 5.8435 | 1.3680 |
* 04 | 1.8050 | 1.2320 | 1.0295 | 1.3538 |
05 | 3.5576 | 1.7812 | 0.9164 | 0.4610 |
06 | 5.7340 | 2.9552 | 1.4797 | 0.6996 |
07 | 1.2983 | 0.7238 | 0.9850 | 0.6497 |
08 | 22.4302 | 4.0389 | 10.2060 | 2.1994 |
09 | 20.9436 | 5.8630 | 3.4717 | 2.3290 |
* 10 | 2.3719 | 1.2684 | 2.3041 | 1.2050 |
Thread | Module | Time (ms) | Rate (Hz) | |
---|---|---|---|---|
Indoor | KITTI | |||
1 | feature extraction | 6 | 25 | 10 |
2 | frame-to-frame odometry | 15 | 65 | 10 |
3 | sliding window optimization | 65 | 350 | 10 |
4 | loop closure | 130 | 200 | X |
pose-graph optimization | 10 | 120 | X |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yang, J.-C.; Lin, C.-J.; You, B.-Y.; Yan, Y.-L.; Cheng, T.-H. RTLIO: Real-Time LiDAR-Inertial Odometry and Mapping for UAVs. Sensors 2021, 21, 3955. https://doi.org/10.3390/s21123955
Yang J-C, Lin C-J, You B-Y, Yan Y-L, Cheng T-H. RTLIO: Real-Time LiDAR-Inertial Odometry and Mapping for UAVs. Sensors. 2021; 21(12):3955. https://doi.org/10.3390/s21123955
Chicago/Turabian StyleYang, Jung-Cheng, Chun-Jung Lin, Bing-Yuan You, Yin-Long Yan, and Teng-Hu Cheng. 2021. "RTLIO: Real-Time LiDAR-Inertial Odometry and Mapping for UAVs" Sensors 21, no. 12: 3955. https://doi.org/10.3390/s21123955
APA StyleYang, J.-C., Lin, C.-J., You, B.-Y., Yan, Y.-L., & Cheng, T.-H. (2021). RTLIO: Real-Time LiDAR-Inertial Odometry and Mapping for UAVs. Sensors, 21(12), 3955. https://doi.org/10.3390/s21123955