GNSS/LiDAR/IMU Fusion Odometry Based on Tightly-Coupled Nonlinear Observer in Orchard
Abstract
1. Introduction
- (1)
- We employ a health-check mechanism to filter GNSS data and integrate it with the IMU, thereby facilitating the robot’s estimation poses in orchard characterized by sparse geometric features and bumpy terrain;
- (2)
- We propose a novel approach to enhance alignment stability in orchard by integrating IMU and GNSS information into scan-to-map using the point-to-point ICP algorithm, complemented with adaptive thresholding;
- (3)
- We use GNSS as absolute observational constraints to improve the robustness of LiDAR-based odometry, thereby compensating for the influence of initial values on the nonlinear estimator;
- (4)
- Extensive experiments conducted on various public datasets and our self-made orchard dataset demonstrate the robustness of our approach in maintaining precise and consistent odometry, even in challenging agricultural environments.
2. System Description
| Algorithm 1. GNSS-LiDAR-Inertial Odometry. |
| Input: Output: |
| //LiDAR Callback Thread |
|
2.1. Initialization
- (1)
- Point Cloud Preprocessing. The dense 3D point cloud is collected by mechanical LiDAR, such as RS-LiDAR or Velodyne (10 Hz~20 Hz). To minimize the loss of information, we employ a 1 m3 box filter at the origin of the point cloud to account for points that potentially originate from the robot itself.
- (2)
- IMU initial processing: The IMU is capable of capturing the robot’s linear acceleration and angular velocity at a frequency ranging from 100 Hz to 500 Hz. The mathematical model representing measurements and obtained from the IMU [29] is as follows:where, the index represents the M measurements within the time stamps and , denotes the sensor bias, represents the sensor white noise, and g is the rotational gravity vector. In a stationary state, the IMU continues to generate output values, which significantly impact the accuracy of the sensor and GLIO system. Hence, online estimation and compensation of IMU bias errors are pivotal preprocessing steps. The process of online estimating IMU bias errors essentially entails calculating the average value of data obtained by the sensor during a 3 s static period and subsequently subtracting this average value from subsequent IMU data to rectify the sensor data.
- (3)
- GNSS Status Check and Online Transformation Correction. When the number of received satellites is less than four, the sensor state is unreliable. Therefore, it is imperative to discard such data. The GNSS position at time k − 1 is represented as a one-dimensional vector encompassing longitude, latitude, and altitude measurements. The orientation of GNSS is determined by the IMU. Thus, the GNSS transform matrix can be defined as the following:where is the translation vector between original of local world and GNSS. is the point alignment with the local world coordinate. The initial transformation between the GNSS and the local coordinate is assumed to be represented as . Subsequently, the transformation matrix at time k is as follows:
2.2. GNSS Healthy Check Mechanism
2.3. Point Cloud De-Skewing and Optimization Prior
2.4. Scan-to-Map Registration
2.5. Nonlinear Observer
3. Experimental Results
3.1. Experiment Setup
3.2. Performance on OrchardSet
- (1)
- We firstly evaluated the performance on individual fruit rows. We selected the fruit rows where the start points and the end points were located. The fruit rows with the first point that was located were labeled as 01, while the rows with the end points located were labeled as 02;
- (2)
- We also assessed the odometry performance for complete rows of orchards, specifically focusing on trajectory 03.
3.3. Comparison to State-of-the-Art Systems on Public Datasets
3.4. Computation Efficiency
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Zhu, J.; Li, H.; Zhang, T. Camera, LiDAR, and IMU Based Multi-Sensor Fusion SLAM: A Survey. Tsinghua Sci. Technol. 2024, 29, 415–429. [Google Scholar] [CrossRef]
- Yin, H.; Xu, X.; Lu, S.; Chen, X.; Xiong, R.; Shen, S.; Stachniss, C.; Wang, Y. A Survey on Global LiDAR Localization: Challenges, Advances and Open Problems. Int. J. Comput. Vis. 2023, 132, 3139–3171. [Google Scholar] [CrossRef]
- Nilchan, N.; Supnithi, P.; Phakphisut, W. Improvement of Kalman Filter for GNSS/IMU Data Fusion with Measurement Bias Compensation. In Proceedings of the 2020 35th International Technical Conference on Circuits/Systems, Computers and Communications (ITC-CSCC), Nagoya, Japan, 3–6 July 2020. [Google Scholar]
- Hidayatullah, F.H.; Abdurohman, M.; Putrada, A.G. Accident Detection System for Bicycle Athletes Using GPS/IMU Integration and Kalman Filtered AHRS Method. In Proceedings of the 2021 International Conference Advancement in Data Science, E-learning and Information Systems (ICADEIS), Bali, Indonesia, 13 October 2021; pp. 1–6. [Google Scholar]
- De Miguel, G.; Goya, J.; Uranga, J.; Alvarado, U.; Adin, I.; Mendizabal, J. GNSS Complementary Positioning System Performance in Railway Domain. In Proceedings of the 2017 15th International Conference on ITS Telecommunications (ITST), Warsaw, Poland, 29–31 May 2017; pp. 1–7. [Google Scholar]
- Jouybari, A.; Ardalan, A.A.; Rezvani, M.-H. Experimental Comparison between Mahoney and Complementary Sensor Fusion Algorithm for Attitude Determination by Raw Sensor Data of Xsens IMU on Buoy. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2017, XLII-4/W4, 497–502. [Google Scholar] [CrossRef]
- Berkane, S.; Tayebi, A.; De Marco, S. A Nonlinear Navigation Observer Using IMU and Generic Position Information. Automatica 2021, 127, 109513. [Google Scholar] [CrossRef]
- Hashim, H.A.; Eltoukhy, A.E.E.; Vamvoudakis, K.G.; Abouheaf, M.I. Nonlinear Deterministic Observer for Inertial Navigation Using Ultra-Wideband and IMU Sensor Fusion. In Proceedings of the 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Detroit, MI, USA, 1 October 2023; pp. 3085–3090. [Google Scholar]
- Suzuki, T. Attitude-Estimation-Free GNSS and IMU Integration. IEEE Robot. Autom. Lett. 2024, 9, 1090–1097. [Google Scholar] [CrossRef]
- Cadena, C.; Carlone, L.; Carrillo, H.; Latif, Y.; Scaramuzza, D.; Neira, J.; Reid, I.; Leonard, J.J. Past, Present, and Future of Simultaneous Localization and Mapping: Toward the Robust-Perception Age. IEEE Trans. Robot. 2016, 32, 1309–1332. [Google Scholar] [CrossRef]
- Zhang, J.; Singh, S. LOAM: Lidar Odometry and Mapping in Real-Time. In Proceedings of the Robotics: Science and Systems X; Robotics: Science and Systems Foundation, Los Angeles, CA, USA, 12 July 2014. [Google Scholar]
- Shan, T.; Englot, B. LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 4758–4765. [Google Scholar]
- Gao, L.; Xia, X.; Zheng, Z.; Ma, J. GNSS/IMU/LiDAR Fusion for Vehicle Localization in Urban Driving Environments within a Consensus Framework. Mech. Syst. Signal Process. 2023, 205, 110862. [Google Scholar] [CrossRef]
- Hess, W.; Kohler, D.; Rapp, H.; Andor, D. Real-Time Loop Closure in 2D LIDAR SLAM. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 1271–1278. [Google Scholar]
- Palieri, M.; Morrell, B.; Thakur, A.; Ebadi, K.; Nash, J.; Chatterjee, A.; Kanellakis, C.; Carlone, L.; Guaragnella, C.; Agha-mohammadi, A. LOCUS: A Multi-Sensor Lidar-Centric Solution for High-Precision Odometry and 3D Mapping in Real-Time. IEEE Robot. Autom. Lett. 2021, 6, 421–428. [Google Scholar] [CrossRef]
- Reinke, A.; Palieri, M.; Morrell, B.; Chang, Y.; Ebadi, K.; Carlone, L.; Agha-Mohammadi, A.-A. LOCUS 2.0: Robust and Computationally Efficient Lidar Odometry for Real-Time 3D Mapping. IEEE Robot. Autom. Lett. 2022, 7, 9043–9050. [Google Scholar] [CrossRef]
- Chen, K.; Lopez, B.T.; Agha-Mohammadi, A.-A.; Mehta, A. Direct LiDAR Odometry: Fast Localization with Dense Point Clouds. IEEE Robot. Autom. Lett. 2022, 7, 2000–2007. [Google Scholar] [CrossRef]
- Xu, W.; Zhang, F. FAST-LIO: A Fast, Robust LiDAR-Inertial Odometry Package by Tightly-Coupled Iterated Kalman Filter. IEEE Robot. Autom. Lett. 2021, 6, 3317–3324. [Google Scholar] [CrossRef]
- Bai, C.; Xiao, T.; Chen, Y.; Wang, H.; Zhang, F.; Gao, X. Faster-LIO: Lightweight Tightly Coupled Lidar-Inertial Odometry Using Parallel Sparse Incremental Voxels. IEEE Robot. Autom. Lett. 2022, 7, 4861–4868. [Google Scholar] [CrossRef]
- Wu, Y.; Guadagnino, T.; Wiesmann, L.; Klingbeil, L.; Stachniss, C.; Kuhlmann, H. LIO-EKF: High Frequency LiDAR-Inertial Odometry Using Extended Kalman Filters. arXiv 2024, arXiv:2311.09887. [Google Scholar]
- Shan, T.; Englot, B.; Meyers, D.; Wang, W.; Ratti, C.; Rus, D. LIO-SAM: Tightly-Coupled Lidar Inertial Odometry via Smoothing and Mapping. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October 2020; pp. 5135–5142. [Google Scholar]
- Xiao, H.; Han, Y.; Zhao, J.; Cui, J.; Xiong, L.; Yu, Z. LIO-Vehicle: A Tightly-Coupled Vehicle Dynamics Extension of LiDAR Inertial Odometry. IEEE Robot. Autom. Lett. 2022, 7, 446–453. [Google Scholar] [CrossRef]
- Park, C.; Moghadam, P.; Kim, S.; Elfes, A.; Fookes, C.; Sridharan, S. Elastic LiDAR Fusion: Dense Map-Centric Continuous-Time SLAM. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018; pp. 1206–1213. [Google Scholar]
- Ramezani, M.; Khosoussi, K.; Catt, G.; Moghadam, P.; Williams, J.; Kottege, N. Wildcat: Online Continuous-Time 3D Lidar-Inertial SLAM. arXiv 2022, arXiv:2205.12595. [Google Scholar]
- Knights, J.; Vidanapathirana, K.; Ramezani, M.; Sridharan, S.; Fookes, C.; Moghadam, P. Wild-Places: A Large-Scale Dataset for Lidar Place Recognition in Unstructured Natural Environments. In Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA), London, UK, 29 May–2 June 2023. [Google Scholar]
- Dellenbach, P.; Deschaud, J.-E.; Jacquet, B.; Goulette, F. CT-ICP: Real-Time Elastic LiDAR Odometry with Loop Closure. In Proceedings of the 2022 International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2022. [Google Scholar]
- Park, C.; Moghadam, P.; Williams, J.; Kim, S.; Sridharan, S.; Fookes, C. Elasticity Meets Continuous-Time: Map-Centric Dense 3D LiDAR SLAM. IEEE Trans. Robot. 2022, 38, 978–997. [Google Scholar] [CrossRef]
- Vizzo, I.; Guadagnino, T.; Mersch, B.; Wiesmann, L.; Behley, J.; Stachniss, C. KISS-ICP: In Defense of Point-to-Point ICP—Simple, Accurate, and Robust Registration If Done the Right Way. IEEE Robot. Autom. Lett. 2023, 8, 1029–1036. [Google Scholar] [CrossRef]
- Chen, K.; Nemiroff, R.; Lopez, B.T. Direct LiDAR-Inertial Odometry: Lightweight LIO with Continuous-Time Motion Correction. In Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA), London, UK, 29 May 2023; pp. 3983–3989. [Google Scholar]
- Gao, Y.; Liu, S.; Atia, M.; Noureldin, A. INS/GPS/LiDAR Integrated Navigation System for Urban and Indoor Environments Using Hybrid Scan Matching Algorithm. Sensors 2015, 15, 23286–23302. [Google Scholar] [CrossRef]
- Chiang, K.W.; Tsai, G.J.; Chang, H.W.; Joly, C.; EI-Sheimy, N. Seamless Navigation and Mapping Using an INS/GNSS/Grid-Based SLAM Semi-Tightly Coupled Integration Scheme. Inf. Fusion 2019, 50, 181–196. [Google Scholar] [CrossRef]
- Liu, X.; Wen, W.; Hsu, L.-T. GLIO: Tightly-Coupled GNSS/LiDAR/IMU Integration for Continuous and Drift-Free State Estimation of Intelligent Vehicles in Urban Areas. IEEE Trans. Intell. Veh. 2023, 9, 1412–1422. [Google Scholar] [CrossRef]
- Li, S.; Li, X.; Wang, H.; Zhou, Y.; Shen, Z. Multi-GNSS PPP/INS/Vision/LiDAR Tightly Integrated System for Precise Navigation in Urban Environments. Inf. Fusion 2023, 90, 218–232. [Google Scholar] [CrossRef]
- Wu, W.; Zhong, X.; Wu, D.; Chen, B.; Zhong, X.; Liu, Q. LIO-Fusion: Reinforced LiDAR Inertial Odometry by Effective Fusion With GNSS/Relocalization and Wheel Odometry. IEEE Robot. Autom. Lett. 2023, 8, 1571–1578. [Google Scholar] [CrossRef]
- Tan, H.; Zhao, X.; Zhai, C. Design and experiments with a SLAM system for low-density canopy environments in greenhouses based on an improved Cartographer framework. Front. Plant Sci. 2024, 15, 1276799. [Google Scholar] [CrossRef]
- Tang, B.; Guo, Z.; Huang, C. A fruit-tree mapping system for semi-structured orchards based on multi-sensor-fusion SlAM. IEEE Access 2024. [Google Scholar] [CrossRef]
- Zhao, Z.; Zhang, Y.; Shi, J. Efficient and adaptive lidar—Visual-inertial odometry for agricultural unmanned ground vehicle. Int. J. Adv. Robot. Syst. 2022, 19, 2. [Google Scholar] [CrossRef]
- Lopez, B.T. A Contracting Hierarchical Observer for Pose-Inertial Fusion. arXiv 2023, arXiv:2303.02777. [Google Scholar]
- Grupp, M. Evo: Python Package for the Evaluation of Odometry and SLAM. 2017. Available online: https://github.com/MichaelGrupp/evo (accessed on 1 June 2017).
- Hsu, L.-T.; Huang, F.; Ng, H.-F.; Zhang, G.; Zhong, Y.; Bai, X.; Wen, W. Hong Kong UrbanNav: An Open-Source Multisensory Dataset for Benchmarking Urban Navigation Algorithms. Navi 2023, 70, navi.602. [Google Scholar] [CrossRef]
- Yin, J.; Li, A.; Li, T.; Yu, W.; Zou, D. M2DGR: A Multi-Sensor and Multi-Scenario SLAM Dataset for Ground Robots. IEEE Robot. Autom. Lett. 2022, 7, 2266–2273. [Google Scholar] [CrossRef]
- Carlevaris-Bianco, N.; Ushani, A.K.; Eustice, R.M. University of Michigan North Campus Long-Term Vision and Lidar Dataset. Int. J. Robot. Res. 2016, 35, 1023–1035. [Google Scholar] [CrossRef]
- Geiger, A.; Lenz, P.; Urtasun, R. Are We Ready for Autonomous Driving? The KITTI Vision Benchmark Suite. In Proceedings of the Conference on Computer Vision and Pattern Recognition (CVPR), Providence, RI, USA, 16–21 June 2012. [Google Scholar]








| Sensor | Transform | X (m) | Y (m) | Z (m) | Roll | Pitch | Yaw |
|---|---|---|---|---|---|---|---|
| LiDAR | 0.0 | 0.0 | 0.0 | 0.0° | 0.0° | 0.0° | |
| IMU | 0.280 | 0.0 | 0.623 | 180.0° | 0.0° | 180.0° | |
| INS | 0.154 | 0.020 | 0.623 | 180.0° | 0.0° | 180.0° |
| Sequence | Method | Avg. tra. | Avg. rot. | ATE. tra. | ATE. rot. |
|---|---|---|---|---|---|
| Experiment I | |||||
| 01 | DLIO | 0.066 | 0.041 | 0.070 | 0.302 |
| KISS-ICP | 0.083 | 0.041 | 0.088 | 0.263 | |
| ours | 0.030 | 0.041 | 0.027 | 0.218 | |
| 02 | DLIO | 0.074 | 0.041 | 0.099 | 0.213 |
| KISS-ICP | 0.087 | 0.040 | 0.095 | 0.171 | |
| ours | 0.031 | 0.046 | 0.033 | 0.201 | |
| Experiment II | |||||
| 03 | DLIO | 0.074 | 0.501 | 0.213 | 0.834 |
| KISS-ICP | 0.092 | 0.502 | 0.266 | 0.834 | |
| ours | 0.037 | 0.050 | 0.068 | 0.856 | |
| Method | Type | UrbanNav | NCLT | M2DGR | |||
|---|---|---|---|---|---|---|---|
| ATE. tra. * | ATE. rot. * | ATE. tra. | ATE. rot. | ATE. tra. | ATE. rot. | ||
| DLO [16] | LO | 0.873 | 2.023 | 2.680 | 2.828 | 0.192 | 2.134 |
| DLIO [29] | LIO | 1.788 | 2.000 | 6.983 | 2.800 | 122.973 | 2.426 |
| KISS-ICP [28] | LO | 7.936 | 2.469 | 3.717 | 2.828 | 6.983 | 2.155 |
| LIO-EKF [20] | LIO | 5.179 | 2.014 | 10.561 | 2.827 | 0.565 | 2.116 |
| Ours | GLIO | 2.202 | 1.967 | 1.978 | 2.828 | 0.160 | 2.123 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Sun, N.; Qiu, Q.; Li, T.; Ru, M.; Ji, C.; Feng, Q.; Zhao, C. GNSS/LiDAR/IMU Fusion Odometry Based on Tightly-Coupled Nonlinear Observer in Orchard. Remote Sens. 2024, 16, 2907. https://doi.org/10.3390/rs16162907
Sun N, Qiu Q, Li T, Ru M, Ji C, Feng Q, Zhao C. GNSS/LiDAR/IMU Fusion Odometry Based on Tightly-Coupled Nonlinear Observer in Orchard. Remote Sensing. 2024; 16(16):2907. https://doi.org/10.3390/rs16162907
Chicago/Turabian StyleSun, Na, Quan Qiu, Tao Li, Mengfei Ru, Chao Ji, Qingchun Feng, and Chunjiang Zhao. 2024. "GNSS/LiDAR/IMU Fusion Odometry Based on Tightly-Coupled Nonlinear Observer in Orchard" Remote Sensing 16, no. 16: 2907. https://doi.org/10.3390/rs16162907
APA StyleSun, N., Qiu, Q., Li, T., Ru, M., Ji, C., Feng, Q., & Zhao, C. (2024). GNSS/LiDAR/IMU Fusion Odometry Based on Tightly-Coupled Nonlinear Observer in Orchard. Remote Sensing, 16(16), 2907. https://doi.org/10.3390/rs16162907

