Visual-Inertial Odometry Using High Flying Altitude Drone Datasets †
Abstract
:1. Introduction
2. VINS-Fusion—A Stereo-Visual-Inertial Odometry Algorithm
2.1. Data Collection and Preprocessing
2.2. Initialisation
2.3. Estimation
2.4. Relocalisation
2.5. Global Pose Estimation
3. Platform Development
3.1. Hardware
3.2. Software
3.3. Calibration
4. Experiments
Algorithm 1 Experiment. |
|
4.1. Drone Data Collection
4.2. Data Processing
4.3. Reference Trajectory
4.4. Error Metrics
5. Results and Discussion
5.1. Experiment Results
5.2. Assessment, Contribution and Future Research
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
API | application programming interface |
ATE | absolute trajectory error |
BRIEF | binary robust independent elementary features |
BVLOS | beyond visual line-of-sight |
FPS | frames per second |
GNSS | global navigation satellite system |
GPIO | general-purpose input/output |
IMU | inertial measurement unit |
LIDAR | light detection and ranging |
PnP | perspective-n-point |
RE | relative error |
ROS | robot operating system |
RTK | real time kinematics |
SfM | structure from motion |
SLAM | simultaneous localisation and mapping |
USB | universal serial bus |
VIO | visual-inertial odometry |
VISLAM | visual-inertial SLAM |
VO | visual odometry |
VSLAM | visual SLAM |
References
- Davies, L.; Bolam, R.C.; Vagapov, Y.; Anuchin, A. Review of unmanned aircraft system technologies to enable beyond visual line-of-sight (BVLOS) operations. In Proceedings of the IEEE 2018 X International conference on electrical power drive systems (ICEPDS), Novocherkassk, Russia, 3–6 October 2018; pp. 1–6. [Google Scholar]
- Poddar, S.; Kottath, R.; Karar, V. Motion Estimation Made Easy: Evolution and Trends in Visual Odometry. In Recent Advances in Computer Vision: Theories and Applications; Springer International Publishing: Cham, Switzerland, 2019; pp. 305–331. [Google Scholar] [CrossRef]
- El-Mowafy, A.; Fashir, H.; Al Habbai, A.; Al Marzooqi, Y.; Babiker, T. Real-time determination of orthometric heights accurate to the centimeter level using a single GPS receiver: Case study. J. Surv. Eng. 2006, 132, 1–6. [Google Scholar] [CrossRef] [Green Version]
- Uzodinma, V.; Nwafor, U. Degradation of GNSS Accuracy by Multipath and Tree Canopy Distortions in a School Environment. Asian J. Appl. Sci. 2018, 6. [Google Scholar] [CrossRef] [Green Version]
- National Land Survey of Finland. New Steps in Nordic Collaboration against GNSS Interference. Available online: https://www.maanmittauslaitos.fi/en/topical_issues/new-steps-nordic-collaboration-against-gnss-interference (accessed on 28 August 2021).
- Morales-Ferre, R.; Richter, P.; Falletti, E.; de la Fuente, A.; Lohan, E.S. A survey on coping with intentional interference in satellite navigation for manned and unmanned aircraft. IEEE Commun. Surv. Tutor. 2019, 22, 249–291. [Google Scholar] [CrossRef]
- Nex, F.; Armenakis, C.; Cramer, M.; Cucci, D.A.; Gerke, M.; Honkavaara, E.; Kukko, A.; Persello, C.; Skaloud, J. UAV in the advent of the twenties: Where we stand and what is next. ISPRS J. Photogramm. Remote Sens. 2022, 184, 215–242. [Google Scholar] [CrossRef]
- Shen, S.; Mulgaonkar, Y.; Michael, N.; Kumar, V. Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft MAV. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 4974–4981. [Google Scholar] [CrossRef]
- Scaramuzza, D.; Fraundorfer, F. Visual Odometry [Tutorial]. IEEE Robot. Autom. Mag. 2011, 18, 80–92. [Google Scholar] [CrossRef]
- Caballero, F.; Merino, L.; Ferruz, J.; Ollero, A. Vision-Based Odometry and SLAM for Medium and High Altitude Flying UAVs. J. Intell. Robot. Syst. 2008, 54, 137–161. [Google Scholar] [CrossRef]
- Romero, H.; Salazar, S.; Santos, O.; Lozano, R. Visual odometry for autonomous outdoor flight of a quadrotor UAV. In Proceedings of the IEEE 2013 International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta, GA, USA, 28–31 May 2013; pp. 678–684. [Google Scholar]
- Warren, M.; Corke, P.; Upcroft, B. Long-range stereo visual odometry for extended altitude flight of unmanned aerial vehicles. Int. J. Robot. Res. 2015, 35, 381–403. [Google Scholar] [CrossRef]
- Forster, C.; Pizzoli, M.; Scaramuzza, D. SVO: Fast semi-direct monocular visual odometry. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 15–22. [Google Scholar] [CrossRef]
- Forster, C.; Zhang, Z.; Gassner, M.; Werlberger, M.; Scaramuzza, D. SVO: Semidirect Visual Odometry for Monocular and Multicamera Systems. IEEE Trans. Robot. 2017, 33, 249–265. [Google Scholar] [CrossRef] [Green Version]
- Wang, R.; Schworer, M.; Cremers, D. Stereo DSO: Large-scale direct sparse visual odometry with stereo cameras. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 3903–3911. [Google Scholar]
- Pire, T.; Fischer, T.; Castro, G.; De Cristóforis, P.; Civera, J.; Jacobo Berlles, J. S-PTAM: Stereo Parallel Tracking and Mapping. Robot. Auton. Syst. 2017, 93, 27–42. [Google Scholar] [CrossRef] [Green Version]
- Mur-Artal, R.; Montiel, J.M.M.; Tardós, J.D. ORB-SLAM: A Versatile and Accurate Monocular SLAM System. IEEE Trans. Robot. 2015, 31, 1147–1163. [Google Scholar] [CrossRef] [Green Version]
- Mur-Artal, R.; Tardós, J.D. ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras. IEEE Trans. Robot. 2017, 33, 1255–1262. [Google Scholar] [CrossRef] [Green Version]
- Bailey, T.; Durrant-Whyte, H. Simultaneous localization and mapping (SLAM): Part II. IEEE Robot. Autom. Mag. 2006, 13, 108–117. [Google Scholar] [CrossRef] [Green Version]
- Qin, T.; Li, P.; Shen, S. VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar] [CrossRef] [Green Version]
- Qin, T.; Shen, S. Online Temporal Calibration for Monocular Visual-Inertial Systems. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 3662–3669. [Google Scholar]
- Campos, C.; Elvira, R.; Rodríguez, J.J.G.; Montiel, J.M.; Tardós, J.D. ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual–Inertial, and Multimap SLAM. IEEE Trans. Robot. 2021, 37, 1874–1890. [Google Scholar] [CrossRef]
- Chen, S.; Wen, C.Y.; Zou, Y.; Chen, W. Stereo visual inertial pose estimation based on feedforward-feedback loops. arXiv 2020, arXiv:2007.02250. [Google Scholar]
- Nguyen, T.M.; Cao, M.; Yuan, S.; Lyu, Y.; Nguyen, T.H.; Xie, L. Viral-fusion: A visual-inertial-ranging-lidar sensor fusion approach. IEEE Trans. Robot. 2021, 38, 958–977. [Google Scholar] [CrossRef]
- Zhang, T.; Liu, C.; Li, J.; Pang, M.; Wang, M. A New Visual Inertial Simultaneous Localization and Mapping (SLAM) Algorithm Based on Point and Line Features. Drones 2022, 6. [Google Scholar] [CrossRef]
- Song, S.; Lim, H.; Lee, A.J.; Myung, H. DynaVINS: A Visual-Inertial SLAM for Dynamic Environments. IEEE Robot. Autom. Lett. 2022, 7, 11523–11530. [Google Scholar] [CrossRef]
- Steenbeek, A.; Nex, F. CNN-Based Dense Monocular Visual SLAM for Real-Time UAV Exploration in Emergency Conditions. Drones 2022, 6, 79. [Google Scholar] [CrossRef]
- Delmerico, J.; Scaramuzza, D. A Benchmark Comparison of Monocular Visual-Inertial Odometry Algorithms for Flying Robots. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 2502–2509. [Google Scholar] [CrossRef]
- Lin, Y.; Gao, F.; Qin, T.; Gao, W.; Liu, T.; Wu, W.; Yang, Z.; Shen, S. Autonomous aerial navigation using monocular visual-inertial fusion. J. Field Robot. 2018, 35, 23–51. [Google Scholar] [CrossRef]
- Gao, F.; Lin, Y.; Shen, S. Gradient-based online safe trajectory generation for quadrotor flight in complex environments. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 3681–3688. [Google Scholar] [CrossRef]
- Gao, F.; Wang, L.; Zhou, B.; Zhou, X.; Pan, J.; Shen, S. Teach-Repeat-Replan: A Complete and Robust System for Aggressive Flight in Complex Environments. IEEE Trans. Robot. 2020, 36, 1526–1545. [Google Scholar] [CrossRef]
- Khattar, F.; Luthon, F.; Larroque, B.; Dornaika, F. Visual localization and servoing for drone use in indoor remote laboratory environment. Mach. Vis. Appl. 2021, 32, 1–13. [Google Scholar] [CrossRef]
- Luo, Y.; Li, Y.; Li, Z.; Shuang, F. MS-SLAM: Motion State Decision of Keyframes for UAV-Based Vision Localization. IEEE Access 2021, 9, 67667–67679. [Google Scholar] [CrossRef]
- Slowak, P.; Kaniewski, P. Stratified Particle Filter Monocular SLAM. Remote Sens. 2021, 13, 3233. [Google Scholar] [CrossRef]
- Zhan, Z.; Jian, W.; Li, Y.; Yue, Y. A slam map restoration algorithm based on submaps and an undirected connected graph. IEEE Access 2021, 9, 12657–12674. [Google Scholar] [CrossRef]
- Couturier, A.; Akhloufi, M.A. A review on absolute visual localization for UAV. Robot. Auton. Syst. 2021, 135, 103666. [Google Scholar] [CrossRef]
- George, A. Analysis of Visual-Inertial Odometry Algorithms for Outdoor Drone Applications. Master’s Thesis, Aalto University, Espoo, Finland, 2021. Available online: http://urn.fi/URN:NBN:fi:aalto-2021121910926 (accessed on 1 February 2022).
- Qin, T.; Pan, J.; Cao, S.; Shen, S. A general optimization-based framework for local odometry estimation with multiple sensors. arXiv 2019, arXiv:1901.03638. [Google Scholar]
- Tomasi, C.; Kanade, T. Detection and tracking of point. Int. J. Comput. Vis. 1991, 9, 137–154. [Google Scholar] [CrossRef]
- Agarwal, S.; Mierle, K.; The Ceres Solver Team. Ceres Solver. Available online: http://ceres-solver.org (accessed on 29 August 2021).
- Gálvez-López, D.; Tardós, J.D. Bags of Binary Words for Fast Place Recognition in Image Sequences. IEEE Trans. Robot. 2012, 28, 1188–1197. [Google Scholar] [CrossRef]
- Calonder, M.; Lepetit, V.; Strecha, C.; Fua, P. Brief: Binary robust independent elementary features. In Computer Vision. ECCV 2010; Springer: Berlin/Heidelberg, Germany, 2010; pp. 778–792. [Google Scholar]
- Intel Corporation. IntelRealSense Tracking Camera. Available online: https://www.intelrealsense.com/wp-content/uploads/2019/09/Intel_RealSense_Tracking_Camera_Datasheet_Rev004_release.pdf (accessed on 21 April 2021).
- Nerian Vision GmbH. Karmin3 Stereo Camera User Manual. Available online: https://nerian.com/nerian-content/downloads/manuals/karmin3/karmin3_manual_v1_1.pdf (accessed on 21 April 2021).
- Nerian Vision GmbH. SceneScan / SceneScan Pro User Manual. Available online: https://nerian.com/nerian-content/downloads/manuals/scenescan/scenescan_manual_v1_14.pdf (accessed on 21 April 2021).
- ROS. About ROS. Available online: https://www.ros.org/about-ros/ (accessed on 12 August 2021).
- Basler. acA2440-75uc. Available online: https://docs.baslerweb.com/aca2440-75uc (accessed on 21 June 2021).
- Xsens. MTi 600-Series User Manual. Available online: https://mtidocs.xsens.com/mti-600-series-user-manual (accessed on 21 June 2021).
- GIGA-BYTE Technology Co., Ltd. GB-BSi5H-6200-B2-IW (Rev. 1.0). Available online: https://www.gigabyte.com/Mini-PcSystem/GB-BSi5H-6200-B2-IW-rev-10#ov (accessed on 28 July 2021).
- FUJIFILM Corporation. HF-XA-5M Series. Available online: https://www.fujifilm.com/us/en/business/optical-devices/optical-devices/machine-vision-lens/hf-xa-5m-series#HF01 (accessed on 28 July 2021).
- Intel Corporation. USB 3.0* Radio Frequency Interference Impact on 2.4 GHz Wireless Devices. Available online: https://www.usb.org/sites/default/files/327216.pdf (accessed on 21 June 2021).
- Basler. pylon-ROS-camera. Available online: https://github.com/basler/pylon-ros-camera (accessed on 8 March 2021).
- Xsens. Xsens MTI ROS node. Available online: https://github.com/xsens/xsens_mti_ros_node (accessed on 8 March 2021).
- Koivula, H.; Laaksonen, A.; Lahtinen, S.; Kuokkanen, J.; Marila, S. Finnish permanent GNSS network, FinnRef. In Proceedings of the FIG Working Week, Helsinki, Finland, 29 May–2 June 2017. [Google Scholar]
- ETHZ ASL. The Kalibr visual-inertial calibration toolbox. Available online: https://github.com/ethz-asl/kalibr (accessed on 8 March 2021).
- Gaowenliang. imu_utils. Available online: https://github.com/gaowenliang/imu_utils (accessed on 8 March 2021).
- Furgale, P.; Rehder, J.; Siegwart, R. Unified temporal and spatial calibration for multi-sensor systems. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 1280–1286. [Google Scholar] [CrossRef]
- Rehder, J.; Nikolic, J.; Schneider, T.; Hinzmann, T.; Siegwart, R. Extending kalibr: Calibrating the extrinsics of multiple IMUs and of individual axes. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 4304–4311. [Google Scholar] [CrossRef]
- Woodman, O.J. An Introduction to Inertial Navigation; Technical Report UCAM-CL-TR-696; University of Cambridge, Computer Laboratory: Cambridge, UK, 2007. [Google Scholar] [CrossRef]
- Olson, E. AprilTag: A robust and flexible visual fiducial system. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 3400–3407. [Google Scholar]
- SPH Engineering. Ground Station Software|UgCS PC Mission Planing. Available online: https://www.ugcs.com/ (accessed on 29 August 2021).
- Agisoft. Agisoft Metashape User Manual, Professional Edition, Version 1.7; Available online: https://www.agisoft.com/pdf/metashape-pro_1_7_en.pdf (accessed on 28 July 2021).
- Elkhrachy, I. Accuracy Assessment of Low-Cost Unmanned Aerial Vehicle (UAV) Photogrammetry. Alex. Eng. J. 2021, 60, 5579–5590. [Google Scholar] [CrossRef]
- Topcon Positioning Systems, Inc. HiPer HR. Available online: https://www.topconpositioning.com/gnss/gnss-receivers/hiper-hr (accessed on 28 July 2021).
- Topcon Positioning Systems, Inc. FC-5000 Field Controller. Available online: https://www.topconpositioning.com/support/products/fc-5000-field-controller (accessed on 28 July 2021).
- Sturm, J.; Engelhard, N.; Endres, F.; Burgard, W.; Cremers, D. A benchmark for the evaluation of RGB-D SLAM systems. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura-Algarve, Portugal, 7–12 October 2012; pp. 573–580. [Google Scholar]
- Geiger, A.; Lenz, P.; Urtasun, R. Are we ready for autonomous driving? the kitti vision benchmark suite. In Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition, Providence, RI, USA, 16–21 June 2012; pp. 3354–3361. [Google Scholar]
- Zhang, Z.; Scaramuzza, D. A tutorial on quantitative trajectory evaluation for visual (-inertial) odometry. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 7244–7251. [Google Scholar]
- Umeyama, S. Least-squares estimation of transformation parameters between two point patterns. IEEE Trans. Pattern Anal. Mach. Intell. 1991, 13, 376–380. [Google Scholar] [CrossRef] [Green Version]
- Robotics and Perception Group. rpg_trajectory_evaluation—Toolbox for quantitative trajectory evaluation of VO/VIO. Available online: https://github.com/uzh-rpg/rpg_trajectory_evaluation (accessed on 18 August 2021).
- Warren, M.; Upcroft, B. High altitude stereo visual odometry. In Proceedings of the Robotics: Science and Systems IX, Daegu, Republic of Korea, 10–14 July 2023; pp. 1–8. [Google Scholar]
- Jeon, J.; Jung, S.; Lee, E.; Choi, D.; Myung, H. Run your visual-inertial odometry on NVIDIA Jetson: Benchmark tests on a micro aerial vehicle. IEEE Robot. Autom. Lett. 2021, 6, 5332–5339. [Google Scholar] [CrossRef]
Altitude (m) | 40 | 60 | 80 | 100 | Exposure Time (s) | External Conditions 1 | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Speed (m/s) | 2 | 3 | 4 | 2 | 3 | 4 | 2 | 3 | 4 | 2 | 3 | 4 | ||
Dataset 1 | ✓ | ✓ | ✓ | 500 | gentle breeze | |||||||||
Dataset 2 | ✓ | ✓ | ✓ | 1000 | moderate breeze | |||||||||
Dataset 3 | ✓ | ✓ | ✓ | 1000 | cloudy, moderate breeze | |||||||||
Dataset 4 | ✓ | ✓ | ✓ | 1200 | cloudy, strong breeze |
Feature Tracking Parameters | Value | Optimisation Parameters | Value |
---|---|---|---|
Max number of tracked features | 150 | Max solver iteration time | 0.04 ms |
Min distance between two features | 30 px | Max solver iterations | 8 |
RANSAC threshold | 1 px | Keyframe parallax threshold | 10 px |
Height | 40 m | 60 m | 80 m | 100 m | |||||
---|---|---|---|---|---|---|---|---|---|
Speed | Estimation | ATE | RE | ATE | RE | ATE | RE | ATE | RE |
stereo-VO | 9.944 | 2.291 | 15.668 | 3.651 | 62.415 | 14.276 | 101.584 | 9.006 | |
2 m/s | mono-VIO | – | – | 55.814 | 10.394 | 12.093 | 1.634 | 18.409 | 2.582 |
stereo-VIO | 21.689 | 3.627 | 10.563 | 0.949 | 4.362 | 1.414 | 12.861 | 1.497 | |
stereo-VO | 16.932 | 6.436 | 14.092 | 3.127 | 56.992 | 8.953 | 123.113 | 9.393 | |
3 m/s | mono-VIO | 9.462 | 3.575 | 16.547 | 2.182 | 6.741 | 4.483 | 20.302 | 4.01 |
stereo-VIO | 2.554 | 2.638 | 4.744 | 0.862 | 7.397 | 4.034 | 12.459 | 4.069 | |
stereo-VO | 16.183 | 2.454 | 41.240 | 5.731 | 71.668 | 3.651 | 113.007 | 3.388 | |
4 m/s | mono-VIO | 9.957 | 2.323 | 15.495 | 4.883 | 20.256 | 2.27 | 17.909 | 6.164 |
stereo-VIO | 4.636 | 1.312 | 2.186 | 3.214 | 9.245 | 1.819 | 9.544 | 4.435 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
George, A.; Koivumäki, N.; Hakala, T.; Suomalainen, J.; Honkavaara, E. Visual-Inertial Odometry Using High Flying Altitude Drone Datasets. Drones 2023, 7, 36. https://doi.org/10.3390/drones7010036
George A, Koivumäki N, Hakala T, Suomalainen J, Honkavaara E. Visual-Inertial Odometry Using High Flying Altitude Drone Datasets. Drones. 2023; 7(1):36. https://doi.org/10.3390/drones7010036
Chicago/Turabian StyleGeorge, Anand, Niko Koivumäki, Teemu Hakala, Juha Suomalainen, and Eija Honkavaara. 2023. "Visual-Inertial Odometry Using High Flying Altitude Drone Datasets" Drones 7, no. 1: 36. https://doi.org/10.3390/drones7010036
APA StyleGeorge, A., Koivumäki, N., Hakala, T., Suomalainen, J., & Honkavaara, E. (2023). Visual-Inertial Odometry Using High Flying Altitude Drone Datasets. Drones, 7(1), 36. https://doi.org/10.3390/drones7010036