Multi-Sensor Fusion for Aerial Robots in Industrial GNSS-Denied Environments
Abstract
:1. Introduction
2. Materials and Methods
2.1. Platform Description
2.2. MCL Algorithm
3. Results
3.1. Test Scenario
3.2. 3D Map
3.3. Public Dataset
3.4. Flight Test
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Sample Availability
Abbreviations
2D | Two-dimensional |
3D | Three-dimensional |
AMCL | Adaptive Monte Carlo localization |
GNSS | Global Navigation Satellite System |
IMU | Inertial Measurement Unit |
LIDAR | Light Detection and Ranging |
MCL | Monte Carlo localization |
RGB | Red, Green, Blue |
RGB-D | Red, Green, Blue-Depth |
ROS | Robot Operating System |
UWB | Ultra-wide band |
UAV | Unmanned aerial vehicle |
References
- Vaidya, S.; Ambad, P.; Bhosle, S. Industry 4.0—A glimpse. Procedia Manuf. 2018, 20, 233–238. [Google Scholar] [CrossRef]
- Sachs, G. Drones: Reporting for Work. 2015. Available online: https://www.goldmansachs.com/insights/technology-driving-innovation/drones/ (accessed on 3 December 2020).
- Systems Engineering Society of Australia. European Drones Outlook Study. 2016. Available online: https://www.sesarju.eu/sites/default/files/documents/reports/European_Drones_Outlook_Study_2016.pdf (accessed on 3 December 2020).
- Wawrla, L.; Maghazei, O.; Netland, T. Applications of Drones in Warehouse Operations; ETH Zurich: Zürich, Switzerland, 2019. [Google Scholar]
- Marder-Eppstein, E.; Berger, E.; Foote, T.; Gerkey, B.; Konolige, K. The Office Marathon: Robust Navigation in an Indoor Office Environment. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, 3–7 May 2010; pp. 300–307. [Google Scholar]
- Montemerlo, M.; Becker, J.; Bhat, S.; Dahlkamp, H.; Dolgov, D.; Ettinger, S.; Haehnel, D.; Hilden, T.; Hoffmann, G.; Huhnke, B.; et al. Junior: The Stanford Entry in the Urban Challenge. J. Field Robot. 2008, 25, 569–597. [Google Scholar] [CrossRef] [Green Version]
- Michael, N.; Mellinger, D.; Lindsey, Q.; Kumar, V. The grasp multiple micro-uav testbed. IEEE Robot. Autom. Mag. 2010, 17, 56–65. [Google Scholar] [CrossRef]
- Perez-Grau, F.; Ragel, R.; Caballero, F.; Viguria, A.; Ollero, A. Semi-Autonomous Teleoperation of UAVs in Search and Rescue Scenarios. In Proceedings of the 2017 International Conference on Unmanned Aircraft Systems (ICUAS), Miami, FL, USA, 13–16 June 2017; pp. 1066–1074. [Google Scholar]
- Leutenegger, S.; Lynen, S.; Bosse, M.; Siegwart, R.; Furgale, P. Keyframe-based visual-inertial odometry using nonlinear optimization. Int. J. Robot. Res. 2015, 34, 314–334. [Google Scholar] [CrossRef] [Green Version]
- Forster, C.; Zhang, Z.; Gassner, M.; Werlberger, M.; Scaramuzza, D. SVO: Semidirect visual odometry for monocular and multicamera systems. IEEE Trans. Robot. 2016, 33, 249–265. [Google Scholar] [CrossRef] [Green Version]
- Qin, T.; Li, P.; Shen, S. Vins-mono: A robust and versatile monocular visual-inertial state estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar] [CrossRef] [Green Version]
- Sumikura, S.; Shibuya, M.; Sakurada, K. OpenVSLAM: A Versatile Visual Slam Framework. In Proceedings of the 27th ACM International Conference on Multimedia, Nice, France, 21–25 October 2019; pp. 2292–2295. [Google Scholar]
- Schmid, K.; Tomic, T.; Ruess, F.; Hirschmüller, H.; Suppa, M. Stereo Vision Based Indoor/Outdoor Navigation for Flying Robots. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 3955–3962. [Google Scholar] [CrossRef]
- Endres, F.; Hess, J.; Sturm, J.; Cremers, D.; Burgard, W. 3-D Mapping with an RGB-D Camera. IEEE Trans. Robot. 2014, 30, 177–187. [Google Scholar] [CrossRef]
- Mur-Artal, R.; Tardós, J.D. Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras. IEEE Trans. Robot. 2017, 33, 1255–1262. [Google Scholar] [CrossRef] [Green Version]
- Labbé, M.; Michaud, F. RTAB-Map as an open-source lidar and visual simultaneous localization and mapping library for large-scale and long-term online operation. J. Field Robot. 2019, 36, 416–446. [Google Scholar] [CrossRef]
- Wolcott, R.W.; Eustice, R.M. Robust LIDAR localization using multiresolution Gaussian mixture maps for autonomous driving. Int. J. Robot. Res. 2017, 36, 292–319. [Google Scholar] [CrossRef]
- Zhang, J.; Singh, S. LOAM: Lidar Odometry and Mapping in Real-time. In Proceedings of the Robotics: Science and Systems X. Robotics: Science and Systems Foundation, Berkeley, CA, USA, 12–16 July 2014. [Google Scholar] [CrossRef]
- Shan, T.; Englot, B.; Meyers, D.; Wang, W.; Ratti, C.; Rus, D. LIO-SAM: Tightly-coupled Lidar Inertial Odometry via Smoothing and Mapping. arXiv 2020, arXiv:2007.00258. [Google Scholar]
- Thrun, S.; Fox, D.; Burgard, W.; Dellaert, F. Robust Monte Carlo localization for mobile robots. Artif. Intell. 2001, 128, 99–141. [Google Scholar] [CrossRef] [Green Version]
- Wan, G.; Yang, X.; Cai, R.; Li, H.; Zhou, Y.; Wang, H.; Song, S. Robust and Precise Vehicle Localization Based on Multi-Sensor Fusion in Diverse City Scenes. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 4670–4677. [Google Scholar]
- Fabresse, R.F.; Caballero, F.; Maza, I.; Ollero, A. Robust Range-Only SLAM for Unmanned Aerial Systems. J. Intell. Robot. Syst. 2015, 84. [Google Scholar] [CrossRef]
- González-Jiménez, J.; Blanco, J.L.; Galindo, C.; Ortiz-de Galisteo, A.; Fernández-Madrigal, J.A.; Moreno, F.; Martínez, J. Mobile robot localization based on Ultra-Wide-Band ranging: A particle filter approach. Robot. Auton. Syst. 2009, 57, 496–507. [Google Scholar] [CrossRef]
- Tiemann, J.; Schweikowski, F.; Wietfeld, C. Design of an UWB Indoor-Positioning System for UAV Navigation in GNSS-Denied Environments. In Proceedings of the 2015 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Banff, AB, Canada, 13–16 October 2015; pp. 1–7. [Google Scholar]
- Perez-Grau, F.; Caballero, F.; Viguria, A.; Ollero, A. Multi-sensor three-dimensional Monte Carlo localization for long-term aerial robot navigation. Int. J. Adv. Robot. Syst. 2017, 14. [Google Scholar] [CrossRef]
- Hornung, A.; Wurm, K.M.; Bennewitz, M.; Stachniss, C.; Burgard, W. OctoMap: An Efficient Probabilistic 3D Mapping Framework Based on Octrees. Auton. Robot. 2013, 34, 189–206. [Google Scholar] [CrossRef] [Green Version]
Sensor | Weight | Size | Range | Range |
---|---|---|---|---|
LiDAR | 447 g | https://ouster.com (accessed on 26 April 2021) Diameter: 85 mm Height: 73.5 mm | 105 m @ >90% prob 120 m @ >50% prob 40 m @ >90% prob 60 m @ >50% prob 0–0.25 m blockage detection | Lambertian targets ±5 cm Retroreflectors ±10 cm |
Orbbec ASTRA Pro | 300 g | 165 × 30 × 40 mm | 0.6–8.0 m (Optimal 0.6–5.0 m) | ±1–3 mm (@1 m) ±12.7 mm (@2–3 m) |
100 Particles | ||||
---|---|---|---|---|
X—Axis (m) | Y—Axis (m) | Z—Axis (m) | Yaw (Degree) | |
0 | 0.103 | 0.079 | 0.208 | 0.109 |
0.2 | 0.107 | 0.091 | 0.202 | 0.119 |
0.4 | 0.106 | 0.078 | 0.125 | 0.112 |
0.6 | 0.112 | 0.074 | 0.185 | 0.110 |
0.8 | 0.137 | 0.087 | 0.062 | 0.114 |
1 | 0.207 | 0.181 | 0.034 | 0.108 |
500 Particles | ||||
---|---|---|---|---|
X—Axis (m) | Y—Axis (m) | Z—Axis (m) | Yaw (Degree) | |
0 | 0.138 | 0.092 | 0.088 | 0.133 |
0.2 | 0.131 | 0.101 | 0.041 | 0.152 |
0.4 | 0.158 | 0.095 | 0.033 | 0.133 |
0.6 | 0.177 | 0.121 | 0.058 | 0.142 |
0.8 | 0.197 | 0.126 | 0.042 | 0.136 |
1 | 0.249 | 0.157 | 0.054 | 0.119 |
Computational Load | |||
---|---|---|---|
Time (s) | 100 Particles (%) | 500 Particles (%) | Increase (%) |
5 | 16.1875 | 18.9625 | 2.775 |
10 | 15.6125 | 18.1625 | 2.55 |
15 | 16.8625 | 18.1 | 1.2375 |
20 | 17.325 | 18.175 | 0.85 |
25 | 18.1625 | 19.1125 | 0.95 |
X—Axis (m) | Y—Axis (m) | Z—Axis (m) | Yaw (Degree) |
---|---|---|---|
0.36 | 0.39 | 0.17 | 10.313 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Carrasco, P.; Cuesta, F.; Caballero, R.; Perez-Grau, F.J.; Viguria, A. Multi-Sensor Fusion for Aerial Robots in Industrial GNSS-Denied Environments. Appl. Sci. 2021, 11, 3921. https://doi.org/10.3390/app11093921
Carrasco P, Cuesta F, Caballero R, Perez-Grau FJ, Viguria A. Multi-Sensor Fusion for Aerial Robots in Industrial GNSS-Denied Environments. Applied Sciences. 2021; 11(9):3921. https://doi.org/10.3390/app11093921
Chicago/Turabian StyleCarrasco, Paloma, Francisco Cuesta, Rafael Caballero, Francisco J. Perez-Grau, and Antidio Viguria. 2021. "Multi-Sensor Fusion for Aerial Robots in Industrial GNSS-Denied Environments" Applied Sciences 11, no. 9: 3921. https://doi.org/10.3390/app11093921
APA StyleCarrasco, P., Cuesta, F., Caballero, R., Perez-Grau, F. J., & Viguria, A. (2021). Multi-Sensor Fusion for Aerial Robots in Industrial GNSS-Denied Environments. Applied Sciences, 11(9), 3921. https://doi.org/10.3390/app11093921