A Novel Fuzzy Image-Based UAV Landing Using RGBD Data and Visual SLAM
Abstract
:1. Introduction
- A novel fuzzy landing score map to reliably characterize the proper landing site is proposed.
- To address the problem of the limited resolution of stereo range cameras, the monocular ORB-SLAM point cloud is exploited to identify the best subset of the point cloud for landing.
- A novel technique for adding synthetic world points to serve as the corner features for IBVS is suggested.
- A new method that integrates the ORB-SLAM3 localization with image moment-based IBVS is offered to perform the landing.
- The close-to-real-world ROS Gazebo simulations for the indoor/outdoor, unknown environments are conducted to verify the effectiveness of the landing pipeline.
2. Methodology
2.1. Dense Stereo Reconstruction
2.2. Visual Monocular ORB-SLAM3 System
2.3. Landing Zone Detection Algorithm
Algorithm 1 Algorithm for obtaining depth variance map |
Input: depth image, window size Output: DepthVarMap
|
Algorithm 2 Algorithm for obtaining the flatness map |
Input: depth image Output: FlatnessMap
|
Algorithm 3 Algorithm for computing the inclination map |
Input: Point Cloud Patches Output: InclinationMap
|
Algorithm 4 Procedure for obtaining the steepness map |
Input: Point Cloud Patches Output: SteepnessMap
|
2.4. Fuzzy Landing Map
2.5. Finding the Best Points to Land
2.6. Reference Frame Attachment
2.7. Quadrotor’s Equations of Motion
2.8. Visual Data Dynamics
2.9. IBVS Control Design
3. Results
3.1. The First Scenario
3.2. The Second Scenario
3.3. The Third Scenario
3.4. The Fourth Scenario
3.5. Sensitivity Analysis Environmental Conditions
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Zheng, L.; Hamaza, S. ALBERO: Agile Landing on Branches for Environmental Robotics Operations. IEEE Robot. Autom. Lett. 2024, 9, 2845–2852. [Google Scholar] [CrossRef]
- Lian, X.; Li, Y.; Wang, X.; Shi, L.; Xue, C. Research on Identification and Location of Mining Landslide in Mining Area Based on Improved YOLO Algorithm. Drones 2024, 8, 150. [Google Scholar] [CrossRef]
- Sefercik, U.G.; Nazar, M. Consistency Analysis of RTK and Non-RTK UAV DSMs in Vegetated Areas. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2023, 16, 5759–5768. [Google Scholar] [CrossRef]
- Shen, J.; Wang, Q.; Zhao, M.; Hu, J.; Wang, J.; Shu, M.; Liu, Y.; Guo, W.; Qiao, H.; Niu, Q.; et al. Mapping Maize Planting Densities Using Unmanned Aerial Vehicles, Multispectral Remote Sensing, and Deep Learning Technology. Drones 2024, 8, 140. [Google Scholar] [CrossRef]
- Li, Z.; Wang, Q.; Zhang, T.; Ju, C.; Suzuki, S.; Namiki, A. UAV High-Voltage Power Transmission Line Autonomous Correction Inspection System Based on Object Detection. IEEE Sens. J. 2023, 23, 10215–10230. [Google Scholar] [CrossRef]
- Boukabou, I.; Kaabouch, N. Electric and Magnetic Fields Analysis of the Safety Distance for UAV Inspection around Extra-High Voltage Transmission Lines. Drones 2024, 8, 47. [Google Scholar] [CrossRef]
- Gao, S.; Wang, W.; Wang, M.; Zhang, Z.; Yang, Z.; Qiu, X.; Zhang, B.; Wu, Y. A Robust Super-Resolution Gridless Imaging Framework for UAV-Borne SAR Tomography. IEEE Trans. Geosci. Remote. Sens. 2024, 62, 1–17. [Google Scholar] [CrossRef]
- Abdollahzadeh, S.; Proulx, P.L.; Allili, M.S.; Lapointe, J.F. Safe Landing Zones Detection for UAVs Using Deep Regression. In Proceedings of the 2022 19th Conference on Robots and Vision (CRV), Toronto, ON, Canada, 31 May–2 June 2022; pp. 213–218. [Google Scholar] [CrossRef]
- Alsawy, A.; Moss, D.; Hicks, A.; McKeever, S. An Image Processing Approach for Real-Time Safety Assessment of Autonomous Drone Delivery. Drones 2024, 8, 21. [Google Scholar] [CrossRef]
- Shah Alam, M.; Oluoch, J. A survey of safe landing zone detection techniques for autonomous unmanned aerial vehicles (UAVs). Expert Syst. Appl. 2021, 179, 115091. [Google Scholar] [CrossRef]
- Xu, Y.; Chen, Z.; Deng, C.; Wang, S.; Wang, J. LCDL: Toward Dynamic Localization for Autonomous Landing of Unmanned Aerial Vehicle Based on LiDAR–Camera Fusion. IEEE Sens. J. 2024, 24, 26407–26415. [Google Scholar] [CrossRef]
- Friess, C.; Niculescu, V.; Polonelli, T.; Magno, M.; Benini, L. Fully Onboard SLAM for Distributed Mapping with a Swarm of Nano-Drones. IEEE Internet Things J. 2024, 11, 32363–32380. [Google Scholar] [CrossRef]
- Symeonidis, C.; Kakaletsis, E.; Mademlis, I.; Nikolaidis, N.; Tefas, A.; Pitas, I. Vision-based UAV Safe Landing exploiting Lightweight Deep Neural Networks. In Proceedings of the 2021 4th International Conference on Image and Graphics Processing, Sanya, China, 1–3 January 2021; ICIGP’21. pp. 13–19. [Google Scholar] [CrossRef]
- Subramanian, J.A.; Asirvadam, V.S.; Zulkifli, S.A.B.M.; Singh, N.S.S.; Shanthi, N.; Lagisetty, R.K.; Kadir, K.A. Integrating Computer Vision and Photogrammetry for Autonomous Aerial Vehicle Landing in Static Environment. IEEE Access 2024, 12, 4532–4543. [Google Scholar] [CrossRef]
- Yang, L.; Ye, J.; Zhang, Y.; Wang, L.; Qiu, C. A semantic SLAM-based method for navigation and landing of UAVs in indoor environments. Knowl. -Based Syst. 2024, 293, 111693. [Google Scholar] [CrossRef]
- Chatzikalymnios, E.; Moustakas, K. Landing site detection for autonomous rotor wing UAVs using visual and structural information. J. Intell. Robot. Syst. 2022, 104, 27. [Google Scholar] [CrossRef]
- Dougherty, J.; Lee, D.; Lee, T. Laser-based guidance of a quadrotor uav for precise landing on an inclined surface. In Proceedings of the 2014 American Control Conference, Portland, OR, USA, 4–6 June 2014; pp. 1210–1215. [Google Scholar] [CrossRef]
- Corke, P. Vision-Based Control. In Robotics, Vision and Control: Fundamental Algorithms in Python; Springer: Berlin/Heidelberg, Germany, 2011; Chapter 16. [Google Scholar]
- Janabi-Sharifi, F.; Marey, M. A Kalman-Filter-Based Method for Pose Estimation in Visual Servoing. IEEE Trans. Robot. 2010, 26, 939–947. [Google Scholar] [CrossRef]
- Sepahvand, S.; Wang, G.; Janabi-Sharifi, F. Image-to-Joint Inverse Kinematic of a Supportive Continuum Arm Using Deep Learning. arXiv 2024, arXiv:2405.20248. [Google Scholar] [CrossRef]
- Tadic, V.; Toth, A.; Vizvari, Z.; Klincsik, M.; Sari, Z.; Sarcevic, P.; Sarosi, J.; Biro, I. Perspectives of RealSense and ZED Depth Sensors for Robotic Vision Applications. Machines 2022, 10, 183. [Google Scholar] [CrossRef]
- Campos, C.; Elvira, R.; Rodríguez, J.J.G.; Montiel, J.M.M.; Tardós, J.D. ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual–Inertial, and Multimap SLAM. IEEE Trans. Robot. 2021, 37, 1874–1890. [Google Scholar] [CrossRef]
- Johnson, A.E.; Klumpp, A.R.; Collier, J.B.; Wolf, A.A. Lidar-Based Hazard Avoidance for Safe Landing on Mars. J. Guid. Control. Dyn. 2002, 25, 1091–1099. [Google Scholar] [CrossRef]
- Jabbari Asl, H.; Yoon, J. Robust image-based control of the quadrotor unmanned aerial vehicle. Nonlinear Dyn. 2016, 85, 2035–2048. [Google Scholar] [CrossRef]
- Chaumette, F. Image moments: A general and useful set of features for visual servoing. IEEE Trans. Robot. 2004, 20, 713–723. [Google Scholar] [CrossRef]
- Lewis, F.L.; Selmic, R.; Campos, J. Neuro-Fuzzy Control of Industrial Systems with Actuator Nonlinearities; Society for Industrial and Applied Mathematics: Philadelphia, PA, USA, 2002. [Google Scholar]
- Sepahvand, S.; Amiri, N.; Pourgholi, M.; Fakhari, V. Robust controller design for a class of MIMO nonlinear systems using TOPSIS function-link fuzzy cerebellar model articulation controller and interval type-2 fuzzy compensator. Iran. J. Fuzzy Syst. 2023, 20, 89–107. [Google Scholar] [CrossRef]
- M’Gharfaoui, I. Implementation of an Image-Based Visual Servoing System on a Parrot Bebop 2 UAV. Ph.D. Thesis, Politecnico di Torino, Torino, Italy, 2019. [Google Scholar]
Parameter | Value | Unit |
---|---|---|
UAV Mass (with hull) | kg | |
UAV Moment of Inertia | ||
Color Camera Focal Length | pixel | |
Color Camera Principal Point | pixel | |
Radial/Tangential Distortion | ignored | - |
Experimental Indicator | Value | Unit |
---|---|---|
Average landing zone detection time | ms | |
IBVS stabilization time | s | |
Landing success rate | Not Applicable | |
Distance error | m |
Parameter | Value | Unit |
---|---|---|
diag() | dimensionless | |
diag() | dimensionless | |
diag() | dimensionless | |
dimensionless | ||
2 | dimensionless | |
dimensionless | ||
2 | dimensionless | |
dimensionless | ||
Depth image variance kernel size in Algorithm 1 | pixel | |
Canny lower and upper values in hysteresis thresholding in Algorithm 2 | dimensionless | |
Distance transformation’s mask size in Algorithm 2 | pixel | |
KNN search radius in Algorithms 3 and 4 | m |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Sepahvand, S.; Amiri, N.; Masnavi, H.; Mantegh, I.; Janabi-Sharifi, F. A Novel Fuzzy Image-Based UAV Landing Using RGBD Data and Visual SLAM. Drones 2024, 8, 594. https://doi.org/10.3390/drones8100594
Sepahvand S, Amiri N, Masnavi H, Mantegh I, Janabi-Sharifi F. A Novel Fuzzy Image-Based UAV Landing Using RGBD Data and Visual SLAM. Drones. 2024; 8(10):594. https://doi.org/10.3390/drones8100594
Chicago/Turabian StyleSepahvand, Shayan, Niloufar Amiri, Houman Masnavi, Iraj Mantegh, and Farrokh Janabi-Sharifi. 2024. "A Novel Fuzzy Image-Based UAV Landing Using RGBD Data and Visual SLAM" Drones 8, no. 10: 594. https://doi.org/10.3390/drones8100594
APA StyleSepahvand, S., Amiri, N., Masnavi, H., Mantegh, I., & Janabi-Sharifi, F. (2024). A Novel Fuzzy Image-Based UAV Landing Using RGBD Data and Visual SLAM. Drones, 8(10), 594. https://doi.org/10.3390/drones8100594