Autonomous UAV Landing and Collision Avoidance System for Unknown Terrain Utilizing Depth Camera with Actively Actuated Gimbal
Abstract
1. Introduction
- A new landing procedure that is compatible with a close-range imaging sensor in the form of an RGB-D camera. The algorithm is lightweight enough so that it can be executed onboard using an SBC with limited computing power like the Jetson nano.
- Usage of the gimbal motion during the landing procedure to counter the narrow field of view of the camera.
- Joining landing spot detection with an uncooperative collision avoidance strategy. Safety is enhanced due to the constant flight path monitoring. The UAV can react to the obstacle in any step of the landing.
2. The State of the Art in Autonomous Landing
2.1. Survey Papers
2.2. Selected Individual Contributions
2.3. Synthesis and Open Questions
3. Proposed Procedure
3.1. Reasoning
3.2. Procedure
4. Materials and Methods
4.1. UAV Construction
- 1.
- Flight controller (green rectangle): Pixhawk 2.1—CubePilot Cube Orange on a carrier board [49].
- 2.
- Onboard computer (red rectangle): Nvidia Jetson Nano 4GB A02 [50].
- 3.
- 4.
- GPS antenna (orange rectangle). HEX/ProfiCNC Here2 GPS [53]—now discontinued, with newer versions available.
4.2. Software Architecture
4.3. UAV State Module
- Limited camera range: Small angular errors do not result in significant distance errors.
- No SLAM or odometry: All calculations are performed in a single frame, limiting error accumulation. The UAV does not require the precision of a robotic arm used in manufacturing.
- Flight controller stability: The flight controller, equipped with redundant IMUs, compass, and barometer, performs pose estimation using EKF and provides a stable base pose. The rest of the UAV is treated as a rigid chain. Yaw drift was minimal and did not cause issues. If yaw drift occurs, it can be minimized through manufacturer-recommended calibration procedures, including thermal calibration of IMUs.
- Landing candidate tracking: The UAV points the camera at the landing spot candidate to center it in the image. Inaccurate chain transformations may cause misalignment, but as long as the rotation direction is correct, the drift settles with an offset. Small offsets are acceptable since enough space around the landing spot remains visible.
- Gimbal movement threshold: The gimbal only moves when the landing spot is sufficiently far from the image center. This behavior limits unnecessary camera motion and reduces the impact of minor inaccuracies.
4.4. Computer Vision Module
4.5. Anticollision Module
4.6. Debug Visualization Module
4.7. Autopilot
- INVALID—Pseudo-state indicating a request for an unknown state. Triggers an error log entry and keeps the current state unchanged.
- SELF_DISCOVERY—The vehicle listens for flight controller traffic and updates its own state; sets the home position.
- IDLE—Passive waiting.
- TAKEOFF—Takes control of the vehicle, arms the motors, and climbs vertically.
- HOVER—Holds position at the current location.
- PATROL—Flies a circle of a specified radius around the current position.
- GIMBAL_SWEEP—Rotates the gimbal through a full sweep.
- GO_COORDS—Flies to specified coordinates.
- GO_RANDOM_COORDS—Flies to random (nearby) coordinates.
- GO_HOME—Returns to the initial (home) position.
- TRANSLATE—Moves by an offset relative to the current position.
- DESCEND—Performs a slow descent; used when landing outside the camera’s working range.
- LAND—Commands landing according to the flight controller’s internal procedure.
- LAND3D—Handles vision-guided landing; listens for messages from the execution app.
- GIMBAL_ON_LAND_SWEEP—Service/test function; performs the gimbal-sweep procedure used during live tests.
- WAIT_ON_LAND—Service/test function; ground-test equivalent of HOVER.
5. Testing Environment
5.1. Simulation Environment
5.2. Laboratory Environment
6. Simulation Experiments
6.1. Landing Accuracy Check
6.2. Repeated Landing with Static Obstacles
6.3. Intrusion Test
7. Laboratory Environment
- Scenario Day/Night—Change in LightingPreliminary Test. This test checks whether the system can operate effectively in both day and night conditions. The system is considered illumination-invariant if the landing sites identified under both lighting conditions overlap. This test also serves as a repeatability check—demonstrating the system’s tendency to prefer similar landing sites in similar environments, resulting in clustering. The persistence of this effect was verified on the real UAV (Figure 23b).
- Scenario 1—Single ObstacleThe approach point identified during the day/night/repeatability tests is now obstructed by a single obstacle. The vision algorithm must reject this point and select a new landing site (Figure 23c).
- Scenario 2—Two ObstaclesThe new landing site chosen in Scenario 1 is now blocked by a second obstacle. The algorithm must relocate the landing site to a third, distinct position (Figure 23d).
- Scenario 3—Two Obstacles with Collision-Avoidance TriggerBoth previous obstacles remain, and a hanging marker is placed near (but not covering) the Scenario 2 landing site to trigger the anti-collision module. The system must select yet another, entirely new landing site (Figure 23e).
- Scenario 4—Forced Landing ZoneObstacles are arranged from the beginning to constrain the free area, forcing the algorithm to find an admissible approach point within a narrow corridor, rather than reacting to obstacles added later (Figure 23f).
8. Results
8.1. Simulation—Landing Accuracy Check
8.2. Simulation—Static Obstacles
8.3. Simulation—Intrusions
8.4. Laboratory Results
9. Discussion
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
UAV | Unmanned Aerial Vehicle |
ENU | East North Up |
NED | North East Down |
DTM | Digital Terrain Model |
DSM | Digital Surface Model |
SBC | Single Board Computer |
GPS | Global Positioning System |
GNSS | Global Navigation Satellite System |
References
- Ansar, A.; Theodore, C.; Rowley, D.; Hubbard, D.; Matthies, L.; Goldberg, S. Flight Trials of a Rotorcraft Unmanned Aerial Vehicle Landing Autonomously at Unprepared Sites. Technical Report. 2006. Available online: https://www.researchgate.net/publication/252103988 (accessed on 11 June 2025).
- Kendoul, F. Survey of Advances in Guidance, Navigation, and Control of Unmanned Rotorcraft Systems. J. Field Robot. 2012, 29, 315–378. [Google Scholar] [CrossRef]
- Gautam, A.; Sujit, P.B.; Saripalli, S. A Survey of Autonomous Landing Techniques for UAVs. In Proceedings of the 2014 International Conference on Unmanned Aircraft Systems (ICUAS), Orlando, FL, USA, 27–30 May 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 1210–1218. [Google Scholar] [CrossRef]
- Kong, W.; Zhou, D.; Zhang, D.; Zhang, J. Vision-Based Autonomous Landing System for Unmanned Aerial Vehicle: A Survey. In Proceedings of the 2014 International Conference on Multisensor Fusion and Information Integration for Intelligent Systems (MFI), Beijing, China, 28–29 September 2014. [Google Scholar]
- Jin, S.; Zhang, J.; Shen, L.; Li, T. On-Board Vision Autonomous Landing Techniques for Quadrotor: A Survey. In Proceedings of the 2016 35th Chinese Control Conference (CCC), Chengdu, China, 27–29 July 2016. [Google Scholar]
- Loureiro, G.; Dias, A.; Martins, A. Survey of Approaches for Emergency Landing Spot Detection with Unmanned Aerial Vehicles. In Robots in Human Life—Proc. CLAWAR 2020; World Scientific: Singapore, 2020; pp. 129–136. [Google Scholar] [CrossRef]
- Loureiro, G.; Dias, A.; Martins, A.; Almeida, J. Emergency Landing Spot Detection Algorithm for Unmanned Aerial Vehicles. Remote Sens. 2021, 13, 1930. [Google Scholar] [CrossRef]
- Shah Alam, M.; Oluoch, J. A Survey of Safe Landing Zone Detection Techniques for Autonomous Unmanned Aerial Vehicles (UAVs). Expert Syst. Appl. 2021, 179, 115091. [Google Scholar] [CrossRef]
- Kakaletsis, E.; Symeonidis, C.; Tzelepi, M.; Mademlis, I.; Tefas, A.; Nikolaidis, N.; Pitas, I. Computer Vision for Autonomous UAV Flight Safety: An Overview and a Vision-Based Safe Landing Pipeline Example; Association for Computing Machinery: New York, NY, USA, 2021. [Google Scholar]
- Xin, L.; Tang, Z.; Gai, W.; Liu, H. Vision-Based Autonomous Landing for the UAV: A Review. Aerospace 2022, 9, 634. [Google Scholar] [CrossRef]
- Hubbard, D.; Morse, B.; Theodore, C.; Tischler, M.; McLain, T. Performance Evaluation of Vision-Based Navigation and Landing on a Rotorcraft Unmanned Aerial Vehicle. In Proceedings of the 2007 IEEE Workshop on Applications of Computer Vision (WACV ’07), Austin, TX, USA, 21–22 February 2007. [Google Scholar]
- Scherer, S.; Chamberlain, L.; Singh, S. Autonomous Landing at Unprepared Sites by a Full-Scale Helicopter. Robot. Auton. Syst. 2012, 60, 1545–1562. [Google Scholar] [CrossRef]
- Scherer, S.; Chamberlain, L.; Singh, S. Online Assessment of Landing Sites. In AIAA Infotech@Aerospace 2010; AIAA: Reston, VA, USA, 2010. [Google Scholar] [CrossRef]
- Yan, L.; Qi, J.; Wang, M.; Wu, C.; Xin, J. A Safe Landing Site Selection Method of UAVs Based on LiDAR Point Clouds. In Proceedings of the 2020 39th Chinese Control Conference (CCC), Shenyang, China, 27–29 July 2020. [Google Scholar] [CrossRef]
- Yang, L.; Wang, C.; Wang, L. Autonomous UAVs Landing Site Selection from Point Cloud in Unknown Environments. ISA Trans. 2022, 130, 610–628. [Google Scholar] [CrossRef]
- Liu, F.; Shan, J.; Xiong, B.; Fang, Z. A Real-Time and Multi-Sensor-Based Landing Area Recognition System for UAVs. Drones 2022, 6, 118. [Google Scholar] [CrossRef]
- Chatzikalymnios, E.; Moustakas, K. Landing Site Detection for Autonomous Rotor Wing UAVs Using Visual and Structural Information. J. Intell. Robot. Syst. 2022, 104. [Google Scholar] [CrossRef]
- Mittal, M.; Mohan, R.; Burgard, W.; Valada, A. Vision-Based Autonomous UAV Navigation and Landing for Urban Search and Rescue. arXiv 2019, arXiv:1906.01304. [Google Scholar] [CrossRef]
- Mittal, M.; Valada, A.; Burgard, W. Vision-Based Autonomous Landing in Catastrophe-Struck Environments. arXiv 2018, arXiv:1809.05700. [Google Scholar]
- Mukadam, K.; Sinh, A.; Karani, R. Detection of Landing Areas for Unmanned Aerial Vehicles. In Proceedings of the 2nd Intl. Conf. on Computing, Communication, Control and Automation (ICCUBEA 2016), Pune, India, 12–13 August 2016; IEEE: Piscataway, NJ, USA, 2017. [Google Scholar] [CrossRef]
- Kakaletsis, E.; Nikolaidis, N. Potential UAV Landing Sites Detection through Digital Elevation Models Analysis. arXiv 2021, arXiv:2107.06921. [Google Scholar] [CrossRef]
- Khattak, S.; Papachristos, C.; Alexis, K. Vision-Depth Landmarks and Inertial Fusion for Navigation in Degraded Visual Environments. In Robotics: Science and Systems 2018; Springer: Cham, Switzerland, 2018; pp. 529–540. [Google Scholar]
- Da, G.; Loureiro, S.M. Emergency Landing Spot Detection for Unmanned Aerial Vehicle; Springer: Cham, Switzerland, 2020. [Google Scholar]
- Cheng, H.-W.; Chen, T.-L.; Tien, C.-H. Motion Estimation by Hybrid Optical Flow Technology for UAV Landing in an Unvisited Area. Sensors 2019, 19, 1380. [Google Scholar] [CrossRef] [PubMed]
- Rosa, L.; Hamel, T.; Mahony, R.; Samson, C.; Rosa, L. Optical-Flow-Based Strategies for Landing VTOL UAVs in Cluttered Environments; Technical Report; Elsevier: Amsterdam, The Netherlands, 2014. [Google Scholar]
- Li, X. A Software Scheme for UAV’s Safe Landing Area Discovery. AASRI Procedia 2013, 4, 230–235. [Google Scholar] [CrossRef]
- Bektash, O.; Naundrup, J.J.; la Cour-Harbo, A. Analyzing Visual Imagery for Emergency Drone Landing on Unknown Environments. Int. J. Micro Air Veh. 2022, 14. [Google Scholar] [CrossRef]
- Kaljahi, M.A.; Shivakumara, P.; Idris, M.Y.I.; Anisi, M.H.; Lu, T.; Blumenstein, M.; Noor, N.M. An Automatic Zone Detection System for Safe Landing of UAVs. Expert Syst. Appl. 2019, 122, 319–333. [Google Scholar] [CrossRef]
- Hinzmann, T.; Stastny, T.; Cadena, C.; Siegwart, R.; Gilitschenski, I. Free LSD: Prior-Free Visual Landing Site Detection for Autonomous Planes; Technical Report; 2018. Available online: www.asl.ethz.ch (accessed on 15 June 2025).
- Kikumoto, C.; Harimoto, Y.; Isogaya, K.; Yoshida, T.; Urakubo, T. Landing Site Detection for UAVs Based on CNNs Classification and Optical Flow from Monocular Camera Images. J. Robot. Mechatron. 2021, 33, 292–300. [Google Scholar] [CrossRef]
- Rojas-Perez, L.O.; Munguia-Silva, R.; Martinez-Carranza, J. Real-Time Landing Zone Detection for UAVs Using Single Aerial Images; Technical Report; 2018. Available online: https://youtu.be/ (accessed on 22 June 2025).
- Mitroudas, T.; Balaska, V.; Psomoulis, A.; Gasteratos, A. Embedded Light-Weight Approach for Safe Landing in Populated Areas. arXiv 2023, arXiv:2302.14445. [Google Scholar]
- Wubben, J.; Fabra, F.; Calafate, C.T.; Krzeszowski, T.; Marquez-Barja, J.M.; Cano, J.C.; Manzoni, P. Accurate Landing of Unmanned Aerial Vehicles Using Ground Pattern Recognition. Electronics 2019, 8, 1532. [Google Scholar] [CrossRef]
- García-Pulido, J.A.; Pajares, G.; Dormido, S. UAV Landing Platform Recognition Using Cognitive Computation Combining Geometric Analysis and Computer Vision Techniques. Cogn. Comput. 2022, 15, 392–412. [Google Scholar] [CrossRef]
- Gautam, A.; Sujit, P.B.; Saripalli, S. Autonomous Quadrotor Landing Using Vision and Pursuit Guidance. IFAC-Pap. Online 2017, 50, 10501–10506. [Google Scholar] [CrossRef]
- Lin, J.; Wang, Y.; Miao, Z.; Zhong, H.; Fierro, R. Low-Complexity Control for Vision-Based Landing of Quadrotor UAV on Unknown Moving Platform. IEEE Trans. Ind. Inf. 2022, 18, 5348–5358. [Google Scholar] [CrossRef]
- Yang, T.; Ren, Q.; Zhang, F.; Xie, B.; Ren, H.; Li, J.; Zhang, Y. Hybrid Camera Array-Based UAV Auto-Landing on Moving UGV in GPS-Denied Environment. Remote Sens. 2018, 10, 1829. [Google Scholar] [CrossRef]
- Safadinho, D.; Ramos, J.; Ribeiro, R.; Filipe, V.; Barroso, J.; Pereira, A. UAV Landing Using Computer Vision Techniques for Human Detection. Sensors 2020, 20, 613. [Google Scholar] [CrossRef] [PubMed]
- Yang, S.; Scherer, S.A.; Schauwecker, K.; Zell, A. Autonomous Landing of MAVs on an Arbitrarily Textured Landing Site Using Onboard Monocular Vision. J. Intell. Robot. Syst. 2014, 74, 27–43. [Google Scholar] [CrossRef]
- Lin, S.; Jin, L.; Chen, Z. Real-Time Monocular Vision System for UAV Autonomous Landing in Outdoor Low-Illumination Environments. Sensors 2021, 21, 6226. [Google Scholar] [CrossRef]
- Nguyen, P.H.; Kim, K.W.; Lee, Y.W.; Park, K.R. Remote Marker-Based Tracking for UAV Landing Using Visible-Light Camera Sensor. Sensors 2017, 17, 1987. [Google Scholar] [CrossRef] [PubMed]
- Kong, W.; Zhang, D.; Wang, X.; Xian, Z.; Zhang, J. Autonomous Landing of an UAV with a Ground-Based Actuated Infrared Stereo Vision System. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013. Technical Report. [Google Scholar]
- Yang, T.; Li, G.; Li, J.; Zhang, Y.; Zhang, X.; Zhang, Z.; Li, Z. A Ground-Based Near Infrared Camera Array System for UAV Auto-Landing in GPS-Denied Environment. Sensors 2016, 16, 1393. [Google Scholar] [CrossRef] [PubMed]
- Ariante, G.; Ponte, S.; Papa, U.; Greco, A.; del Core, G. Ground Control System for UAS Safe Landing Area Determination (SLAD) in Urban Air Mobility Operations. Sensors 2022, 22, 9326. [Google Scholar] [CrossRef]
- Lorenzo, O.G.; Martínez, J.; Vilariño, D.L.; Pena, T.F.; Cabaleiro, J.C.; Rivera, F.F. Landing Sites Detection Using LiDAR Data on Manycore Systems. J. Supercomput. 2017, 73, 557–575. [Google Scholar] [CrossRef]
- Kim, I.; Kim, H.-G.; Kim, I.-Y.; Ohn, S.-Y.; Chi, S.-D. Event-Based Emergency Detection for Safe Drone. Appl. Sci. 2022, 12, 8501. [Google Scholar] [CrossRef]
- Guérin, J.; Delmas, K.; Guiochet, J. Certifying Emergency Landing for Safe Urban UAV. In Proceedings of the IEEE/IFIP International Conference on Dependable Systems and Networks Workshops (DSN-W 2021), Taipei, Taiwan, 21–24 June 2021; IEEE: Piscataway, NJ, USA, 2021. [Google Scholar] [CrossRef]
- DJI Innovations. Flame Wheel F550 User Manual, Version 2.0; DJI: Shenzhen, China, 2015; Available online: https://dl.djicdn.com/downloads/flamewheel/en/F550_User_Manual_v2.0_en.pdf (accessed on 21 June 2025).
- PX4 Autopilot. Flight Controller: CubePilot Cube Orange. PX4 User Guide, Main Branch, 2025. Available online: https://docs.px4.io/main/en/flight_controller/cubepilot_cube_orange.html (accessed on 21 June 2025).
- NVIDIA Corporation. Jetson Nano Developer Kit User Guide, rev. 1.3; NVIDIA: Santa Clara, USA. 2020; Available online: https://developer.nvidia.com/embedded/dlc/jetson_nano_developer_kit_user_guide (accessed on 21 June 2025).
- OlliW. STorM32 Controller Boards—STorM32 v1.32; STorM32 Wiki: 2024. Available online: https://www.olliw.eu/storm32bgc-wiki/STorM32_Boards (accessed on 21 June 2025).
- Intel Corporation. Intel RealSense Depth Camera D435i. Available online: https://www.intelrealsense.com/depth-camera-d435i/ (accessed on 21 June 2025).
- PX4 Autopilot. GPS/Compass: Hex Here2. PX4 User Guide, Version 1.13; 2022. Available online: https://docs.px4.io/v1.13/en/gps_compass/gps_hex_here2.html (accessed on 21 June 2025).
- Rosin, P.L.; Lai, Y.-K.; Shao, L.; Liu, Y. (Eds.) RGB-D Image Analysis and Processing; Advances in Computer Vision and Pattern Recognition Series; Springer: Cham, Switzerland, 2019. [Google Scholar] [CrossRef]
- Teutsch, M.; Sappa, A.D.; Hammoud, R.I. Computer Vision in the Infrared Spectrum; Synthesis Lectures on Computer Vision Series; Springer: Cham, Switzerland, 2022. [Google Scholar] [CrossRef]
- PX4 Autopilot User Guide. Available online: https://docs.px4.io/v1.13/en/ (accessed on 30 November 2024).
- Meeussen, W. Coordinate Frames for Mobile Platforms. ROS REP-105. 27 October 2010. Available online: https://www.ros.org/reps/rep-0105.html (accessed on 28 November 2024).
- Foote, T.; Purvis, M. Standard Units of Measure and Coordinate Conventions. ROS REP-103. 7 October 2010. Available online: https://www.ros.org/reps/rep-0103.html (accessed on 28 November 2024).
- PX4. Flight Controller/Sensor Orientation. Available online: https://docs.px4.io/v1.13/en/config/flight_controller_orientation.html (accessed on 30 November 2024).
- PX4. Using the ECL EKF. Available online: https://docs.px4.io/v1.13/en/advanced_config/tuning_the_ecl_ekf.html (accessed on 30 November 2024).
- Open Robotics. Spherical Coordinates. Available online: https://gazebosim.org/api/sim/7/spherical_coordinates.html (accessed on 30 November 2024).
- PX4. Using Vision or Motion Capture Systems for Position Estimation. Available online: https://docs.px4.io/v1.13/en/ros/external_position_estimation.html (accessed on 30 November 2024).
- Ermakov, V. MAVROS. Available online: https://docs.ros.org/en/iron/p/mavros/ (accessed on 30 November 2024).
- Granosik, G.; Zubrycki, I.; Soghbatyan, T.; Zarychta, D.; Gawryszewski, M. Kube—Platforma Robotyczna dla Badań Naukowych i Prac Wdrożeniowych. In Problemy Robotyki—Materiały XIV KKR; Oficyna Wydawnicza PW: Warszawa, Poland, 2016; pp. 223–234. [Google Scholar]
Symbol | Description | Value |
---|---|---|
a | Size of the KUKA KUBE platform plate | 250 cm |
b | Distance from UAV centre to the side with the rail and KUKA robot | 110 cm |
c | Distance from UAV centre to the rear edge of the platform | 68 cm |
H | Height of UAV centre above the plate level | 138 cm |
Z | Height of F1 flag centre above the plate level | 84 cm |
Vehicle body coordinate frame axis-aligned with the testing bed | ||
Local frame (PX4 position–estimator output) |
Scenario | Attempts | Fails | Success Rate |
---|---|---|---|
East | 103 | 3 | 97.10% |
North | 103 | 5 | 95.14% |
West | 108 | 0 | 100% |
South | 100 | 0 | 100% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Łuczak, P.; Granosik, G. Autonomous UAV Landing and Collision Avoidance System for Unknown Terrain Utilizing Depth Camera with Actively Actuated Gimbal. Sensors 2025, 25, 6165. https://doi.org/10.3390/s25196165
Łuczak P, Granosik G. Autonomous UAV Landing and Collision Avoidance System for Unknown Terrain Utilizing Depth Camera with Actively Actuated Gimbal. Sensors. 2025; 25(19):6165. https://doi.org/10.3390/s25196165
Chicago/Turabian StyleŁuczak, Piotr, and Grzegorz Granosik. 2025. "Autonomous UAV Landing and Collision Avoidance System for Unknown Terrain Utilizing Depth Camera with Actively Actuated Gimbal" Sensors 25, no. 19: 6165. https://doi.org/10.3390/s25196165
APA StyleŁuczak, P., & Granosik, G. (2025). Autonomous UAV Landing and Collision Avoidance System for Unknown Terrain Utilizing Depth Camera with Actively Actuated Gimbal. Sensors, 25(19), 6165. https://doi.org/10.3390/s25196165