Development of a Collision Avoidance System Using Multiple Distance Sensors for Indoor Inspection Drone
Abstract
1. Introduction
2. Development of Sensor Module
2.1. ToF Distance Sensor Selection
2.2. Single-Sensor Performance Test
2.3. Multiple-Sensor Orientation and Coordinate System
- (1)
- Rotate around the by an angle , represented by , and translate by represented as .
- (2)
- Rotate around the by an angle , denoted as , where ranges from −22.5° to 22.5°.
- (3)
- Rotate around the by an angle , represented as , where also ranges from −22.5° to 22.5°.
- (4)
- Translation along the of the rotated coordinate system by a distance , denoted as . The parameter corresponds to the measured distance from the center of the sensor to its surrounding object, which ranges from 0 to 400 cm.
3. Drone System
3.1. Sensor Module Integration
3.2. SBUS Communication Protocol
4. Collision Avoidance Algorithm
4.1. Obstacle Detection and Responsive Distance Modeling
4.2. Gaussian-Based Virtual Repulsive Force Model
4.3. Motion Adjustment via SBUS Control Signal Modification
- Figure 18a plots the measured distance between the drone and the obstacle, which decreases as the drone moves forward.
- Figure 18b displays the resulting repulsive force generated by the system using the Gaussian model, which increases as the obstacle comes closer.
- Figure 18c compares the original user command (blue line), representing continuous forward motion, with the modified control command (green line), which increases in magnitude as the repulsive force increases.
- Figure 18d summarizes all three signals: distance, repulsive force, and control commands on a single graph for comprehensive comparison.
5. Experimental Results
5.1. Test Setup and Environment
5.2. Collision Avoidance Performance
- (A)
- Front direction case: As the drone approaches an obstacle in the front direction, as shown in Figure 20a, the obstacle is represented by the 2D representative point cloud, as shown in Figure 20b, then an opposite force towards the rear of the drone is virtually generated by modifying the control command. A model of this force is shown in Figure 20c.
- (B)
- Left direction case: As the drone approaches an obstacle in the left direction, as shown in Figure 21a, the obstacle is represented by the 2D representative point cloud, as shown in Figure 21b, then an opposite force towards the right of the drone is virtually generated by modifying the control command. A model of this force is shown in Figure 21c.
- (C)
- Rear direction case: When an obstacle (the white board was placed on the rear side of the drone) is detected in the rear direction of the drone, the repulsive force moves towards the front direction, as illustrated in Figure 22.
- (D)
- Right direction case: Similar to the other directions, when an obstacle is detected in the right direction of the drone, the repulsive force moves towards the left of it, as illustrated in Figure 23.
- (a)
- Front–left direction case: When obstacles are detected in both the front and left directions, each generates a repulsive force away from its location. The vector summation of these forces results in a resultant force directed toward the rear–right direction, steering the drone away from the obstacles, as illustrated in Figure 25a.
- (b)
- Front–right direction case: When obstacles are present in the front and right directions, the resulting repulsive force is directed towards the rear–left direction, as shown in Figure 25b. This response ensures the drone moves away from both detected threats.
- (c)
- (d)
- Rear–right direction case: When obstacles are detected in both the rear and right directions, a resultant force is generated toward the front–left direction, guiding the drone away from the obstacles, as shown in Figure 25d.
5.3. Obstacle Avoidance Flight Experiments
Trajectory Visualization
- Eight reflective markers were mounted on the drone frame in symmetrical positions to accurately measure its position and orientation in 3D space.
- Additionally, eight to ten reflective markers were attached to each obstacle in every experimental setup. This allowed both drone and obstacle positions to be tracked in the same global coordinate frame.
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Hammadi, M. A Comprehensive Analysis of Autonomous Drone Technology Across Multiple Sectors, HAL Open Science 2024. Available online: https://hal.science/hal-04412160 (accessed on 18 April 2025).
- Telli, K.; Kraa, O.; Himeur, Y.; Ouamane, A.; Boumehraz, M.; Atalla, S.; Mansoor, W. A comprehensive review of recent research trends on unmanned aerial vehicles (UAVs). Systems 2023, 11, 400. [Google Scholar] [CrossRef]
- Razafimahazo, E.; De Saqui-Sannes, P.; Vingerhoeds, R.A.; Baron, C.; Soulax, J.; Mège, R. Mastering complexity for indoor inspection drone development. In Proceedings of the 2021 IEEE International Symposium on Systems Engineering (ISSE), Vienna, Austria, 13 September–13 October 2021; pp. 1–8. [Google Scholar] [CrossRef]
- Karam, S.; Nex, F.; Chidura, B.T.; Kerle, N. Microdrone-based indoor mapping with Graph SLAM. Drones 2022, 6, 352. [Google Scholar] [CrossRef]
- Wang, S.; Wang, L.; He, X.; Cao, Y. A monocular vision collision avoidance method applied to indoor tracking robot. Drones 2021, 5, 105. [Google Scholar] [CrossRef]
- Yang, L.; Feng, X.; Zhang, J.; Shu, X. Multi-ray modeling of ultrasonic sensors and application for micro-UAV localization in indoor environments. Sensors 2019, 19, 1770. [Google Scholar] [CrossRef] [PubMed]
- Kumar, G.A.; Patil, A.K.; Patil, R.; Park, S.S.; Chai, Y.H. A LiDAR and IMU integrated indoor navigation system for UAVs and its application in real-time pipeline classification. Sensors 2017, 17, 1268. [Google Scholar] [CrossRef] [PubMed]
- Kazan, F.A.; Solak, H. Improvement of ultrasonic sensor-based obstacle avoidance system in drones. Int. J. Aeronaut. Astronaut. 2023, 4, 9–35. [Google Scholar] [CrossRef]
- Xiong, X.; Yan, Y.; Yang, Y. A survey of indoor UAV collision avoidance research. IEEE Access 2023, 11, 51861–51891. [Google Scholar] [CrossRef]
- Park, Y.-J.; Yi, C.-Y. Time-of-flight distance sensor–based construction equipment activity detection method. Sensors 2024, 14, 2859. [Google Scholar] [CrossRef]
- Gadde, C.S.; Gadde, M.S.; Mohanty, N.; Sundaram, S. Fast collision avoidance motion in small quadcopter operation in a cluttered environment. arXiv 2021. [Google Scholar] [CrossRef]
- Eresen, A.; Imamoǧlu, N.; Efe, M.Ö. Autonomous quadrotor flight with vision-based collision avoidance in virtual environment. Expert Syst. Appl. 2012, 39, 894–905. [Google Scholar] [CrossRef]
- Khaleel, M.; Emimi, M.; Alkrash, A. The current opportunities and challenges in drone technology. Int. J. Electr. Eng. Sustain. 2023, 1, 74–89. Available online: https://ijees.org/index.php/ijees/index (accessed on 2 July 2025).
- Falanga, D.; Kleber, K.; Scaramuzza, D. Dynamic obstacle avoidance for quadrotors with event cameras. Sci. Robot. 2020, 5, eaaz9712. Available online: https://www.science.org (accessed on 25 April 2025). [CrossRef] [PubMed]
- Devos, A.; Ebeid, E.; Manoonpong, P. Development of autonomous drones for adaptive collision avoidance in real-world environments. In Proceedings of the 2018 21st Euromicro Conference on Digital System Design (DSD), Prague, Czech Republic, 29–31 August 2018; pp. 707–710. [Google Scholar]
- Tsuji, S.; Kohama, T. Omnidirectional proximity sensor system for drones using optical time-of-flight sensors. IEEJ Trans. Electr. Electron. Eng. 2022, 17, 19–25. [Google Scholar] [CrossRef]
- Niculescu, V.; Polonelli, T.; Magno, M.; Benini, L. Nanoslam: Enabling fully onboard SLAM for tiny robots. IEEE Internet Things J. 2024, 11, 13584–13607. [Google Scholar] [CrossRef]
- Aldao, E.; Santos, L.M.G.-D.; Michinel, H.; González-Jorge, H. UAV collision avoidance algorithm to navigate in dynamic building environments. Drones 2022, 6, 16. [Google Scholar] [CrossRef]
- Gageik, N.; Benz, P.; Montenegro, S. Obstacle detection and collision avoidance for a UAV with complementary low-cost sensors. IEEE Access 2015, 3, 599–609. [Google Scholar] [CrossRef]
- Pourjabar, M.; Rusci, M.; Bompani, L.; Lamberti, L.; Niculescu, V.; Palossi, D.; Benini, L. Multi-sensory anti-collision design for autonomous nano-swarm exploration. arXiv 2023, arXiv:2312.13086. [Google Scholar] [CrossRef]
- Mattar, R.A.; Kalai, R. Development of a wall-sticking drone for non-destructive ultrasonic and corrosion testing. Drones 2018, 2, 8. [Google Scholar] [CrossRef]
- Rezaee, M.R.; Hamid, N.A.W.A.; Hussin, M.; Zukarnain, Z.A. Comprehensive review of drones collision avoidance schemes: Challenges and open issues. IEEE Trans. Intell. Transp. Syst. 2024, 5, 6397–6426. [Google Scholar] [CrossRef]
- Yasin, J.N.; Mohamed, S.A.S.; Haghbayan, M.-H.; Heikkonen, J.; Tenhunen, H.; Plosila, J. Unmanned aerial vehicles (UAVs): Collision avoidance systems and approaches. IEEE Access 2020, 8, 105139–105155. [Google Scholar] [CrossRef]
- Electronics, S. Product Page: VL53L8CX Distance Sensor. Available online: https://www.sparkfun.com/products/19013 (accessed on 5 November 2024).
- STMicroelectronics. VL53L5CX-SATEL: Time-of-Flight 8×8 Multizone Ranging Sensor with Wide Field of View. Available online: https://www.st.com/en/imaging-and-photonics-solutions/vl53l5cx.html (accessed on 1 June 2025).
- Seeed Studio. What Is a Time-of-Flight Sensor and how Does It Work? Available online: https://www.seeedstudio.com/blog/2020/01/08/what-is-a-time-of-flight-sensor-and-how-does-a-tof-sensor-work/#sidr-nav (accessed on 5 November 2024).
- Pololu. VL53L8CX Carrier Dimensions. Available online: https://www.pololu.com/product/3419 (accessed on 11 May 2025).
- STMicroelectronics. VL53L8CX User Manual: UM3109 A Guide for Using the VL53L8CX Time-of-Flight Multizone Sensor. Available online: https://www.st.com (accessed on 11 May 2025).
- I2C-Bus Specification and User Manual. Available online: https://github.com/xioTechnologies/Dub-Siren (accessed on 15 December 2024).
- Isogai, K.; Inohara, R.; Nakano, H.; Okazaki, H. Modeling and simulation of motion of a quadcopter. In Proceedings of the International Symposium on Nonlinear Theory and its Applications, Yugawara, Japan, 27–30 November 2016. [Google Scholar] [CrossRef]
- Holybro. PX4 Development Kit X500 v2. Available online: https://holybro.com/products/px4-development-kit-x500-v2 (accessed on 1 July 2024).
- Corporation, F. Instruction Manual. Available online: https://www.futabausa.com/ (accessed on 13 June 2024).
- Futaba Corporation. S.BUS System, Futaba Official Website. Available online: https://www.rc.futaba.co.jp/support/tips/detail/41 (accessed on 1 July 2025).
- Bigazzi, L.; Basso, M.; Boni, E.; Innocenti, G.; Pieraccini, M. A multilevel architecture for autonomous UAVs. Drones 2021, 5, 55. [Google Scholar] [CrossRef]
- Open Robotics. RViz Visualization Tool. Available online: https://wiki.ros.org/rviz (accessed on 24 January 2025).
- OptiTrack. Motion Capture Systems by OptiTrack. Available online: https://optitrack.com (accessed on 10 June 2025).
- Xue, Z.; Gonsalves, T. Vision-based drone obstacle avoidance by deep reinforcement learning. Ai 2021, 2, 366–380. [Google Scholar] [CrossRef]
Feature | Detail |
---|---|
Model number | VL53L8CX |
Manufacturer | STMicroelectronics |
Package | Optical LGA16 |
Size | 23 × 13 × 1.75 mm |
Ranging | 2 cm–400 cm (per zone) |
Operating voltage | 3.3 V (AVDD), 1.2/1.8 V (IOVDD) |
Power consumption | ~46 nW |
Operating temperature | −30 °C to 85 °C |
Sample rate | Up to 60 Hz |
FoV | 45 × 45 degree |
I2C and SPI interface | I2C (1 MHz), SPI (3 MHz) |
Feature | Detail |
---|---|
Frame type | X-type carbon fiber frame |
Wheelbase | 500 mm diagonal motor-to-motor |
Frame weight | ~610 g |
Propeller size | 10 × 4.5 inch (plastic) |
Motors | 4 × 2216/880 KV brushless motors |
ESC | 4 × 20A ESC (BLHeli-S firmware, DShot capable) |
Flight controller | Pixhawk 6C with PX4 open-source firmware |
Power distribution board | Integrated (Holybro PM06 V2.1) |
Battery recommendation | 4S LiPo (14.8 V), typically 3000–5000 mAh |
Payload capacity | Up to 1 kg (recommended for stable flight) |
Byte | Data | Description |
---|---|---|
0 | Start byte (0 × 0 F) | Indicates the beginning of an SBUS frame |
1–22 | Channel data (22 bytes) | Encodes 16 channels (11 bits each) |
23 | Flag byte | Fail safe/frame lost |
24 | End byte (0 × 00) | Indicates end of the SBUS frame |
Pseudo-Code for Object Detection and Collision Avoidance Algorithm: | |||
---|---|---|---|
1 | Initialize sensorArray ← [sensor1, sensor2, …, sensor8] | ||
2 | loop ⟶ Repeats periodically during flight | ||
3 | # Get original user command raw distance measurements | ||
4 | Input: = get_user_control_signal | ||
5 | for each sensor in sensorArray do: | ||
6 | raw_data = sensor.get_distance_data | ||
7 | # Filter and compute minimum distance of angular segment | ||
8 | = Minimum of angular segment distance data (raw_data) | ||
9 | # Get obstacle direction | ||
10 | = sensor.get_direction | ||
11 | # Compute repulsive force using gaussian model | ||
12 | |||
13 | end for | ||
14 | # Calculate total repulsive force vector | ||
15 | |||
16 | # Modify control signals to adjust flight direction and speed | ||
17 | |||
18 | apply_control_signal () | ||
19 | end loop |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gerencheal, T.A.; Lee, J.H. Development of a Collision Avoidance System Using Multiple Distance Sensors for Indoor Inspection Drone. Appl. Sci. 2025, 15, 8791. https://doi.org/10.3390/app15168791
Gerencheal TA, Lee JH. Development of a Collision Avoidance System Using Multiple Distance Sensors for Indoor Inspection Drone. Applied Sciences. 2025; 15(16):8791. https://doi.org/10.3390/app15168791
Chicago/Turabian StyleGerencheal, Teklay Asmelash, and Jae Hoon Lee. 2025. "Development of a Collision Avoidance System Using Multiple Distance Sensors for Indoor Inspection Drone" Applied Sciences 15, no. 16: 8791. https://doi.org/10.3390/app15168791
APA StyleGerencheal, T. A., & Lee, J. H. (2025). Development of a Collision Avoidance System Using Multiple Distance Sensors for Indoor Inspection Drone. Applied Sciences, 15(16), 8791. https://doi.org/10.3390/app15168791