Next Article in Journal
Simple and Robust UPLC-MS Method for Serum Arachidonic Acid with Potential Application in Clinical Settings
Previous Article in Journal
Borehole Trajectory Optimization Design Based on the SAC Algorithm with a Self-Attention Mechanism
Previous Article in Special Issue
A Multi-Strategy ALNS for the VRP with Flexible Time Windows and Delivery Locations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development of a Collision Avoidance System Using Multiple Distance Sensors for Indoor Inspection Drone

by
Teklay Asmelash Gerencheal
and
Jae Hoon Lee
*
Graduate School of Science and Engineering, Ehime University, Matsuyama 790-8577, Japan
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(16), 8791; https://doi.org/10.3390/app15168791
Submission received: 11 July 2025 / Revised: 30 July 2025 / Accepted: 31 July 2025 / Published: 8 August 2025
(This article belongs to the Special Issue AI-Based Methods for Object Detection and Path Planning)

Abstract

Drones, particularly those designed for indoor inspections, are widely used across various industries for tasks such as infrastructure monitoring, maintenance, and security. This study focuses on developing a robust collision avoidance system for teleoperated indoor drones, ensuring comprehensive 360-degree horizontal safety during flight. By integrating multiple cost-effective and compact Time-of-Flight (ToF) distance sensors, the system enhances real-time obstacle detection and collision prevention. A custom sensor module, strategically installed on the drone, facilitates continuous environmental monitoring and dynamic flight adjustment. This module continuously provides real-time distance data to the collision avoidance algorithm. Additionally, the system receives user control signals from the remote operator, which are then transmitted to the flight controller. Upon detecting an obstacle, the system immediately modifies these control signals to adjust the drone’s motion, thereby avoiding collisions and ensuring the safety of both the drone and its surroundings. The methodology involves initializing and calibrating multiple sensors, collecting and processing ranging data, and dynamically adjusting motion commands based on proximity alerts. This approach significantly improves the safety and operational efficiency of drones in complex indoor environments.

1. Introduction

Drones have transformed numerous industries by performing tasks in hazardous or inaccessible environments [1]. Indoor inspection drones, in particular, play a critical role by providing essential data through visual, thermal, and structural analyses, thereby supporting effective decision-making in industrial operations [2,3,4,5]. However, indoor environments present unique challenges such as limited space, the absence of GPS signals, and dynamic obstacles. These constraints necessitate advanced collision avoidance mechanisms to ensure safe and reliable navigation [6,7].
Existing drone technologies offer various obstacle detection and avoidance mechanisms. These typically involve the use of sensors such as LiDAR, ultrasonic sensors, and cameras, combined with algorithms that process environmental data to predict and prevent collisions [7,8]. Despite these developments, significant challenges remain in indoor applications. The dynamic nature of indoor environments, characterized by narrow spaces, moving objects, and unpredictable obstacles, renders many existing systems insufficient. Moreover, the confined spatial layout, lack of GPS-based positioning, and the need for rapid maneuvering further complicate safe navigation [9]. Addressing these issues is essential to ensure the safety of the drone, its environment, and personnel involved in inspection operations.
In a related study, Park, Y.J. and Yi, C.Y. [10] proposed a Time-of-Flight distance sensor-based construction equipment activity detection method in dynamic site environments. Their method demonstrated the ability of ToF sensors to robustly detect movements and interactions of machinery under challenging conditions. This further validates the effectiveness of ToF sensors in complex, cluttered spaces, highlighting their suitability not only for static obstacle detection but also for dynamic activity monitoring. The insights from their work underscore the adaptability and reliability of ToF technology, particularly in environments where conventional sensors may struggle due to lighting variability or spatial constraints.
Recent research [11] has placed a particular focus on Sense-and-Avoid (SAA) mechanisms, which aim to enable drones to autonomously detect and avoid obstacles in real time. Approaches such as the Fast Obstacle Avoidance Motion (FOAM) algorithm integrate data from sensors like monocular cameras and LiDAR to create a 2D probabilistic occupancy map that helps estimate free spaces for obstacle avoidance. While effective to a degree, these systems tend to be bulky, complex, and consume considerable power, making them less suitable for lightweight, energy-efficient indoor drones requiring real-time operation [12,13,14,15].
Tsuji and Kohama [16] introduced an omnidirectional proximity sensor system utilizing optical Time-of-Flight (ToF) sensors, which enabled accurate real-time obstacle detection. While their system provided efficient omnidirectional coverage, its primary focus was on obstacle detection rather than active avoidance. This study builds upon their work by proposing a more cost-effective and power-efficient solution suitable for indoor inspection drones. In addition to detecting potential obstacles, the system introduced here incorporates real-time collision avoidance mechanisms.
Manual flight with teleoperation is essential for indoor inspection tasks, where human judgment and flexibility play a critical role in navigating complex environments [17,18]. However, to enhance safety and operational efficiency during such flights, a reliable obstacle detection and avoidance system is necessary. This research addresses these challenges by proposing a novel solution: the integration of lightweight, low-power, and compact Time-of-Flight (ToF) distance sensors to enable full 360-degree horizontal obstacle detection and collision avoidance. Specifically, this study employs multiple small ToF sensors that offer key advantages, including compact design, high accuracy, a 45 × 45-degree field of view (FoV), and stable performance under various lighting conditions [10,16,19]. By leveraging these sensors, the system provides comprehensive environmental awareness around the drone. This significantly enhances its capability to autonomously avoid collisions even during manual operation, thereby supporting safer and more efficient teleoperated inspection missions.
Furthermore, integrating these sensors with advanced real-time processing algorithms enhances the performance of drones during inspection tasks in cluttered environments [20,21,22,23]. This approach was experimentally validated in realistic indoor settings, where various sensor configurations and avoidance strategies were tested. The findings contribute to the development of safer, more efficient indoor drones capable of handling the complexities of industrial inspection tasks.
The primary contributions of this study include the development of a low-cost, lightweight, and energy-efficient collision avoidance system that outperforms existing technologies in both efficiency and practical applicability for indoor environments. Designed to support new and inexperienced operators, the proposed system enhances both safety and autonomy. By optimizing sensor integration, processing algorithms, and control strategies, this research paves the way for the next generation of inspection drones with improved reliability and operational efficiency.

2. Development of Sensor Module

Time-of-Flight (ToF) sensors are optical distance-measuring devices that operate by emitting a light pulse, typically infrared, and calculating the time it takes for the light to reflect back from an object to the sensor [24,25,26], as illustrated in Figure 1. The measured distance is represented by Equation (1).
D i s t a n c e = S p e e d   o f   L i g h t × T i m e   o f   F l i g h t 2
This technology is widely utilized across various fields where accurate, reliable, and compact distance measurements are essential. These applications include robotics for navigation and interaction, automotive systems for parking assistance and collision avoidance, consumer electronics like smartphones (for facial recognition and augmented reality), and industrial automation for presence detection and volume measurement.
ToF sensors offer several inherent advantages that make them suitable for diverse applications. First, they provide highly precise and consistent distance measurements, largely unaffected by target surface properties such as texture, color, or ambient lighting conditions [27]. This robustness is critical for reliable performance in both indoor and outdoor environments. Second, their compact size and remarkably low power consumption make them ideal for integration into space-constrained and battery-sensitive applications, such as drones, where weight and energy efficiency are paramount.

2.1. ToF Distance Sensor Selection

In this study, for the specific application of obstacle detection in unfamiliar indoor environments, the VL53L8CX (STMicroelectronics, Bengaluru, Karnataka) was selected as the primary ToF distance sensor due to its multi-zone ranging capability, which addresses the limitations of conventional single-point ToF distance sensors. Unlike its predecessors (e.g., VL53L0X), the VL53L8CX supports up to 64 independent ranging zones (8 × 8 grid array), enabling higher-resolution spatial mapping of obstacles within a 45 × 45-degree FoV and a fast response time (up to 60 Hz, corresponding to approximately 16.7 ms per frame) [26,27,28], characteristics that are crucial for real-time applications like collision avoidance in dynamic and unpredictable environments. This enables the sensor to perceive not just distance in a single direction but to construct a simplified spatial representation of its environment. This multi-zone detection enhances spatial awareness, allowing the drone to better recognize obstacles from various angles and make informed decisions to avoid collisions. This miniature ToF distance sensor offers an optical combination of accuracy, reliability, and ease of integration, making it an effective component for improving drone navigation and safety. Its multi-zone detection, low power consumption, wide field of view, and built-in signal processing contribute to its effectiveness in collision avoidance applications, ensuring drones can operate safely and efficiently in diverse environments; Table 1 summarizes the sensor’s specifications.
One of the most advantageous features of this sensor is its wide field of view of 45 degrees both horizontally and vertically. The sensor can return either 16 distance values (in 4 × 4 resolution mode) or 64 distance values (in 8 × 8 resolution mode). Figure 2 illustrates how these modes capture depth information in a real-world environment. Each number in the grid represents the index of a zone in the measurement data. The VL53L8CX’s combination of multi-zone detection, wide field of view, high accuracy, low power consumption, and compact form factor makes it a suitable sensor for real-time obstacle detection and avoidance in confined indoor environments.

2.2. Single-Sensor Performance Test

A single Time-of-Flight (ToF) distance sensor can detect and map its surroundings by generating virtual representations based on its selected resolution mode. Figure 3 illustrates the sensor operating in 8 × 8 mode, where it captures 64 individual distance measurements within its field of view. The red points represent locations on the target object where distance measurements are taken, while the green lines indicate the corresponding distance from each measured point to the sensor. In this study, the Robot Operating System (ROS) was used to visualize sensor data in real-time via the RViz interface, using ROS-Serial for communication between the microcontroller and ROS.
In the example shown in Figure 4, a single sensor detects a box placed in front of a wall. The red dashed line shown in Figure 4a represents the sensor’s field of view (FoV). These measurements are visualized in 3D in Figure 4b, where the red points represent the box and the white points represent the wall. The corresponding raw distance measurements are shown in Figure 4c. This illustrates the sensor’s capability to distinguish objects at different depths within its field of view. The sensor includes an integrated noise filtering mechanism based on digital signal processing (DSP) [28], which enhances the accuracy and stability of distance measurements by reducing signal noise and fluctuations.

2.3. Multiple-Sensor Orientation and Coordinate System

Eight distance sensors are arranged in an octagonal configuration at 45-degree intervals ( α = 45 ° ) to achieve full 360-degree horizontal coverage around the drone, as illustrated in Figure 5. Each sensor provides a 45-degree horizontal field of view, ensuring seamless obstacle detection in all directions. The coordinate system defines the x-axis and y-axis for directional reference, with r representing the radius from the center of the module to each sensor. This layout was implemented using a 3D-printed sensor holder to maintain precise angular spacing, as shown in Figure 6a.
To manage this sensor configuration efficiently while maintaining a compact, lightweight, and low-power design, an ESP32-WROOM-32 microcontroller was employed to handle all data processing. When connecting multiple distance sensors to a single microcontroller, several key considerations must be addressed to ensure proper functionality. All sensors can share a common power supply and ground: the VCC pins are connected to a 3.3 V source, and the GND pins are connected to the microcontroller’s ground. For I2C communication [29], all sensors utilize the same SDA and SCL lines, connected to the corresponding pins on the microcontroller.
However, each sensor must be assigned a unique I2C address to avoid communication conflicts [29]. Since the microcontroller does not support dynamic address assignment, address management is achieved by controlling the power state of each sensor via GPIO pins, as illustrated in the wiring diagram shown in Figure 7. The microcontroller enables one sensor at a time by powering it on through a dedicated GPIO pin, allowing the sensor’s I2C address to be configured without interference from others. This process is repeated for all eight sensors, enabling effective multiplexed communication.
To describe how each sensor’s measurement data is interpreted, Figure 8 defines the local coordinate system for the sensor module. Each sensor detects distances, denoted by d s , to objects within its field of view. These raw distance measurements are initially defined in each sensor’s local coordinate frame and must be transformed into the sensor module coordinate frame of the drone using rotation and translation matrices.
The distance data from each sensor ( i 0 to i 7 ) are obtained using a coordinate transformation matrix M . Each sensor’s distance data d s is transformed using a matrix M , which incorporates the following steps:
(1)
Rotate around the z a x i s by an angle α , represented by R z ( α ) , and translate x a x i s by r represented as T x ( r ) .
(2)
Rotate around the z a x i s by an angle β , denoted as R z ( β ) , where β ranges from −22.5° to 22.5°.
(3)
Rotate around the y a x i s by an angle γ , represented as R y ( γ ) , where γ also ranges from −22.5° to 22.5°.
(4)
Translation along the x a x i s of the rotated coordinate system by a distance d s , denoted as T x ( d s ) . The parameter d s corresponds to the measured distance from the center of the sensor to its surrounding object, which ranges from 0 to 400 cm.
These transformations convert each sensor’s distance data into a unified 3D point cloud. The complete transformation is expressed in Equations (2)–(4), where each matrix converts the multi-zone array distance sensor outputs into the sensor module position frame relative to the drone:
M = R z α T x r R z β R y γ T x d
M = c o s α s i n α 0 0 s i n α c o s α 0 0 0 0 1 0 0 0 0 1 1 0 0 r 0 1 0 0 0 0 1 0 0 0 0 1 c o s β s i n β 0 0 s i n β c o s β 0 0 0 0 1 0 0 0 0 1 c o s γ 0 s i n γ 0 0 1 0 0 s i n γ 0 c o s γ 0 0 0 0 1 1 0 0 d s 0 1 0 0 0 0 1 0 0 0 0 1
M = c o s ( α + β ) c o s γ s i n ( α + β ) c o s ( α + β ) s i n γ d s c o s ( α + β ) c o s γ + r c o s α s i n ( α + β ) c o s γ c o s ( α + β ) s i n ( α + β ) s i n γ d s sin α + β c o s γ + r s i n α s i n γ 0 c o s γ d s i n γ 0 0 0 1
Using the final transformation matrix M , each distance measurement is projected into the sensor module coordinate frame, namely the drone’s frame. The result is a complete 3D point cloud of obstacles detected in the environment. Each sensor is assigned a specific angular index relative to the front of the drone (sensor i 0 at 0°), allowing for the accurate mapping of direction and distance. This allows the drone to localize and respond to obstacles from all directions in real-time, enabling smooth and accurate collision avoidance.

3. Drone System

Conventional drones primarily rely on manual control or GPS-assisted navigation, which limits their effectiveness in indoor environments where GPS signals are unavailable, and space is often constrained. In such scenarios, human operators, especially inexperienced ones, face challenges in maintaining stable and collision-free flight due to limited spatial awareness and reaction time. This becomes particularly critical during indoor inspection tasks, where obstacles such as walls, equipment, and structural elements are densely distributed.
To address these challenges, the custom-designed 360-degree sensor module proposed in this research was integrated into a PX4 Development Kit-X500 v2 drone platform [30,31], shown in Figure 9. The sensor module enhances the drone’s perception capabilities by providing real-time distance measurements using eight Time-of-Flight (ToF) distance sensors arranged around the drone. This configuration enables autonomous collision avoidance by detecting nearby obstacles and dynamically modifying the drone’s motion commands. The integration preserves manual flight capability while adding a layer of autonomous safety, ensuring efficient and secure operations in complex, GPS-denied environments. The drone used for experimentation and testing is a custom quadcopter equipped with standard commercial-grade components. Table 2 summarizes the drone’s main specifications. The total additional weight introduced by the sensor module and microcontroller was less than 100 g, which is well within the payload capability of the drone. During test flights, the drone demonstrated reliable performance with minimal impact on flight stability or power efficiency.

3.1. Sensor Module Integration

The custom sensor module was designed and mounted on the top center of the drone using a 3D-printed holder. Eight ToF distance sensors were arranged at 45-degree intervals around the module, providing 360-degree horizontal obstacle detection. Each sensor is capable of multi-zone distance measurement (up to 64 zones in 8 × 8 mode) within a 45 × 45-degree field of view. The sensors were mounted approximately 130 mm above the drone frame to minimize obstruction from the propellers and to provide a clear horizontal field of view for each sensor. This mounting position was determined experimentally to achieve optimal sensing coverage without interference from the drone’s structure. Once the module was fully assembled and wired, the drone was equipped for flight testing. The final drone system with the integrated sensor module is shown in Figure 10.
The sensors communicate with a microcontroller using the I2C protocol. To avoid address conflicts on the I2C bus [29], the microcontroller sequentially powers each sensor and reads data one at a time using GPIO-based address switching. This approach allows the microcontroller to manage all eight sensors using a shared communication bus. The microcontroller continuously collects environmental data and processes it in real time. When an obstacle is detected, the collision avoidance algorithm modifies the remote operator’s control signals accordingly. The microcontroller overrides manual commands only when necessary to prevent collisions, thereby maintaining smooth and safe flight behavior. Figure 11 shows the overall system configuration, where the microcontroller is positioned between the distance sensors and the flight controller. Instead of direct integration with the flight controller [32], the microcontroller modifies the user’s control signals from the receiver (Rx) based on obstacle proximity and transmits the adjusted commands. The computer unit is used solely during the development and testing phase when the drone is stationary for evaluating obstacle detection and collision avoidance functionality. It is not involved during actual flight operations.

3.2. SBUS Communication Protocol

The SBUS (Serial Bus) protocol is a widely used digital communication standard in radio-controlled systems. It enables efficient and reliable transmission of up to 16 control channels over a single serial line [33,34]. This reduces wiring complexity while ensuring high-speed and low-latency communication. In drone systems, SBUS typically operates over a 2.4 GHz wireless link between the remote controller and the onboard receiver. The receiver decodes the SBUS signal and forwards control commands such as throttle, pitch, roll, and yaw to the flight controller, which executes the user’s intended maneuvers.
In this system, a Futaba T10J transmitter [32] sends user commands to the onboard receiver. The receiver then forwards the raw SBUS signals to the microcontroller. The microcontroller intercepts these commands, dynamically adjusts them based on the measurements from the sensor module, and forwards the modified signals to the Pixhawk flight controller. The SBUS frame structure consists of 25 bytes: a start byte, 23 bytes encoding up to 16 channels (each with 11-bit resolution), a flag byte for fail-safe and frame-loss detection, and an end byte [33], as summarized in Table 3.
In this research, four primary SBUS channels of throttle, pitch, roll, and yaw were used for manual drone control. The SBUS signal was extended beyond conventional usage to support autonomous collision avoidance. When an obstacle is detected, the system computes a virtual repulsive force and dynamically adjusts the pitch and roll signals to steer the drone away from the hazard. Since indoor inspections are conducted at low speeds in confined spaces, the SBUS signal range was intentionally constrained to promote safe and precise control. During these operations, SBUS channel values were maintained near the neutral midpoint to avoid abrupt maneuvers, as illustrated in Figure 12.
In this study, SBUS values from the Futaba T10J transmitter ranged from 360 to 1673. Through a series of controlled experiments in a 4 m × 6 m indoor test area, the drone was found to operate most safely and stably within the SBUS range of 687 to 1344, centered at 1016. This range supports fine motion control, minimizing abrupt directional changes or high-speed responses that could lead to collisions in cluttered indoor environments.
The effectiveness of this configuration was validated using the experimental drone system shown in Figure 10. By leveraging the SBUS protocol and real-time sensor feedback, the system achieved both responsive control and enhanced collision avoidance, making it well-suited for inspection tasks in complex indoor spaces.

4. Collision Avoidance Algorithm

The collision avoidance algorithm developed in this study enables the drone to detect and respond to nearby obstacles in real time by modifying its control signals. The approach combines distance data from multiple Time-of-Flight (ToF) distance sensors and computes virtual repulsive forces to guide the drone safely through cluttered indoor environments. The core concept is to maintain smooth and responsive manual flight while automatically overriding user inputs when potential collisions are detected.

4.1. Obstacle Detection and Responsive Distance Modeling

The object detection workflow begins by computing a minimum representative distance from each angular segment of the sensor module FoV. As described in Section 2.3, each distance sensor provides a matrix of 3D distance data across multiple zones in its field of view. To reduce computational load while preserving directional awareness, the system processes these transformed 3D measurements by selecting the minimum distance value in each column of the sensor’s zone matrix. Each column corresponds to a vertical slice (angular segment) of the FoV, and the minimum value in that column represents the closest obstacle point detected in that segment.
Specifically, for each sensor, the algorithm selects the minimum distance from each column of the matrix (multi-zone data) corresponding to the closest point along each angular segment. Let this minimum distance or distance measured from the nearest obstacle be denoted as D n , where D n is the minimum measured distance in the n -th angular segment across all sensors. This scalar distance is then used as input to the Gaussian-based repulsive force model for real-time obstacle avoidance. Figure 13 illustrates the general overview of object detection and representation by the developed system. In this figure, no object is detected within the safety zone represented by threshold R t h . Consequently, no repulsive forces are generated, the total repulsive force is zero ( F t = 0 ) , and the drone continues following the user command.

4.2. Gaussian-Based Virtual Repulsive Force Model

This algorithm processes data from all eight ToF sensors mounted around the drone, enabling 360-degree horizontal detection. Based on the computed minimum distances in each direction, the system generates repulsive forces that adjust the drone’s flight direction and speed to prevent collisions. The system continuously collects distance data from multiple ToF distance sensors mounted on the drone to achieve 360-degree coverage. Each sensor provides distance measurements across multiple zones (e.g., 4 × 4 = 16 zones in low-resolution mode).
As illustrated in Figure 14, the geometric diagram presents the workflow of obstacle detection and repulsive force generation from each angular segment corresponding to detected obstacles. Red points represent detected obstacles, while green points indicate free or safe space for navigation. When obstacles O 1 and O 2 enter the safety zone, the system calculates the individual repulsive force f 1 to f 4   for O 1 and f 5 to f 8 for O 2 , using Equation (5) at their respective angular direction Θ n . These forces are then summed vectorially to produce the intermediate repulsive forces F 1 and F 2 , which represent the overall repulsive effects of obstacles O 1 and O 2 , respectively. These intermediate forces are used solely to clarify how the total force is generated from multiple angular segments, but they are not used directly in the collision avoidance algorithm.
The repulsive force model transforms each detected distance value into a virtual force vector that influences drone movement. The model uses a Gaussian decay function, which smoothly increases the force magnitude as the distance to the obstacle decreases.
For each detected point D n at an angle Θ n , the corresponding repulsive force f n is computed as
  f n = K exp D n 2 2 σ 2 cos Θ n + π sin Θ n + π
where D n is the minimum distance data from a given column distance data of the n -th angular segment in the 2D point cloud, f n is the virtual repulsive force due to distance D n , Θ n is the direction of the angular segment where the distance D n is measured (direction of the detected obstacle), K is a tuning gain constant (controls maximum repulsion strength), and the parameter σ represents the standard deviation in the Gaussian function and controls the decay width of the repulsive force. A larger σ results in a broader influence range or a large threshold, allowing the drone to respond to obstacles that are farther away with a gradual decrease in force. In contrast, a smaller σ produces a sharper and quicker drop-off, causing the repulsive force to act more strongly only when objects are in close proximity. This tighter response makes the system more responsive in constrained environments and helps prevent sudden or abrupt changes in control behavior. π is used to reverse the virtual repulsive force directly opposite to the obstacle.
This Gaussian model offers a smooth and continuously differentiable repulsive force, which is beneficial for generating stable avoidance maneuvers. It also prevents overreaction to distant objects, reducing unnecessary control input and enabling the drone to maintain efficient flight paths in cluttered environments. This approach enables a flexible, responsive, and computationally efficient collision avoidance behavior suitable for indoor inspection tasks. The magnitude of the virtual repulsive force is scaled and mapped to this control range using a Gaussian model. The scaling constant K in the Gaussian equation defines the maximum repulsive influence. A higher value of K results in a stronger correction applied to the user command, enabling the drone to reverse or significantly adjust its path when an obstacle is very close. This parameter is critical in tuning the sensitivity of the avoidance system.
This model offers several advantages for real-time drone navigation. First, it provides a smooth and continuous force profile, ensuring stable control responses without abrupt fluctuations. The function produces a maximum repulsive force when the obstacle is very close ( D n 0 ) and rapidly decays as the distance increases, which helps the drone prioritize immediate threats while ignoring distant, non-critical objects.
As shown in Figure 15, the detection space is categorized into three zones based on the distance to the obstacle and corresponding repulsive force. In the first zone, when the obstacle is approximately 200 cm to 400 cm away (green area in Figure 15), although the obstacle is detected, the generated repulsive force is negligible or zero, resulting in no modification to the control command. This prevents unnecessary drone reactions to distant objects. In the second zone, when the distance is less than 200 cm, a corresponding repulsive force is generated, and the control command is modified accordingly. As the obstacle approaches, the magnitude of the repulsive force increases, prompting the drone to slow down or reverse (yellow area in Figure 15). In the third zone, when the obstacle is very close, at approximately 70 cm or less, a strong repulsive force is generated, causing the drone to move quickly in the opposite direction to avoid collision(red are in Figure 15). The blue solid line represents the generated repulsive force corresponding to each distance across the defined zones, illustrating how the force varies within each region. Additionally, the vertical red and green dashed lines indicate the conceptual boundaries between adjacent zones. In this research, the values of σ and K were set to 75 and 1050, respectively.
To ensure effective operation in confined indoor environments, the drone was flown at deliberately low speeds, aligned with the system’s processing capacity. Although the actual flight speed was not measured, slow movement allowed sufficient time for object detection, force computation, and control adjustment. This strategy contributed to stable and safe navigation in cluttered environments.

4.3. Motion Adjustment via SBUS Control Signal Modification

The algorithm modifies the drone’s motion control signals in real time to avoid detected obstacles. The user’s command is interpreted as a virtual motion vector. When an obstacle is detected, the system generates a repulsive force vector according to the sensor data and adjusts the control signal accordingly.
Repulsive Force Calculation: After obstacle detection, the system computes repulsive forces from each angular segment where a nearby obstacle is identified. Each segment corresponds to a specific direction around the drone, and the closest obstacle distance in that direction is denoted as D n , where n is the index of the angular segment.
Using the Gaussian-based model introduced in Section 4.2, a virtual repulsive force f n is generated for each angular segment in which an obstacle is detected. This repulsive force vector points directly away from the obstacle’s direction, and its magnitude is determined by Equation (5) with the corresponding distance D n . The closer the obstacle (i.e., the smaller D n ), the stronger the repulsive force. Once all individual repulsive forces f n are computed for the relevant angular segments, the system calculates the total repulsive force F t as the vector mean (average) of these forces. The updated formula is
  F t = 1 N n = 1 N f n    
where f n is the repulsive force vector generated from the n -th angular segment, N is the number of segments with valid obstacle detection (i.e., those within the safety threshold), and F t is the average repulsive force vector used to adjust the drone’s control command.
Control Signal Modification:
The final modified control signal U m o d is computed as the vector sum of the original user command U c and the total repulsive force F t :
U m o d = U c + F t
where U m o d is the modified control signal, U c is the current control signal from the user, and F t is the virtual repulsive force generated due to the detected obstacle, as illustrated graphically in Figure 16.
The drone’s control signal is continuously adjusted based on the computed repulsive force F t . If there are no obstacles that generate a significant repulsive force, or no obstacles within the threshold ( F t = 0 ), the drone follows the original user command without modification. When obstacles are detected, the magnitude and direction of F t are used to adjust the drone’s flight direction and speed. The proposed algorithm was implemented as a program and embedded in the microcontroller. Its pseudo-code is given in Table 4.
To demonstrate how user command is modified in response to obstacle detection, Figure 17 and Figure 18 present a representative experimental case. Figure 17 shows the physical setup, where the drone is placed on a movable cart and slowly pushed toward an obstacle located in front. Although the drone is not in flight, the sensor module actively detects the surrounding environment in the same manner as during actual flight. As the drone approaches the object, the onboard sensors detect decreasing distance values in the forward direction.
Figure 18 provides a set of corresponding graphs that visualize how the system processes this sensor feedback and modifies the control signals:
  • Figure 18a plots the measured distance between the drone and the obstacle, which decreases as the drone moves forward.
  • Figure 18b displays the resulting repulsive force generated by the system using the Gaussian model, which increases as the obstacle comes closer.
  • Figure 18c compares the original user command (blue line), representing continuous forward motion, with the modified control command (green line), which increases in magnitude as the repulsive force increases.
  • Figure 18d summarizes all three signals: distance, repulsive force, and control commands on a single graph for comprehensive comparison.
This example highlights how the system selectively intervenes to override manual input when necessary. Initially, when no obstacle is present, the modified command follows the user input. As the obstacle enters the effective zone range, the computed repulsive force causes the control signal to deviate smoothly, ensuring collision avoidance without abrupt maneuvers. For clarity, the graphs were generated using data from a single angular segment.

5. Experimental Results

To evaluate the effectiveness of the proposed real-time collision avoidance system, a series of indoor flight tests were conducted using the custom-built quadcopter described in Section 3. The experiments aimed to verify the drone’s ability to detect nearby obstacles, generate repulsive forces, and dynamically modify control signals to avoid collisions during teleoperated flight.

5.1. Test Setup and Environment

The drone was flown manually using a Futaba T10J remote controller inside an indoor laboratory environment. The test area was approximately 4 m wide and 6 m long, with several obstacles including boxes, walls, and furniture placed at different positions to simulate a cluttered inspection environment.
The sensor module was mounted on top of the drone, providing full 360-degree horizontal coverage. All eight ToF distance sensors were calibrated before the flight to ensure consistent range accuracy. The microcontroller handled sensor communication and SBUS signal modification in real time.
During the initial verification stage, the drone was mounted on a movable cart and manually pushed around the experimental space toward various types of obstacles, simulating real flight conditions. Although not airborne, the sensor module operated exactly as it would during flight, continuously detecting the surrounding environment. In each test, the drone was approached toward obstacles from different directions (front, side, and diagonal), while sensor data, SBUS control signals, and generated repulsive forces were processed and analyzed in real time.
The sensor module continuously monitored the drone’s horizontal surroundings and transmitted real-time data to the embedded microcontroller; the experimental setup is illustrated in Figure 19a. The 3D point cloud data shown in Figure 19b represent the drone’s environment, where the green points indicate that no obstacles were detected within the detection range, while the gray points represent detected obstacles that are outside the critical range and thus not considered threats by the system. This 3D point cloud was then converted into the 2D horizontal data shown as the blue points in Figure 19c. The conversion was achieved by selecting the nearest point in each angular segment (azimuth slice) of the 3D point cloud. Each segment represents a fixed directional slice of the drone’s surrounding horizontal plane. The closest distance reading within each segment is projected onto the 2D plane and used as a representative value for that direction. This simplification significantly reduces the computational load while preserving essential obstacle proximity information for the collision avoidance algorithm.

5.2. Collision Avoidance Performance

The drone successfully detected obstacles within the effective sensing range of the ToF sensors (up to 360 cm in 4 × 4 mode) and responded appropriately by adjusting its motion commands. The repulsive force-based control adjustment allowed the drone to avoid collisions without sudden or unstable movement.
Figure 20, Figure 21, Figure 22 and Figure 23 show obstacle detection and repulsive force generation from experiments involving obstacles located in the front, left, back, and right directions, respectively. Each figure includes the actual experimental scenario showing the drone and obstacle arrangement, a 2D RViz visualization [35] illustrating obstacle representation (blue points), and the corresponding virtual repulsive force (red arrow) generated in each case. These visualizations demonstrate how the drone’s behavior dynamically adjusts in response to obstacle proximity and direction.
(A)
Front direction case: As the drone approaches an obstacle in the front direction, as shown in Figure 20a, the obstacle is represented by the 2D representative point cloud, as shown in Figure 20b, then an opposite force towards the rear of the drone is virtually generated by modifying the control command. A model of this force is shown in Figure 20c.
(B)
Left direction case: As the drone approaches an obstacle in the left direction, as shown in Figure 21a, the obstacle is represented by the 2D representative point cloud, as shown in Figure 21b, then an opposite force towards the right of the drone is virtually generated by modifying the control command. A model of this force is shown in Figure 21c.
(C)
Rear direction case: When an obstacle (the white board was placed on the rear side of the drone) is detected in the rear direction of the drone, the repulsive force moves towards the front direction, as illustrated in Figure 22.
(D)
Right direction case: Similar to the other directions, when an obstacle is detected in the right direction of the drone, the repulsive force moves towards the left of it, as illustrated in Figure 23.
When an obstacle is detected in a diagonal direction outside of the four main directions discussed earlier, the resulting repulsive force is generated in the opposite direction of the detected obstacle. The angle of this force is calculated by reversing the angle of the representative obstacle direction by 180 degrees, while its magnitude is continuously determined based on the measured distance.
A representative experimental scenario for a rear–left obstacle detection case, along with its corresponding 2D RViz visualization, is shown in Figure 24. This scenario serves as a reference for understanding how the system responds to diagonal obstacles. The corresponding repulsive force outputs for other diagonal directions, front–left, front–right, and rear–right, are presented in Figure 25a–d. While the actual experimental setups for these cases are similar to that in Figure 24 (only the direction of the obstacles changed as explained in each case), only the repulsive force vector visualizations are shown in Figure 25 to illustrate the drone’s directional response behavior.
(a)
Front–left direction case: When obstacles are detected in both the front and left directions, each generates a repulsive force away from its location. The vector summation of these forces results in a resultant force directed toward the rear–right direction, steering the drone away from the obstacles, as illustrated in Figure 25a.
(b)
Front–right direction case: When obstacles are present in the front and right directions, the resulting repulsive force is directed towards the rear–left direction, as shown in Figure 25b. This response ensures the drone moves away from both detected threats.
(c)
Rear–left direction case: When obstacles are detected in the rear and left directions, as shown in Figure 24, the corresponding repulsive forces combine to produce a resultant force toward the front–right direction, enabling the drone to avoid both obstacles, as illustrated in Figure 25c.
(d)
Rear–right direction case: When obstacles are detected in both the rear and right directions, a resultant force is generated toward the front–left direction, guiding the drone away from the obstacles, as shown in Figure 25d.
Following the generation of virtual repulsive forces tested in various scenarios, the corresponding modification of user commands was also verified through a series of experiments. Figure 26 illustrates two different case scenarios in which the drone detects surrounding obstacles. Figure 26a,c show two distinct experimental setups with various obstacle arrangements surrounding the drone (positioned at the center), while Figure 26b,d display the corresponding 2D RViz visualizations. In both cases, the original user control command (blue arrow) is adjusted by the virtual repulsive force (red arrow), resulting in a modified control command (green arrow) that directs the drone along a safer path. The virtual repulsive force is represented by a red arrow, the original user command by a blue arrow, and the resulting modified control command by a green arrow. In both scenarios, the initial user commands would direct the drone toward nearby obstacles; however, the system dynamically adjusts the control signal in real time to steer the drone away from potential collisions. This demonstrates the effectiveness of the repulsive force mechanism in modifying the drone’s trajectory for safe navigation, while still maintaining responsiveness to user input. The visualizations clearly illustrate the real-time adaptation of control commands based on obstacle proximity.

5.3. Obstacle Avoidance Flight Experiments

Finally, the experimental drone equipped with the developed system was tested to react to various objects within the experimental laboratory to evaluate its performance in responding to different obstacles. Figure 27 illustrates the experimental drone navigating a complex object setup. As the drone approached a shelf with complex object arrangements (Figure 27a,b), the distance between the drone and the objects decreased below a safety threshold (Figure 27c). In response, the drone began to reduce its speed and moved away from the obstacles (Figure 27d). Ultimately, it maintained a safe distance, preventing collisions by moving away from the obstacles, as shown in Figure 27e,f.
Similarly, Figure 28a–c illustrates a case in which the drone was manually operated within the experimental laboratory environment. During its flight, the drone detected an obstacle, and the collision avoidance system continuously measured the distance and generated a virtual repulsive force.
As the drone approached the obstacle, the repulsive force gradually increased, causing the drone’s speed to reduce to zero. However, due to inertia, the drone continued moving slightly in its previous direction. Once the repulsive force became strong enough to overcome this inertia, it reversed the drone’s motion, prompting it to move away from the obstacle. This dynamic response effectively demonstrated the system’s ability to prevent a potential collision by automatically adjusting the drone’s motion based on real-time distance feedback.

Trajectory Visualization

Although real-time control signal visualization was implemented using RViz (as described in Section 5.3), the drone system developed in this study does not include an onboard localization or trajectory visualization module. In other words, the drone lacks autonomous position estimation capability (such as SLAM) and cannot internally generate or display its flight path during operation. While the RViz visualization system [35] supports real-time display of user commands of the drone, repulsive forces, and modified control signals, it requires the embedded microcontroller to be connected to a computer. During flight, no wireless transmitter was used to send data from the drone to a base station for an RViz display in this study. Therefore, the RViz demonstration was conducted by either pushing the drone on a cart or holding it by hand, mimicking realistic flight behavior while maintaining a stable data connection for visualization.
To compensate for this limitation and to enable accurate post-flight analysis, an external motion capture system (OptiTrack) was introduced to track and record the drone’s position and orientation trajectory in real time [36]. The OptiTrack system uses a network of high-speed infrared cameras to detect retroreflective markers attached to tracked objects. To capture precise motion data,
  • Eight reflective markers were mounted on the drone frame in symmetrical positions to accurately measure its position and orientation in 3D space.
  • Additionally, eight to ten reflective markers were attached to each obstacle in every experimental setup. This allowed both drone and obstacle positions to be tracked in the same global coordinate frame.
During each experiment, the drone was manually flown toward static obstacles, and the OptiTrack system recorded the time-synchronized position, heading, and relative movement of both the drone and the obstacles. After each test, the recorded data were exported and visualized to reconstruct the drone’s trajectory and assess how it responded to obstacle detection and avoidance.
To evaluate the system under various conditions, four representative flight experiments were conducted under different obstacle conditions. In each case, the drone was manually piloted toward the obstacle, and its response was recorded and analyzed. Figure 29, Figure 30, Figure 31 and Figure 32 summarize the results of these experiments. In each figure, (a) shows the actual experimental scenario, and (b) presents the drone’s trajectory, captured using the OptiTrack motion capture system. In the trajectory plots with blue and red colors, point A indicates the starting position, point B marks the endpoint, and red arrows denote the direction of drone motion.
Figure 29: The drone was directed toward a flat whiteboard, representing a generic planar obstacle such as a wall. Upon detecting the obstacle denoted with light green line, the drone reversed its direction. The trajectory plot (b) shows the initial forward motion toward the wall, followed by a retreat in the opposite direction after obstacle detection.
Figure 30: The drone approached a tool shelf, representing a set of multiple complex obstacles with irregular shapes and varying depths. Where the obstacle is denoted by purple color lines. After detecting the shelf, the drone adjusted its path and moved away to avoid collision. The trajectory (b) reflects this adaptive behavior, showing a change in direction following detection.
Figure 31: The drone flew toward a box, placed along its forward path. Upon detecting the box, it adjusted its trajectory to avoid collision and continued its flight. The corresponding plot (b) demonstrates a smooth directional adjustment after obstacle detection. Where the obstacle denoted by a box with purple dashed lines.
Figure 32: The drone was flown between two consecutive obstacles labeled obstacle_1 (light blue) and obstacle_2 (light green). As it approached the first obstacle, it generated a repulsive response, altered its course, and then adjusted again to avoid the second. The resulting zigzag trajectory, shown in (b), confirms the drone’s ability to handle sequential obstacle avoidance using the proposed system.
These visualizations confirm that as soon as an obstacle entered the sensor’s detection range, a virtual repulsive force was generated based on proximity. The drone then deviated smoothly from its original path, validating the system’s ability to perform real-time collision avoidance under various environmental conditions.
This motion tracking approach provided a reliable and objective method for verifying the system’s real-time behavior and the performance of the repulsive force-based control modification. It served as essential ground-truth data, particularly in the absence of an onboard localization system, and confirmed that the drone adjusted its trajectory and heading based on the proposed collision avoidance algorithm.

6. Conclusions

This study presented the design and implementation of a real-time collision avoidance system for indoor drones using multiple Time-of-Flight (ToF) distance sensors. The proposed system enables 360-degree horizontal obstacle detection and automatic flight command modification without fully overriding manual control. A total of eight ToF distance sensors were integrated into a custom-designed sensor module and mounted on the drone to provide full directional coverage.
The core of the system is a Gaussian-based repulsive force model, which transforms the measured distances from each sensor into virtual force vectors. These forces are averaged and used to modify the SBUS control signals in real time, ensuring safe navigation without disrupting operator intent. The drone only overrides user commands when a potential collision is detected, maintaining responsive and intuitive manual control.
The entire system was implemented on a low-cost microcontroller, demonstrating that high-performance real-time obstacle avoidance can be achieved without relying on high computational resources. The algorithm was validated through a series of indoor flight experiments in a cluttered environment. Results showed that the system effectively prevented collisions by modifying control commands proportionally based on obstacle proximity.
To further evaluate the system’s behavior, both real-time visualizations using RViz and trajectory tracking using the OptiTrack motion capture system were conducted. The results confirmed that the drone successfully adjusted its path in response to obstacle detection, validating the effectiveness of the repulsive force model and control modification algorithm. Since the developed system does not include onboard localization or mapping, the OptiTrack system provided essential ground-truth data for analyzing drone motion and verifying performance.
This research demonstrates a practical, low-cost solution for enhancing indoor drone safety during manual operations, especially in environments where GPS is unavailable and sensor-based navigation is required. The modular nature of the system also allows for future upgrades, including integration with onboard mapping, SLAM, or autonomous waypoint navigation systems.
Future enhancements may focus on expanding sensor coverage and optimizing placement to detect smaller and more distant obstacles. Developing sophisticated algorithms, possibly incorporating machine learning and AI techniques, could further improve the drone’s navigation and collision avoidance capabilities [37]. Additionally, refining the positioning of sensors on the drone body can enhance measurement accuracy and system efficiency, reduce blind spots, and ensure consistent performance.

Author Contributions

Conceptualization, J.H.L.; methodology, T.A.G. and J.H.L.; software, T.A.G.; validation, T.A.G. and J.H.L.; formal analysis, T.A.G. and J.H.L.; investigation, T.A.G. and J.H.L.; resources, J.H.L.; data curation, T.A.G.; writing—original draft preparation, T.A.G.; writing—review and editing, J.H.L.; visualization, T.A.G.; supervision, J.H.L.; project administration, J.H.L.; funding acquisition, J.H.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed at the author.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Hammadi, M. A Comprehensive Analysis of Autonomous Drone Technology Across Multiple Sectors, HAL Open Science 2024. Available online: https://hal.science/hal-04412160 (accessed on 18 April 2025).
  2. Telli, K.; Kraa, O.; Himeur, Y.; Ouamane, A.; Boumehraz, M.; Atalla, S.; Mansoor, W. A comprehensive review of recent research trends on unmanned aerial vehicles (UAVs). Systems 2023, 11, 400. [Google Scholar] [CrossRef]
  3. Razafimahazo, E.; De Saqui-Sannes, P.; Vingerhoeds, R.A.; Baron, C.; Soulax, J.; Mège, R. Mastering complexity for indoor inspection drone development. In Proceedings of the 2021 IEEE International Symposium on Systems Engineering (ISSE), Vienna, Austria, 13 September–13 October 2021; pp. 1–8. [Google Scholar] [CrossRef]
  4. Karam, S.; Nex, F.; Chidura, B.T.; Kerle, N. Microdrone-based indoor mapping with Graph SLAM. Drones 2022, 6, 352. [Google Scholar] [CrossRef]
  5. Wang, S.; Wang, L.; He, X.; Cao, Y. A monocular vision collision avoidance method applied to indoor tracking robot. Drones 2021, 5, 105. [Google Scholar] [CrossRef]
  6. Yang, L.; Feng, X.; Zhang, J.; Shu, X. Multi-ray modeling of ultrasonic sensors and application for micro-UAV localization in indoor environments. Sensors 2019, 19, 1770. [Google Scholar] [CrossRef] [PubMed]
  7. Kumar, G.A.; Patil, A.K.; Patil, R.; Park, S.S.; Chai, Y.H. A LiDAR and IMU integrated indoor navigation system for UAVs and its application in real-time pipeline classification. Sensors 2017, 17, 1268. [Google Scholar] [CrossRef] [PubMed]
  8. Kazan, F.A.; Solak, H. Improvement of ultrasonic sensor-based obstacle avoidance system in drones. Int. J. Aeronaut. Astronaut. 2023, 4, 9–35. [Google Scholar] [CrossRef]
  9. Xiong, X.; Yan, Y.; Yang, Y. A survey of indoor UAV collision avoidance research. IEEE Access 2023, 11, 51861–51891. [Google Scholar] [CrossRef]
  10. Park, Y.-J.; Yi, C.-Y. Time-of-flight distance sensor–based construction equipment activity detection method. Sensors 2024, 14, 2859. [Google Scholar] [CrossRef]
  11. Gadde, C.S.; Gadde, M.S.; Mohanty, N.; Sundaram, S. Fast collision avoidance motion in small quadcopter operation in a cluttered environment. arXiv 2021. [Google Scholar] [CrossRef]
  12. Eresen, A.; Imamoǧlu, N.; Efe, M.Ö. Autonomous quadrotor flight with vision-based collision avoidance in virtual environment. Expert Syst. Appl. 2012, 39, 894–905. [Google Scholar] [CrossRef]
  13. Khaleel, M.; Emimi, M.; Alkrash, A. The current opportunities and challenges in drone technology. Int. J. Electr. Eng. Sustain. 2023, 1, 74–89. Available online: https://ijees.org/index.php/ijees/index (accessed on 2 July 2025).
  14. Falanga, D.; Kleber, K.; Scaramuzza, D. Dynamic obstacle avoidance for quadrotors with event cameras. Sci. Robot. 2020, 5, eaaz9712. Available online: https://www.science.org (accessed on 25 April 2025). [CrossRef] [PubMed]
  15. Devos, A.; Ebeid, E.; Manoonpong, P. Development of autonomous drones for adaptive collision avoidance in real-world environments. In Proceedings of the 2018 21st Euromicro Conference on Digital System Design (DSD), Prague, Czech Republic, 29–31 August 2018; pp. 707–710. [Google Scholar]
  16. Tsuji, S.; Kohama, T. Omnidirectional proximity sensor system for drones using optical time-of-flight sensors. IEEJ Trans. Electr. Electron. Eng. 2022, 17, 19–25. [Google Scholar] [CrossRef]
  17. Niculescu, V.; Polonelli, T.; Magno, M.; Benini, L. Nanoslam: Enabling fully onboard SLAM for tiny robots. IEEE Internet Things J. 2024, 11, 13584–13607. [Google Scholar] [CrossRef]
  18. Aldao, E.; Santos, L.M.G.-D.; Michinel, H.; González-Jorge, H. UAV collision avoidance algorithm to navigate in dynamic building environments. Drones 2022, 6, 16. [Google Scholar] [CrossRef]
  19. Gageik, N.; Benz, P.; Montenegro, S. Obstacle detection and collision avoidance for a UAV with complementary low-cost sensors. IEEE Access 2015, 3, 599–609. [Google Scholar] [CrossRef]
  20. Pourjabar, M.; Rusci, M.; Bompani, L.; Lamberti, L.; Niculescu, V.; Palossi, D.; Benini, L. Multi-sensory anti-collision design for autonomous nano-swarm exploration. arXiv 2023, arXiv:2312.13086. [Google Scholar] [CrossRef]
  21. Mattar, R.A.; Kalai, R. Development of a wall-sticking drone for non-destructive ultrasonic and corrosion testing. Drones 2018, 2, 8. [Google Scholar] [CrossRef]
  22. Rezaee, M.R.; Hamid, N.A.W.A.; Hussin, M.; Zukarnain, Z.A. Comprehensive review of drones collision avoidance schemes: Challenges and open issues. IEEE Trans. Intell. Transp. Syst. 2024, 5, 6397–6426. [Google Scholar] [CrossRef]
  23. Yasin, J.N.; Mohamed, S.A.S.; Haghbayan, M.-H.; Heikkonen, J.; Tenhunen, H.; Plosila, J. Unmanned aerial vehicles (UAVs): Collision avoidance systems and approaches. IEEE Access 2020, 8, 105139–105155. [Google Scholar] [CrossRef]
  24. Electronics, S. Product Page: VL53L8CX Distance Sensor. Available online: https://www.sparkfun.com/products/19013 (accessed on 5 November 2024).
  25. STMicroelectronics. VL53L5CX-SATEL: Time-of-Flight 8×8 Multizone Ranging Sensor with Wide Field of View. Available online: https://www.st.com/en/imaging-and-photonics-solutions/vl53l5cx.html (accessed on 1 June 2025).
  26. Seeed Studio. What Is a Time-of-Flight Sensor and how Does It Work? Available online: https://www.seeedstudio.com/blog/2020/01/08/what-is-a-time-of-flight-sensor-and-how-does-a-tof-sensor-work/#sidr-nav (accessed on 5 November 2024).
  27. Pololu. VL53L8CX Carrier Dimensions. Available online: https://www.pololu.com/product/3419 (accessed on 11 May 2025).
  28. STMicroelectronics. VL53L8CX User Manual: UM3109 A Guide for Using the VL53L8CX Time-of-Flight Multizone Sensor. Available online: https://www.st.com (accessed on 11 May 2025).
  29. I2C-Bus Specification and User Manual. Available online: https://github.com/xioTechnologies/Dub-Siren (accessed on 15 December 2024).
  30. Isogai, K.; Inohara, R.; Nakano, H.; Okazaki, H. Modeling and simulation of motion of a quadcopter. In Proceedings of the International Symposium on Nonlinear Theory and its Applications, Yugawara, Japan, 27–30 November 2016. [Google Scholar] [CrossRef]
  31. Holybro. PX4 Development Kit X500 v2. Available online: https://holybro.com/products/px4-development-kit-x500-v2 (accessed on 1 July 2024).
  32. Corporation, F. Instruction Manual. Available online: https://www.futabausa.com/ (accessed on 13 June 2024).
  33. Futaba Corporation. S.BUS System, Futaba Official Website. Available online: https://www.rc.futaba.co.jp/support/tips/detail/41 (accessed on 1 July 2025).
  34. Bigazzi, L.; Basso, M.; Boni, E.; Innocenti, G.; Pieraccini, M. A multilevel architecture for autonomous UAVs. Drones 2021, 5, 55. [Google Scholar] [CrossRef]
  35. Open Robotics. RViz Visualization Tool. Available online: https://wiki.ros.org/rviz (accessed on 24 January 2025).
  36. OptiTrack. Motion Capture Systems by OptiTrack. Available online: https://optitrack.com (accessed on 10 June 2025).
  37. Xue, Z.; Gonsalves, T. Vision-based drone obstacle avoidance by deep reinforcement learning. Ai 2021, 2, 366–380. [Google Scholar] [CrossRef]
Figure 1. ToF distance sensor’s ranging mechanism.
Figure 1. ToF distance sensor’s ranging mechanism.
Applsci 15 08791 g001
Figure 2. Zone mapping: (a) 4 × 4 resolution mode provides 16 distance measurements; (b) 8 × 8 resolution mode provides 64 distance measurements.
Figure 2. Zone mapping: (a) 4 × 4 resolution mode provides 16 distance measurements; (b) 8 × 8 resolution mode provides 64 distance measurements.
Applsci 15 08791 g002
Figure 3. VL53L8CX ToF distance sensor object detection mechanism.
Figure 3. VL53L8CX ToF distance sensor object detection mechanism.
Applsci 15 08791 g003
Figure 4. Sensor performance test: (a) detection of a box positioned near a wall using a single sensor; (b) 3D point cloud representation of the box and wall based on sensor output; (c) raw distance measurement in centimeters.
Figure 4. Sensor performance test: (a) detection of a box positioned near a wall using a single sensor; (b) 3D point cloud representation of the box and wall based on sensor output; (c) raw distance measurement in centimeters.
Applsci 15 08791 g004aApplsci 15 08791 g004b
Figure 5. Top view of the eight-sensor orientation and coordinate system.
Figure 5. Top view of the eight-sensor orientation and coordinate system.
Applsci 15 08791 g005
Figure 6. Sensor module implementation and coverage characteristics; (a) Developed eight-sensor module mounted in a custom 3D-printed octagonal holder, designed to position each sensor at 45-degree intervals for complete horizontal coverage; (b) Side view illustration of the sensor coverage, showing the 360-degree horizontal field of view (FoV) achieved through the sensor arrangement and the individual 45-degree vertical FoV of each sensor.
Figure 6. Sensor module implementation and coverage characteristics; (a) Developed eight-sensor module mounted in a custom 3D-printed octagonal holder, designed to position each sensor at 45-degree intervals for complete horizontal coverage; (b) Side view illustration of the sensor coverage, showing the 360-degree horizontal field of view (FoV) achieved through the sensor arrangement and the individual 45-degree vertical FoV of each sensor.
Applsci 15 08791 g006
Figure 7. Wiring diagram of multiple ToF distance sensors connected to the microcontroller.
Figure 7. Wiring diagram of multiple ToF distance sensors connected to the microcontroller.
Applsci 15 08791 g007
Figure 8. Coordinate system for mapping detected points from the local to the sensor module coordinate frame.
Figure 8. Coordinate system for mapping detected points from the local to the sensor module coordinate frame.
Applsci 15 08791 g008
Figure 9. Drone frame kit used for system development and experimentation.
Figure 9. Drone frame kit used for system development and experimentation.
Applsci 15 08791 g009
Figure 10. Experimental drone system (drone equipped with developed sensor module system); (a) front view; (b) side view.
Figure 10. Experimental drone system (drone equipped with developed sensor module system); (a) front view; (b) side view.
Applsci 15 08791 g010
Figure 11. Block diagram of the proposed system configuration and its integration with the drone for real-time collision avoidance.
Figure 11. Block diagram of the proposed system configuration and its integration with the drone for real-time collision avoidance.
Applsci 15 08791 g011
Figure 12. SBUS channel signal range and corresponding drone speed behavior.
Figure 12. SBUS channel signal range and corresponding drone speed behavior.
Applsci 15 08791 g012
Figure 13. General model of obstacle detection.
Figure 13. General model of obstacle detection.
Applsci 15 08791 g013
Figure 14. Obstacle detection (red points represent obstacle detected and green points represent safe space) and collision avoidance algorithm.
Figure 14. Obstacle detection (red points represent obstacle detected and green points represent safe space) and collision avoidance algorithm.
Applsci 15 08791 g014
Figure 15. Relationship between obstacle distance D n and the corresponding repulsive force magnitude f n , modeled using a Gaussian function. The force reaches its maximum value when the obstacle is closest to the drone (i.e., D n = 0 ) and decays exponentially as the distance increases, promoting smooth and reactive collision avoidance behavior.
Figure 15. Relationship between obstacle distance D n and the corresponding repulsive force magnitude f n , modeled using a Gaussian function. The force reaches its maximum value when the obstacle is closest to the drone (i.e., D n = 0 ) and decays exponentially as the distance increases, promoting smooth and reactive collision avoidance behavior.
Applsci 15 08791 g015
Figure 16. Graphical representation of control signal modification: (a) original user command ( U c ) and generated repulsive force ( F t ) ; (b) resultant modified control signal vector ( U m o d ) .
Figure 16. Graphical representation of control signal modification: (a) original user command ( U c ) and generated repulsive force ( F t ) ; (b) resultant modified control signal vector ( U m o d ) .
Applsci 15 08791 g016
Figure 17. Front direction obstacle detection example.
Figure 17. Front direction obstacle detection example.
Applsci 15 08791 g017
Figure 18. Control command modification in response to front obstacle detection: (a) As the drone moves forward, it detects an obstacle at decreasing distances, from approximately 400 cm down to 60 cm. (b) Then a virtual repulsive force is computed according to Equation (6). (c) The original user command (blue) indicates continuous forward motion. Initially, the modified control command (green) matches the user’s original command since no obstacle is present. Once the obstacle is detected, the modified control signal is gradually diverted to avoid a collision. (d) Finally, the summary shows the relationship between distance, generated repulsive force, raw user command, and modified control command.
Figure 18. Control command modification in response to front obstacle detection: (a) As the drone moves forward, it detects an obstacle at decreasing distances, from approximately 400 cm down to 60 cm. (b) Then a virtual repulsive force is computed according to Equation (6). (c) The original user command (blue) indicates continuous forward motion. Initially, the modified control command (green) matches the user’s original command since no obstacle is present. Once the obstacle is detected, the modified control signal is gradually diverted to avoid a collision. (d) Finally, the summary shows the relationship between distance, generated repulsive force, raw user command, and modified control command.
Applsci 15 08791 g018
Figure 19. (a) Environment detection; (b) 3D representation; (c) 2D horizontal representation.
Figure 19. (a) Environment detection; (b) 3D representation; (c) 2D horizontal representation.
Applsci 15 08791 g019
Figure 20. Experiment repulsive force response to front direction obstacle detection: (a) Front obstacle detection; (b) 3D representation; (c) backward repulsive force calculated from the object detection.
Figure 20. Experiment repulsive force response to front direction obstacle detection: (a) Front obstacle detection; (b) 3D representation; (c) backward repulsive force calculated from the object detection.
Applsci 15 08791 g020
Figure 21. Experiment repulsive force response to left direction obstacle detection: (a) drone moving towards the object on its left side; (b) 2D horizontal detection results; (c) repulsive force model towards the right direction.
Figure 21. Experiment repulsive force response to left direction obstacle detection: (a) drone moving towards the object on its left side; (b) 2D horizontal detection results; (c) repulsive force model towards the right direction.
Applsci 15 08791 g021
Figure 22. Rear side object detection: (a) obstacle in the rear direction of the drone; (b) its 2D representation; (c) repulsive force model towards the front direction.
Figure 22. Rear side object detection: (a) obstacle in the rear direction of the drone; (b) its 2D representation; (c) repulsive force model towards the front direction.
Applsci 15 08791 g022
Figure 23. Right side object detection: (a) obstacle in the right direction of the drone; (b) its 2D representation; (c) repulsive force model towards the left direction.
Figure 23. Right side object detection: (a) obstacle in the right direction of the drone; (b) its 2D representation; (c) repulsive force model towards the left direction.
Applsci 15 08791 g023
Figure 24. Example of detection of objects in the rear and left sides: (a) back–left sides and (b) its 2D representation.
Figure 24. Example of detection of objects in the rear and left sides: (a) back–left sides and (b) its 2D representation.
Applsci 15 08791 g024
Figure 25. Summary of resultant repulsive force of different cases and directions: (a) for obstacle in the front and left side, repulsive force towards back–right; (b) for obstacle in the front and right side, repulsive force towards back–left; (c) for obstacle in the back and left side, repulsive force towards front–right; (d) for obstacle in the back and right side, repulsive force towards front–left.
Figure 25. Summary of resultant repulsive force of different cases and directions: (a) for obstacle in the front and left side, repulsive force towards back–right; (b) for obstacle in the front and right side, repulsive force towards back–left; (c) for obstacle in the back and left side, repulsive force towards front–right; (d) for obstacle in the back and right side, repulsive force towards front–left.
Applsci 15 08791 g025aApplsci 15 08791 g025b
Figure 26. User command modification based on virtual repulsive force for two different obstacle scenarios.
Figure 26. User command modification based on virtual repulsive force for two different obstacle scenarios.
Applsci 15 08791 g026
Figure 27. Example of drone reacting to a complex obstacle setup: (a) the drone starts from a safe zone and begins moving toward the nearby shelf; (b) the developed module detects the shelf and identifies it as a potential obstacle, resulting in a speed reduction; (c) upon reaching a critical point, the drone begins to move in the opposite direction of the obstacle; (d) the drone continues its motion under a relatively high repulsive force; (e) it keeps moving in the opposite direction until the safe zone is detected again, although the repulsive force is relatively low at this position, causing the drone to move slowly; (f) after re-entering the safe zone, the drone resumes motion based on the manual command.
Figure 27. Example of drone reacting to a complex obstacle setup: (a) the drone starts from a safe zone and begins moving toward the nearby shelf; (b) the developed module detects the shelf and identifies it as a potential obstacle, resulting in a speed reduction; (c) upon reaching a critical point, the drone begins to move in the opposite direction of the obstacle; (d) the drone continues its motion under a relatively high repulsive force; (e) it keeps moving in the opposite direction until the safe zone is detected again, although the repulsive force is relatively low at this position, causing the drone to move slowly; (f) after re-entering the safe zone, the drone resumes motion based on the manual command.
Applsci 15 08791 g027aApplsci 15 08791 g027b
Figure 28. Example of flight experiment: (a) drone moving towards the object on its left side; (b) the developed system detects the object as a potential obstacle and orders the drone to move against it; (c) drone moving towards the opposite direction to prevent collision.
Figure 28. Example of flight experiment: (a) drone moving towards the object on its left side; (b) the developed system detects the object as a potential obstacle and orders the drone to move against it; (c) drone moving towards the opposite direction to prevent collision.
Applsci 15 08791 g028
Figure 29. Obstacle avoidance test with a flat and smooth surface (white board): (a) real flight scenario with the obstacle; (b) recorded flight trajectory showing avoidance behavior.
Figure 29. Obstacle avoidance test with a flat and smooth surface (white board): (a) real flight scenario with the obstacle; (b) recorded flight trajectory showing avoidance behavior.
Applsci 15 08791 g029
Figure 30. Obstacle avoidance test with a complex object (left-side shelf): (a) real flight scenario with the shelf obstacle; (b) recorded flight trajectory showing the drone’s avoidance response.
Figure 30. Obstacle avoidance test with a complex object (left-side shelf): (a) real flight scenario with the shelf obstacle; (b) recorded flight trajectory showing the drone’s avoidance response.
Applsci 15 08791 g030
Figure 31. Obstacle avoidance test with a single major obstacle (right-side box): (a) real flight scenario with the box obstacle; (b) recorded flight trajectory demonstrating avoidance behavior.
Figure 31. Obstacle avoidance test with a single major obstacle (right-side box): (a) real flight scenario with the box obstacle; (b) recorded flight trajectory demonstrating avoidance behavior.
Applsci 15 08791 g031
Figure 32. Obstacle avoidance test with two consecutive obstacles: (a) real flight scenario showing the obstacle setup; (b) recorded flight trajectory illustrating the drone’s path adjustment in response to both obstacles.
Figure 32. Obstacle avoidance test with two consecutive obstacles: (a) real flight scenario showing the obstacle setup; (b) recorded flight trajectory illustrating the drone’s path adjustment in response to both obstacles.
Applsci 15 08791 g032
Table 1. Sensor technical specifications [27].
Table 1. Sensor technical specifications [27].
FeatureDetail
Model numberVL53L8CX
ManufacturerSTMicroelectronics
PackageOptical LGA16
Size23 × 13 × 1.75 mm
Ranging2 cm–400 cm (per zone)
Operating voltage3.3 V (AVDD), 1.2/1.8 V (IOVDD)
Power consumption~46 nW
Operating temperature−30 °C to 85 °C
Sample rateUp to 60 Hz
FoV45 × 45 degree
I2C and SPI interfaceI2C (1 MHz), SPI (3 MHz)
Table 2. PX4 development Kit-X500 v2 drone specifications [31].
Table 2. PX4 development Kit-X500 v2 drone specifications [31].
FeatureDetail
Frame typeX-type carbon fiber frame
Wheelbase500 mm diagonal motor-to-motor
Frame weight~610 g
Propeller size10 × 4.5 inch (plastic)
Motors4 × 2216/880 KV brushless motors
ESC4 × 20A ESC (BLHeli-S firmware, DShot capable)
Flight controllerPixhawk 6C with PX4 open-source firmware
Power distribution boardIntegrated (Holybro PM06 V2.1)
Battery recommendation4S LiPo (14.8 V), typically 3000–5000 mAh
Payload capacityUp to 1 kg (recommended for stable flight)
Table 3. SBUS frame structure.
Table 3. SBUS frame structure.
ByteDataDescription
0Start byte (0 × 0 F)Indicates the beginning of an SBUS frame
1–22Channel data (22 bytes)Encodes 16 channels (11 bits each)
23Flag byteFail safe/frame lost
24End byte (0 × 00)Indicates end of the SBUS frame
Each channel value is an 11-bit integer ranging from 360 to 1673. The midpoint value, 1016, typically corresponds to a neutral or stationary control input.
Table 4. Pseudo-code of the proposed collision avoidance algorithm.
Table 4. Pseudo-code of the proposed collision avoidance algorithm.
Pseudo-Code for Object Detection and Collision Avoidance Algorithm:
1Initialize sensorArray [sensor1, sensor2, …, sensor8]
2loop ⟶ Repeats periodically during flight
3 # Get original user command raw distance measurements
4 Input: U c = get_user_control_signal
5 for each sensor in sensorArray do:
6 raw_data = sensor.get_distance_data
7 # Filter and compute minimum distance of angular segment
8 D n = Minimum of angular segment distance data (raw_data)
9 # Get obstacle direction
10 Θ n = sensor.get_direction
11 # Compute repulsive force using gaussian model
12 f n = K exp D n 2 2 σ 2 cos Θ n + π sin Θ n + π
13 end for
14 # Calculate total repulsive force vector
15 F t = 1 N n = 1 N f n
16 # Modify control signals to adjust flight direction and speed
17 U m o d = U c + F t
18 apply_control_signal ( ( U m o d ) )
19end loop
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gerencheal, T.A.; Lee, J.H. Development of a Collision Avoidance System Using Multiple Distance Sensors for Indoor Inspection Drone. Appl. Sci. 2025, 15, 8791. https://doi.org/10.3390/app15168791

AMA Style

Gerencheal TA, Lee JH. Development of a Collision Avoidance System Using Multiple Distance Sensors for Indoor Inspection Drone. Applied Sciences. 2025; 15(16):8791. https://doi.org/10.3390/app15168791

Chicago/Turabian Style

Gerencheal, Teklay Asmelash, and Jae Hoon Lee. 2025. "Development of a Collision Avoidance System Using Multiple Distance Sensors for Indoor Inspection Drone" Applied Sciences 15, no. 16: 8791. https://doi.org/10.3390/app15168791

APA Style

Gerencheal, T. A., & Lee, J. H. (2025). Development of a Collision Avoidance System Using Multiple Distance Sensors for Indoor Inspection Drone. Applied Sciences, 15(16), 8791. https://doi.org/10.3390/app15168791

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop