Next Article in Journal
MSA-YOLO: A Remote Sensing Object Detection Model Based on Multi-Scale Strip Attention
Next Article in Special Issue
Resilient Multi-Sensor UAV Navigation with a Hybrid Federated Fusion Architecture
Previous Article in Journal
Magnesium Ingot Stacking Segmentation Algorithm for Industrial Robot Based on the Correction of Image Overexposure Area
Previous Article in Special Issue
Computational Study of a Motion Sensor to Simultaneously Measure Two Physical Quantities in All Three Directions for a UAV
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Review on Type of Sensors and Detection Method of Anti-Collision System of Unmanned Aerial Vehicle

by
Navaneetha Krishna Chandran
1,
Mohammed Thariq Hameed Sultan
1,2,3,*,
Andrzej Łukaszewicz
4,*,
Farah Syazwani Shahar
2,
Andriy Holovatyy
5 and
Wojciech Giernacki
6
1
Laboratory of Biocomposite Technology, Institute of Tropical Forestry and Forest Products (INTROP), University Putra Malaysia, Serdang 43400, Selangor Darul Ehsan, Malaysia
2
Department of Aerospace Engineering, Faculty of Engineering, University Putra Malaysia, Serdang 43400, Selangor Darul Ehsan, Malaysia
3
Aerospace Malaysia Innovation Centre (944751-A), Prime Minister’s Department, MIGHT Partnership Hub, Jalan Impact, Cyberjaya 63000, Selangor Darul Ehsan, Malaysia
4
Institute of Mechanical Engineering, Faculty of Mechanical Engineering, Bialystok University of Technology, 15-351 Bialystok, Poland
5
Department of Computer-Aided Design Systems, Lviv Polytechnic National University, 79013 Lviv, Ukraine
6
Institute of Robotics and Machine Intelligence, Faculty of Control, Robotics and Electrical Engineering, Poznan University of Technology, 60-965 Poznan, Poland
*
Authors to whom correspondence should be addressed.
Sensors 2023, 23(15), 6810; https://doi.org/10.3390/s23156810
Submission received: 28 June 2023 / Revised: 18 July 2023 / Accepted: 27 July 2023 / Published: 30 July 2023
(This article belongs to the Special Issue New Methods and Applications for UAVs)

Abstract

:
Unmanned aerial vehicle (UAV) usage is increasing drastically worldwide as UAVs are used in various industries for many applications, such as inspection, logistics, agriculture, and many more. This is because performing a task using UAV makes the job more efficient and reduces the workload needed. However, for a UAV to be operated manually or autonomously, the UAV must be equipped with proper safety features. An anti-collision system is one of the most crucial and fundamental safety features that UAVs must be equipped with. The anti-collision system allows the UAV to maintain a safe distance from any obstacles. The anti-collision technologies are of crucial relevance to assure the survival and safety of UAVs. Anti-collision of UAVs can be varied in the aspect of sensor usage and the system’s working principle. This article provides a comprehensive overview of anti-collision technologies for UAVs. It also presents drone safety laws and regulations that prevent a collision at the policy level. The process of anti-collision technologies is studied from three aspects: Obstacle detection, collision prediction, and collision avoidance. A detailed overview and comparison of the methods of each element and an analysis of their advantages and disadvantages have been provided. In addition, the future trends of UAV anti-collision technologies from the viewpoint of fast obstacle detection and wireless networking are presented.

1. Introduction

General Visual Inspection (GVI) is a typical approach for quality control, data collection, and analysis. It involves using basic human senses such as vision, hearing, touch, smell, and non-specialized inspection equipment. Unmanned aerial systems (UAS), also known as UAVs, are being developed for automated visual inspection and monitoring in various industrial applications [1]. These systems consist of UAVs outfitted with the appropriate payload and sensors for the job at hand [2].
Sensor and measurement reliance is crucial for UAV operations and functionality, as they serve as indispensable resources to ensure the safety and security of UAVs. Since UAVs operate autonomously without a pilot’s input, a series of sensors and systems are required for the UAVs to position themselves. Usually, UAVs use a global positioning system (GPS) to position themselves. However, GPS input will not always be accurate, especially when the UAV has to be equipped with sensors such as rangefinders, which are very useful when the UAV flies at low altitudes. The investigation of the quadcopter control problem came to a standstill until relatively recently, since the control of four separate motor-based propulsion systems was nearly impossible without modern electronic equipment. These technologies have only become increasingly sophisticated, versatile, quick, and affordable in the past several decades.
Due to the intricacy of the issue, controlling a quadcopter is a topic that is both intriguing and important. The fact that the system has just four inputs (the angular velocity of the propellers) despite having six degrees of freedom (three rotational axes and three transnational axes) gives the system the quality of being under-actuated [3]. Even though some of them have more than six inputs, they all have the same number of axes to manipulate, meaning they are all under-actuated. This is because all those inputs can only directly control the three rotation axes, not the translation axis [4].
Additionally, the dynamics on which this form of UAV operates give freedom in movement and robustness towards propulsion problems. This sort of UAV is ideal for reconnaissance missions. As an illustration, control algorithms may be programmed so that a UAV can keep its stability even if fifty percent of the propellers that control one axis of rotation stop working correctly. On the other hand, since it is an airborne vehicle, the frictions of the chassis are almost non-existent, and the control algorithm is responsible for handling the damping.
A UAV’s level of autonomy is defined by its ability to perform a set of activities without direct human intervention [5]. Different kinds of onboard sensors allow unmanned vehicles to make autonomous decisions in real time [6,7,8]. Demand for unmanned vehicles is rising fast because of the minimal danger to human life, enhanced durability for more extended missions, and accessibility in challenging terrains. Still, one of the most difficult problems to address is planning their course in unpredictable situations [9,10,11]. The necessity for an onboard system to prevent accidents with objects and other vehicles is apparent, given their autonomy and the distances they may travel from base stations or their operators [12,13].
Whether a vehicle is autonomous or not, it must include a collision avoidance system. Several potential causes of collisions include operator/driver error, machinery failure, and adverse environmental factors. According to statistics provided by planecrashinfo.com, over 58% of fatal aviation crashes occurred due to human mistakes between January 1960 and December 2015 [14]. To reduce the need for human input, the autopilot may be upgraded with features like object recognition, collision avoidance, and route planning. Methods of intelligent autonomous collision avoidance have the potential to contribute to making aircrafts even safer and saving lives.
The exponential growth in UAVs using in public spaces has made a necessity for sophisticated and highly dependable collision avoidance systems evident and incontestable from the public safety perspective. UAVs can access risky or inaccessible locations without risking human lives. Therefore UAVs should be built to operate independently and avoid crashing into anything while in flight [15]. Precision agriculture is an application of UAVs that has been increasing rapidly worldwide. Precision agriculture is expanding quickly in commercial goods and research and development applications. In order to correctly account for the geographical and temporal fluctuations of crop and soil components, this revolutionary trend is redefining the crop management system and placing a higher focus on data collecting and analysis, whether in real-time or offline.
Figure 1 shows the basic architecture of an anti-collision system that will be implemented in a vehicle. Anti-collision systems consist of two major parts: the input and output [15]. These parts can also be recognized as perspective and action. Any system designed to prevent accidents from happening must begin with perception, or more specifically, obstacle detection [16]. At this stage, sensors gather information about the surrounding area and locate any hazards. However, the active part comes after the perspective, where once the threat has been detected, the situation will be analyzed by the computation of the control system of the UAVs. As a result, the actuators will implement proper countermeasures to avoid the hazard [17].
Sensors come in a wide variety, but they may be broken down into two broad categories: active and passive. The backscatter is measured by an active sensor with its own source that sends out a beam of light or a wave. On the other hand, passive sensors can only estimate the energy emitted by an item, such as sunlight reflected off the object. Anti-collision systems use a total of four different approaches in detecting the hazards, which are geometric (using the UAV’s and obstacles’ positions and velocities to reformat nodes, typically via trajectory simulation), force-field (manipulating attractive and repulsive forces to avoid collisions), optimized (using the known parameters of obstacles to find the most efficient route), and sense-and-avoid (making avoidance decisions at runtime based on sensing the environment) [18,19].
The complexity of collision avoidance systems may vary from as simple as alerting the vehicle’s pilot to be involved to wholly or partly taking control of the system on its own to prevent the accident [20]. For an unmanned vehicle to travel without direct human intervention, it must be equipped with several specialized systems that identify obstacles, prevent collisions, plan routes, determine their exact location, and implement the necessary controls [21]. Multiple UAVs provide substantial benefits over single UAVs. They are in high demand for a wide range of applications, including military and commercial usage, search and rescue, traffic monitoring, threat detection (particularly near borders), and atmospheric research [22,23,24]. UAVs may struggle to complete missions in a demanding dynamic environment due to cargo restrictions, power constraints, poor vision due to weather, and difficulties in remote monitoring. To ensure unmanned vehicles’ success and safe navigation, the robotics community is working tirelessly to overcome these difficulties and deliver the technical level fit for challenging settings [25,26,27,28].
One of the most challenging problems for autonomous vehicles is detecting and avoiding collisions with objects, which becomes much more critical in dynamic situations with several UAVs and moving obstacles [29]. Sensing is the initial process in which the system takes data from its immediate environment. When an impediment enters the system’s field of view, the detection stage performs a risk assessment. To prevent a possible collision, the collision avoidance module calculates how much of a detour has to be made from the original route. Once the system has completed its calculations, it will execute the appropriate move to escape the danger safely.

2. Obstacle Detection Sensors

The drone needs a “perspective model” of its environment to avoid crashing into obstacles [30,31]. To do this, the UAV must have a perception unit consisting of one or more sensors [32]. Sensors, like imaging sensors of varying resolutions, are crucial components of remote sensing systems. Sensors may be used in a wide variety of contexts. LiDAR, visible cameras, thermal or infrared cameras, and solid-state or mechanical devices are all examples of sensors that may be used for monitoring [27,33]. The sensors that have been used for the anti-collision system are majorly categorized into two, which are active sensors and passive sensors. In Figure 2, the categorization of the anti-collision system sensors is shown.

2.1. Active Sensors

Sensing using active sensors involves emitting radiation and then detecting the reflected radiation. All the necessary components, including the source and the detector, are built within an active sensor. A sensor works by having a transmitter send out some signal (light, electricity, sound) that then gets reflected off of whatever it is being used to detect [34,35]. Most of these sensors operate in the spectrum’s microwave range, allowing them to penetrate the atmosphere under most circumstances. The metrics of interest of the obstacles, such as distance and angles, may be adequately returned by such sensors since they have a short reaction time, need less processing power, can scan more significant regions quickly, and are less impacted by weather and lighting conditions. In [36], the authors use MMW radar. In their setup, things are detected and followed by watching radar echoes and figuring out how far away they are from the vehicle. Different distances and weather conditions are also used to conclude the performance. Despite the allure, radar-based solutions are either too costly or too heavy to be practical on more miniature robots, such as battery-powered UAVs [37,38].

2.1.1. Radar

A radar sensor transmits a radio wave that will be reflected back to the sensor after hitting an object. The distance between the object and the radar is determined by timing how long it takes the signal to return. Despite their high cost, airborne radar systems are often used for their precision to provide data. Both continuous-wave and pulsed-wave radars exist, with the former emitting a steady stream of linearly modulated (or frequency-modulated) signals and the latter emitting intense but brief bursts of signals; however, both types have blind spots [39]. As a bonus, radars could also track the objects’ speeds and other motion data. For instance, the radar may determine an object’s velocity by measuring how much the frequency of its echo or bounced-off signal changes as it approaches the radar [40].
Using a compact radar, the authors of [40] could get range data in real time, regardless of the weather. The system incorporates a compact radar sensor and an OCAS (obstacle collision avoidance system) computer. OCAS utilizes radar data such as obstacle velocity, azimuth angles, and range to determine avoidance criteria and provide orders to the flight controller to execute the appropriate maneuver to prevent collisions. The findings indicated that with the set safety margins, the likelihood of successfully avoiding a crash is more than 85%, even if there is an inaccuracy in the radar data.
The benefits of integrating radar sensors into UAVs for obstacle identification and for detecting and calculating additional aspects of the observed obstruction, such as the velocity of the obstacle and the angular information utilizing multichannel radars, are thoroughly explored by the authors in [41]. Experiments reveal that with forward-looking radars, with the radar’s simultaneous multi-target range capabilities, it is possible to identify targets across an extensive angular range of 60 degrees in azimuth. For their suggested autonomous collision avoidance system, the authors of [41] used Ultra-Wideband (UWB) collocated MIMO radar. Radar cognition’s capacity to modify the waveform of ultra-wideband multiple-input multiple-output radar transmissions for better detection and, by extension, to steer the UAV by giving an estimate of the collision locations is a significant advantage.

2.1.2. LiDAR

One may compare the operation of a light detection and ranging (LiDAR) sensor to that of a radar. One half of a LiDAR sensor fires laser pulses at the surface(s), while the other half scans their reflection and calculates distance based on how long each pulse takes to return. Rapid and precise data collection is achieved using LiDAR. LiDAR sensors have shrunk in size and shed weight over the years, making it possible to put them on mini and small UAVs [42,43]. LiDAR-based systems are more cost-effective than radar systems, particularly those using 1D and 2D LiDAR sensors.
The designed system was successfully field tested by the authors of [44] using a variety of laser scanners installed on a vehicle, which are laser radars ranging in three dimensions. Regarding 3D mapping and 3D obstacle detection, 3D LiDARs are as standard as it gets in the sensor world [45,46]. Since LiDAR is constantly being moved and ranged, the gathered data is prone to motion distortion, which makes using these devices challenging. To get around this, as proposed by the authors of [45], additional sensors may be used with LiDAR. Only 3D LiDARs allow for precise assessment of an object’s posture.

2.1.3. Ultrasonic

To determine an item’s distance, ultrasonic sensors transmit sound waves and then analyze the echoes they receive [47]. The sound waves produced are outside the range humans can hear (25 to 50 kilohertz) [48]. Compared to other types of range sensors, ultrasonic sensors are both more affordable and widely accessible. The object’s transparency does not affect ultrasonic sensors, unlike LiDARs. Unlike ultrasonic sensors, which are color-blind, LiDARs have trouble identifying transparent materials like glass. However, the sonic sensor will not provide accurate readings if the item reflects the sound wave in the opposite direction than the receiver or if the substance has the properties of absorbing sound.
Like radars and LiDARs, this method relies on emitting a wave, waiting for the reflected wave to return, and then calculating the distance based on the time difference between the two. Compared to other types of range sensors, ultrasonic sensors are both more accessible and more affordable. Since each sensor in Table 1 has its advantages and disadvantages compared to the others, it is clear that more than one sensor can be employed to provide complete protection against the collision avoidance issue. Multiple sensors may be utilized to cover a greater area and eliminate blind spots, or different kinds of sensors can be fused to create a super sensor whose weaknesses cancel out those of its components.
According to Table 1, the LiDAR and ultrasonic sensors, which can be used in the UAV’s anti-collision system, are smaller than radar. This makes the ultrasonic and LiDAR the ideal method of obstacle sensing for small UAVs, as they are less in weight, reducing the UAV’s payload. In addition, the power consumption by ultrasonic and LiDAR is also low compared to radar. However, the accuracy and range of the radar are highest compared to ultrasonic and LiDAR, which makes the radar suitable for use in large UAVs that fly at high altitudes. On the other hand, the radar is not affected by weather conditions, but the LiDAR is affected, while ultrasonic is slightly affected by the weather condition. Last but not least, the cost of an ultrasonic sensor is the lowest compared to radar and LiDAR, which makes it more affordable.

2.2. Passive Sensors

The energy the seen items or landscape gives off is measured using passive sensors. Optical cameras, infrared (IR) cameras, and spectrometers are the most common types of passive sensors now used in sensing applications [49]. Wide varieties of cameras, each optimized for a specific wavelength, exist. The authors of [50] offer a system for acoustic signal tracking and real-time vehicle identification. The result is obtained by isolating the resilient spatial characteristics from the noisy input and then processing them using sequential state estimation. They provide empirical acoustic data to back up the suggested technique.
In contrast, thermal or infrared cameras operate in the infrared light range and have a larger wavelength than the visible light range. Therefore, the primary distinction between the two is that visual cameras use visible light to create a picture, while thermal cameras use infrared radiation. Ordinary cameras struggle when light levels are low, while IR cameras thrive [51]. It takes more computational resources since an additional algorithm is required to extract points of interest in addition to the algorithm already needed to calculate the range and other characteristics of the barriers [52]. Vision cameras are susceptible to environmental factors, including sunlight, fog, and rain, in addition to the field-of-view restrictions imposed by the sensor being employed [53,54].

2.2.1. Optical

Taking pictures of the world around us is the foundation of visual sensors and cameras, which then utilize those pictures to extract information. There are three main types of optical cameras: monocular, stereo, and event-based [55,56,57]. Using cameras has several advantages, including their compact size, lightweight, low power consumption, adaptability, and simple mounting. Some drawbacks of employing such sensors include their sensitivity to lighting and background color changes and their need for clear weather. When any of these conditions are present, the recorded image’s quality plummets, significantly influencing the final product.
According to [58], a monocular camera may be used to identify obstacles in the path of a ground robot. Coarse obstacle identification in the bottom third of the picture is achieved by an enhanced Inverse Perspective Mapping (IPM) with a vertical plane model; however, this method is only suitable for slow-moving robots. Using stereo cameras is one method proposed by the authors of [59]. In stereo cameras, absolute depth is determined by combining internal and external camera characteristics, unlike in monocular cameras. The amount of processing power needed rises when stereo images are used. Because of the high processing cost and the need to accommodate highly complex systems with six degrees of freedom, like drones, the authors solve this problem by dividing the collected pictures into nine zones.

2.2.2. Infrared

Sensors operating in the infrared spectrum, such as those used in infrared (IR) cameras, are deployed when ambient light is scarce. They may also be used with visual cameras to compensate for the latter’s lackluster performance, particularly at night. Data from a thermal camera may be analyzed by automatically determining the image’s orientation by extracting fake control points due to the thermal camera’s output being hazy and distorted with lesser resolution than that of an RGB camera [60].

3. Obstacle Detection Method

Both reactive and deliberative planning frameworks may be used for collision avoidance. During management by reaction, the UAV is equipped with onboard sensors to collect data about its immediate environment and behave accordingly. It facilitates instantaneous responses to changing environmental conditions. An alternative navigational strategy may be necessary if reactive control leads to a local minimum and becomes trapped there. The method of decision-making used by autonomous commercial cars will determine their level of safety and sanity. By dynamically connecting rear anti-collision elements, a driving decision network built on an actor-critic architecture has been developed to ensure safe driving. To interpret sensor data efficiently, this network considers the effects of different elements on collision prevention, such as rearward target detection, safety clearance, and vehicle roll stability. This has been accomplished by creating an improved reward function that considers these factors inside a multi-objective optimization framework. The network attempts to improve collision avoidance skills and guarantee the safety and stability of the vehicle by thoroughly examining these parameters. The force-field method, geometry, optimization-based methods, and sense-and-avoid techniques are the four main approaches to collision avoidance algorithms, as shown in Figure 3.

3.1. Force-Field Method

Using the idea of a repulsive or attractive force field, force-field techniques (also called potential field methods) may steer a UAV away from obstruction or draw it closer to a target [61,62]. Instead of using physical barriers, the authors of [63] propose using a potential field to surround a robot. In order to determine the shortest route between two places, the authors of [64] suggest using an artificial potential field. The points that create repulsive and attractive pressures for the robot are the obstacles and the targets, respectively.
The authors of [65] suggested a new artificial potential field technique to generate optimum collision-free paths in dynamic environments with numerous obstacles, where other UAVs are also treated as moving obstacles. This method is dubbed an improved curl-free vector field. Although simulations confirmed the method’s viability, more validation in 3D settings with static and dynamic factors is required [66]. Regarding UAV navigation in 3D space, the authors of [67] describe an artificial potential field technique that has been improved to produce safe and smooth paths. By factoring in the behavior of other UAVs and their interactions, the proposed optimized artificial potential field (APF) algorithm improves the performance of standard APF algorithms. During route planning, the algorithm considers other UAVs to be moving obstacles.
A vehicle collision avoidance algorithm is provided in [68], using synthetic potential fields. The algorithm considers the relative velocities of the cars and the surrounding traffic to decide whether to slow down or speed up to pass another vehicle. This decision is based on the size and the form of the potential fields of the barriers. Too big of a time step might lead to collisions or unstable behavior, so getting it exactly right is essential. A 1D virtual force field approach is proposed for moving obstacle detection [69]. They argue that the inability to account for the barriers’ mobility causes the efficiency loss seen with conventional obstacle force field approaches.

3.2. Sense and Avoid Method

In order to control the flight path of each UAV in a swarm without information about the plans of other drones, with fast response time, sense-and-avoid techniques focus on reducing the computational power required by simplifying the collision avoidance process to individual detection and avoidance of obstacles. Methods based on “Sense and Avoid” The speed with which collision avoidance can respond makes it a good tool for complex contexts. A robot or agent is outfitted with several sensing technologies, including LiDAR, sonar, and radar. Although it cannot distinguish between different objects, radar can quickly respond to anything that enters its field of view [69,70,71].
In [72], the authors suggest a technique for categorizing objects as static or dynamic using 2D LiDAR data. Additionally, the program can provide rough estimates of the speeds of the moving obstructions. In [73], the authors use a computer vision method to implement an animal detection and collision-avoidance system. The team has trained its system with over 2200 photos and tested it with footage of animals in traffic. In [74], the authors implement a preset neural network module in MATLAB to operate with five ultrasonic (US) sensors to triangulate and determine objects’ exact location and form. They use three distinct shapes in their evaluations. To accomplish object recognition and avoidance, the inventors of [75] fused a US sensor with a binocular stereo-vision camera. Using stereo vision as the primary method, a new route is constructed via an algorithm based on the Rapidly Explored Random Tree (RRT) scheme.

3.3. Geometric Method

To ensure that the predetermined minimum distances between agents, such as UAVs, are not violated, geometric techniques depend on studying geometric features. The UAVs’ separation distances and travel speeds have been used to calculate the time remaining until a collision occurs. In [76], the authors provide an analytical method for resolving the planar instance of the issue of aircraft collision. We can find closed-form analytical solutions for the best possible sets of orders to end the dispute by analyzing the trajectories’ geometric properties.
In [77], conflict avoidance in a 3D environment is accomplished by using information such as the aircraft’s coordinates and velocities in conjunction with a mixed geometric and collision cone technique. However, the authors depend on numerical optimization techniques for the most common scenarios and only get analytical conclusions for specific circumstances. The paper [78] investigates UAV swarms that use geometry-based collision avoidance techniques. The suggested method integrates line-of-sight vectors with relative velocity vectors to consider a formation’s dynamic limitations. Each UAV may assess if the formation can be maintained while avoiding collisions by computing a collision envelope and using that information to determine the potential directions for avoiding collisions.
In [79], the authors combined geometric avoidance and the selection of start time from critical avoidance to provide a novel approach to collision avoidance based on kinematics, the risk of collisions, and navigational constraints. Instead of trying to avoid all of the barriers simultaneously, FGA may prioritize which obstacles must be avoided first, depending on how much time must pass before they can be safely passed. The authors of [80] developed a way to safely pilot UAVs from the beginning of a mission to its completion while ensuring that the vehicles stay on their intended course and avoid potential hazards. The authors offer a solution that individually tackles the system’s collision avoidance control and trajectory control and then merges them via a planned movement strategy.

3.4. Optimization Method

Methods based on optimization need geospatial data for the formulation of the avoidance trajectory. Probabilistic search algorithms aim to offer the most productive locations to conduct a search, given the level of uncertainty associated with that information. Different optimization techniques, such as those inspired by ants, genetic algorithms, gradient descent-based approaches, particle swarm optimization, greedy methods, and local approximations, have been developed to handle the enormous computing demands of these algorithms.
For instance, to successfully calculate optimum collision-free search pathways for UAVs under communication-related limitations, the authors of [81] use a minimum time search method with ant colony optimization. The authors of [82] provide a prediction technique for the next UAV coordinates based on the set of probable instructions the UAV will execute in the near future. After considering the destination coordinates and the UAV’s current location, the algorithm generates a cost function for the best trajectory. Using particle swarm optimization, a novel technique for autonomous vehicle route planning in the wild. This strategy uses the sensor data by giving various kinds of territory different weights, then using those weights to categorize the possible paths across the landscape.

3.5. Summary of Object Detection Method

Table 2 summarizes previous research studies on detection and anti-collision system. From Table 2, it can be concluded that the geometric detection and force field methods are suitable for long-range UAVs. However, the sense and avoid method is suitable for short-range UAVs. The compatibility of real-time detection in four detection methods allows the UAVs to analyze the surroundings and be more varied about the surrounding. The 3D compatibility in geometric, optimization, and sense-and-avoid methods allows the system to generate a 3D mapping around the surroundings, allowing the maneuvering to be more precise in the UAVs.
Other than these obstacle detection methods, which involve their implementation, many obstacle detection methods are being developed around the world. One obstacle detection method is neural network-based navigation. Human decisions about these types of motions may be observed in various situations, including those with randomly produced barriers and pertinent environmental data [84]. In comparison to human decision-making, the simulation results showed that the suggested method had a high estimation accuracy rate of almost 90%. In contrast to the adaptive project framework (APF) method, the neural network methodology demonstrated its usefulness by successfully navigating over obstacles without running into the local minimum problem, hence emphasizing the strength of neural network decision-making.

4. Conclusions

Analyzing this short review on the sensor type and detection method of anti-collision systems of UAVs, the selection of sensors and detection method mainly depends on the UAV type and the objective of the UAV mission. The table below presents the research gap and the stigmatization of the research review identified through the literature review.
In this context, the recommended method of detection in an anti-collision system in a UAV depends on the UAV’s specification and the UAV’s mission objective. Methods of obstacle detection using geometric are considered effective, where they are capable of 3 dimensions projection alternate route generation, and multiple UAV compatibility. However, they cannot communicate with ground control. The geometric object detection method basically uses input from GPS in order to position the UAV itself. This detection method is suitable in urban areas, where there will be strong GPS signals. However, strong GPS signals may not be found in rural areas, especially in plantation areas, where UAVs’ applications have rapidly increased in agricultural applications. When the GPS signal strength is low, the UAV cannot position itself accurately. Hence, optimization and sense and avoid methods will be more suitable in this case than geometric object detection methods. More specifically, optimization and sense-and-avoid detection methods are suitable for UAVs that fly at low altitudes; however, the geometric is ideal for high-altitude and long-range UAVs.
On the other hand, the force field detection method is more suitable in an environment consisting of multiple UAVs, where the UAV can sense the electromagnetic emission from other UAVs. However, although the force field method is the same as the geometric method, where it is suitable for long-range UAVs, it is not suitable for urban areas because there will be a lot of electromagnetic wave interference, eventually affecting the force field detection method. This literature review gives a better understanding of the anti-collision system within a UAV. It allows the optimization of anti-collision systems according to the UAV in which the anti-collision system will be implemented.

Author Contributions

Conceptualization, N.K.C. and A.H.; writing—original draft preparation, N.K.C. and A.H.; writing—review and editing, F.S.S., M.T.H.S. and A.Ł.; visualization, N.K.C. and A.H.; supervision, M.T.H.S., A.Ł. and W.G.; project administration, M.T.H.S., F.S.S., W.G. and A.Ł.; funding acquisition, A.Ł. and M.T.H.S. All authors have read and agreed to the published version of the manuscript.

Funding

The authors would like to thank Ministry of Higher Education, Malaysia for the financial support through Higher Institution Centre of Excellence (HICoE) research grant (Vot number 6369119).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data sharing does not apply to this article as no new data were created or analyzed in this study.

Acknowledgments

The authors would like to thank the Department of Aerospace Engineering, Faculty of Engineering, Universiti Putra Malaysia; the Laboratory of Biocomposite Technology, Institute of Tropical Forestry and Forest Product (INTROP–HICOE), Universiti Putra, Malaysia, (HICOE); the Institute of Mechanical Engineering, Bialystok University of Technology, Poland; the Department of Computer-Aided Design Systems, Lviv Polytechnic National University, Ukraine; and also the Institute of Robotics and Machine Intelligence, Faculty of Control, Robotics and Electrical Engineering, Poznan University of Technology, Poland, for their close collaboration in this study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zosimovych, N. Preliminary design of a VTOL unmanned aerial system for remote sensing of landscapes. Aeron Aero Open Access J. 2020, 4, 62–67. [Google Scholar] [CrossRef]
  2. Papa, U.; Ponte, S. Preliminary Design of an Unmanned Aircraft System for Aircraft General Visual Inspection. Electronics 2018, 7, 435. [Google Scholar] [CrossRef] [Green Version]
  3. Giernacki, W.; Gośliński, J.; Goślińska, J.; Espinoza-Fraire, T.; Rao, J. Mathematical Modeling of the Coaxial Quadrotor Dynamics for Its Attitude and Altitude Control. Energies 2021, 14, 1232. [Google Scholar] [CrossRef]
  4. Gheorghiţă, D.; Vîntu, I.; Mirea, L.; Brăescu, C. Quadcopter control system. In Proceedings of the 2015 19th International Conference on System Theory, Control and Computing (ICSTCC), Cheile Gradistei, Romania, 14–16 October 2015; pp. 421–426. [Google Scholar]
  5. Huang, H.-M. Autonomy levels for unmanned systems (ALFUS) framework: Safety and application issues. In Proceedings of the 2007 Workshop on Performance Metrics for Intelligent Systems, Washington, DC, USA, 28–30 August 2007. [Google Scholar]
  6. Zhang, W.; Zelinsky, G.; Samaras, D. Real-time accurate object detection using multiple resolutions. In Proceedings of the 2007 IEEE 11th International Conference on Computer Vision, Rio De Janeiro, Brazil, 14–21 October 2007. [Google Scholar]
  7. Holovatyy, A.; Teslyuk, V.; Lobur, M. VHDL-AMS model of delta-sigma modulator for A/D converter in MEMS interface circuit. Perspective Technologies and Methods In MEMS Design. In Proceedings of the MEMSTECH 2015—11th International Conference, Lviv, Ukraine, 2–6 September 2015; pp. 55–57. [Google Scholar] [CrossRef]
  8. Holovatyy, A.; Lobur, M.; Teslyuk, V. VHDL-AMS model of mechanical elements of MEMS tuning fork gyroscope for the schematic level of computer-aided design. Perspective Technologies and Methods. In Proceedings of the 2008 International Conference on Perspective Technologies and Methods in MEMS Design, Lviv, Ukraine, 21–24 May 2008; pp. 138–140. [Google Scholar] [CrossRef]
  9. Zhuge, C.; Cai, Y.; Tang, Z. A novel dynamic obstacle avoidance algorithm based on collision time histogram. Chin. J. Electron. 2017, 6, 522–529. [Google Scholar] [CrossRef]
  10. Puchalski, R.; Giernacki, W. UAV Fault Detection Methods, State-of-the-Art. Drones 2022, 6, 330. [Google Scholar] [CrossRef]
  11. Bondyra, A.; Kołodziejczak, M.; Kulikowski, R.; Giernacki, W. An Acoustic Fault Detection and Isolation System for Multirotor UAV. Energies 2022, 15, 3955. [Google Scholar] [CrossRef]
  12. Chao, H.; Cao, Y.; Chen, Y. Autopilots for small fixed-wing unmanned air vehicles: A survey. In Proceedings of the 2007 International Conference on Mechatronics and Automation, Harbin, China, 5–8 August 2007. [Google Scholar]
  13. Vijayavargiya, A.; Sharma, A.; Anirudh; Kumar, A.; Kumar, A.; Yadav, A.; Sharma, A.; Jangid, A.; Dubey, A. Unmanned aerial vehicle. Imp. J. Interdiscip. 2016, 2, 1747–1750. [Google Scholar]
  14. Shim, D.; Chung, H.; Kim, H.J.; Sastry, S. Autonomous exploration in unknown urban environments for unmanned aerial vehicles. In Proceedings of the AIAA Guidance, Navigation, and Control Conference and Exhibit, San Francisco, CA, USA, 15–18 August 2005. [Google Scholar]
  15. Mikołajczyk, T.; Mikołajewski, D.; Kłodowski, A.; Łukaszewicz, A.; Mikołajewska, E.; Paczkowski, T.; Macko, M.; Skornia, M. Energy Sources of Mobile Robot Power Systems: A Systematic Review and Comparison of Efficiency. Appl. Sci. 2023, 13, 7547. [Google Scholar] [CrossRef]
  16. Zhang, A.; Zhou, D.; Yang, M.; Yang, P. Finite-time formation control for unmanned aerial vehicle swarm system with time-delay and input saturation. IEEE Access Pract. Innov. Open Solut. 2019, 7, 5853–5864. [Google Scholar] [CrossRef]
  17. Yasin, J.N.; Mohamed, S.A.S.; Haghbayan, M.-H.; Heikkonen, J.; Tenhunen, H.; Plosila, J. Unmanned aerial vehicles (UAVs): Collision avoidance systems and approaches. IIEEE Access Pract. Innov. Open Solut. 2020, 8, 105139–105155. [Google Scholar] [CrossRef]
  18. Mircheski, I.; Łukaszewicz, A.; Szczebiot, R. Injection process design for manufacturing of bicycle plastic bottle holder using CAx tools. Procedia Manuf. 2019, 32, 68–73. [Google Scholar] [CrossRef]
  19. Kiefer, R.J.; Grimm, D.K.; Litkouhi, B.B.; Sadekar, V. Collision Avoidance System. U.S. Patent 7 245 231 2007, 6 August 2004. [Google Scholar]
  20. Foka, A.; Trahanias, P. Real-time hierarchical POMDPs for autonomous robot navigation. Robot. Auton. Syst. 2020, 55, 561–571. [Google Scholar] [CrossRef] [Green Version]
  21. Murray, R.M. Recent research in cooperative control of multivehicle systems. J. Dyn. Syst. Meas. Control 2007, 129, 571–598. [Google Scholar] [CrossRef] [Green Version]
  22. Ladd, G.; Bland, G. Non-military applications for small UAS platforms. In Proceedings of the AIAA Infotech@Aerospace Conference, Washington, DC, USA, 6–9 April 2009. [Google Scholar]
  23. He, L.; Bai, P.; Liang, X.; Zhang, J.; Wang, W. Feedback formation control of UAV swarm with multiple implicit leaders. Aerosp. Sci. Technol. 2018, 72, 327–334. [Google Scholar] [CrossRef]
  24. Esfahlani, S.S. Mixed reality and remote sensing application of unmanned aerial vehicle in fire and smoke detection. J. Ind. Inf. Integr. 2019, 15, 42–49. [Google Scholar] [CrossRef]
  25. Valavanis, K.P. Unmanned Aircraft Systems: The Current State-of-the Art; Springer: Cham, Switzerland, 2016. [Google Scholar]
  26. Wargo, C.A.; Church, G.C.; Glaneueski, J.; Strout, M. Unmanned Aircraft Systems (UAS) research and future analysis. In Proceedings of the 2014 IEEE Aerospace Conference, Big Sky, MT, USA, 1–8 March 2014. [Google Scholar]
  27. Horla, D.; Giernacki, W.; Báča, T.; Spurny, V.; Saska, M. AL-TUNE: A Family of Methods to Effectively Tune UAV Controllers in In-flight Conditions. J. Intell. Robot Syst. 2021, 103, 5. [Google Scholar] [CrossRef]
  28. Wang, X.; Yadav, V.; Balakrishnan, S.N. Cooperative UAV formation flying with obstacle/collision avoidance. IEEE Trans. Control. Syst. Technol. A Publ. IEEE Control. Syst. Soc. 2007, 15, 672–679. [Google Scholar] [CrossRef]
  29. Łukaszewicz, A.; Panas, K.; Szczebiot, R. Design process of technological line to vegetables packaging using CAx tools. In Proceedings of the 17th International Scientific Conference on Engineering for Rural Development, Jelgava, Latvia, 23–25 May 2018; pp. 871–887. [Google Scholar] [CrossRef]
  30. Łukaszewicz, A.; Szafran, K.; Jóźwik, J. CAx techniques used in UAV design process. In Proceedings of the 2020 IEEE 7th International Workshop on Metrology for AeroSpace (MetroAeroSpace), Pisa, Italy, 22–24 June 2020; pp. 95–98. [Google Scholar] [CrossRef]
  31. Active Sensors. (n.d.). Esa.int. Available online: https://www.esa.int/Education/7.ActiveSensors (accessed on 15 December 2022).
  32. What Is Active Sensor?—Definition. Available online: https://internetofthingsagenda.techtarget.com/defisensor (accessed on 13 March 2020).
  33. Blanc, C.; Aufrère, R.; Malaterre, L.; Gallice, J.; Alizon, J. Obstacle detection and tracking by millimeter wave RADAR. IFAC Proc. Vol. 2004, 37, 322–327. [Google Scholar] [CrossRef]
  34. Korn, B.; Edinger, C. UAS in civil airspace: Demonstrating “sense and avoid” capabilities in flight trials. In Proceedings of the 2008 IEEE/AIAA 27th Digital Avionics Systems Conference, St. Paul, MN, USA, 26–30 October 2008. [Google Scholar]
  35. Owen, M.P.; Duffy, S.M.; Edwards, M.W.M. Unmanned aircraft sense and avoid radar: Surrogate flight testing performance evaluation. In Proceedings of the 2014 IEEE Radar Conference, Cincinnati, OH, USA 19–23 May 2014. [Google Scholar]
  36. Quist, E.B.; Beard, R.W. Radar odometry on fixed-wing small unmanned aircraft. IEEE Trans. Aerosp. Electron. Syst. 2016, 52, 396–410. [Google Scholar] [CrossRef]
  37. Kwag, Y.K.; Chung, C.H. UAV based collision avoidance radar sensor. In Proceedings of the 2007 IEEE International Geoscience and Remote Sensing Symposium, Barcelona, Spain, 23–28 July 2007. [Google Scholar]
  38. Hugler, P.; Roos, F.; Schartel, M.; Geiger, M.; Waldschmidt, C. Radar taking off: New capabilities for UAVs. IEEE Microw. Mag. 2018, 19, 43–53. [Google Scholar] [CrossRef] [Green Version]
  39. Nijsure, Y.A.; Kaddoum, G.; Khaddaj Mallat, N.; Gagnon, G.; Gagnon, F. Cognitive chaotic UWB-MIMO detect-avoid radar for autonomous UAV navigation. IEEE Trans. Intell. Transp. Syst. A Publ. IEEE Intell. Transp. Syst. Counc. 2016, 17, 3121–3131. [Google Scholar] [CrossRef] [Green Version]
  40. Mohamed, S.A.S.; Haghbayan, M.-H.; Westerlund, T.; Heikkonen, J.; Tenhunen, H.; Plosila, J. A survey on odometry for autonomous navigation systems. IEEE Access Pract. Innov. Open Solut. 2019, 7, 97466–97486. [Google Scholar] [CrossRef]
  41. Nashashibi, F.; Bargeton, A. Laser-based vehicles tracking and classification using occlusion reasoning and confidence estimation. In Proceedings of the 2008 IEEE Intelligent Vehicles Symposium, Eindhoven, The Netherlands, 4–6 June 2008. [Google Scholar]
  42. Nüchter, A.; Lingemann, K.; Hertzberg, J.; Surmann, H. 6D SLAM-3D mapping outdoor environments: 6D SLAM-3D Mapping Outdoor Environments. J. Field Robot. 2007, 24, 699–722. [Google Scholar] [CrossRef] [Green Version]
  43. Zhang, J.; Singh, S. Visual-lidar odometry and mapping: Low-drift, robust, and fast. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015. [Google Scholar]
  44. Tahir, A.; Böling, J.; Haghbayan, M.-H.; Toivonen, H.T.; Plosila, J. Swarms of unmanned aerial vehicles—A survey. J. Ind. Inf. Integr. 2019, 16, 100106. [Google Scholar] [CrossRef]
  45. Armingol, J.M.; Alfonso, J.; Aliane, N.; Clavijo, M.; Campos-Cordobés, S.; de la Escalera, A.; del Ser, J.; Fernández, J.; García, F.; Jiménez, F.; et al. Environmental Perception for Intelligent Vehicles; Jiménez, F., Ed.; Butterworth-Heinemann: Oxford, UK, 2018; Volume 2, pp. 23–101. [Google Scholar]
  46. Wang, C.-C.R.; Lien, J.-J.J. Automatic vehicle detection using local features—A statistical approach. IEEE Trans. Intell. Transp. Syst. A Publ. IEEE Intell. Transp. Syst. Counc. 2008, 9, 83–96. [Google Scholar] [CrossRef]
  47. Mizumachi, M.; Kaminuma, A.; Ono, N.; Ando, S. Robust sensing of approaching vehicles relying on acoustic cues. Sensors 2014, 14, 9546–9561. [Google Scholar] [CrossRef] [Green Version]
  48. Kim, J.; Hong, S.; Baek, J.; Kim, E.; Lee, H. Autonomous vehicle detection system using visible and infrared camera. In Proceedings of the 2012 12th International Conference on Control, Automation and Systems, Jeju, Republic of Korea, 17–21 October 2012; pp. 630–634. [Google Scholar]
  49. Kota, F.; Zsedrovits, T.; Nagy, Z. Sense-and-avoid system development on an FPGA. In Proceedings of the 2019 International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta, Gam, USA, 11–14 June 2019. [Google Scholar]
  50. Mcfadyen, A.; Durand-Petiteville, A.; Mejias, L. Decision strategies for automated visual collision avoidance. In Proceedings of the 2014 International Conference on Unmanned Aircraft Systems (ICUAS), Orlando, FL, USA, 27–30 May 2014. [Google Scholar]
  51. Saha, S.; Natraj, A.; Waharte, S. A real-time monocular vision-based frontal obstacle detection and avoidance for low cost UAVs in GPS denied environment. In Proceedings of the 2014 IEEE International Conference on Aerospace Electronics and Remote Sensing Technology, Yogyakarta, Indonesia, 13–14 November 2014. [Google Scholar]
  52. Mejias, L.; McNamara, S.; Lai, J.; Ford, J. Vision-based detection and tracking of aerial targets for UAV collision avoidance. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan, 8–22 October 2010. [Google Scholar]
  53. Mohamed, S.A.S.; Haghbayan, M.-H.; Heikkonen, J.; Tenhunen, H.; Plosila, J. Towards real-time edge detection for event cameras based on lifetime and dynamic slicing. In Advances in Intelligent Systems and Computing; Springer International Publishing: Berlin/Heidelberg, Germany, 2020. [Google Scholar]
  54. Lee, T.-J.; Yi, D.-H.; Cho, D.-I.D. A monocular vision sensor-based obstacle detection algorithm for autonomous robots. Sensors 2016, 16, 311. [Google Scholar] [CrossRef] [Green Version]
  55. Haque, A.U.; Nejadpak, A. Obstacle avoidance using stereo camera. arXiv 2017, arXiv:1705.04114. [Google Scholar]
  56. Hartmann, W.; Tilch, S.; Eisenbeiss, H.; Schindler, K. Determination of the uav position by automatic processing of thermal images. In Proceedings of the ISPRS—International Archives of the Photogrammetry Remote Sensing and Spatial Information Sciences, Melbourne, Australia, 25 August–1 September 2012; pp. 111–116. [Google Scholar]
  57. Gupta, A.; Padhi, R.; Indian Inst of Science Bangalore Dept of Aerospace Engineering. Reactive Collision Avoidance of Using Nonlinear Geometric and Differential Geometric Guidance. J. Guid. Control Dyn. 2010, 34, 303–310. [Google Scholar]
  58. Khatib, O. Real-time obstacle avoidance for manipulators and mobile robots. In Proceedings of the 1985 IEEE International Conference on Robotics and Automation, St. Louis, MS, USA, 25–28 March 2005. [Google Scholar]
  59. Oroko, J.; Nyakoe, G. Obstacle avoidance and path planning schemes for autonomous navigation of a mobile robot: A review. In Proceedings of the Sustainable Research and Innovation Conference, Delft, The Netherlands, 9 May 2014; pp. 314–318. [Google Scholar]
  60. Choi, D.; Lee, K.; Kim, D. Enhanced Potential Field Based Collision Avoidance for Unmanned Aerial Vehicles in a Dynamic Environment. In Proceedings of the AIAA Scitech 2020 Forum, Orlando, FL, USA, 6–10 January 2020. [Google Scholar]
  61. Grodzki, W.; Łukaszewicz, A. Design and manufacture of unmanned aerial vehicles (UAV) wing structure using composite materials. Mater. Werkst. 2015, 46, 269–278. [Google Scholar] [CrossRef]
  62. Sun, J.; Tang, J.; Lao, S. Collision avoidance for cooperative UAVs with optimized artificial potential field algorithm. Sens. IEEE Access Pract. Innov. Open Solut. 2017, 5, 18382–18390. [Google Scholar] [CrossRef]
  63. Wolf, M.T.; Burdick, J.W. Artificial potential functions for highway driving with collision avoidance. In Proceedings of the 2008 IEEE International Conference on Robotics and Automation, Pasadena, CA, USA, 19–23 May 2008. [Google Scholar]
  64. Kim, C.Y.; Kim, Y.H.; Ra, W.-S. Modified 1D virtual force field approach to moving obstacle avoidance for autonomous ground vehicles. J. Electr. Eng. Technol. 2019, 14, 1367–1374. [Google Scholar] [CrossRef]
  65. Yasin, J.N.; Mohamed, S.A.S.; Haghbayan, M.-H.; Heikkonen, J.; Tenhunen, H.; Plosila, J.M. Navigation of autonomous swarm of drones using translational coordinates. In Advances in Practical Applications of Agents, Multi-Agent Systems, and Trustworthiness; Springer International Publishing: Berlin/Heidelberg, Germany, 2020; pp. 353–362. [Google Scholar]
  66. Yu, X.; Zhang, Y. Sense and avoid technologies with applications to unmanned aircraft systems: Review and prospects. Prog. Aerosp. Sci. 2015, 74, 152–166. [Google Scholar] [CrossRef]
  67. Wang, M.; Voos, H.; Su, D. Robust online obstacle detection and tracking for collision-free navigation of multirotor UAVs in complex environments. In Proceedings of the 2018 15th International Conference on Control, Automation, Robotics and Vision (ICARCV), Singapore, 18–21 November 2018. [Google Scholar]
  68. Sharma, S.U.; Shah, D.J. A practical animal detection and collision avoidance system using computer vision technique. IEEE Access Pract. Innov. Open Solut. 2017, 5, 347–358. [Google Scholar] [CrossRef]
  69. De Simone, M.; Rivera, Z.; Guida, D. Obstacle avoidance system for unmanned ground vehicles by using ultrasonic sensors. Machine 2018, 6, 18, Version June 19, 2023 submitted to Journal Not Specified 12 of 12. [Google Scholar] [CrossRef] [Green Version]
  70. Yu, Y.; Tingting, W.; Long, C.; Weiwei, Z. Stereo vision based obstacle avoidance strategy for quadcopter UAV. In Proceedings of the 2018 Chinese Control and Decision Conference (CCDC), Shenyang, China, 9–11 June 2018. [Google Scholar]
  71. Bilimoria, K. A geometric optimization approach to aircraft conflict resolution. In Proceedings of the 18th Applied Aerodynamics Conference, Denver, CO, USA, 14–17 August 2000. [Google Scholar]
  72. Goss, J.; Rajvanshi, R.; Subbarao, K. Aircraft conflict detection and resolution using mixed geometric and collision cone approaches. In Proceedings of the AIAA Guidance, Navigation, and Control Conference and Exhibit, San Francisco, CA, USA, 16–19 August 2004. [Google Scholar]
  73. Seo, J.; Kim, Y.; Kim, S.; Tsourdos, A. Collision avoidance strategies for unmanned aerial vehicles in formation flight. IEEE Trans. Aerosp. Electron. Syst. 2017, 53, 2718–2734. [Google Scholar] [CrossRef] [Green Version]
  74. Lin, Z.; Castano, L.; Mortimer, E.; Xu, H. Fast 3D collision avoidance algorithm for fixed wing UAS. J. Intell. Robot. Syst. 2020, 97, 577–604. [Google Scholar] [CrossRef]
  75. Ha, L.N.N.T.; Bui, D.H.P.; Hong, S.K. Nonlinear control for autonomous trajectory tracking while considering collision avoidance of UAVs based on geometric relations. Energies 2019, 12, 1551. [Google Scholar] [CrossRef] [Green Version]
  76. Pérez-Carabaza, S.; Scherer, J.; Rinner, B.; López-Orozco, J.A.; Besada-Portas, E. UAV trajectory optimization for Minimum Time Search with communication constraints and collision avoidance. Appl. Artif. Intell. 2019, 85, 357–371. [Google Scholar] [CrossRef]
  77. Boivin, E.; Desbiens, A.; Gagnon, E. UAV collision avoidance using cooperative predictive control. In Proceedings of the 2008 16th Mediterranean Conference on Control and Automation, Ajaccio, France, 25–27 June 2008. [Google Scholar]
  78. Biswas, S.; Anavatti, S.G.; Garratt, M.A. A particle swarm optimization based path planning method for autonomous systems in unknown terrain. In Proceedings of the 2019 IEEE International Conference on Industry 4.0, Artificial Intelligence, and Communications Technology (IAICT), Bali, Indonesia, 1–3 July 2019. [Google Scholar]
  79. van den Berg, J.; Wilkie, D.; Guy, S.J.; Niethammer, M.; Manocha, D. LQG-obstacles: Feedback control with collision avoidance for mobile robots with motion and sensing uncertainty. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation, St. Paul, MN, USA, 14–18 May 2012. [Google Scholar]
  80. Zhu, H.; Alonso-Mora, J. Chance-constrained collision avoidance for MAVs in dynamic environments. IEEE Robot. Autom. Lett. 2019, 4, 776–783. [Google Scholar] [CrossRef]
  81. Balestrieri, E.; Daponte, P.; De Vito, L.; Picariello, F.; Tudosa, I. Sensors and Measurements for UAV Safety: An Overview. Sensors 2021, 21, 8253. [Google Scholar] [CrossRef] [PubMed]
  82. Pallottino, F.; Antonucci, F.; Costa, C.; Bisaglia, C.; Figorilli, S.; Menesatti, P. Optoelectronic proximal sensing vehicle-mounted technologies in precision agriculture: A review. Comput. Electron. Agric. 2019, 62, 859–873. [Google Scholar] [CrossRef]
  83. Hu, W.; Li, X.; Hu, J.; Song, X.; Dong, X.; Kong, D.; Xu, Q.; Ren, C. A rear anti-collision decision-making methodology based on deep reinforcement learning for autonomous commercial vehicles. IEEE Sens. J. 2022, 22, 16370–16380. [Google Scholar] [CrossRef]
  84. Chen, Y.; Cheng, C.; Zhang, Y.; Li, X.; Sun, L. A Neural Network-Based Navigation Approach for Autonomous Mobile Robot Systems. Appl. Sci. 2022, 12, 7796. [Google Scholar] [CrossRef]
Figure 1. Anti-collision system general architecture.
Figure 1. Anti-collision system general architecture.
Sensors 23 06810 g001
Figure 2. Categorization of anti-collision system sensors.
Figure 2. Categorization of anti-collision system sensors.
Sensors 23 06810 g002
Figure 3. The main approaches to collision avoidance algorithms.
Figure 3. The main approaches to collision avoidance algorithms.
Sensors 23 06810 g003
Table 1. Comparison between the active sensors of the anti-collision system.
Table 1. Comparison between the active sensors of the anti-collision system.
SensorSensor SizePower
Required
AccuracyRange Weather ConditionLight
Sensitivity
Cost
RadarLargeHighHighLong Not AffectedNoHigh
LiDarSmallLowMediumMediumAffectedNo Medium
UltrasonicSmallLowLowShortSlightly AffectedNoLow
Table 2. Previous studies of detection and anti-collision system.
Table 2. Previous studies of detection and anti-collision system.
GeometricSense and AvoidForce FieldOptimization
[78,79][80][83][72][74][69][65][82]
Multiple UAV Compatibility//////O/
3D Compatibility/////OO/
CommunicationO////OO/
Alternate Route Generation////O///
Real-time Detection////////
/—Available. O—Not Available.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chandran, N.K.; Sultan, M.T.H.; Łukaszewicz, A.; Shahar, F.S.; Holovatyy, A.; Giernacki, W. Review on Type of Sensors and Detection Method of Anti-Collision System of Unmanned Aerial Vehicle. Sensors 2023, 23, 6810. https://doi.org/10.3390/s23156810

AMA Style

Chandran NK, Sultan MTH, Łukaszewicz A, Shahar FS, Holovatyy A, Giernacki W. Review on Type of Sensors and Detection Method of Anti-Collision System of Unmanned Aerial Vehicle. Sensors. 2023; 23(15):6810. https://doi.org/10.3390/s23156810

Chicago/Turabian Style

Chandran, Navaneetha Krishna, Mohammed Thariq Hameed Sultan, Andrzej Łukaszewicz, Farah Syazwani Shahar, Andriy Holovatyy, and Wojciech Giernacki. 2023. "Review on Type of Sensors and Detection Method of Anti-Collision System of Unmanned Aerial Vehicle" Sensors 23, no. 15: 6810. https://doi.org/10.3390/s23156810

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop