Next Article in Journal
Carbon Dioxide Sensing—Biomedical Applications to Human Subjects
Next Article in Special Issue
Modelling and Simulation of Collaborative Surveillance for Unmanned Traffic Management
Previous Article in Journal
Performance Metric Analysis for a Jamming Detection Mechanism under Collaborative and Cooperative Schemes in Industrial Wireless Sensor Networks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Review and Simulation of Counter-UAS Sensors for Unmanned Traffic Management

Information Processing and Telecommunications Center, Universidad Politécnica de Madrid, 28040 Madrid, Spain
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(1), 189; https://doi.org/10.3390/s22010189
Submission received: 30 November 2021 / Revised: 20 December 2021 / Accepted: 24 December 2021 / Published: 28 December 2021
(This article belongs to the Special Issue Sensors for Unmanned Traffic Management)

Abstract

:
Noncollaborative surveillance of airborne UAS (Unmanned Aerial System) is a key enabler to the safe integration of UAS within a UTM (Unmanned Traffic Management) ecosystem. Thus, a wide variety of new sensors (known as Counter-UAS sensors) are being developed to provide real-time UAS tracking, ranging from radar, RF analysis and image-based detection to even sound-based sensors. This paper aims to discuss the current state-of-the art technology in this wide variety of sensors (both academically and commercially) and to propose a set of simulation models for them. Thus, the review is focused on identifying the key parameters and processes that allow modeling their performance and operation, which reflect the variety of measurement processes. The resulting simulation models are designed to help evaluate how sensors’ performances affect UTM systems, and specifically the implications in their tracking and tactical services (i.e., tactical conflicts with uncontrolled drones). The simulation models cover probabilistic detection (i.e., false alarms and probability of detection) and measurement errors, considering equipment installation (i.e., monostatic vs. multistatic configurations, passive sensing, etc.). The models were integrated in a UTM simulation platform and simulation results are included in the paper for active radars, passive radars, and acoustic sensors.

1. Introduction

The use of UAVs (Unmanned Aerial Vehicles), or as they are commonly known, drones, has increased in recent years. Initially, these aircraft were used as military technology, especially for security and monitoring purposes, but today, many companies and private users are using UAVs in their daily lives. These nonmilitary drones are used by citizens for recreational activities, such as video recording or taking high-resolution photos, and by companies for observation, transportation, field monitoring, traffic monitoring, fire protection and border patrol, among many other uses [1]. In addition to their widespread use for actions such as those described above, UAVs can be hacked and used to commit crimes, such as espionage, smuggling or even attacks.
For all these reasons, drone detection is necessary to check their presence near critical areas or infrastructures, and if a drone’s behavior is appropriate and compatible with other air operations (of manned aircraft or other drones). There are many different technologies enabling drone detection, localization, and tracking, including cooperative and noncooperative sensors. This paper focuses on this second type of sensors.
Over the past five years, significant research efforts have been made to detect and counter UAVs, and the main physical operating principles of the different technologies being used are clearly described in [2]. Noncooperative sensors include active and passive radar detection techniques, detection through UAVs radio frequency signals, detection by acoustics signals, image detection and detection by merging these techniques or data fusion.
In this contribution, we go a little further in the analysis of these technologies. In addition to describing some of the most interesting literature proposals and commercial products in the state of the art, we define a collection of simulation models, usable for some of those technologies and expandable to others, to be potentially usable for:
(a)
Comparative assessment of potential systems deployment in a given position.
(b)
Analysis of integrated sensing solutions/data fusion approaches for C-UAS.
(c)
Analysis through simulation of the potential integration of the measurements from those sensors in UTM tactical chains, specifically to test the associated implications in their tracking and tactical services.
In any case, the paper focuses on modeling the sensing processes for the different technologies, which would be a prerequisite for any of the previously described analyses. Finally, there are plenty of models of radar, RF, vision, and acoustic sensors. Here, we try to select, parameterize, and summarize those of real application for the detection of small drones in civilian applications (for UTM).
The paper is structured as follows: In the second section of this paper, we describe in detail some of these sensing technologies, covering both the academic literature in the area and the fast-evolving commercial scenario. Meanwhile, the third section is devoted to deriving the simulation models of some of these sensors. This simulation models are to be incorporated in the UTM simulator described in [3]. The fourth section summarizes simulation results for some of the previous sensors, enabling a comparison of their main sensing features and performances, and finally, Section 5 concludes the paper, providing some insights on future work.

2. Review of the State-of-the-Art Technology

In this section, we summarize the different detection technologies. Section 2.1, Section 2.2, Section 2.3, Section 2.4, Section 2.5 and Section 2.6 describe solutions in the literature and some of the commercial solutions (if available). In the case of Section 2.6, it is important to note that it focuses on the use of fusion approaches making use of different sensing technologies. Therefore, quite often, a commercial solution will be described in several of the following sections, once per sensor type, and again when talking about integrated sensing and fusion C-UAS systems. Finally, Section 2.7 includes a comparative summary of technologies requirements, expected performance and limitations.

2.1. Active Detection Radars

Radars have several advantages in detecting aircraft compared with other sensors in terms of weather independency, day and night operation capability, technology development, and capacity to measure range and velocity simultaneously. A big challenge with UAVs is that they have very small radar cross sections (RCS), and they fly at lower altitudes and lower speeds compared to larger aircrafts [4]. Regular radar systems typically aim to detect air targets of medium and large size (RCS larger than 1 m2). In addition, due to its low speed, Doppler processing (Moving Target Indication/Detection) is not so effective. In the literature [5], there are several types of radar used for detection, tracking and classification of drones, such as mmWave Radar or Ultrawide-Band Radar, which can be classified in two main categories: active detection and passive detection radars. In this section, we focus on active detection radars, while the next section describes passive radars.
Conventionally, there are two possible ways to increase the distance and azimuth resolution of active radar detection systems in the case of UAVs operations: using higher frequency carriers or utilizing multiple input multiple output (MIMO) beamforming antennas.
To use shorter a wavelength, K-band, X-band and W-band frequency modulated continuous wave (FMCW) radars are specifically designed for UAV detection. The selection of carrier frequency for UAV detection radar should be higher than 6 GHz (K-band), as in [6], where it is verified the ability of radars to detect small, slow, and low-flying targets. There are two important factors to be considered for the use of radars to detect airborne threats: the target to be detected and the radar itself. When a radar is used to detect small and slow targets, the limiting factor is the RCS, so in that work, “mini-UAVs” were treated as a Medium of Airborne Attack (MoAA), and it was concluded that radars working in the K-band are the ones that best detect “mini-UAVs” due to their dimensions and radar sections. These radars offer optimal accuracy for measuring the coordinates of the targets being detected and small antenna dimensions.
Other approaches use multiple antennas following a MIMO approach. The advantage of this approach is in its applicability to a radar system with lower carrier frequencies, as in [7,8]. A Holographic RadarTM (HR) with a 2D antenna array and an appropriate signal processing is used in [7]. This signal processing can create a multibeam, 3D, wide-area, staring surveillance sensor, which is able to achieve high detection sensitivity, and provide fine Doppler resolution, with update rates of fractions of a second. The ability to remain continuously on targets throughout the entire search volume enables the detection of small targets, such as UAVs, against a moving background. The system uses a 32-by-8 element L-Band receiver array. As the radar has a high detection sensitivity, it can detect small drones and other small moving targets as birds. Thus, it is necessary to have a stage of processing to discriminate the UAV from other objects. In this case, a machine learning decision tree classifier is used to reject small objects while maintaining a high probability of detection for the drone. A similar study is presented in [8], where a ubiquitous frequency modulated continuous wave (FMCW) radar system working at 8.75 GHz (X-band) with PC-based signal processor can detect a micro-UAV at a range of 2 km with an excellent range–speed resolution.
The advances in computation enable another type of radar described in the literature for this application, the software-defined radar (SDR) [9]. This radar is a multiband, multimode, software-defined radar that consists of a hardware-based platform and software-based platform. It is multiband because the module allows the selection of S-, X- and K-bands, while it is multimode because of the capacity of selecting waveforms of CW, Pulse, FMCW and LFM Chirp. The detection results of this system show that the detection of the SDR platform if successfully performed in real-time operations, so it can be used for air safety applications by detecting and warning of the threat UAVs.
An example of mmWave Radar with a precise detection and 3D localization system for drones can be observed in [10]. The positions of drones are estimated from spatial heatmaps of the received radar signals, obtained by applying a super-resolution algorithm. These positions are improved analyzing the micro-Doppler effect, which is generated by the rotating propellers. This radar presents a novel Gaussian Process Regression model to compensate for systematic biases in the radar data.
Finally, another way of detecting UAVs using radars is by means of the Multistatic Forward Scattering Radar (FSR) [11]. The most important principle of a FSR for target detection is the use of the shadow field. When the diffraction angle, which is the angle between the direction of the transmitter–target and the direction of the target–receiver, is approximately zero, the shadow field can be observed at the receiving point. This shadow field is considered in narrow regions where the diffraction angle is approximately zero, causing the forward scatter radar section to increase considerably compared to monostatic radar sections, which occurs only when the size of the target is larger than the wavelength. In the multistatic configuration of a FSR, a certain number of transmitting and receiving positions in the air and receiving positions on the ground must be used. The altitude of the targets must always be equal to or less than the altitude of the airborne transmitter positions, which could be placed onboard UAVs or any other type of aircraft. This kind of airborne sensor network is described here for completeness, but we do not model it in the second part of the paper.
To summarize, the highest disadvantage of active radars is the need for specially designed transmitters that can be difficult to deploy.
Next, we detail some commercial active radars. The ART Midrange 3D [12] is a high-resolution C-UAS FMCW surveillance radar. This high-performance sensor is specifically designed to detect small unmanned aerial vehicles (C-UAS) and for its use in unmanned aircraft traffic management (UTM). The radar is composed of a 3D multibeam antenna system and a high-power amplification stage and is capable of detecting, tracking, and classifying micro quadcopters and micro fixed-wing UAVs, with extended elevation coverage. The main specifications of this solution can be seen in Table 1.
Another commercial solution, provided by Indra [13], called ARMS, includes another FMCW radar. Its main characteristics are detailed next, in Table 2.
German company HENSOLDT has developed a drone detection system called Xpeller Counter UAV solution [14]. This solution can detect the potential threat through a radar system whose specifications can be seen in Table 3 (two different radar systems may be integrated).
Meanwhile, Echodyne [15] has developed an alternative active radar solution based on an Electronic Scan Antenna capable of simultaneously tracking (with very high detection rate) and searching for additional targets in its coverage. Its specifications are detailed in Table 4.
An alternative solution is the Ranger R8SS-3D from Flir [16], whose specifications can be seen in Table 5.
RST enterprise has another radar solution to detect UAVs, and it is called Doruk: UAV detection radar [17]. Its basic functions are a low-altitude moving target detection over land and sea. It provides detection, classification, azimuth and range measures, RCS, radial velocity, heading and width of Doppler Frequency Spectrum of targets. Its main specifications can be seen in Table 6.

2.2. Passive Detection Radars

Passive radars do not require a specially designed transmitter. There are two types of passive radar, the single station passive radar, which exploits only one illumination source, and the distributed passive radar, which uses the existing telecommunications infrastructures as illumination sources to enhance the UAV detection. Typically, two different widespread signals are used: cellular systems and the digital video broadcasting systems.
Passive bistatic radars (PBR) have a challenging problem in the detection of UAVs due to their low RCS [18]. Range migration (RM) occurs in the coherent processing interval, which makes it difficult to increase coherent integration gain and improve radar detection ability, although there are techniques to alleviate this problem. An example of single-station passive radar is the investigation presented in [19], where it is possible to localize small UAVs in 3D by exploiting a passive radar based on Wi-Fi transmissions. A demonstration of the capability of the radar to estimate the position of the target from the ground by exploiting multiple surveillance antennas is performed.
In the case of distributed passive radar, a possible approach is the one proposed in [20], where the detection system uses reflected global system for mobile communications (GSM) signals to locate and track UAVs. Another example of distributed passive radar is the one presented in [21], where a fixed-wing micro-UAV using passive radar based on digital technology is detected using audio broadcasting signals up to a distance of 1.2 km. The experiment was achieved at a lower frequency of 189 MHz in the VHF band.
The major disadvantage of passive radar is that a large amount of postprocessing effort or multiple receivers are required to obtain acceptable detection accuracy.

2.3. Detection through UAS Radio Frequency Signals

UAVs usually have at least one RF communication data link to their remote controller to either receive control commands (typically at 2.4 GHz) or deliver aerial images. In this case, the spectral patterns of such transmission are used for the detection and localization of UAVs. In most cases, software-defined radio receivers are employed to intercept the RF channels.
To utilize the spectrum patterns of UAVs, three possible approaches are considered for drone detection in [22]. One of them is based on sniffing the communication between drone and its controller is a clear application of this approach. Another approach is the one explained in [23], where the frequency hopping spread spectrum signals from a UAV are extracted. According to these articles, it is possible to train a classifier for identifying unique RF transmission patterns from UAVs.
Data traffic patterns are also an important feature to classify and identify UAVs. In [24], a UAV’s detection and identification system, using two receiver units for recording the received signal strength resulting from the UAV was proposed. The system makes use of a novel machine learning-based for efficient identification and detection of UAVs. The system consists of four classifiers working in a hierarchical way. In the first classifier, the availability of the sample as UAV is checked, while the second classifier specifies the type of the detected UAV. The third and fourth classifiers handle specific vendors’ drone types. The system detects UAVs flying within the area, and it can classify UAVs and flight modes of the detected UAV with an accuracy around 99%.
Another UAV detection and identification approach is based on Wi-Fi signal and radio fingerprint, as presented in [25]. Firstly, the system detects the presence of a UAV, and features from RF signal are extracted using Machine Learning and Principal Component Analysis-derived techniques to extract RF fingerprints. The extracted UAV fingerprints are stored and used as training data and test data. The results of this approach are above 95% in indoor scenarios and above 93% in outdoor scenarios.
The real scenarios are not controlled, so it is not so easy to pick up the RF signals, as there is interference in the environment. The following two studies have carried out their experiments with interference in the radio frequency band. The proposed method in [26] relies on machine learning-based RF recognition and considers that the bandwidth of the video signal and Wi-Fi are identical. The process consists of extracting 31 features from the Wi-Fi signal and the UAV video signal and then introducing them to the classifier. It is demonstrated that the proposed method can accurately recognize UAV video signal in the presence of Wi-Fi interference. The proposed method has a recognition rate greater than 95% in the 2 km outdoor experiment. On the other hand, a radio frequency-based drone detection and identification system under wireless interference (Wi-Fi and Bluetooth), by using machine learning algorithms and a pretrained convolutional neural network-based algorithm called SqueezeNet as classifiers is explained in [27]. Different categories of wavelet transforms are used to extract features from the signals. From these extracted features, different models have been built. The experiment has consisted of the study of the performance of these models under different signal-to-noise ratio levels. The results had a correct detection accuracy obtained of 98.9% at 10 dB signal-to-noise ratio level.
Next, we detail some commercial RF detection systems. DJI has created a system to detect their own drones. AeroScope [28] can identify them by monitoring and analyzing their electronic signal to gain critical information such as flight status, paths, and other information in real time. There are two types of AeroScope systems: stationary (designed for continuous protection of large-scale sites, up to 50 km range) and portable (designed for temporary events and mobile deployments, up to 5 km range).
Dedrone provides a complete airspace security system [29], including RF sensors, able to detect and localize drones by their RF signals. There are two types of these sensors: the DedroneSensor RF-160 forms the basis of the sensor network and is used in initial risk analysis, whereas the DedroneSensor RF-360 can locate and track drones. The main characteristics of these sensors can be seen in Table 7.
Finally, DroneShield provides the DroneSentry-X product [30], which is a portable device that is compatible with vehicles. It provides 360° awareness and protection using integrated sensors to detect and disrupt UAVs moving at any speed. It has a nominal UAV detection range greater than 2 km, and it detects UAV RF signals, operating on consumer and commercial industrial, scientific, and medical (ISM) frequencies.

2.4. Detection by Acoustic Signals

An array of acoustic sensors can be employed to capture the sound, detect, and estimate the direction of arrival of sounds from sources such as UAVs. These arrays are deployed around the restricted areas and record the audio signal periodically and deliver this signal to the ground stations. The ground stations extract the features of this audio signal to determine the direction of arrival of the UAV.
Conventionally, once the audio signal of UAV is received, the power or frequency spectrum is analyzed to identify the UAV. An example implementation of this type of UAV detection is explained in [31]. This paper shows how to estimate and track the location of a target by triangularization with two or more microphone arrays, in addition to how the UAV model can be obtained by measuring the sound spectrum of the target. In this report, a small tetrahedral array of microphones was used. The results show that the detection algorithm performs best with a 99.5% probability of detection and a 3% false alarm rate. On the other hand, the tracking algorithm often misses trajectories when other trajectories are present, and the elevation tracking is poor.
Another example of UAV detection using acoustic signals is shown in [32]. In this work, the data collection equipment is composed of two individual microphone arrays in 16-X and 4-L configurations where the microphones are placed on the ground and mounted on metal spikes, while the elevated sensors are placed on tripods. These microphones are covered by six-inch-thick foam shields to protect them and limit the effects of wind. Once the signals have been captured by the arrays, they must be processed and analyzed. The data processing developed, as well as the analysis of the acoustic sensor arrays, has been tested by being used to detect and track the trajectory of UAVs at low altitude and tactical distances. This process operates best under benign daytime conditions and is approximately five times better at detecting noisier, medium-sized, gasoline-powered UAVs than small, electric-powered UAVs.
In the literature, there are some machine learning (ML) approaches to classify the UAV from audio data. Support vector machine (SVM) is implemented to analyze the signal of an UAV engine and to build the signal fingerprint of UAV. The results show that the classifier can precisely distinguish the UAVs in some scenarios [33]. Another example of using deep learning methods to detect UAVs with acoustic signals is shown in [34]. In this paper, there is a comparison among Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs) and Convolutional Recurrent Neural Networks (CRNNs) using melspectrogram features. Here, the CNNs show the better performing results, achieving the highest average accuracy of 94.7%. In summary, machine learning presents an ability to recognize and locate the UAV. However, the nature of acoustic approaches limits the deployment and detection of UAV.
In [35], a detailed study was conducted on how drone detection is performed by using acoustic signals, and it characterized how the microphone array in charge of capturing the sound signal should be organized. The geometry of the microphone array depends on the application to be carried out, although, when the desired signal can come from any angle, the best geometry is the circular array. The possible geometries studied were uniform linear array (ULA), uniform circular array (UCA) and uniform rectangular array (URA). In the array, it is important to know the number of microphones, which usually ranges from 4 to 16 microphones (in steps of two), and the distance between sensors, which usually ranges from 0.3 to 0.6 meters in increments of 0.05 meters.
A commercial C-UAS solution from Dedrone enterprise is Dedrone DroneTracker [36], which is a multiple-sensor unit that may integrate an ultrasonic audio detector. Its specifications are shown in Table 8.

2.5. Detection through Video/Images

Vision-based UAV detection techniques mainly focus on image processing. Cameras and videos are used to capture the images of UAVs. Then, using artificial vision techniques, UAVs positions are estimated.
A vision-based UAV detection approach is presented in [37]. This approach consists of an online recognition system for the identification of 3D objects. The system uses a black-and-white television camera to provide a 2D image on a digital computer. After obtaining the image on the computer, the next step is to remove the clutter from the image by means of a preprocess that provides a clean silhouette as well as its boundaries. At the time of the calculations, certain characteristics are obtained and are used to identify the objects, the position they occupy and their orientation in space by means of a recognition algorithm. A similar system is the one developed in [38], which makes use of classical vision algorithms. This system starts by taking the first image, which is used for initialization of the background estimation. Then a loop is started where the trajectories are predicted in the capture time for each new image taken by the cameras. All those pixels that are different from the background that was previously estimated are detected and form one or more blobs related to the current targets. These blobs are extracted using trajectory predictions, edge detectors and motion detectors. With blobs and an association process, one on more blobs are associated to each target, and in addition, the blobs within the association are used to initialize the tracks. Finally, each track is updated with its corresponding blobs, and the not-updated tracks are deleted.
In contrast, nonconventional segmentation methods make use of neural networks to directly identify the appearance of UAVs. For example, in [39], the authors developed a system that is capable of detecting, recognizing, and tracking a UAV using a single camera automatically. For that purpose, a single Pan–Tilt–Zoom (PTZ) camera detects flying objects and obtains their tracks; once a track is identified as a UAV, it locks the PTZ control system to capture the detailed image of the target region. Afterward, the images can be classified into the UAV and interference classes (such as birds) by a convolution neural network classifier trained with an image dataset. The identification accuracy of track and image reaches 99.50% and 99.89%, respectively. This system could be applied in a complex environment where many birds and UAVs appear simultaneously.
It is possible to detect UAVs from the cameras of other UAVs. An approach for online detection of small UAVs and estimation of their positions and velocities in a 3D environment from a single moving (on-board) camera is presented in [40]. The methods used are computationally light, despite the complexity of computer vision algorithms, so they may be used on UAVs with limited payload. This approach incorporates fast object detection using an AdaBoost-based tracking algorithm. Real-time performance with accurate object detection and tracking is possible, enabling the tracker to extract the position and size of an aircraft from a video frame. The detections are given to a multitarget tracker to estimate the aircraft’s position and velocity in 3D. The effectiveness of this method has been proven with an indoor experiment with three quadrotors. In [41], a general architecture for a highly accurate and computationally efficient UAV-to-UAV detection and tracking algorithm from a camera mounted on a moving UAV platform was developed. The system is composed of a moving target detector followed by a target tracker. The moving target detector accurately subtracts the background from subsequent frames by using a sparsely estimated global perspective transform. The target tracker consists of a Kalmar tracker and was validated using public video data from multiple fixed-wing UAVs working in real time. Video surveillance has not yet been incorporated to our simulation models but is described here for completeness.
Next, we describe two commercial PTZ cameras used for drone detection and tracking. On the one hand, there is Axis Q6215-LE PTZ Network Camera from Axis Communications [42], which is a camera with normal range. Its specifications can be seen in Table 9.
On the other hand, there is Triton PT-Series HD Camera from FLIR Enterprise [43], which is a PTZ with very high range, whose specifications are detailed in Table 10.
Indra also has a camera/optronic sensor to be integrated in its ARMS system. Some details on it are described next, in Table 11.
Another company that markets this type of sensor is HGH USA, specifically with its product called Spynel Series [44]. Spynel is based on thermal imaging technology with a 360° thermal sensor, which works day and night. Spynel can track targets over a long range and wide area. The specifications of each sensor model that exists in this product series can be seen in Table 12.

2.6. Detection by Data Fusion

Detection using a collection of these techniques is the ultimate way to detect UAVs. Data fusion, which is the process of integrating multiple data sources to obtain more consistent, accurate and useful information than that provided by any of the individual techniques explained below, has the advantaged to gain more informative and synthetic fused data than the original inputs. In the case of UAV detection, data fusion could be used to improve the performance of the UAV detection system, by overcoming or alleviating the problems and disadvantages of the individual sensors.
However, data fusion should be conducted with great caution. The key problems to be solved can be referred to as data association, positional estimation, and temporal synchronization. Data association is a general method of combining data from different sensors by correlating one sensor observation with the other observations. This process should ensure that only measurements that refer to the same drone are associated. There are different ways to perform this process: one of them is by spatial synchronization, i.e., seeing that a pair of measurements from different sensors have very similar position values. The coordinate’s changes, bias estimation and correction are sources of errors to be considered in this process. Furthermore, before making any kind of association, it is necessary to make a time synchronization so that all the measures refer to the same instant of time. The last problem faced by data fusion systems is filtering and prediction, for which they usually use common techniques such as Kalman filtering and Bayesian methods.
A low-cost, low-power methodology consisting of a fusion of technologies linking several sensors is presented in [45]. This technology includes a simple radar, an acoustic array of microphones and optical cameras that are used to detect, track, and discriminate potential airborne targets. The multimode sensor fusion algorithms employ the Kalman filter for target tracking, and an acoustic and visual recognition algorithm is implemented to classify targets. The first element of the multimode sensor network is the radar, which is responsible for detecting targets that are approaching the area of interest. The second component is the acoustic microphone array, whose main objectives are to provide target arrival direction and target identification and classification and to mitigate false alarms. The last sensor is the optical system composed of infrared detectors to improve the resolution of targets. Results show that this sensor fusion is useful for detecting, tracking, and discriminating small UAVs. Another set of heterogeneous sensors combined with a sensor data fusion is proposed in [46]. This system is composed of a Radio Frequency (RF) sensor to capture the uplink and downlink communications of the UAV, an acoustic sensor searching for the rotor noise, a passive radar system using the cellular network and a multihypothesis tracking (MHT) system for the fusion of sensor data. Finally, in the case explained in [47], the system is composed of different range acoustic, optical and radar sensors. There is a combination of sensors of long- and short-range detection, the passive RF receivers detect the UAV’s telemetry signals, and the camera and microphone sensors are used to increase the detection accuracy in the near field. Specifically, the system is composed of a 120-node acoustic array that uses acoustic signal to locate and track the UAV; 16 high-resolution optical cameras, which are used to detect the UAV in the middle distance; and MIMO radar (with three bands) to achieve remote detection in the long distance. The developed combination overcomes the drawbacks of each of the sensor types in UAV detection and maximizes the advantages of the sensors. At the same time, the system reduces the cost of large-scale sensor deployment.
In this paper, we focus on the simulation of individual sensors, so we do not simulate these integrated solutions, which remains an area for future research, especially for the cases in which some of the sensors are controlled by the outputs provided by others.
Regarding commercial solutions, some of them are based on integrating some of the previously described sensors. For instance, a commercial solution provided by Indra [13], called ARMS (Anti-RPAS Multisensor System), is a multilayer system ready to support the full C-UAS cycle, combining multiple types of sensors and countermeasures, ready to be deployed in different formats (fixed, mobile, portable) and designed to interact with complementary systems in to provide defense against UAVs threats. It is composed of a radar (described in Section 2.1), a jammer (to interfere with drone control or GPS navigation) and optronics (described in Section 2.5).
HENSOLDT Xpeller Counter UAV solution [14] combines various types of sensors and effectors for protection against small drones. The sensors used to detect and identify are radars, electro-optics, rangefinders, and direction finders. Its radars were described in Section 2.1), and it also identifies the potential threats via visual confirmation with a multispectral camera.
Meanwhile, Dedrone provides a complete airspace security system [29]. Different types of sensors may be connected to the DedroneTracker software. The sensors provided by Dedrone are RF sensors, radars, and cameras. Depending on the application, Dedrone has different radars [48] with different performances in the Dedrone platform, such as the Counter-Drone Radar from Echodyne [15] and the Ranger R8SS-3D from Flir [16], whose specifications were analyzed in Section 2.1. The last sensors integrated by Dedrone are PTZ cameras [49]. DedroneTracker system software has a video analysis capability, able to detect and locate UAVs in real time. Depending on the application, Dedrone can integrate one or more PTZ camera models with different performance levels. On the one hand, there is Axis Q6215-LE PTZ Network Camera from Axis Communications [42], On the other hand, there is Triton PT-Series HD Camera from FLIR [43]. They were described in Section 2.5.
Another company to have its drone detection solutions analyzed in this paper is DroneShield [50]. It has a range of stand-alone portable products and rapidly deployable fixed site solutions. One of the most remarkable ones is the DroneSentry product [51], which is an autonomous fixed C-UAS system that integrates DroneShield’s suite of sensors and countermeasures into a unified responsive platform. This product has as its primary detection method the RadarZero product [52], which is a radar, and/or the RfOne RF detector [53]. It has secondary detection methods such as the WideAlert acoustic sensors and DroneOpt camera sensor [54]. The main specifications of DroneSentry can be seen in Table 13.

2.7. Comparative Analysis of UAV Sensing Technologies

Next, we summarize the main properties of the described technologies to summarize the previous sections. The summary takes the form of Table 14.

3. Counter-UAS Sensors Modeling

Modeling and simulation tools are a useful alternative to test and assess the performance of complex systems in a cost-effective manner. Regarding the usage of such tools to evaluate UTM systems, authors have already proposed in [3] a simulation platform that aims to replicate drone operations and complex scenarios. The objective of the platform is to easily perform system-level evaluations of UTM. To do so, the platform simulates the required input information for UTM systems both in preflight (operation definition submission for authorization) and in-flight phases (telemetry messages from drones or tracks from surveillance networks). Thus, starting from a user-defined simulation scenario (which might include the occurrence of unexpected events or contingencies), the platform is able to replicate the behavior of the actors involved in a drone operation. Then, it forwards the required data streams to the UTM system under evaluation and can retrieve the resulting output information to carry out tests and generate evaluation metrics. This operation is schematically represented in Figure 1.
The platform follows an agent modeling approach where the behavior of drones, ground control stations, surveillance networks and communication networks linking all agents is individually modeled. The complete behavior of the overall scenario arises from the autonomous interaction of these individually modeled agents. The environment in which drones operate is also simulated including terrain, weather, or airspace constraints. With this approach, the platform can currently simulate drone trajectories or effects such as navigation errors, communication disturbances (i.e., latencies, package losses…) and drone detection from sensors.
A model-agnostic, extendable microservices-based architecture has been used to implement the platform, as depicted in Figure 2. The architecture allows for defining multiple simulation models for each agent that can be easily implemented and simultaneously simulated. The simulation of each agent is isolated within a separate microservice so that modeling changes in each service do not affect the rest of the platform. It also provides utilities to define replicable simulation scenarios where the simulated agent’s specification can be defined together with the selected model to carry out their simulation.
A set of simple simulation models for each agent was initially provided, as described in [3]. Particularly, a simplistic technology-agnostic model for noncooperative sensors was already provided. This model just considered a maximum range for each sensor following a pass–not pass approach. It also included a constant additive gaussian noise to model detection inaccuracies.
The models proposed in this section for different technologies aim to improve that simplistic model by designing more accurate models that are based on the inner operation of each sensor type. Measurement simulation models are proposed in this paper for the following sensors: active radars, passive radars, and microphone sensors.
By integrating these enhanced models into the existing platform (which can be easily done by modifying the preexisting surveillance network simulation service), it is possible to assess the performance of those sensors in realistic scenarios. Simulation scenarios defined for the platform not only consider the number of drones, their trajectories and the distribution of surveillance sensors; they also allow for simulating emergent effects from the interaction of sensors with other agents. For instance, the simulator is also able to simulate the network used by sensors to forward information to a UTM system and how it affects track reporting periodicity, latencies, etc. To summarize, the models proposed in this section will enhance the capabilities of the preexisting simulation tool, but they will also benefit from the integration in such platform for assessing the performance of surveillance sensors.

3.1. Active Radar

Two different types of active radars have been modeled, quasi-monostatic radars and MIMO radars.

3.1.1. Quasi-Monostatic Radars

This radar will be simulated using a power model from the radar equation. It will be assumed that the separation between transmitter and receiver is small compared to the distance to the target. In this first approximation, it is assumed that the radar can eliminate the clutter by doppler filtering. It is also assumed that the predominant noise is thermal. Its calculation now depends on the surrounding conditions and not only on the bandwidth. The main characteristics are:
  • Radar cross section dependent on target size [55].
  • CFAR detection.
  • Radar parameters adapted to drone detection (integration times of the order of tens of milliseconds and range resolutions in order of meters).
  • Exploration times around a second.
  • Measurement error simulation.
  • Minimum scan time below a second.
  • False alarm simulation.
The basic parameters defining the model of a quasi-monostatic radar are:
  • Instrumental range ( R m a x ).
  • Minimum and maximum azimuth of coverage.
  • Distance resolution in meters ( Δ d i s ).
  • Bandwidth.
  • Transmitting array position.
  • Transmitted power in W.
  • Azimuthal width of the transmission pattern ( θ 3 d B a z T ).
  • Elevation width of the transmission pattern ( θ 3 d B e l e T ).
  • Position of the receiving array.
  • Number of receiver array beams ( N b e a m s ).
  • Receiver array azimuth beamwidth ( θ 3 d B a z R ).
  • Receiver array elevation beamwidth ( θ 3 d B e l e R ).
  • Dwell time.
  • Minimum time between scans.
  • Minimum and maximum frequency.
  • False alarm probability.
The typical expression for the radar equation of a quasi-monostatic radar is simpler than that of a microwave radar since the free-space propagation losses are included within the ground-wave propagation losses. The radar equation is:
S N = P a v G T G R σ λ 2   T i n t ( 4 π ) 3   R T t 2 R t R 2   L s   N 0
where:
  • S N signal noise relation in the detector
  • P a v average power of the system
  • G T transmit antenna power gain
  • G R receiver antenna power gain
  • σ cross section
  • λ wavelength
  • T i n t integration time
  • R T t transmitter–target distance
  • R t R target–receiver distance
  • L s power losses of the radar system
  • N 0 system noise
To obtain the elevation gain, the elevation width is considered, and to obtain the antenna gain, the azimuth shaping is considered. In this case and considering that the beamforming is conducted only in azimuth, the array gain is estimated approximately as 360° divided by the beamwidth.
G T ( 4 π θ 3 d B e l e T θ 3 d B a z i T )
G R ( 4 π θ 3 d B e l e R θ 3 d B a z i R )
θ 3 d B R = θ m a x θ m i n N b e a m s
The system losses depend on many factors such as the antenna feed or the construction of the processing. A loss factor of around 4 dB has been given as a typical value. The cross section for these frequency bands depends on the target size, and it is modeled as constant for all angles (0.01–0.1 m2).
The reception noise is predominantly thermal noise due to the frequencies being used.
N 0 = k   T 0   10 F a / 10
where k is the Boltzmann constant, T 0 is the Earth temperature (typically 290° K) and F a a noise factor with a typical value of 5 dB.
Once the SNR has been calculated, the detection, false alarms and measurement position must be generated. The detector is assumed to be a CA-CFAR so it is assumed that the target behaves as a Swerling I between scans and the noise residual has a Gaussian distribution [56]. The threshold of the CFAR is obtained with the following expression:
α = ( P F A ) 1 / N 1
where α is the CFAR threshold factor, N is the number of CFAR cells and PFA is the false alarm probability.
The probability of detection (PD) is calculated according to the following expression corresponding to a CA-CFAR and a Swerling I target in Gaussian noise:
P D = ( 1 + α ( 1 + S N ) ) N
The generation of whether there is detection or not is completed by generating a uniform random variable and comparing it with the probability of detection:
D e t e c t i o n = ( r a n d ( 0 , 1 ) P D )
On the other hand, several false alarms per lap will be generated and output at each scan of the space. The average number of alarms per lap is calculated with the following expression:
N a l a r m s = P F A   ( R m a x Δ d i s ) N b e a m s
A binomial random variable with mean N a l a r m s is generated for each scan. The positions corresponding to the N false alarms are then generated uniformly in azimuth, elevation and distance. The position of each alarm is generated as:
ρ i = R m a x   R a n d ( 0 , 1 )
θ i = θ m i n + ( θ m a x θ m i n )   R a n d ( 0 , 1 )
h i = 20 + 100 ·   R a n d ( 0 , 1 )
If there has been detection, the measurement position is calculated assuming the quasi-monostatic configuration and adding to the true position of the aircraft errors in the radial direction and tangential to the direction of view from the receiver. It is assumed that the optimal distance, elevation, and azimuth estimators are being used. The expressions of their errors are given below.
σ d = Δ d i s 1.63 2 ( S N )
σ a z i = θ 3 d B a z i _ R 2 ( S N )
σ e l e = θ 3 d B e l e _ R 2 ( S N )

3.1.2. MIMO Radars

This simulator models a MIMO radar with spatially separated antennas at high frequencies (X, Ku, K or Ka). Each radar unit will have three transmitting antennas and one receiving antenna placed with the central transmitter. Since the antennae are widely separated and will view the target from different angles, echo coherence is not expected. Therefore, incoherent integration processing is performed, since the coherent has no gain. The main characteristics are:
  • Radar cross section dependent on drone size.
  • Several simultaneous transmitters.
  • CFAR detection.
  • Measurement error simulation.
  • Simultaneous space exploration system using simultaneous antenna beams (MIMO techniques).
  • Minimum scan time around one second.
  • False alarm simulation.
The basic parameters defining the model of a MIMO radar are:
  • Transmission frequency.
  • Position of each transmitting and receiving antenna.
  • Power transmitted by each transmitter.
  • Instrumental range.
  • Minimum azimuth of coverage.
  • Maximum azimuth of coverage.
  • Azimuthal width of the transmission pattern ( θ 3 d B a z T ).
  • Elevation width of the transmission pattern ( θ 3 d B e l e T ).
  • Number of receiver array beams ( N b e a m s ).
  • Receiver array azimuth beamwidth ( θ 3 d B a z R ).
  • Receiver array elevation beamwidth ( θ 3 d B e l e R ).
  • Integration time.
  • Bandwidth.
  • Distance resolution.
  • False alarm probability.
  • Scan period.
  • Antenna gain.
  • Secondary lobe level.
In this case, to obtain the target echo power at the receiver, it has been added the echo powers of each of the three receivers. It is assumed that, in MIMO radar, before the incoherent integration of the signals from all transmitters, the possible clutter is coherently eliminated, but in this model, clutter is not considered. Therefore, the power received from a target will be obtained as the sum of the power received from each transmitter.
P t a r g e t = P T x 1 + P T x 2 + P T x 3
P T x i = P a v _ i G T x i G R   T   λ 2   σ   ( 4 π ) 3 R 1 2 R 2 2   L p · F p _ T x i _ R x
where:
  • P a v _ i average power of transmitter i
  • G T x i power gain of transmitting antenna i
  • G R receiver antenna power gain
  • λ wavelength
  • σ cross section
  • T integration time
  • Ri distance in each path
  • L p power losses
  • F p _ T x i _ R x propagation factor in the propagation path ith-transmitter–target–receiver
The average signal-to-noise ratio per echo for each transmitter is calculated by adding the target powers from each transmitter and dividing by the number of transmitters and dividing by the noise power.
( S N ) = P t a r g e t / N P N _ e l e c
Transmitting antennas are assumed to be uniformly patterned in coverage in the horizontal plane and to distribute their power to uniformly illuminate the scanned area from their respective positions. Transmitter gains are specified as a factor depending on the azimuth width of the scanned area. The gains are as follows:
G T ( 4 π θ 3 d B e l e T θ 3 d B a z i T )
G R ( 4 π θ 3 d B e l e R θ 3 d B a z i R )
The propagation factor, being free space, is assumed to be 1. The system losses are assumed to be 2 dB due to Doppler filtering envelopes and CFAR detection losses. Assuming that the images of the three transmitters are integrated incoherently, it can be assumed that it is integrated a CFAR reference of three times the reference of a single system. If it is assumed for each radar a 10-cell reference, it will have 30 reference cells. For a PFA of 1e-4, this means a loss of less than 1 dB [57]. The cross section for these radars depends on the target size, so it is specified by an equal constant for all angles (0.01–0.1 m2).
These radars operate at high microwave frequencies, and consequently, the predominant noise is thermal noise. Therefore, the noise power is obtained as:
P N _ e l e c = k   T 0 f N _ t o t a l
where f N _ t o t a l is the receiver and antenna noise figure (it has been taken a typical value of 5 dB). The bandwidth is not shown since it is assumed that the receiver uses a matched filter, and for the calculation of the ( S N ) ratio, the integration time has already been included in the radar equation.
The detection will be calculated after integrating the echoes from the three transmitters in an incoherent way by making the appropriate corrections according to the relative positions of the transmitters. After filtering, the square of the envelope is found, and the one coming from the three transmitters is integrated. The expression of the detection threshold is obtained according to the following expression [58]:
P F A = 1 P ( Y b , N )
where P ( Y b , N ) represents the incomplete gamma function of order N (the number of transmitters), and Y b is the detection threshold for the specified PFA. The threshold is calculated by clearing the above equation using the inverse function of the incomplete gamma function. The probability of detection (PD) is calculated according to the following expression, corresponding to the integration of the echo from all transmitters:
P D = 1 P ( Y b ( 1 + ( S N ) / 2 ) , N )
where S N is the average signal-to-noise ratio per transmitter, calculated with the radar equation. These expressions are for the integrator without CFAR. The effect of CFAR has been included through a term in the power losses. The generation of whether there is detection or not is decided by generating a uniform random variable and comparing it with the probability of detection as in the quasi-monostatic radar.
On the other hand, a few false alarms per lap and their positions will be generated as in the quasi-monostatic radar. Finally, if there has been detection, the measurement position is also calculated as in the quasi-monostatic radar.

3.2. Passive Radar

Passive radars will be simulated using a power model from the radar equation, which includes a multipath propagation model. These radars are of multistatic type. The predominant noise will be the direct transmitter–receiver signal interference, which will be considered in the model. The main characteristics are:
  • Radar cross section dependent on drone size.
  • Possibility of several simultaneous opportunity transmitters.
  • Opportunity transmitters self-interference simulation.
  • Simulation of electrical noise dependent on the frequency of the transmitter (atmospheric, human and white noise).
  • CFAR detection.
  • Measurement error simulation.
  • Simultaneous space exploration system using multilateration position determination techniques (bistatic or multistatic radars).
  • Minimum scan time in the order of seconds (>1 s).
  • False alarm simulation.
The basic parameters defining the model of a passive radar are:
  • Passive radar position.
  • Instrumental range in kilometers.
  • Minimum and maximum azimuth of coverage in degrees.
  • Number of receiver antenna beams.
  • Number of opportunity transmitters.
  • Position of each opportunity transmitter.
  • Carrier frequency of each opportunity transmitter.
  • Transmitted power of each opportunity transmitter.
  • Antenna gain.
  • Gain of the receiving antenna of the direct signal.
  • Bandwidth of each opportunity transmitter.
  • Integration time in seconds.
  • Number of receiver array beams ( N b e a m s ).
  • Receiver array azimuth beamwidth ( θ 3 d B a z R ).
  • Receiver array elevation beamwidth ( θ 3 d B e l e R ).
  • Receiving array sidelobe level.
  • Direct signal cancellation level
  • False alarm probability.
  • Scan period in seconds.
The radar equation for a passive radar is implemented in several steps. The first is to calculate the received signal power of the target echo before the correlator for an opportunity transmitter.
P t a r g e t = P a v G T G R   λ 2   σ f i n a l   ( 4 π ) 3 R 1 2 R 2 2   L p · F p
where:
  • P a v average power of transmitter
  • G T transmitting antenna power gain
  • G R receiver antenna power gain
  • λ wavelength
  • σ cross section
  • Ri distance in each path
  • L p power losses
  • F p propagation factor
The ratio ( S N ) is calculated at the correlator output, where the signal will have a gain equal to the square of the product bandwidth and propagation time. The interference powers (noise and correlator side lobes of different signals, including the target) have a gain equal to the product of bandwidth and integration time. The S N ratio to be used in the CFAR is computed with the following expression:
( S N ) = P t a r g e t · ( B   T ) 2 P t a r g e t · ( B   T ) + P N _ c l u t t e r · ( B   T ) + P N _ s i g n a l · ( B   T ) + P N _ e l e c · ( B   T )
where P N _ c l u t t e r , P N _ s i g n a l and P N _ e l e c are the clutter, direct signal and electrical noise powers at the correlator input, respectively (these are calculated in the corresponding sections); T is the integration interval and B is the bandwidth (these parameters are specified for each opportunity signal).
Transmitting antennas are assumed to have uniform pattern in coverage in the horizontal plane. The transmit gain is specified as a parameter of the transmitter. This gain, if omnidirectional broadcasting is assumed, will be around that of a half-wave dipole (2.15 dBi). As the transmitter power is usually given in apparent radiated power, which considers the gain of the transmitting antenna over the half-wave dipole, 2.15 dBi is the gain over the isotropic that we will assume of the transmitting antenna. For the receiving antenna, a circular array is assumed that generates a number N of beams covering 360°.
G T = 10 G T / 10
G R ( 4 π θ 3 d B e l e R θ 3 d B a z i R )
The above gain expression will be used for signals within the main beam. For signals entering through the secondary lobes (in the case of passive radars, it will consider the direct signal entering through the secondary lobes of the antenna), it is considered that the signal suffers a constant gain equal to the level of the secondary lobes of the antenna.
G R _ s i d e l o b e s = ( 4 π θ 3 d B e l e R θ 3 d B a z i R ) ·   10 G r _ s i d e l o b e / 10
The propagation factor, being free space, is assumed to be 1. System losses, for passive radars, are 2 dB due to the correlator windowing to reduce the secondary lobes in distance and doppler and another 2 dB due to the clutter elimination system and direct signal. The cross section for these radars depends on the target size, so it is specified by an equal constant for all angles (0.01–0.1 m2).
The noise of a passive radar is composed of three main components: the radio noise; the self-interference of the signal itself due to the secondary lobes of the cross-correlation function; and the residual of the secondary lobes of the direct signal autocovariance function. In this model, clutter power is assumed to be zero.
The power of the direct signal arriving at the receiver is calculated using the propagation equation and applying the attenuation that a typical direct signal canceller can provide.
P N _ s i g n a l = P a v G T   G R _ s i d e l o b e   λ 2 ( 4 π ) 2 R b 2 · L d i r e c t
After the correlator, the echo of the direct signal appears at zero distance and is eliminated. What remains is the residue of the secondary lobes of the ambiguity function spread over the entire Doppler–distance space.
The reception noise in this type of band is the antenna noise (human noise + galactic noise + atmospheric noise). This noise predominates over the thermal noise. Finally, the power of the radio noise can be obtained as:
P N _ e l e c = k   T 0 f N _ t o t a l B
f N _ t o t a l = 10 F a / 10 1 + L l L a f R x
where F a is calculated as in quasi-monostatic radars, L l represents the transmission line losses (typically 0.5 dB [57]), L a represents the resistive losses in the antenna (typically 0.5 dB [57]) and f R x represents the receiver noise figure (typically 4 dB [57]). In this first approximation, it has been estimated that there is no clutter.
Once the signal-to-noise ratio is obtained, detection and false alarms are generated. This generation is the same as that of the quasi-monostatic radars, so the explanation of its procedure can be seen in that section except elevation since these passive radars do not calculate target height. The measured position will be generated by adding to the actual position a random variable with standard deviation the noise variance. The measurement accuracy is calculated in local radar coordinates centered on the receiver (y-axis north, x-axis east). First, the covariance matrix of the measurement is calculated in local Cartesian as [59]:
P = [ σ x 2 σ x y σ y x σ y 2 ] = { δ x ^ δ r R θ } [ σ R 2 0 0 σ θ 2 ] { δ x ^ δ r R θ } T
{ δ x ^ δ r R θ } = [ δ x ^ δ R δ x ^ δ θ δ y ^ δ θ δ y ^ δ R ]
x ^ = ( ( R + R b ) 2 R b 2 )   s i n ( θ ) 2 ( R + R b R b   s e n ( θ ) )  
y ^ = ( ( R + R b ) 2 R b 2 )   c o s ( θ ) 2 ( R + R b R b   s e n ( θ ) )
where σ R and σ θ are the standard deviations, which are calculated as in the quasi-monostatic radar; R is the bistatic distance ( R 1 + R 2 R b ) and R b is the baseline distance (transmitter–receiver).
Finally, the output positions are found by generating a 2D Gaussian random variable correlated with the previous autocorrelation matrix:
[ x m e a s u r e y m e a s u r e ] = [ x y ] + { δ x ^ δ r R θ } [ σ R ·   r a n d n ( 1 ) σ θ ·   r a n d n ( 1 ) ]

3.3. Microphone Sensor and RF Sensor

Microphone sensors and RF sensors are modeled following an azimuth model in which, from the power of the signals sent by the drones, either acoustic or radio frequency, the signal-to-noise ratio at the input of the sensor is calculated, and from the detection, the azimuth and distance to the sensor are obtained. These sensors have a certain sensitivity, and, depending on the signal-to-noise ratio, there will be detection or not. The main characteristics are:
  • Surface propagation losses over land.
  • Parameters adapted to drone detection (integration times of the order of minutes).
  • Exploration times on the order of minutes.
  • Measurement error simulation.
  • False alarm simulation.
The basic parameters defining the model of a sensor are:
  • Sensor position.
  • Sensor sensitivity.
  • Distance resolution in kilometers.
  • Maximum range in meters.
  • Number of sensors.
  • Minimum and maximum azimuth in degrees.
  • Bandwidth.
  • Receiving beamwidth in degrees.
  • Direct signal cancellation level.
  • False alarm probability.
The first is to calculate the received signal power of the target echo before the correlator.
P t a r g e t = P a v e r a g e D C · A t t · L p
where P a v e r a g e is the noise power of the drone. D C is the directivity correction, A t t is the attenuation and L p is the system power loss.
The ratio ( S N ) is calculated at the correlator output. There will be the interference powers of noise and correlator side lobes of different signals, including the target. The S N ratio is computed with the following expression:
( S N ) = P t a r g e t P N _ c l u t t e r + P N _ s i g n a l + P N _ e l e c
In this case, a multisource scenario is assumed, so the directivity correction factor is set at 3 dB. Once the source, the powers and their basic definitions have been characterized, we proceed to obtain the calculation of the effects that produce attenuation. The attenuation in real environments for the propagation of a wave is defined by the following equation:
A t t = A d i v + A a t m + A g r + A b a r + A m i s c
The waves emitted by a drone are those of an omnidirectional source, since it propagates in all possible directions, so the waves emitted are spherical waves whose power level coincides at the same distance from the source. As this distance increases, the wave energy is distributed over a larger and larger area, so that each time this distance is doubled, the power level decreases by a factor of 6 dB theoretically, so the geometric divergence attenuation is:
A d i v ( dB ) = 20 · log 10 ( R ) + 11
where R is the distance in meters between the drone and the sensor.
Atmospheric absorption is the attenuation due to nitrogen, oxygen and carbon dioxide during wave propagation as it travels a specific distance to the receiver.
A a t m ( dB ) = α ( dB km ) · R ( km )
where α ( dB km ) is the atmospheric attenuation coefficient, which depends on the following parameters: the frequency of the wave, the ambient atmospheric temperature, the relative humidity of the air and the ambient pressure. Since there are already estimated tables from which these values can be obtained and for the frequencies of these waves, this coefficient takes values of the order of 1 × 10−3 to 1 × 10−2.
Ground attenuation is mainly due to waves reflected by the ground surface interfering with the propagation of the main wave from the source to the receiver. This attenuation occurs when the source or receiver is close to the ground surface. This model uses an equation that allows for obtaining the ground effect attenuation in a simpler way because its operation is specified only for long distances and with porous or mixed surface. As the source gets closer, this attenuation tends to disappear.
A g r ( dB ) = 4.8 ( 2 · h m R ) · [ 17 + ( 300 R ) ]
where h m is the average height of the propagation path above ground in meters, and R represents the distance from the drone to the receiver, also in meters.
An object should be considered as a shielding obstacle (barrier) if: it meets a surface density of at least 10 kg/m2, it has a closed surface with no large cracks or gaps and the horizontal dimension of the object perpendicular to the line connecting transmitter–receiver is greater than the wavelength. As the simulator is going to operate in real spaces that are filled with objects, it is assumed that the barrier losses are 3 dB.
Finally, there may be other types of attenuation such as those due to foliage or housing. Losses of 3 dB are assumed.
The reception noise is assumed to be the microphone noise (human noise + atmospheric noise + natural interferences). This noise predominates over the thermal noise. Thus, the power of the audio noise is assumed to be a constant ( P N _ e l e c ). Once the signal-to-noise ratio is obtained, detection is generated. Logistic regression was used to determine the probability of detection. If the signal-to-noise ratio exceeds the sensitivity of the sensor there is detection, the model used is as follows:
P D = 1 1 + e ( ( S N ) S e n s i t i v i t y )  
The generation of whether there is detection or not is confirmed by generating a uniform random variable and comparing it with the probability of detection:
D e t e c t i o n = ( r a n d ( 0 , 1 ) P D )
Later, false alarms generation and the generation of measurement positions is performed. These generations are the same as those of the quasi-monostatic radars, so the explanation of their procedures can be seen in that section. It should be noted only azimuth measurements are obtained (through the measurement of the angle of arrival), as range measurement from acoustic signals would demand the performance of multistatic/trilateration procedures.

4. Counter-UAS Simulation Results

The previous models have been implemented and integrated in the simulator described in [3]. This integration allows us to use the proposed models in realistic drone scenarios to assess and compare the performance of different sensors. Particularly, a scenario is proposed in this section where an area of interest (i.e., a critical facility) is to be surveilled with different C-UAS sensors.
The simulated scenario is represented in Figure 3, where the area of interest to be protected is depicted in red. A surveillance solution using radars and eventual microphone sensors is proposed. This solution is complemented with the microphone sensor that works in shorter distances. The proposed sensors (depicted as markers in Figure 3) are:
  • Quasi-monostatic Radar (also named Active Radar in figures): A quasi-monostatic radar has been installed in the middle of the critical infrastructure with a quasi-monostatic configuration. The values taken to model the sensor refer to some of the commercial radars detailed in the state-of-the-art section. The radar has an instrument range of 10 km, a minimum azimuth of −180° and a maximum of 180°, 32 receiving beams and 10 m resolution. The average power transmitted is 500 W. The minimum time between explorations is 0.06 sec (the dwell time). The minimum frequency is 8 GHz, and the maximum frequency is 12 GHz. Finally, the reception beamwidth in azimuth is 2°, and the beamwidth in elevation is 6°.
  • Passive Radar: A passive radar is also installed in the center of the protected facility. It works in conjunction with a hypothetical transmitter of opportunity (i.e.: DVB-T transmitter) located at around 20 km from the passive receiver. The transmitter has enough power to support an instrument range of around 10 km. The minimum azimuth is −30°, and the maximum is 30°. The resolution in distance is about 20 m, and the resolution in azimuth is 2°. A scan time of 1 second has been assumed since it is a system with simultaneous space exploration without mechanical antenna movement. The antenna has a gain of 2 dBi, a secondary lobe level of 22 dB and a signal cancellation level of 60 dB. Finally, the carrier frequency is 600 MHz.
  • MIMO Radar: A MIMO radar is also installed in the critical infrastructure. It has the following instrumental coverage (minimum azimuth: −180°, maximum azimuth: 180°, maximum range: 10 km): It is a medium/short surveillance system with one receiver (located in the center) and three transmitters (located in the area perimeter), which receive simultaneously through 32 receiving beams. The power transmission of each transmitter is 2 kW. The maximum scan rate is 1 s. The resolution in distance is about 20 m, and the resolution in azimuth is 3°. The antenna has a gain of 11.6 dBi, a secondary lobe level of 13 dB and a signal cancellation level of 40 dB. Finally, the carrier frequency is 15 GHz.
  • Microphone sensor: A microphone sensor is also simulated in this scenario located in the center of the critical area. It is an array composed by eight microphones separated 0.5 m from each other. The sensitivity of the array is 32 dB, and it has an instrumental range of 1 km. The minimum azimuth of the sensor is −180°, and the maximum azimuth is 180°.
Although not frequent, violent and hostile acts against critical infrastructures have already occurred and are documented. The scenario tries to assess the alert distance in case of an aerial attack and the positioning accuracy provided by each of the sensors in the described surveillance system. The simulated attack is to be conducted by a terrorist group that intends to infiltrate by air using an off-the-shelf, affordable, and small drone such as a DJI Phantom 4 (estimated RCS of 0.01 m2 and a noise level of around 80dB). The departure place is located around 7 km away from the critical infrastructure. From there, the drone will try to make a direct approach at maximum speed following the direction depicted as a black arrow in Figure 3).
This scenario (i.e., drone trajectory, sensors’ location) has been easily represented and executed in real time using the simulation platform. After running it, the plots generated by each of the sensors were retrieved for analysis. These plots are shown in Figure 4, where the plots corresponding to actual drones and the false alarms are represented. To represent the plots from the microphone sensor (where only angular information is available), the actual distance of the drone is used.
False alarms are filtered out in Figure 5 and Figure 6 for each of the four sensors to facilitate the analysis. There, the alert distance provided by each of the sensors can be compared. As expected, radar-based sensors have a greater range than the microphone-based one, which only detects the drone in very close proximity. Within the radar-based sensors, active radar provides consistent detections from the beginning of the trajectory, whereas passive and MIMO sensors provide consistent results from a range of 3 and 4 km, respectively (as can be derived from plot density in Figure 7). It can also be checked that detection probability (related to the number of detections) increases as the distance to the sensor decreases. This result is the expected one, as detection probability increases with the SNR, which also increases as the distance to the sensor is reduced.
The positioning errors are depicted in Figure 7 for the angle error and Figure 8 for the distance error (microphone sensor is not included here). It can be observed that the magnitude of both type of errors decreases as the drone approaches the sensor receiving location. This can be explained, once again, by the dependency between the computed error and the SNR.
More detailed performance measures could be obtained, both in terms of detection and accuracy (i.e., PD vs. range or angle/range error standard deviations vs. range). In the case of monostatic radars/RF sensors and acoustic sensors, it is possible to derive this relation by considering PD/accuracy dependency with SNR, which, in turn, depends directly on range. However, for multistatic sensors, passive radars and distributed sensors in general, the relations are much more unlinear, and the results are very scenario dependent. In these cases, the relative locations of the emitters, receivers, etc. have important impact on the results.

5. Conclusions and Future Work

This paper reviews some of the current technologies used for the noncollaborative detection and tracking of UAVs and proposes a collection of simulation models, composed by integrating preexisting models of radar and acoustic sensing and by adapting them to our application. These models allow for a lightweight simulation of the most important effects on detection and estimation performance of the C-UAS sensors and sensor networks.
There are some limitations on the current models, such as:
  • The radar models are not fully compatible with some newer Electronic Scan Antenna radars with adaptable track update rates.
  • The problem of target resolution has not been addressed, which could impose limitations to the simulation of nearby targets and drone swarms.
  • Multistatic sensors with several receivers have not been implemented.
  • Systematic errors (biases) related to sensor alignment, propagation, etc. have not been included in our models.
The presented simulation results show the capability to derive realistic measures using those simulation models, following the expected behaviors regarding both detection and estimation accuracy performance. Finally, there are a collection of future lines for research related to this paper:
  • Improvements of the models to alleviate the previous limitations.
  • Completion of the simulation with models adequate for RF UAS signal detection. The model is expected to be similar to that of the acoustic system, with specific modifications to define the noise power and the emitter signal power.
  • Completion of the simulation with models adequate for camera/vision systems.
  • Definition of simulation means to evaluate integrated deployments using different collaborating sensors and potentially managing the collection of sensors for specific tasks (long-range detection, short-term classification, clutter removal, etc.)
  • Integration with actual UTM systems and tracking systems.

Author Contributions

Conceptualization, J.A.B., I.C. and D.C.; Funding acquisition, J.A.B.; Methodology, J.A.B. and G.d.M.; Software, I.C., D.C. and L.B.; Supervision, J.A.B.; Validation, I.C. and G.d.M.; Writing—original draft, I.C.; Writing—review and editing, J.A.B., D.C., L.B. and G.d.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Spanish Ministry of Science and Innovation under Grant PID2020-118249RB-C21.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hassanalian, M.; Abdelkefi, A. Classifications, Applications, and Design Challenges of Drones: A Review. Prog. Aerosp. Sci. 2017, 91, 99–131. [Google Scholar] [CrossRef]
  2. Wang, J.; Liu, Y.; Song, H. Counter-Unmanned Aircraft System(s) (C-UAS): State of the Art, Challenges, and Future Trends. IEEE Aerosp. Electron. Syst. Mag. 2021, 36, 4–29. [Google Scholar] [CrossRef]
  3. Carramiñana, D.; Campaña, I.; Bergesio, L.; Bernardos, A.M.; Besada, J.A. Sensors and Communication Simulation for Unmanned Traffic Management. Sensors 2021, 21, 927. [Google Scholar] [CrossRef] [PubMed]
  4. Fioranelli, F.; Ritchie, M.; Griffiths, H.; Borrion, H. Classification of Loaded/Unloaded Micro-Drones Using Multistatic Radar. Electron. Lett. 2015, 51, 1813–1815. [Google Scholar] [CrossRef] [Green Version]
  5. Güvenç, İ.; Ozdemir, O.; Yapici, Y.; Mehrpouyan, H.; Matolak, D. Detection, Localization, and Tracking of Unauthorized UAS and Jammers. In Proceedings of the 2017 IEEE/AIAA 36th Digital Avionics Systems Conference (DASC), St. Petersburg, FL, USA, 17–21 September 2017; pp. 1–10. [Google Scholar]
  6. Krátký, M.; Fuxa, L. Mini UAVs Detection by Radar. In Proceedings of the International Conference on Military Technologies (ICMT) 2015, Brno, Czech Republic, 19–21 May 2015; pp. 1–5. [Google Scholar]
  7. Jahangir, M.; Baker, C. Persistence Surveillance of Difficult to Detect Micro-Drones with L-Band 3-D Holographic RadarTM. In Proceedings of the 2016 CIE International Conference on Radar (RADAR), Guangzhou, China, 10–13 October 2016; pp. 1–5. [Google Scholar]
  8. De Quevedo, Á.D.; Urzaiz, F.I.; Menoyo, J.G.; López, A.A. Drone Detection with X-Band Ubiquitous Radar. In Proceedings of the 2018 19th International Radar Symposium (IRS), Bonn, Germany, 20–22 June 2018; pp. 1–10. [Google Scholar]
  9. Kwag, Y.-K.; Woo, I.-S.; Kwak, H.-Y.; Jung, Y.-H. Multi-Mode SDR Radar Platform for Small Air-Vehicle Drone Detection. In Proceedings of the 2016 CIE International Conference on Radar (RADAR), Guangzhou, China, 10–13 October 2016; pp. 1–4. [Google Scholar]
  10. Paredes, J.A.; Álvarez, F.J.; Hansard, M.; Rajab, K.Z. A Gaussian Process Model for UAV Localization Using Millimetre Wave Radar. Expert Syst. Appl. 2021, 185, 115563. [Google Scholar] [CrossRef]
  11. Blyakhman, A.B.; Burov, V.N.; Myakinkov, A.V.; Ryndyk, A.G. Detection of Unmanned Aerial Vehicles via Multi-Static Forward Scattering Radar with Airborne Transmit Positions. In Proceedings of the 2014 International Radar Conference, Lille, France, 13–17 October 2014; pp. 1–5. [Google Scholar]
  12. ART Drone Detection Radar—ART Drone Sentinel—Anti Drone Radar System. Available online: https://www.advancedradartechnologies.com/products/art-drone-sentinel-2/ (accessed on 5 November 2021).
  13. INDRA ARMS Anti RPAS Multisensor System. Available online: https://counteruas.indracompany.com/es/home (accessed on 21 October 2021).
  14. Integrated Solutions|HENSOLDT. Available online: https://www.hensoldt.net/products/integrated-solutions/ (accessed on 24 November 2021).
  15. Counter-Drone Radar—Echodyne. Available online: https://echodyne.azurewebsites.net/security/counter-drone-radar/ (accessed on 5 November 2021).
  16. Radar de Panel de Vigilancia de Drones en Aire y en Tierra del Ranger R8SS-3D|Teledyne FLIR. Available online: https://www.flir.es/products/ranger-r8ss-3d/ (accessed on 5 November 2021).
  17. Doruk: UAV Detection Radar—RST. Available online: https://www.rstteknoloji.com.tr/en/solutions/doruk-uav-detection-radar/ (accessed on 25 October 2021).
  18. UAV Detection via Long-Time Coherent Integration for Passive Bistatic Radar|Elsevier Enhanced Reader. Available online: https://reader.elsevier.com/reader/sd/pii/S1051200421000361?token=67DAE35126030395C4842CC2AA2981FA4779B040C0079AB0B895F95C69806212205AABA4EA594CEB1A772FACE0476553&originRegion=eu-west-1&originCreation=20211104102124 (accessed on 4 November 2021).
  19. Martelli, T.; Murgia, F.; Colone, F.; Bongioanni, C.; Lombardo, P. Detection and 3D Localization of Ultralight Aircrafts and Drones with a WiFi-Based Passive Radar. In Proceedings of the International Conference on Radar Systems (Radar 2017), Belfast, Ireland, 23–26 October 2017; pp. 1–6. [Google Scholar]
  20. Knoedler, B.; Zemmari, R.; Koch, W. On the Detection of Small UAV Using a GSM Passive Coherent Location System. In Proceedings of the 2016 17th International Radar Symposium (IRS), Krakow, Poland, 10–12 May 2016; pp. 1–4. [Google Scholar]
  21. Schüpbach, C.; Patry, C.; Maasdorp, F.; Böniger, U.; Wellig, P. Micro-UAV Detection Using DAB-Based Passive Radar. In Proceedings of the 2017 IEEE Radar Conference (RadarConf), Seattle, WA, USA, 8–12 May 2017; pp. 1037–1040. [Google Scholar]
  22. Nguyen, P.; Ravindranatha, M.; Nguyen, A.; Han, R.; Vu, T. Investigating Cost-Effective RF-Based Detection of Drones. In DroNet’16: Proceedings of the 2nd Workshop on Micro Aerial Vehicle Networks, Systems, and Applications for Civilian Use; Association for Computing Machinery: New York, NY, USA, 2016; pp. 17–22. [Google Scholar]
  23. Shin, H.; Choi, K.; Park, Y.; Choi, J.; Kim, Y. Security Analysis of FHSS-Type Drone Controller. In Information Security Applications; Kim, H., Choi, D., Eds.; Springer: Cham, Switzerland, 2016; pp. 240–253. [Google Scholar]
  24. Nemer, I.; Sheltami, T.; Ahmad, I.; Yasar, A.U.-H.; Abdeen, M.A.R. RF-Based UAV Detection and Identification Using Hierarchical Learning Approach. Sensors 2021, 21, 1947. [Google Scholar] [CrossRef]
  25. Nie, W.; Han, Z.-C.; Zhou, M.; Xie, L.-B.; Jiang, Q. UAV Detection and Identification Based on WiFi Signal and RF Fingerprint. IEEE Sens. J. 2021, 21, 13540–13550. [Google Scholar] [CrossRef]
  26. Zuo, M.; Xie, S.; Zhang, X.; Yang, M. Recognition of UAV Video Signal Using RF Fingerprints in the Presence of WiFi Interference. IEEE Access 2021, 9, 88844–88851. [Google Scholar] [CrossRef]
  27. Medaiyese, O.; Ezuma, M.; Lauf, A.P.; Guvenc, I. Wavelet Transform Analytics for RF-Based UAV Detection and Identification System Using Machine Learning. arXiv 2021, arXiv:2102.11894. [Google Scholar]
  28. Aeroscope—DJI. Available online: https://www.dji.com/es/aeroscope (accessed on 25 October 2021).
  29. Counter-Drone Solution—Dedrone. Available online: https://www.dedrone.com/products/counter-drone-solution (accessed on 5 November 2021).
  30. DroneSentry-X. Available online: https://www.droneshield.com/sentry-x (accessed on 25 October 2021).
  31. Benyamin, M.; Goldman, G.H. Acoustic Detection and Tracking of a Class I UAS with a Small Tetrahedral Microphone Array; Army Research Lab: Adelphi, MD, USA, 2014. [Google Scholar]
  32. Pham, T.; Srour, N. TTCP AG-6: Acoustic Detection and Tracking of UAVs. In Unattended/Unmanned Ground, Ocean, and Air Sensor Technologies and Applications VI; SPIE: Orlando, FL, USA, 1 September 2004; Volume 5417, pp. 24–30. [Google Scholar]
  33. Bernardini, A.; Mangiatordi, F.; Pallotti, E.; Capodiferro, L. Drone Detection by Acoustic Signature Identification. Electron. Imaging 2017, 2017, 60–64. [Google Scholar] [CrossRef]
  34. Casabianca, P.; Zhang, Y. Acoustic-Based UAV Detection Using Late Fusion of Deep Neural Networks. Drones 2021, 5, 54. [Google Scholar] [CrossRef]
  35. Tejera Berengué, D. Análisis de Señales Acústicas para la Detección de UAVs. Available online: https://ebuah.uah.es/dspace/handle/10017/49529 (accessed on 8 November 2021).
  36. DeDrone DroneTracker: DroneBouncer. Available online: http://dronebouncer.com/en/dedrone-dronetracker (accessed on 5 November 2021).
  37. Dudani, S.A.; Breeding, K.J.; McGhee, R.B. Aircraft Identification by Moment Invariants. IEEE Trans. Comput. 1977, C-26, 39–46. [Google Scholar] [CrossRef]
  38. Campaña Ramos, I. Diseño de Un Prototipo de Un Sistema de Vigilancia Basado En Visión Artificial. Master’s Thesis, Universidad Politécnica de Madrid, Madrid, Spain, 2016. [Google Scholar]
  39. Liu, Y.; Liao, L.; Wu, H.; Qin, J.; He, L.; Yang, G.; Zhang, H.; Zhang, J. Trajectory and Image-Based Detection and Identification of UAV. Vis. Comput. 2021, 37, 1769–1780. [Google Scholar] [CrossRef]
  40. Sapkota, K.R.; Roelofsen, S.; Rozantsev, A.; Lepetit, V.; Gillet, D.; Fua, P.; Martinoli, A. Vision-Based Unmanned Aerial Vehicle Detection and Tracking for Sense and Avoid Systems. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 1556–1561. [Google Scholar]
  41. Li, J.; Ye, D.H.; Kolsch, M.; Wachs, J.P.; Bouman, C.A. Fast and Robust UAV to UAV Detection and Tracking from Video. In IEEE Transactions on Emerging Topics in Computing (Early Access); IEEE: Piscataway, NJ, USA, 2021; p. 1. [Google Scholar] [CrossRef]
  42. AXIS Q6215-LE PTZ Network Camera|Axis Communications. Available online: https://www.axis.com/products/axis-q6215-le (accessed on 5 November 2021).
  43. HD de la Serie Triton PT de FLIR|Teledyne FLIR. Available online: https://www.flir.es/products/pt-series-hd/ (accessed on 5 November 2021).
  44. Drone/UAV Detection and Tracking with SPYNEL. HGH Infrared USA. Available online: https://hgh-infrared-usa.com/drone-uav-detection-and-tracking/ (accessed on 27 December 2021).
  45. Thomas, C. Sensor Fusion: Foundation and Applications; BoD—Books on Demand: Rijeka, Croatia, 2011; ISBN 978-953-307-446-7. [Google Scholar]
  46. Jovanoska, S.; Knoedler, B.; Palanivelu, D.P.; Still, L.; Fiolka, T.; Oispuu, M.; Steffes, C.; Koch, W. Passive Sensor Processing and Data Fusion for Drone Detection. In Proceedings of the NATO STO Meeting Proceedings: MSG-SET-183 Specialists’ Meeting on Drone Detectability: Modelling the Relevant Signature, Prague, Czech Republic, 27 April 2021; p. 16. [Google Scholar]
  47. Böniger, U.; Ott, B.; Wellig, P.; Aulenbacher, U.; Klare, J.; Nussbaumer, T.; Leblebici, Y. Detection of Mini-UAVs in the Presence of Strong Topographic Relief: A Multisensor Perspective. In Target and Background Signatures II; SPIE: Edinburgh, UK, 2016; Volume 9997, pp. 13–20. [Google Scholar]
  48. Drone Detection Radar—Hardware—Dedrone. Available online: https://www.dedrone.com/products/hardware/extensions/drone-detection-radar (accessed on 25 October 2021).
  49. PTZ Cameras for the Detection of Drones—Dedrone. Available online: https://www.dedrone.com/products/hardware/extensions/ptz-cameras (accessed on 25 October 2021).
  50. C-UAS, Electronic Warfare and Command-and-Control Systems. Available online: https://www.droneshield.com (accessed on 8 November 2021).
  51. DroneSentry. Available online: https://www.droneshield.com/dronesentry (accessed on 25 October 2021).
  52. Radar. Available online: https://www.droneshield.com/radars (accessed on 8 November 2021).
  53. RfOne. Available online: https://www.droneshield.com/rfone (accessed on 25 October 2021).
  54. DroneOpt. Available online: https://www.droneshield.com/cameras (accessed on 8 November 2021).
  55. Farlik, J.; Kratky, M.; Casar, J.; Stary, V. Radar Cross Section and Detection of Small Unmanned Aerial Vehicles. In Proceedings of the 2016 17th International Conference on Mechatronics—Mechatronika (ME), Prague, Czech Republic, 7–9 December 2016; pp. 1–7. [Google Scholar]
  56. Skolnik, M.I. Radar Handbook, 3rd ed.; McGraw-Hill Education: New York, NY, USA, 2008; ISBN 978-0-07-148547-0. [Google Scholar]
  57. Barton, D.K. Radar System Analysis and Modeling; Artech House: Norwood, MA, USA, 2005. [Google Scholar]
  58. DiFranco, J.V.; Rubin, W.L. Radar Detection; Artech House: Norwood, MA, USA, 1980. [Google Scholar]
  59. Malanowski, M. Signal Processing for Passive Bistatic Radar; Artech House: Norwood, MA, USA, 2019. [Google Scholar]
Figure 1. Operation of the simulation architecture proposed in [3]. Extracted from [3], with permission.
Figure 1. Operation of the simulation architecture proposed in [3]. Extracted from [3], with permission.
Sensors 22 00189 g001
Figure 2. Detail of the microservice-based, extendable simulation architecture proposed in [3]. Extracted from [3],.with permission.
Figure 2. Detail of the microservice-based, extendable simulation architecture proposed in [3]. Extracted from [3],.with permission.
Sensors 22 00189 g002
Figure 3. Graphical representation of the proposed scenario.
Figure 3. Graphical representation of the proposed scenario.
Sensors 22 00189 g003
Figure 4. Plots generated by each of the sensors.
Figure 4. Plots generated by each of the sensors.
Sensors 22 00189 g004
Figure 5. Plots generated by the active and passive radar sensors corresponding to the actual incoming drone.
Figure 5. Plots generated by the active and passive radar sensors corresponding to the actual incoming drone.
Sensors 22 00189 g005
Figure 6. Plots generated by MIMO and microphone-based sensors corresponding to the actual incoming drone.
Figure 6. Plots generated by MIMO and microphone-based sensors corresponding to the actual incoming drone.
Sensors 22 00189 g006
Figure 7. Angle error of each of the plots as distance to each of sensors’ receivers increases.
Figure 7. Angle error of each of the plots as distance to each of sensors’ receivers increases.
Sensors 22 00189 g007
Figure 8. Distance error of each of the plots as distance to each of sensors’ receivers increases.
Figure 8. Distance error of each of the plots as distance to each of sensors’ receivers increases.
Sensors 22 00189 g008
Table 1. ART Midrange 3D specifications.
Table 1. ART Midrange 3D specifications.
SpecificationValue
Frequency BandKu-band
Bandwidth1 GHz
Elevation Control+/−5 degrees
Instrumental Detection Range5000 m
Coverage Area78 km2
Azimuth Coverage360°
Scan Rate60 rpm (configurable)
Range Resolution1 m–0.2 m
Range Accuracy0.25 m–0.05 m
CommunicationsTCP/IP over Ethernet
ProtocolXML-based on NMEA0183
Table 2. ARMS radar specifications.
Table 2. ARMS radar specifications.
Radar
Ku-band, FMCW
Scan 360 degrees/second
Sectorized RF blanking
Doppler and Clutter Map techniques
True track report (position, course and speed) >2 km for smallest target of RCS = 0.1 m2, once per second
X-Band alternative for longer ranges
Table 3. HENSOLDT radar specifications.
Table 3. HENSOLDT radar specifications.
SpecificationSpexer 2000 3D MkII RadarSpexer 2000 3D MkIII Radar
Maximum UAV detection range9 km9 km
Maximum small UAV detection range6 km6 km
Radar technologyFull coherent pulse Doppler RadarFull coherent pulse Doppler Radar
Frequency rangeX-bandX-band
Azimuth coverage120°up to 360° (single antenna 120°)
Elevation coverage15°up to 90°
Track while scan>300 targets>300 targets in 120°
Power consumption<550 WAntenna: 1700 W
Processing: 400 W
Table 4. Echodyne Counter-Drone Radar specifications.
Table 4. Echodyne Counter-Drone Radar specifications.
SpecificationValue
Detection range2.5 km
Frequency24.05–24.25 GHz
Field of view120° azimuth
80° elevation
Angular resolution2° azimuth
6° elevation
Search while trackobject tracks are updated at ~10 Hz while continuously scanning entire field of view
Track acquisition rate<1 s
Max tracks≤20 simultaneous tracks
Table 5. Ranger R8SS-3D specifications.
Table 5. Ranger R8SS-3D specifications.
SpecificationValue
Instrumented range7800 m
Micro-UAV detection range1200 m
Mini-UAV detection range2100 m
Small UAV detection range4000 m
Minimum detection range10 m
Scan sector±45° (fixed)–360° with pan/tilt mount
Vertical coverage≥40°
Number of simultaneously displayed tracksUp to 512
Electronic scan rate2 Hz or 4 Hz
Minimum detection velocity<0.1 m/s
Range accuracy±3 m
Angular accuracy (azimuth)<0.8°
Angular accuracy (elevation)<3°
Operating frequencyX-band
ConnectivityEthernet
Table 6. Doruk radar specifications.
Table 6. Doruk radar specifications.
SpecificationValue
Frequency bandX-band
Detection probability80%
Detection range6 km
Detection velocity0.2–100 m/s
Elevation beamwidth20°
Azimuth accuracy≤1° (RMS)
Azimuth resolution≤2°
Azimuth coverage360°
Range accuracy≤5 m
Range resolution≤15 m
Velocity accuracy≤0.2 m/s
Scanning rate90 °/s
Target tracks>200, Track While Scan
Clutter suppression≥45 dB
Table 7. DedroneSensor’s RF sensors’ specifications.
Table 7. DedroneSensor’s RF sensors’ specifications.
SpecificationDedroneSensor RF-160DedroneSensor RF-360
Range1.6–5 km (depending on RF interference conditions)2–5 km (depending on RF interference conditions)
Radio frequencyOmnidirectional, passive detection and classificationOmnidirectional, passive detection, classification and direction-finding
Table 8. Dedrone DroneTracker audio detector specifications.
Table 8. Dedrone DroneTracker audio detector specifications.
SpecificationValue
Range500 m
Coverage azimuth angle (min–max)10°–90°
Audio spectrum0–96 kHz
Microphone range50–80 m
Table 9. Axis Q6215-LE PTZ specifications.
Table 9. Axis Q6215-LE PTZ specifications.
SpecificationValue
Image sensorCMOS
Image sensor size1/1.9 inches
Range1000 m
Night vision range400 m
Min illumination/light sensitivity (color)0.07 lux
Min illumination/light sensitivity (B/W)0 lux
Max video resolution1920 × 1080
Max frames per second50/60
Focal length6.7–201 mm
Horizontal field of view (min–max)2.2°–58.6°
Vertical field of view (min–max)1.2°–34.1°
Pan range360°
Tilt range−90° to +90°
Optical zoom30
Digital zoom21
Table 10. Triton PT-Series HD camera specifications.
Table 10. Triton PT-Series HD camera specifications.
SpecificationValue
Range2–4 km (depending on visibility conditions)
Min illumination/light sensitivity (color)0.01 lux
Max video resolution1920 × 1080
Focal length4.3–129 mm
Field of view (min–max)21° × 28° W
1.5° × 2° N
Lens field of view (min–max)2.3°–63.7°
Pan range360°
Pan velocity0.1 to 60°/s
Tilt range−90° to +90°
Tilt velocity0.1 to 30°/s
Optical zoom120
Digital zoom22
Table 11. ARMS camera specifications.
Table 11. ARMS camera specifications.
Optronic
Camera model (IR and CCD) selectable from a wide range
360° PTZ platform
Wide and narrow FoV continuous zoom
Tracking and 3D positioning
Table 12. Spynel series specification.
Table 12. Spynel series specification.
SpecificationSPYNEL-C 1000SPYNEL-S 2000SPYNEL-X 3500
Horizontal field of view360°360°360°
Vertical field of view20°20°20°
Frame rateUp to 2 HzUp to 2 HzUp to 2 Hz
Spectral bandLWIR (8–12 µm)MWIR (3–5 µm)MWIR (3–5 µm)
Image resolution3 Mpixel7 Mpixel30 Mpixel
Range400 m400 m400 m
Table 13. DroneSentry integrated system specifications.
Table 13. DroneSentry integrated system specifications.
SpecificationValue
Radar detection range1.5 km
RF detection range1 km (urban)
5 km (rural)
Acoustic detection range200 m
Camera detection range600 m (small UAVs)
2 km (large UAVs)
RadarZero field of view≥90° azimuth × 80° elevation
RadarZero angle resolution±1° azimuth ± 3° elevation
RadarZero frequency24.45–24.65 GHz (multichannel)
RadarZero target detection≥20 targets simultaneously
DroneOpt pan rotation360° continuous
DroneOpt pan speed0.2°/s–120°/s
DroneOpt tilt range−55°–+90°
DroneOpt tilt speed0.2°/s–90°/s
DroneOpt position accuracy±0.07°
DroneOpt zoom30× optical zoom
12× digital zoom
DroneOpt field of view (min–max)2.3°–63.7°
DroneOpt resolution640 × 480
DroneOpt frame rate30 Hz
Table 14. Comparison of UAV sensing technologies.
Table 14. Comparison of UAV sensing technologies.
MethodOperational ConditionsRangeCostMeasures Provided
Active radarPartially affected by weather conditionsLong-range (~5 km)High-costRange, azimuth, elevation
Passive radarPartially affected by weather conditionsLong-range (~5 km)Low-costRange, azimuth, elevation
Radio frequency sensorAffected by RF interferences and partially by weather conditionsMedium-range (~2 km)Low-costAzimuth, elevation, classification
Acoustic sensorAffected by weather/noise conditionsShort-range (~500 m)Low-costAzimuth, classification
Camera sensorAffected by weather conditions, day/nightMedium-range (~1 or 2 km)Low-costAzimuth, elevation, classification
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Besada, J.A.; Campaña, I.; Carramiñana, D.; Bergesio, L.; de Miguel, G. Review and Simulation of Counter-UAS Sensors for Unmanned Traffic Management. Sensors 2022, 22, 189. https://doi.org/10.3390/s22010189

AMA Style

Besada JA, Campaña I, Carramiñana D, Bergesio L, de Miguel G. Review and Simulation of Counter-UAS Sensors for Unmanned Traffic Management. Sensors. 2022; 22(1):189. https://doi.org/10.3390/s22010189

Chicago/Turabian Style

Besada, Juan A., Ivan Campaña, David Carramiñana, Luca Bergesio, and Gonzalo de Miguel. 2022. "Review and Simulation of Counter-UAS Sensors for Unmanned Traffic Management" Sensors 22, no. 1: 189. https://doi.org/10.3390/s22010189

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop