Next Article in Journal
Multi-Time Scale Trading Simulation of Source Grid Load Storage Based on Continuous Trading Mechanism for China
Next Article in Special Issue
YOLOv5 with ConvMixer Prediction Heads for Precise Object Detection in Drone Imagery
Previous Article in Journal
A Contact-Sensitive Probe for Biomedical Optics
Previous Article in Special Issue
Design and Implementation of a UAV-Based Airborne Computing Platform for Computer Vision and Machine Learning Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

UWB and IMU-Based UAV’s Assistance System for Autonomous Landing on a Platform

by
Aitor Ochoa-de-Eribe-Landaberea
1,2,*,
Leticia Zamora-Cadenas
1,2,
Oier Peñagaricano-Muñoa
3 and
Igone Velez
1,2
1
CEIT-Basque Research and Technology Alliance (BRTA), Manuel Lardizabal 15, 20018 San Sebastián, Spain
2
Tecnun School of Engineering, Universidad de Navarra, Manuel Lardizabal 13, 20018 San Sebastián, Spain
3
Alerion Technologies SL, Paseo de Mikeletegi 73B, Suite 304, 20009 San Sebastián, Spain
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(6), 2347; https://doi.org/10.3390/s22062347
Submission received: 18 February 2022 / Revised: 14 March 2022 / Accepted: 16 March 2022 / Published: 18 March 2022
(This article belongs to the Special Issue UAV Imaging and Sensing)

Abstract

:
This work presents a novel landing assistance system (LAS) capable of locating a drone for a safe landing after its inspection mission. The location of the drone is achieved by a fusion of ultra-wideband (UWB), inertial measurement unit (IMU) and magnetometer data. Unlike other typical landing assistance systems, the UWB fixed sensors are placed around a 2 × 2 m landing platform and two tags are attached to the drone. Since this type of set-up is suboptimal for UWB location systems, a new positioning algorithm is proposed for a correct performance. First, an extended Kalman filter (EKF) algorithm is used to calculate the position of each tag, and then both positions are combined for a more accurate and robust localisation. As a result, the obtained positioning errors can be reduced by 50% compared to a typical UWB-based landing assistance system. Moreover, due to the small demand of space, the proposed landing assistance system can be used almost anywhere and is deployed easily.

1. Introduction

The inspection of infrastructures is a necessary task for their correct performance and durability, especially in the case of the energetic, petrochemical, construction or transport sectors. However, sometimes dangerous zones with difficult accessibility must be reached by a human worker (or a group of workers), increasing the risks of the work. For this reason, there is a growing interest in the use of drones or unmanned aerial vehicles (UAVs) for infrastructure inspection [1,2,3,4,5,6]. One of the main advantages of UAVs is their high adaptability to any infrastructure, as they can be used to inspect power transmission lines [1,2,3], surfaces in bridges and roads [4], wind turbines [5] or rail viaduct bearings [6] among others. As a consequence, the infrastructure inspection already makes 45% of the total UAV market [7].
Nevertheless, the use of drones for inspection tasks also has its drawbacks as investment must be made in vehicle and staff training to pilot the UAV. Moreover, since drones must be operated by a person, this solution is still prone to human errors, so the possibility of using autonomous drones should be considered.
The landing manoeuvre is probably one of the riskiest situations of a flight. In the case of an autonomous drone, knowing the real-time location of the vehicle with respect to the landing area is crucial for a successful operation. A positioning error of a few metres could cause significant damage to the drone. A high positioning rate is also important, since adverse conditions such as windy weather could cause sudden velocity changes that could not be detected on time.
In the aeronautic sector it is common to use the global navigation satellite system (GNSS) for an autonomous landing [8]. Nevertheless, this technology is not always available or it is sometimes incapable of giving an acceptable level of accuracy, as can happen when the inside of a tank of a petrochemical plant is to be inspected. For this reason, in the literature, complementary landing assistance systems (LASs) are proposed based on computer vision techniques [9,10,11,12,13,14,15,16,17,18,19], a fusion between computer vision techniques and inertial measurement units (IMUs) [20,21,22,23,24,25], computer vision, IMU and ultrasonic sensors [26], computer vision and a Time-of-Flight-based height sensor [27], computer vision and GNSS [28,29] and even an approach fusing onboard cameras and a robotic total station [30]. The main setback of traditional vision-based systems is their strong dependency on weather or lighting conditions. Coming back to the example of a petrochemical plant, there is no light available inside the tanks, so vision-based systems would fail. Recently, Refs. [12,15] proposed to use convolutional neural networks to detect a marker on the landing area in low illumination environments. However, these networks are yet to be implemented on a resource-limited UAV onboard platform [12,15]. Among the works that use computer vision techniques, Ref. [19] presented really good accuracy. Nevertheless, this work only presented results in short ranges with low velocities of the drone.
Other vision-based approaches use Time-of-Flight (ToF) cameras [31] and a fusion of ToF cameras and ultrasonic sensors [32] for the location of UAVs. However, these proposals install sensors on the ceiling of a room, which can be difficult to do inside the tank of a petrochemical plant.
Recently, millimetre wave radar has been proposed by [33] to detect the presence of UAVs. However, as in the case of [31,32] the UAV detection is undertaken on the ground station. As this information should be sent to the drone for UAV navigation, potential latency problems in the wireless communication link between the drone and ground station could endanger the autonomous landing operation of the UAV.
Other proposals such as [34,35,36] suggest using ultra-wideband (UWB) technology for a safe landing as a passive radar, where similar techniques to those of computer vision are utilised. However, the developed systems are only used to estimate the roughness of the ground where the landing will be performed.
Other recent works such as [37,38,39] have explored the possibility of using active impulse-radio ultra-wideband (IR-UWB) technology to estimate the position of the drone with respect to the landing zone. This technology uses two types of sensors: anchors and tags. Anchors are fixed sensors placed at known locations, and they communicate with the tags to calculate the distances between each anchor and tag. With the measured distances, the position of each tag can be calculated. In order to localise drones for an autonomous landing, Ref. [37] runs simulations to optimise the geometry of the infrastructure formed by UWB anchors and to improve the accuracy of the real-time location system (RTLS). The work of [38] goes a step further and tests the feasibility of a real UWB system to locate UAVs and in [39] a path generation algorithm is proposed for an autonomous landing of the drone. In all of the mentioned works, it is suggested to place the anchors around the landing zone with a separation of tens of metres among them. Such infrastructures are similar to those used in classic RTLSs [40]. However, one of the main drawbacks of UWB-based UAV positioning systems is that it is sometimes impossible or impractical to deploy those large infrastructures. For example, when inspecting off-shore wind turbines, the drone has to land on a boat or a floating platform with little space available. Another case can be the inspection of a tank in a petrochemical plant, where humans are not allowed to stay inside for long time periods for safety reasons, making it impossible to deploy a big infrastructure for a RTLS. A fast and easy deployment is crucial for the latter example.
A possible solution is to use UWB technology with a small infrastructure. In [41], the authors demonstrate that good positioning accuracy can be achieved even if the anchors of a UWB-based AGV (Automated Guided Vehicle) positioning system are not placed in a fixed infrastructure with a big separation between them. Similarly, the authors of [42] proposed to use an anchor infrastructure of around 2 × 2 m to locate a UAV. However, according to [42], the errors of their RTLS are twice as big compared to a deployment where the anchors make up a rectangle of 64 m 2 . Moreover, the experiments were run in a controlled environment, where the authors could easily control all the movements of the drone. In a real environment, wind could cause sudden velocity changes to a UAV. As a consequence, the obtained performance could further decrease because of the limited positioning rate of UWB systems. In fact, the low positioning rate of UWB-based UAV positioning systems poses a big limitation in the positioning accuracy of the system.
There are different methods to improve the positioning accuracy of UWB-based UAV positioning systems. For example, in [43] a particle filter algorithm is proposed for an enhanced performance of UWB for the localisation of drones. However, approaches fusing data from different sensors are more popular. It is very common to fuse UWB data with inertial measurement units, as suggested by [44,45,46]. A third sensor can also be added to the UWB/IMU approach, such as a light scanner in [47], a frequency modulated continuous wave (FMCW) radar in [48] or a real-time kinematic global positioning system (RTK-GPS) in [49]. Another popular approach is to add visual data to the UWB-based RTLS as in [50,51,52,53]. Laser imaging detection and ranging (LIDAR) sensors have also been used to improve the UWB accuracy for UAV location in [54], where a drone had to fly close to a wall. Despite the improved performance of the RTLS proposed in the mentioned works, only one of them uses a simple infrastructure [53], where four UWB anchors are placed around a 1.5 × 1 m pad with a system of visual fiducial tags. The UWB data are fused with the visual and inertial data, resulting in a safe landing. However, it is not known how this system would perform in a dark environment, since the pad must remain in the field of view of the camera.
This paper proposes a novel LAS for autonomous drones that combines data from UWB, IMUs and magnetometers to estimate the position of the drone when approaching or moving away from the landing platform. In this LAS, as in the case of [42], UWB anchors are placed around a small landing platform and two tags are placed on the drone. However, in our case, both tags also have IMUs and magnetometers. The proposed drone positioning algorithm takes advantage of the UWB positioning accuracy and of the higher sampling rate of the IMUs and provides accurate estimates of the position of the drone, even when the drone suffers from high accelerations. This positioning algorithm is executed in the single board computer (SBC) of the drone and works in two steps. In the first step, for each tag, the proposed drone positioning algorithm fuses the information of the IMU and magnetometer with UWB data to estimate its position. In the second step, the positioning estimates of each tag are combined to provide a more accurate estimate of the position of the centre of the drone. Unlike other solutions in the state of art, our proposal neither needs a complex infrastructure deployment, nor does it depend on lighting conditions or availability of GNSS. Additionally, our proposed system presents high accuracy even with sudden changes in drone velocity, as it achieves a higher positioning rate than traditional UWB-based positioning systems. Finally, the proposed combination of tags’ positions further improves the accuracy of our system. Higher robustness is gained because the possible errors of a tag are compensated with the other.
The rest of the article is organised as follows: Section 2 describes how a LAS works when only UWB data are used, Section 3 describes the proposed LAS and the main contributions to the state of art, Section 4 explains the performed experiments and analysis and the obtained results are presented in Section 5. Finally, conclusions and future research lines are given in Section 6.

2. State of the Art of UWB-Based Systems

When UWB technology is used as RTLS, two main elements are necessary: anchors and tags. Anchors are fixed sensors at known locations, while tags are the moving sensors to be located. Each tag communicates with the anchors in order to calculate the distance to all of them. With the measured distances and the known locations of anchors, the positions of the tags can be calculated.
For the real-time location of a UAV, a single tag is usually placed on the vehicle and an anchor infrastructure is deployed around the flying space. Ideally, the anchors should have a separation of tens of meters so that the calculation is optimal. This type of infrastructure is typical in the literature, as proposed for example by [37,38,39]. Nevertheless, optimal anchor infrastructures cannot always be deployed, so [42] suggests a deployment where four anchors are placed on a 2 × 2 m square on the floor.
With an anchor infrastructure and a tag on the drone, the estimated distances can be used in different algorithms to calculate the position of the vehicle. One of the most typical methods to calculate the position of a tag from ranging measurements to known anchors is the Extended Kalman Filter (EKF). An algorithm based on the EKF is used by [41,55], which is shown in Figure 1.
This algorithm needs a motion model f and an observation model h to be defined
x ˜ i = f ( x ^ i 1 , u i 1 , ω i 1 )
y ˜ i = h ( x ˜ i , ν i ) ,
where x i is the state vector related to the i th estimation. It contains the position of the tag to be estimated and its first and second derivatives
x i = x i x ˙ i x ¨ i y i y ˙ i y ¨ i z i z ˙ i z ¨ i T ,
u i is the optional input vector set to zero and f is the function representing the motion model of the system. It relates the previous state x i 1 with the current state x i . The observation vector is represented by y i , which contains the measured distances between the tag and each anchor. These distances can be calculated with the state estimate x i and the function of the observation model h . Note that x ˜ and x ^ notation in (1) represents the a priori and a posteriori state estimate, respectively. The process is characterised with the stochastic random variables ω i and ν i that represent the process and observation noise, respectively. They are assumed to be independent, white and normal probably distributed with covariance matrices Q i and R i , respectively.
The above mentioned a priori estimate of the state is calculated with the linearised version of the motion model f
x ˜ i = Φ · x ^ i 1
Φ = I 3 × 3 B
B = 1 Δ Δ 2 2 0 1 Δ 0 0 1
C ˜ i = Φ · C ^ i 1 · Φ T + Q i 1 ,
where ⊗ represents the Kronecker product of the matrices, Δ the time difference between two consecutive time steps and C the error covariance matrix of the state estimate.
Using the predicted estimate of the state vector, the predicted observation vector y ˜ i can be calculated by means of the observation model h . For each anchor l, the distance between the predicted position ( x ˜ i , y ˜ i , z ˜ i ) T and the fixed sensor position ( X l , Y l , Z l ) T is calculated as
y ˜ i , l = ( x ˜ i X l ) 2 + ( y ˜ i Y l ) 2 + ( z ˜ i Z l ) 2 .
Finally, the predicted state x ˜ i is corrected to obtain x ^ i by comparing the predicted observation vector y ˜ i with the measured ranging values y i r
x ^ i = x ˜ i + K i · ( y i r y ˜ i )
K i = C ˜ i · H i T · ( H i · C ˜ i · H i T + R i ) 1
H i = h x ( x ˜ i )
C ^ i = ( I K i · H i ) · C ˜ i ,
where H i is the Jacobian matrix of the observation model h .
Despite the high accuracy of the UWB technology and the suitability of the EKF for a correct performance of this type of locating systems, they still present some drawbacks for the UAV localisation. The data rate of UWB systems is limited and may be incapable of detecting sudden changes in the drone path due to sudden wind changes. In order to deal with these types of conditions, it is better to add the data from an IMU that could give an accurate estimate of the vehicle’s acceleration, track all the trajectory changes and improve the data rate of the position estimates.

3. Proposed LAS

In this section the proposed novel LAS and its main differences compared to the typical UWB-based systems described in Section 2 are explained. Note that the proposed LAS is designed for a drone that inspects critical infrastructures such as off-shore wind turbines or a tank of a petrochemical plant. After the inspection mission, the drone needs to land on a small platform to charge its batteries. This platform is on a boat or in a confined space, so there is not enough space or time to deploy typical UWB infrastructure. In our proposal, a small anchor infrastructure with easy deployment is used, which also allows us to make the UWB anchors part of the landing platform and use the same power supply for the anchors and the battery charger. Thus, the resulting LAS needs no complex additional infrastructure. Moreover, our LAS is not affected by changing lighting conditions because of the day or night time, rain or fog that traditionally affect computer-vision systems. As our system is based on UWB technology, during the landing, our LAS will present lower positioning errors than GNSS, which can be around 2 m in the latter case [29].
In this work, data from UWB, IMU and magnetometer are proposed to be combined to estimate the position of the drone. Figure 2 depicts the system architecture.
Similar to [42], eight anchors are placed around the landing platform of 2 × 2 m and two tags are installed on both sides of the drone. Taking advantage of the availability of these sensors, a new positioning algorithm is proposed. This algorithm first fuses the UWB, IMU and magnetometer data from each tag to obtain two independent position estimates and then combines them to calculate the position of the centre of the drone. From the resulting data, only the horizontal coordinates of the drone are used since the vehicle is capable of accurately estimating its altitude with other sensors; i.e., an altimeter.
Figure 3 shows the placement of the tags on the drone.
They are installed on both sides of the drone with a separation of 0.36 m. In this work, the LAS will provide the position of the point that is in the middle of the line formed by the two tags. We will denote the centre of the drone to this point. In the same picture, the SBC of the drone can be seen, which receives the data from both tags and runs the necessary positioning algorithm.
The tags employ the DW1000 chip of Decawave as UWB transceiver. This transceiver follows the IEEE 802.15.4a standard and is configured with the parameters presented in Table 1.
Apart from the UWB transceiver, the tags also contain the LSM6DSOTR IMU [56] and the LIS2MDLTR magnetometer [57]. Both sensors are developed by STMicroelectronics and they can be fused in the MotionFX library of STMicroelectronics [58] in order to subtract the measurement of gravity from the acceleration data and obtain the orientation of the tag. The chosen configuration parameters of MotionFX are shown in Table 2.
Figure 4 shows the flow chart of the proposed algorithm. As described in Section 2, the tags communicate with the anchors in order to calculate the distances between the sensors, r i , l , j ( 1 ) , using the two way ranging (TWR) method. The subscripts i, l and j of the ranging estimates refer to the time step, the identifier of the anchor and the identifier of the tag, respectively. The obtained data are sent to the SBC of the drone (see Figure 3) which runs the necessary algorithms for a correct position estimation.
Unlike the state of art, our LAS filters the ranging estimates with a parameter r m a x , which represents the maximum allowed ranging estimate. Since the objective of the proposed LAS is to help the drone during the autonomous landing and not the rest of the flight, any estimate ranging above r m a x is discarded.
Moreover, our proposed LAS adds the data of two IMUs and magnetometers to the algorithm, one for each tag. At time t i 1 and tag j, the measured specific force y i 1 , j a , angular velocity y i 1 , j ω and magnetic field y i 1 , j m are used by the MotionFX library to calculate the acceleration a i 1 ( b j ) and quaternion q ( b j ) i 1 ( w ) . The terms ( b j ) and ( w ) refer to the body frame of tag j and world frame, respectively. The used frames are shown in Figure 5.
Each tag contains an independent body frame, ( b 1 ) and ( b 2 ) , which are fixed to the sensors. All measurements of the IMUs and magnetometers and the resulting acceleration a i 1 ( b j ) are referred to their body frames. The quaternion q ( b j ) i 1 ( w ) transforms any vector referred to the body frame ( b j ) to the world frame ( w ) , of which the x, y and z axes look at the east, north and up, respectively. However, the inertial frame ( i n ) , which is defined by the landing platform, does not have to be aligned with the world frame, so another quaternion q ( w ) ( i n ) must be defined to transform any vector referred to the world frame ( w ) to the inertial frame ( i n ) . If the landing platform is on the horizontal plane, q ( w ) ( i n ) is defined as a quaternion that rotates any vector by an angle ϕ around the z axis. Both quaternions can be combined to calculate the quaternion q ( b j ) i 1 ( i n ) that transforms any vector from the body frame ( b j ) to the inertial frame ( i n )
q ( b j ) i 1 ( i n ) = q ( w ) ( i n ) q ( b j ) i 1 ( w ) ,
being ⊙ the quaternion multiplication operator.
The calculated ranging data, acceleration and orientation of the tags are fused in two parallel EKF algorithms. This way, two position estimates p ^ i , 1 and p ^ i , 2 are calculated and finally combined to obtain the position of the centre of the drone p ^ i , D . If for some reason one of the tags does not see the anchors for a time t r e i n i t , that tag stops giving position estimates. In this case, the position of the centre of the drone can still be calculated with the position estimate of the other tag, its orientation and the relative position of the centre of the drone with respect to the remaining tag. Once the UWB signal is available again in the tag, its EKF is reinitialised. This means that all the parameters of the EKF are set to their initial value. As the EKF algorithm needs some time to converge to the real solution, during a time period of t c o n v e r g e , the position estimates of the newly found tag are not used in the combination algorithm. The EKF algorithm is further explained in Section 3.1 and the combination algorithm in Section 3.2.

3.1. EKF with Fusion of Sensors

The first part of the proposed positioning algorithm consists of an EKF that takes advantage of the availability of the data of the IMU and magnetometer. The flow chart that summarises this part is shown in Figure 6.
After a reinitialisation, the first position of the tag is estimated by means of a recursive least squares (RLS) algorithm [59] using the first received UWB data. After this first position estimate, every time a new acceleration estimate is received, the prediction step is performed. Since the IMU and UWB rates are different, while no new UWB measurements are received, the EKF algorithm keeps working with the predicted state estimate. When new UWB data are received, the correction step is performed. The advantage of this method is that the resulting positioning rate of the proposed LAS is of 25 Hz, much faster than the UWB ranging rate. However, if no UWB measurements are obtained for a long period of time, the position estimate can drift and get lost. For this reason, if not enough UWB ranging estimates are obtained during an adjustable time interval of t r e i n i t , the proposed LAS stops giving position estimates. Once the UWB signal is recovered, the algorithm is reinitialised.
This algorithm is run twice in parallel, once for each tag. For simplicity, the letter j that is used to refer to the tag is going to be skipped in this subsection.
Unlike the previous algorithm of Section 2, the state vector only contains position and velocity data
x i = p i T v i T T = x i y i z i x ˙ i y ˙ i z ˙ i T ,
with p i being the position of the tag at time step i and v i its velocity. The acceleration data are introduced in the motion model as one of the input parameters. The inputs are the acceleration referred to the body frame a ( b ) and a unit quaternion q ( b ) ( i n ) that rotates any vector from the body frame ( b ) to the inertial frame ( i n ) .
The motion model f that transforms the previous state x ^ i 1 = p ^ i 1 T v ^ i 1 T T to the current predicted state x ˜ i = p ˜ i T v ˜ i T T is defined as
x ˜ i = f x ^ i 1 , u i 1 , e i 1
u i 1 = a i 1 ( b ) q ( b ) i 1 ( i n ) T
e i 1 = e i 1 ( a ) e i 1 ( ϕ ) T
p ˜ i = p ^ i 1 + Δ · v ^ i 1 + Δ 2 2 · R ( b ) i 1 ( i n ) · a i 1 ( b ) e i 1 ( a )
v ˜ i = v ^ i 1 + Δ · R ( b ) i 1 ( i n ) · a i 1 ( b ) e i 1 ( a )
R b i 1 ( i n ) = q 2 R q ( b ) i 1 ( i n ) f q e i 1 ( ϕ ) ,
where Δ represents the time between two consecutive steps and R ( b ) ( i n ) the rotation matrix obtained from the unit quaternion q ( b ) ( i n ) as explained in [60] with the here defined function q 2 R . For any unit quaternion q = q w q x q y q z T , its corresponding rotation matrix R q is calculated as
R q = q 2 R ( q ) = 2 q w 2 + 2 q x 2 1 2 q x q y 2 q w q z 2 q x q z + 2 q w q y 2 q x q y + 2 q w q z 2 q w 2 + 2 q y 2 1 2 q y q z 2 q w q x 2 q x q z 2 q w q y 2 q y q z + 2 q w q x 2 q w 2 + 2 q z 2 1 .
The noise parameters of the motion model f are represented with e ( a ) for the acceleration data and e ( ϕ ) for the orientation data. The latter is represented as an orientation deviation in the body coordinate frame and it is converted to a unit quaternion q ( ϕ ) with the function f q
q ( ϕ ) = f q e ( ϕ ) = cos ( | | e ( ϕ ) | | 2 2 ) e ( ϕ ) | | e ( ϕ ) | | 2 sin ( | | e ( ϕ ) | | 2 2 ) ,
being | | e ( ϕ ) | | 2 the euclidean norm of vector e ( ϕ ) .
Both noise parameters are determined empirically and have zero mean and covariance Q I M U , which is necessary for the prediction step of the EKF to make an a priori estimation of the state error covariance matrix P ˜ i
P ˜ i = F i 1 · P ^ i 1 · F i 1 T + G i 1 · Q I M U · G i 1 T
F i 1 = f x x ^ i 1
G i 1 = f e e i 1 .
In the above equation the Jacobian matrices F i 1 and G i 1 of the motion model f have been calculated with respect to the state vector x and noise vector e . The calculation process of useful derivatives for quaternions and rotation matrices is explained in [61].
After the prediction step, the a priori estimate must be corrected with the UWB ranging data as
x ^ i = x ˜ i + K i · ( y i r y ˜ i ) ,
where y i r is the vector of measured ranging values, y ˜ i is the predicted observation vector calculated with (8) and K i represents the Kalman gain matrix. The Kalman gain matrix is calculated as
K i = P ˜ i · H i T · ( H i · P ˜ i · H i T + R i ) 1 ,
where H i is the Jacobian matrix of the observation model and R i the measurement covariance matrix. The Jacobian matrix of the observation model is calculated with (11). Finally, the predicted state error covariance matrix P ˜ i must be corrected with
P ^ i = ( I K i · H i ) · P ˜ i .

3.2. Combination of Tags

In the last part of the proposed positioning algorithm, the two independent position estimates p ^ i , 1 and p ^ i , 2 are combined to calculate the position of the centre of the drone p ^ i , D . If the estimates of both tags are available at time step i, then the average position is calculated. If at some certain moment, only one of the tags gives a positioning estimate, then the position of the centre of the drone can be calculated with the known orientation q ( b j ) i ( i n ) and the coordinates of the centre of the drone d j with respect to the body frame of the remaining tag ( b j ) . The algorithm is summarised in (29)
p ^ i , D = p ^ i , 1 + p ^ i , 2 2 , if p ^ i , 1 , p ^ i , 2 p ^ i , 1 + q 2 R q ( b 1 ) i ( i n ) · d 1 , if p ^ i , 1 , p ^ i , 2 p ^ i , 2 + q 2 R q ( b 2 ) i ( i n ) · d 2 , if p ^ i , 1 , p ^ i , 2 .

4. Methodology

For the correct assessment of the proposed LAS, some experiments were performed by flying the drone in a controlled indoor environment close to the landing area. Additionally, more experiments were conducted in a real outdoor environment. In both cases, the parameters r m a x , t r e i n i t and t c o n v e r g e described in Section 3 were set to 20 m, 2 s and 3 s, respectively. In the next subsections, the experimental set-ups as well as the employed evaluation methods are described.

4.1. Indoor Experiments

All the indoor tests were run in the Industry 4.0 Laboratory of Ceit-BRTA which contains a motion capture system of Optitrack, which allowed us to track the drone with millimetre level accuracy. Due to the high accuracy of the motion capture system, its measurements were used as ground truth. A picture of the testing zone can be seen in Figure 7a. The developed LAS was deployed inside the observation area of the Optitrack system, as shown in Figure 7b and the positions of each anchor are given in Table 3.
For safety reasons, some fences were placed around the measurement zone.
Once the set-up was prepared, 9 different flights were conducted inside the tracking area. All of them consisted of a take-off, movements close to the landing platform and a landing. The paths followed by the centre of the drone during the flights can be seen in Figure 8.
The flights can be separated into two groups: those with a mean horizontal acceleration under 1 m / s 2 and those with a mean horizontal acceleration over 1 m / s 2 , as shown in Table 4.
The measured accelerations correspond to the centre of the drone, so the acceleration on each tag may be slightly different. By dividing the flights in two groups, the effect of acceleration on the positioning accuracy can be evaluated.

4.2. Outdoor Experiments

The drone was also flown in a real outdoor environment with the proposed LAS. A picture of the test zone is shown in Figure 9 with the prepared set-up.
The anchors were placed in the same positions as described in Table 3. These experiments were useful to test the LAS at longer distances than in the indoor environment. It is especially interesting to test the ability of the LAS to find the drone once it reaches the visible range of the system, when the landing is about to occur.
The chosen place contains a concrete platform of 2 × 2 m to land the drone and deploy the LAS. There is also a wind turbine, which simulates the infrastructure that the drone should inspect. Two different flights were performed, both of which consisted of a take-off, a linear movement to the wind turbine reaching a height of 14 m, return and landing, as shown in Figure 10.
Because of the unavailability of a highly accurate outdoor locating system such as Optitrack, the performance of the system in this environment was evaluated qualitatively by comparing it to the GNSS position estimates.

4.3. Calculation of Errors

In order to evaluate the performance of the proposed system, the positioning error in the horizontal plane XY was calculated as
ϵ i = ( x ^ i x i ) 2 + ( y ^ i y i ) 2 ,
where ϵ i represents the error of the position estimate i, x i and y i the real 2D position coordinates and x ^ i and y ^ i the estimated 2D position.
Once all the positioning errors were calculated, the system was evaluated with the mean error μ , standard deviation σ and root mean square error RMSE [62]. Additionally, the error below which 80% of samples are, the probability of obtaining an error under 1 m and the maximum error ϵ m a x were calculated.

5. Results

In this section the obtained experimental results are presented and discussed. As explained in the previous section, two types of measurements were performed. The first group was in an indoor controlled environment with the objective of evaluating the feasibility of the proposed LAS. The performance of the system was evaluated with only UWB data and later with the fusion of the inertial data, so that the advantages of data fusion could be seen. The second group of experiments was performed in a realistic environment, and the performance of the proposed LAS was qualitatively evaluated.

5.1. Indoor Results

5.1.1. Accuracy with Four UWB Anchors

For a correct comparison with the systems of the state of art, the performance of our LAS was evaluated using only UWB data. First, a similar set-up to that proposed by [42] was considered; i.e., only the ranging estimates of the four anchors on the corners were used to position the tags.
In Table 5, the accuracy data are given for both tags.
These tables show, for each flight, the mean positioning error, its standard deviation, the root mean square error, the error below which 80% of samples are, the percentage of errors below 1 m and the measured maximum error.
The obtained results confirm that a small anchor infrastructure of 2 × 2 m can accurately locate a drone when it flies close to the landing platform. Considering all flights, the RMSE value was 0.377 m for Tag T1 and 0.442 m for Tag T2. However, there was a considerable difference between those flights with low horizontal acceleration (Flights 1 to 4) and those with high acceleration (Flights 5 to 9). This is confirmed with the cumulative distribution function plots shown in Figure 11a for Flights 1 to 4 and Figure 11b for Flights 5 to 9.
When the acceleration values were low, as can be seen in Figure 11a, the obtained results were similar to those of [42]. In this case, the authors of [42] measured a mean horizontal acceleration of 0.67 m / s 2 and a maximum of 2.35 m / s 2 . However, our results demonstrate that when the drone suffered a higher acceleration (Flights 5 to 9) the accuracy of the UWB-based LAS was reduced, so a traditional system using only UWB data could have problems under adverse conditions.

5.1.2. Accuracy with Eight UWB Anchors

For a better performance of the LAS, our proposal adds some redundancy by using the estimates of eight anchors instead of four. The benefit of having anchor redundancy is that it is possible to calculate new positions even if an anchor fails to see the tags. If only four anchors are used, the lack of a single anchor-tag distance measurement is enough to skip a new position sample. With eight anchors, however, new positions can be calculated even with the lack of four sensors’ measurements. We tested the effect of this redundancy on the positioning accuracy and Table 6 shows the obtained data for the LAS using only UWB data with eight anchors.
The added redundancy reduced the mean error, RMSE and especially the maximum error. However, just adding more anchors could not solve the problems in the flights of higher accelerations. These flights need a high sampling rate sensor such as an IMU, as we propose in our LAS.

5.1.3. Accuracy with Fusion of Data

With the data of eight UWB anchors, an IMU and a magnetometer, our proposed LAS uses the EKF algorithm presented in Section 3.1 to fuse all this information. In Table 7 the obtained results of this algorithm are shown when it is used to estimate the position of the two tags of the drone.
Compared to the obtained results with only UWB data of eight anchors, the data fusion improved the accuracy of the system, especially in the second group of flights, where the mean horizontal acceleration was over 1 m / s 2 . For example, significant changes can be noticed in Flight 5 with the data fusion algorithm: the position of Tag T1 had an RMSE of 0.194 m and Tag T2 had an RMSE of 0.249 m. Without the proposed fusion algorithm, these values were 0.401 m and 0.503 m, respectively, so our proposed positioning algorithm halved the RMSE values in this case. Furthermore, the maximum error in this flight also had a reduction of around 50% in both tags. In general, our proposal significantly reduced the values of the mean error, standard deviation and maximum error in all flights with high accelerations.
Moreover, the fusion of data was also beneficial for those flights with low accelerations as almost all error metrics of every flight improved. Considering all data, with our proposed fusion algorithm, great accuracy can be obtained to locate a drone close to its landing platform.

5.1.4. Accuracy with a Combination of Tags

Finally, our proposed LAS combines both tags of the drone for a more accurate path. After fusing the UWB data from each tag with their IMUs, both position estimates are fused to calculate the position of the centre of the drone. In Table 8 the accuracy of the proposed system is shown.
Compared to the individual results of Table 7, the accuracy was further improved. The mean error and the RMSE were reduced in almost all cases. Moreover, there was a general reduction of the standard deviation of the error, which means a reduction of outliers. When only a tag was used to estimate positions, it could sometimes be with an unfavourable orientation with respect to the anchors. In these cases, the position estimates would suffer from high errors. With two tags, it is less likely to have a bad estimate at the same time with both of them. Therefore, the biggest errors were compensated with the help of the other tag. Thus, significant improvements can also be seen in the percentage of samples with an error under 1 m and in the maximum error.

5.1.5. Summary of Results

As a summary of the improved results, Table 9 shows the key metrics obtained with a set-up similar to [42] and with our proposed LAS.
The first row is the result of considering all the measured errors of both tags and all flights when positioning with only UWB data of four anchors. We can observe that the proposed LAS has improved all performance metrics. The reduction in RMSE value is remarkable, as it reduced from 0.410 m to 0.208 m; i.e., our novel LAS can reduce the obtained errors by 50% compared to a typical system. Thanks to a higher accuracy and a more frequent data rate, the task of autonomous landing becomes much safer with our proposal.

5.2. Results in a Real Environment

In addition to the indoor measurements, the proposed LAS was also tested in an outdoor realistic environment. In this way, it can be assessed how the LAS finds the drone after the inspection mission and how it tracks the vehicle until the landing manoeuvre.
Figure 12 represents the estimated trajectories by the proposed LAS compared to the GNSS. The position estimates of our proposal are shown as red points, while the GNSS trajectories are represented as blue lines. However, these GNSS data could not be used as ground truth, since their errors were similar to or greater than those of the UWB system. As an example, note that in both flights the GNSS incorrectly estimated that the drone landed out of the platform, while the proposed system is able to correctly estimate the landing place.
Due to dilution of precision, the accuracy of a UWB positioning system such as the one proposed in [42] is degraded at large distances. However, thanks to the information of the IMUs and magnetometers, the results in Figure 12 show that, at large distances, the drone position estimates of our proposed system are similar to those of GNSS in outdoor environments. This accuracy is enough to help the drone approach the platform. Furthermore, when we are near the platform, the accuracy of our system improves significantly, as we can see in Table 8, where the drone flew as far as 4.5 m from the platform. Thus, when the drone starts its landing operation, the accuracy of the system is good enough to help it land on the platform.

5.3. Comparison with State of the Art Technologies

Table 10 presents a comparison of the proposed system with others in the literature. This table shows the characteristics of a UWB-based LAS, such as the one proposed in [42], while the accuracy indicated is the one presented in Section 5.1.1. We can observe that this system achieved the worst accuracy of the compared LAS systems. The authors of [19] used vision in their LAS and presented a high accuracy in indoor environments and short ranges. However, this vision-based LAS was tested at a much lower horizontal velocity than our proposed LAS. Our proposal combines UWB technology with IMUs and magnetometers and, thus, achieves a high positioning rate, which is crucial for autonomous landing. Moreover, our proposed LAS is not affected by lighting conditions, as is the case with vision-based systems. We have shown that it can find the landing platform from at least 20 m and is robust in terms of high horizontal velocities and accelerations.

6. Conclusions and Future Research

This paper presents a novel landing assistance system capable of locating a UAV for a safe landing after its inspection mission. The proposed LAS is composed of eight UWB anchors placed around the landing platform of the drone and two UWB tags on the vehicle. Both tags also contain an IMU and a magnetometer, which enables the combination of real time acceleration of the drone with the UWB data. Unlike other proposed solutions in the literature, our LAS neither needs a large infrastructure deployment, nor does it depend on lighting conditions or the availability of GNSS.
In a recent study, a similar deployment was proposed for a UWB-based RTLS of an autonomous drone. In contrast to this study, our research tested several flights with different horizontal accelerations, so that the effect of sudden changes of the movement of the drone, which could be caused by windy weather, could be studied. It has been concluded that higher accelerations can cause problems in UWB-based RTLSs, as their positioning rate can be too low for correct tracking of the drone’s movements.
Our proposed LAS is more accurate than UWB-based systems when the drone suffers from high accelerations thanks to the fusion of UWB data with different sensors, namely IMUs and magnetometers. Our proposed algorithm takes advantage of the high sampling rate of the IMUs to estimate the position of the drone with a higher rate. Thus, it achieves a better tracking performance of the drone in those flights of high velocity and/or acceleration. Moreover, the proposed combination of tags’ positions further improves the accuracy of our LAS. Higher robustness is gained because possible errors from one of the tags are compensated with the other. As a result, with our novel LAS, an RMSE value of 0.208 m was obtained, compared to an RMSE value of 0.410 m of a traditional UWB-based LAS. Thanks to the higher accuracy and sampling rate of our proposal, the decision-making of an autonomous vehicle becomes safer.
Additionally, measurements in an outdoor relevant environment have shown that our system is able to position the drone when it is flying close to the landing platform and to track it accurately until the end of the flight. When the drone is flying far away from the landing platform our system presents an accuracy similar to GNSS. However, when the drone is near to the landing platform, our LAS presented better accuracy than GNSS. Furthermore, compared with vision-based systems in the literature, our LAS is not sensitive to lighting conditions. This will allow it to be used with a drone that inspects the inside of critical infrastructures such as off-shore wind turbines or a tank in a petrochemical plant.
In conclusion, this paper has presented an accurate landing assistance system for autonomous drones that combines UWB with IMU and magnetometer data. The system can also be improved to obtain a higher flexibility. For example, the case of a moving platform has not been considered, so future research lines could point towards this direction.

Author Contributions

Conceptualization, I.V. and O.P.-M.; methodology, A.O.-d.-E.-L., L.Z.-C. and I.V.; software, A.O.-d.-E.-L. and I.V.; validation, A.O.-d.-E.-L., L.Z.-C., I.V. and O.P.-M.; formal analysis, A.O.-d.-E.-L., L.Z.-C. and I.V.; investigation, A.O.-d.-E.-L., L.Z.-C., I.V. and O.P.-M.; writing—original draft preparation, A.O.-d.-E.-L.; writing—review and editing, L.Z.-C., I.V. and O.P.-M.; supervision, L.Z.-C. and I.V.; funding acquisition, L.Z.-C., I.V. and O.P.-M. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported in part by the Basque Government under Grant KK-2021/00033 of the Elkartek research programme (TREBEZIA project) and by the Provincial Council of Gipuzkoa under Grant 2020-I40P-000047-01 of the Gipuzkoa 4.0 research programme (WINDRON project).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors would like to thank Paul Bustamante, Javier Cejudo, Astrid Desireé Da Silva and Markos Losada for their support in the preparation of the utilised hardware. Our most sincere gratitude is also given to the Alerion team for providing the drone and flying it in the experiments.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

AGVAutomated Guided Vehicle
EKFExtended Kalman Filter
FMCWFrequency Modulated Continuous Wave
GNSSGlobal Navigation Satellite System
IMUInertial Measurement Unit
IR-UWBImpulse Radio Ultra-wideband
LASLanding Assistance System
LIDARLaser Imaging Detection and Ranging
PRFPulse Repetition Frequency
RLSRecursive Least Squares
RMSERoot Mean Square Error
RTK-GPSReal-time Kinematic Global Positioning System
RTLSReal Time Locating System
SBCSingle Board Computer
SFDStart of Frame Delimiter
ToFTime-of-Flight
TWRTwo-Way Ranging
UAVUnmanned Aerial Vehicle
UWBUltra-wideband

References

  1. Jalil, B.; Leone, G.R.; Martinelli, M.; Moroni, D.; Pascali, M.A.; Berton, A. Fault Detection in Power Equipment via an Unmanned Aerial System Using Multi Modal Data. Sensors 2019, 19, 3014. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Rahman, E.U.; Zhang, Y.; Ahmad, S.; Ahmad, H.I.; Jobaer, S. Autonomous Vision-Based Primary Distribution Systems Porcelain Insulators Inspection Using UAVs. Sensors 2021, 21, 974. [Google Scholar] [CrossRef]
  3. Suo, C.; Zhao, J.; Zhang, W.; Li, P.; Huang, R.; Zhu, J.; Tan, X. Research on UAV Three-Phase Transmission Line Tracking and Localization Method Based on Electric Field Sensor Array. Sensors 2021, 21, 8400. [Google Scholar] [CrossRef] [PubMed]
  4. Zhu, Q.; Dinh, T.H.; Phung, M.D.; Ha, Q.P. Hierarchical Convolutional Neural Network With Feature Preservation and Autotuned Thresholding for Crack Detection. IEEE Access 2021, 9, 60201–60214. [Google Scholar] [CrossRef]
  5. Car, M.; Markovic, L.; Ivanovic, A.; Orsag, M.; Bogdan, S. Autonomous Wind-Turbine Blade Inspection Using LiDAR-Equipped Unmanned Aerial Vehicle. IEEE Access 2020, 8, 131380–131387. [Google Scholar] [CrossRef]
  6. Lee, D.; Liu, J.; Lim, R.; Chan, J.L.; Foong, S. Geometrical-Based Displacement Measurement With Pseudostereo Monocular Camera on Bidirectional Cascaded Linear Actuator. IEEE/ASME Trans. Mechatron. 2021, 26, 1923–1931. [Google Scholar] [CrossRef]
  7. Shakhatreh, H.; Sawalmeh, A.H.; Al-Fuqaha, A.; Dou, Z.; Almaita, E.; Khalil, I.; Othman, N.S.; Khreishah, A.; Guizani, M. Unmanned Aerial Vehicles (UAVs): A Survey on Civil Applications and Key Research Challenges. IEEE Access 2019, 7, 48572–48634. [Google Scholar] [CrossRef]
  8. Sivaneri, V.O.; Gross, J.N. Flight-testing of a cooperative UGV-to-UAV strategy for improved positioning in challenging GNSS environments. Aerosp. Sci. Technol. 2018, 82–83, 575–582. [Google Scholar] [CrossRef]
  9. Sharp, C.; Shakernia, O.; Sastry, S. A vision system for landing an unmanned aerial vehicle. In Proceedings of the 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164), Seoul, Korea, 21–26 May 2001; Volume 2, pp. 1720–1727. [Google Scholar] [CrossRef]
  10. Xu, G.; Zhang, Y.; Ji, S.; Cheng, Y.; Tian, Y. Research on computer vision-based for UAV autonomous landing on a ship. Pattern Recognit. Lett. 2009, 30, 600–605. [Google Scholar] [CrossRef]
  11. Nguyen, P.H.; Kim, K.W.; Lee, Y.W.; Park, K.R. Remote Marker-Based Tracking for UAV Landing Using Visible-Light Camera Sensor. Sensors 2017, 17, 1987. [Google Scholar] [CrossRef] [Green Version]
  12. Nguyen, P.H.; Arsalan, M.; Koo, J.H.; Naqvi, R.A.; Truong, N.Q.; Park, K.R. LightDenseYOLO: A Fast and Accurate Marker Tracker for Autonomous UAV Landing by Visible Light Camera Sensor on Drone. Sensors 2018, 18, 1703. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Wubben, J.; Fabra, F.; Calafate, C.T.; Krzeszowski, T.; Marquez-Barja, J.M.; Cano, J.C.; Manzoni, P. Accurate Landing of Unmanned Aerial Vehicles Using Ground Pattern Recognition. Electronics 2019, 8, 1532. [Google Scholar] [CrossRef] [Green Version]
  14. Antenucci, A.; Mazzaro, S.; Fiorilla, A.E.; Messina, L.; Massa, A.; Matta, W. A ROS Based Automatic Control Implementation for Precision Landing on Slow Moving Platforms Using a Cooperative Fleet of Rotary-Wing UAVs. In Proceedings of the 2020 5th International Conference on Robotics and Automation Engineering (ICRAE), Singapore, 20–22 November 2020; pp. 139–144. [Google Scholar] [CrossRef]
  15. Lin, S.; Jin, L.; Chen, Z. Real-Time Monocular Vision System for UAV Autonomous Landing in Outdoor Low-Illumination Environments. Sensors 2021, 21, 6226. [Google Scholar] [CrossRef] [PubMed]
  16. Kim, I.; Viksnin, I.; Kaisina, I.; Kuznetsov, V. Computer Vision System for Landing Platform State Assessment Onboard of Unmanned Aerial Vehicle in Case of Input Visual Information Distortion. In Proceedings of the 2021 29th Conference of Open Innovations Association (FRUCT), Tampere, Finland, 12–14 May 2021; pp. 192–198. [Google Scholar] [CrossRef]
  17. Gupta, P.; Pareek, B.; Kumar, R.; Aeron, A.C. Vision-Based Safe Landing of UAV using Tiny-SURF Algorithm. In Proceedings of the 2021 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Melbourne, Australia, 17–20 October 2021; pp. 226–231. [Google Scholar] [CrossRef]
  18. Lee, B.; Saj, V.; Benedict, M.; Kalathil, D. Intelligent Vision-based Autonomous Ship Landing of VTOL UAVs. arXiv 2022, arXiv:2202.13005. [Google Scholar]
  19. Patruno, C.; Nitti, M.; Petitti, A.; Stella, E.; D’Orazio, T. A Vision-Based Approach for Unmanned Aerial Vehicle Landing. J. Intell. Robot. Syst. 2019, 95, 645–664. [Google Scholar] [CrossRef]
  20. Chen, X.; Phang, S.K.; Shan, M.; Chen, B.M. System integration of a vision-guided UAV for autonomous landing on moving platform. In Proceedings of the 2016 12th IEEE International Conference on Control and Automation (ICCA), Kathmandu, Nepal, 1–3 June 2016; pp. 761–766. [Google Scholar] [CrossRef]
  21. Araar, O.; Aouf, N.; Vitanov, I. Vision Based Autonomous Landing of Multirotor UAV on Moving Platform. J. Intell. Robot. Syst. 2017, 85, 369–384. [Google Scholar] [CrossRef]
  22. Yang, Q.; Sun, L. A fuzzy complementary Kalman filter based on visual and IMU data for UAV landing. Optik 2018, 173, 279–291. [Google Scholar] [CrossRef]
  23. Wang, J.; McKiver, D.; Pandit, S.; Abdelzaher, A.F.; Washington, J.; Chen, W. Precision UAV Landing Control Based on Visual Detection. In Proceedings of the 2020 IEEE Conference on Multimedia Information Processing and Retrieval (MIPR), Shenzhen, China, 6–8 August 2020; pp. 205–208. [Google Scholar] [CrossRef]
  24. Chang, C.W.; Lo, L.Y.; Cheung, H.C.; Feng, Y.; Yang, A.S.; Wen, C.Y.; Zhou, W. Proactive Guidance for Accurate UAV Landing on a Dynamic Platform: A Visual-Inertial Approach. Sensors 2022, 22, 404. [Google Scholar] [CrossRef]
  25. Bigazzi, L.; Gherardini, S.; Innocenti, G.; Basso, M. Development of Non Expensive Technologies for Precise Maneuvering of Completely Autonomous Unmanned Aerial Vehicles. Sensors 2021, 21, 391. [Google Scholar] [CrossRef]
  26. Santos, M.C.; Santana, L.V.; Brandão, A.S.; Sarcinelli-Filho, M.; Carelli, R. Indoor low-cost localization system for controlling aerial robots. Control Eng. Pract. 2017, 61, 93–111. [Google Scholar] [CrossRef]
  27. Demirhan, M.; Premachandra, C. Development of an Automated Camera-Based Drone Landing System. IEEE Access 2020, 8, 202111–202121. [Google Scholar] [CrossRef]
  28. Xing, B.Y.; Pan, F.; Feng, X.X.; Li, W.X.; Gao, Q. Autonomous Landing of a Micro Aerial Vehicle on a Moving Platform Using a Composite Landmark. Int. J. Aerosp. Eng. 2019, 2019, 4723869. [Google Scholar] [CrossRef]
  29. Supriyono, H.; Akhara, A. Design, building and performance testing of GPS and computer vision combination for increasing landing precision of quad-copter drone. J. Phys. Conf. Ser. 2021, 1858, 012074. [Google Scholar] [CrossRef]
  30. Benjumea, D.; Alcántara, A.; Ramos, A.; Torres-Gonzalez, A.; Sánchez-Cuevas, P.; Capitan, J.; Heredia, G.; Ollero, A. Localization System for Lightweight Unmanned Aerial Vehicles in Inspection Tasks. Sensors 2021, 21, 5937. [Google Scholar] [CrossRef]
  31. Paredes, J.A.; Álvarez, F.J.; Aguilera, T.; Aranda, F.J. Precise drone location and tracking by adaptive matched filtering from a top-view ToF camera. Expert Syst. Appl. 2020, 141, 112989. [Google Scholar] [CrossRef]
  32. Paredes, J.A.; Álvarez, F.J.; Aguilera, T.; Villadangos, J.M. 3D Indoor Positioning of UAVs with Spread Spectrum Ultrasound and Time-of-Flight Cameras. Sensors 2018, 18, 89. [Google Scholar] [CrossRef] [Green Version]
  33. Paredes, J.A.; Álvarez, F.J.; Hansard, M.; Rajab, K.Z. A Gaussian Process model for UAV localization using millimetre wave radar. Expert Syst. Appl. 2021, 185, 115563. [Google Scholar] [CrossRef]
  34. Shin, Y.H.; Lee, S.; Seo, J. Autonomous safe landing-area determination for rotorcraft UAVs using multiple IR-UWB radars. Aerosp. Sci. Technol. 2017, 69, 617–624. [Google Scholar] [CrossRef]
  35. Mohamadi, F. Software-Defined Multi-Mode Ultra-Wideband Radar for Autonomous Vertical Take-Off and Landing of Small Unmanned Aerial Systems. U.S. Patent 9,110,168, 18 August 2015. [Google Scholar]
  36. Mohamadi, F. Vertical Takeoff and Landing (vtol) Small Unmanned Aerial System for Monitoring Oil and Gas Pipeline. U.S. Patent 8,880,241, 4 November 2014. [Google Scholar]
  37. Kim, E.; Choi, D. A UWB positioning network enabling unmanned aircraft systems auto land. Aerosp. Sci. Technol. 2016, 58, 418–426. [Google Scholar] [CrossRef]
  38. Cisek, K.; Zolich, A.; Klausen, K.; Johansen, T.A. Ultra-wide band Real time Location Systems: Practical implementation and UAV performance evaluation. In Proceedings of the 2017 Workshop on Research, Education and Development of Unmanned Aerial Systems, RED-UAS 2017, Linköping, Sweden, 3–5 October 2017; pp. 204–209. [Google Scholar] [CrossRef] [Green Version]
  39. Shin, Y.; Kim, E. Primitive Path Generation for a UWB Network Based Auto Landing System. In Proceedings of the 3rd IEEE International Conference on Robotic Computing, IRC 2019, Naples, Italy, 25–27 February 2019; pp. 431–432. [Google Scholar] [CrossRef]
  40. Zekavat, R.; Buehrer, R.M. Handbook of Position Location: Theory, Practice, and Advances; Wiley-IEEE Press: Hoboken, NJ, USA, 2019. [Google Scholar]
  41. Zamora-Cadenas, L.; Velez, I.; Sierra-Garcia, J.E. UWB-Based Safety System for Autonomous Guided Vehicles Without Hardware on the Infrastructure. IEEE Access 2021, 9, 96430–96443. [Google Scholar] [CrossRef]
  42. Queralta, J.P.; Martínez Almansa, C.; Schiano, F.; Floreano, D.; Westerlund, T. UWB-based System for UAV Localization in GNSS-Denied Environments: Characterization and Dataset. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 25 October 2020–24 January 2021; pp. 4521–4528. [Google Scholar] [CrossRef]
  43. Khalaf-Allah, M. Particle Filtering for Three-Dimensional TDoA-Based Positioning Using Four Anchor Nodes. Sensors 2020, 20, 4516. [Google Scholar] [CrossRef] [PubMed]
  44. Mueller, M.W.; Hamer, M.; D’Andrea, R. Fusing ultra-wideband range measurements with accelerometers and rate gyroscopes for quadrocopter state estimation. In Proceedings of the IEEE International Conference on Robotics and Automation, Seattle, WA, USA, 26–30 May 2015; pp. 1730–1736. [Google Scholar] [CrossRef]
  45. Fresk, E.; Ödmark, K.; Nikolakopoulos, G. Ultra WideBand enabled Inertial Odometry for Generic Localization. IFAC-PapersOnLine 2017, 50, 11465–11472. [Google Scholar] [CrossRef]
  46. Song, Y.; Hsu, L.T. Tightly coupled integrated navigation system via factor graph for UAV indoor localization. Aerosp. Sci. Technol. 2021, 108, 106370. [Google Scholar] [CrossRef]
  47. Wang, C.; Li, K.; Liang, G.; Chen, H.; Huang, S.; Wu, X. A Heterogeneous Sensing System-Based Method for Unmanned Aerial Vehicle Indoor Positioning. Sensors 2017, 17, 1842. [Google Scholar] [CrossRef] [PubMed]
  48. Zahran, S.; Mostafa, M.M.; Masiero, A.; Moussa, A.M.; Vettore, A.; El-Sheimy, N. Micro-radar and UWB aided UAV navigation in GNSS denied environment. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.-ISPRS Arch. 2018, 42, 469–476. [Google Scholar] [CrossRef] [Green Version]
  49. Gryte, K.; Hansen, J.; Johansen, T.; Fossen, T. Robust Navigation of UAV using Inertial Sensors Aided by UWB and RTK GPS. In Proceedings of the AIAA Guidance, Navigation, and Control Conference, Grapevine, TX, USA, 9–13 January 2017. [Google Scholar] [CrossRef] [Green Version]
  50. Tiemann, J.; Ramsey, A.; Wietfeld, C. Enhanced UAV indoor navigation through SLAM-Augmented UWB Localization. In Proceedings of the 2018 IEEE International Conference on Communications Workshops, ICC Workshops 2018, Kansas City, MO, USA, 20–24 May 2018; pp. 1–6. [Google Scholar] [CrossRef]
  51. d’Apolito, F.; Sulzbachner, C. System Architecture of a Demonstrator for Indoor Aerial Navigation. IFAC-PapersOnLine 2019, 52, 316–320. [Google Scholar] [CrossRef]
  52. Hoeller, D.; Ledergerber, A.; Hamer, M.; D’Andrea, R. Augmenting Ultra-Wideband Localization with Computer Vision for Accurate Flight. IFAC-PapersOnLine 2017, 50, 12734–12740. [Google Scholar] [CrossRef]
  53. Nguyen, T.H.; Cao, M.; Nguyen, T.M.; Xie, L. Post-Mission Autonomous Return and Precision Landing of UAV. In Proceedings of the 2018 15th International Conference on Control, Automation, Robotics and Vision, ICARCV 2018, Singapore, 18–21 November 2018; pp. 1747–1752. [Google Scholar] [CrossRef]
  54. Orjales, F.; Losada-Pita, J.; Paz-Lopez, A.; Deibe, Á. Towards Precise Positioning and Movement of UAVs for Near-Wall Tasks in GNSS-Denied Environments. Sensors 2021, 21, 2194. [Google Scholar] [CrossRef]
  55. Zamora-Cadenas, L.; Arrue, N.; Jiménez-Irastorza, A.; Vélez, I. Improving the Performance of an FMCW Indoor Localization System by Optimizing the Ranging Estimator. In Proceedings of the 2010 6th International Conference on Wireless and Mobile Communications, Valencia, Spain, 20–25 September 2010; pp. 226–231. [Google Scholar] [CrossRef]
  56. LSM6DSO. Available online: https://www.st.com/en/mems-and-sensors/lsm6dso.html (accessed on 26 January 2022).
  57. LIS2MDL. Available online: https://www.st.com/en/mems-and-sensors/lis2mdl.html (accessed on 26 January 2022).
  58. Getting started with MotionFX Sensor Fusion Library in X-CUBE-MEMS1 Expansion for STM32Cube. Available online: https://www.st.com/en/embedded-software/x-cube-mems1.html#documentation (accessed on 26 January 2022).
  59. Kay, S.M. Fundamentals of Statistical Signal Processing: Estimation Theory; Prentice-Hall, Inc.: Upper Saddle River, NJ, USA, 1993. [Google Scholar]
  60. Kok, M.; Hol, J.D.; Schön, T.B. Using Inertial Sensors for Position and Orientation Estimation; Now Foundations and Trends: Boston, MA, USA, 2017. [Google Scholar]
  61. Solà, J. Quaternion kinematics for the error-state Kalman filter. arXiv 2017, arXiv:1711.02508. [Google Scholar]
  62. Rubinstein, R.Y.; Kroese, D.P. Simulation and the Monte Carlo Method, 3rd ed.; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2017. [Google Scholar]
Figure 1. Flow chart of an extended Kalman filter (EKF) with ultra-wideband (UWB) sensors for localisation.
Figure 1. Flow chart of an extended Kalman filter (EKF) with ultra-wideband (UWB) sensors for localisation.
Sensors 22 02347 g001
Figure 2. Proposed system architecture. Eight anchors are placed around the landing platform in order to locate the tags on the drone in real-time.
Figure 2. Proposed system architecture. Eight anchors are placed around the landing platform in order to locate the tags on the drone in real-time.
Sensors 22 02347 g002
Figure 3. Tags on the drone.
Figure 3. Tags on the drone.
Sensors 22 02347 g003
Figure 4. Flow chart of the positioning algorithm.
Figure 4. Flow chart of the positioning algorithm.
Sensors 22 02347 g004
Figure 5. Body, world and inertial frames.
Figure 5. Body, world and inertial frames.
Sensors 22 02347 g005
Figure 6. Flow chart of the EKF that fuses the UWB data with an inertial measurement unit (IMU) and magnetometer.
Figure 6. Flow chart of the EKF that fuses the UWB data with an inertial measurement unit (IMU) and magnetometer.
Sensors 22 02347 g006
Figure 7. (a) Location of the tests and (b) set-up for the measurements.
Figure 7. (a) Location of the tests and (b) set-up for the measurements.
Sensors 22 02347 g007
Figure 8. Horizontal trajectory of each flight. Blue line represents the movements of the drone and the black-edged square the landing platform.
Figure 8. Horizontal trajectory of each flight. Blue line represents the movements of the drone and the black-edged square the landing platform.
Sensors 22 02347 g008
Figure 9. Set-up in the outdoor environment.
Figure 9. Set-up in the outdoor environment.
Sensors 22 02347 g009
Figure 10. Horizontal trajectory of each flight in the outdoor environment.
Figure 10. Horizontal trajectory of each flight in the outdoor environment.
Sensors 22 02347 g010
Figure 11. Cumulative distribution function plots of (a) Flights 1–4 and (b) Flights 5–9.
Figure 11. Cumulative distribution function plots of (a) Flights 1–4 and (b) Flights 5–9.
Sensors 22 02347 g011
Figure 12. Estimated trajectory in the outdoor environment by the combination of tags. Estimates of the proposed landing assistance system (LAS) are given in red colour. Blue line represents the trajectory given by the global navigation satellite system (GNSS) of the drone.
Figure 12. Estimated trajectory in the outdoor environment by the combination of tags. Estimates of the proposed landing assistance system (LAS) are given in red colour. Blue line represents the trajectory given by the global navigation satellite system (GNSS) of the drone.
Sensors 22 02347 g012
Table 1. Configuration parameters of the UWB system.
Table 1. Configuration parameters of the UWB system.
ParameterValueUnits
Carrier frequency3.9936GHz
Bandwidth499.2MHz
Channel2-
Bitrate6.8Mbps
PRF (pulse repetition frequency)16MHz
Preamble length128symbols
Preamble code3-
SFD (start of frame delimiter)8symbols
Ranging rate3.3Hz
Table 2. Configuration parameters of MotionFX.
Table 2. Configuration parameters of MotionFX.
ParameterValue
Sampling rate25 Hz
output_type1
acc_orientationENU
gyro_orientationENU
mag_orientationESU
LMode1
ATime0.9
MTime1.5
FrTime0.667
modx2
Table 3. Positions of the anchors during the tests.
Table 3. Positions of the anchors during the tests.
Anchor Namex (m)y (m)z (m)
A01.9980.00.145
A11.00.00.149
A20.00.00.147
A30.00.9990.151
A40.01.9980.155
A51.0011.9980.153
A61.9981.9980.157
A71.9980.9990.159
Table 4. Measured horizontal acceleration on the centre of the drone.
Table 4. Measured horizontal acceleration on the centre of the drone.
FlightMean Acceleration ( m / s 2 )Max Acceleration ( m / s 2 )
Flight 10.4561.632
Flight 20.4831.651
Flight 30.4451.588
Flight 40.5631.838
Flight 51.0924.145
Flight 61.9586.496
Flight 71.4044.412
Flight 81.3744.199
Flight 91.2854.807
Table 5. Accuracy using only UWB data with four anchors.
Table 5. Accuracy using only UWB data with four anchors.
TagFlightμ
(m)
σ
(m)
RMSE
(m)
P ( ϵ p ) < 80 %
(m)
P(%) < 1 m
(%)
ϵmax
(m)
T1Flight 10.2330.1440.2730.3461000.608
Flight 20.2450.1480.2860.3841000.598
Flight 30.2140.1760.2770.3361000.751
Flight 40.2730.1610.3170.4071000.685
Flight 50.2640.2790.3830.35697.461.609
Flight 60.2270.1610.2780.3721000.681
Flight 70.3110.2490.3980.49097.671.075
Flight 80.3390.2450.4180.56698.121.214
Flight 90.4110.3230.5220.61894.951.709
All0.2930.2380.3770.46198.391.709
T2Flight 10.1840.1330.2270.3181000.565
Flight 20.2210.1600.2730.3751000.649
Flight 30.2560.2330.3460.4551000.854
Flight 40.3270.2460.4090.5801000.932
Flight 50.3360.3460.4810.52993.781.716
Flight 60.3010.3020.4260.48096.001.499
Flight 70.3690.3970.5410.66692.351.790
Flight 80.3330.3100.4550.55194.141.283
Flight 90.4330.3870.5800.73690.152.001
All0.3160.3090.4420.52195.712.001
Table 6. Accuracy using only UWB data with eight anchors.
Table 6. Accuracy using only UWB data with eight anchors.
TagFlightμ
(m)
σ
(m)
RMSE
(m)
P ( ϵ p ) < 80 %
(m)
P(%) < 1 m
(%)
ϵmax
(m)
T1Flight 10.2260.1610.2770.3341000.720
Flight 20.2370.1160.2630.3561000.528
Flight 30.2080.1530.2580.3471000.774
Flight 40.2400.1350.2750.3651000.581
Flight 50.2780.2900.4010.46096.981.431
Flight 60.2590.2050.3300.4981000.865
Flight 70.2650.1960.3290.4501000.789
Flight 80.3430.2380.4170.57199.381.067
Flight 90.3570.2890.4590.55495.381.304
All0.2800.2220.3570.44598.861.431
T2Flight 10.1690.1130.2030.2791000.492
Flight 20.2040.1440.2500.3231000.650
Flight 30.2460.2240.3330.4331000.857
Flight 40.2750.2060.3440.4691000.904
Flight 50.3420.3700.5030.56690.311.623
Flight 60.3790.3600.5220.73791.711.722
Flight 70.3270.2690.4230.55597.001.267
Flight 80.3240.2680.4200.58299.331.099
Flight 90.3720.3250.4930.61193.811.487
All0.3030.2820.4140.50596.691.722
Table 7. Accuracy data fusing UWB and inertial data.
Table 7. Accuracy data fusing UWB and inertial data.
TagFlightμ
(m)
σ
(m)
RMSE
(m)
P ( ϵ p ) < 80 %
(m)
P(%) < 1 m
(%)
ϵmax
(m)
T1Flight 10.1750.1050.2040.2651000.630
Flight 20.1990.1220.2330.3061000.613
Flight 30.1760.0940.2000.2451000.578
Flight 40.1930.0770.2080.2591000.434
Flight 50.1650.1030.1940.2411000.751
Flight 60.1950.1390.2400.2981000.814
Flight 70.1930.1510.2450.30299.821.090
Flight 80.2070.1530.2570.3361000.794
Flight 90.2430.1790.3010.38099.701.203
All0.1980.1370.2410.29699.931.203
T2Flight 10.1770.1180.2130.2891000.496
Flight 20.1670.1100.2000.2521000.523
Flight 30.1550.1140.1920.2621000.577
Flight 40.2250.1870.2930.3491000.941
Flight 50.2090.1350.2490.3261000.791
Flight 60.2370.2010.3100.39899.451.314
Flight 70.2180.1370.2580.3461000.731
Flight 80.2360.1850.3000.36899.491.269
Flight 90.2230.1560.2720.32599.791.060
All0.2100.1590.2630.32999.821.314
Table 8. Accuracy data when combining tags.
Table 8. Accuracy data when combining tags.
Flight μ (m) σ (m)RMSE (m) P ( ϵ p ) < 80 % (m)P(%) < 1 m (%) ϵ max (m)
Flight 10.1450.0810.1660.2341000.350
Flight 20.1420.0880.1670.2301000.403
Flight 30.1290.0830.1540.1961000.417
Flight 40.1560.0950.1820.2231000.473
Flight 50.1570.1030.1880.2441000.546
Flight 60.1950.1560.2500.3211000.862
Flight 70.1790.1350.2240.2991000.660
Flight 80.1820.1390.2290.2961000.845
Flight 90.1990.1540.2510.31799.671.150
All0.1680.1240.2080.26099.951.150
Table 9. Comparison of all cases.
Table 9. Comparison of all cases.
Systemμ
(m)
σ
(m)
RMSE
(m)
P ( ϵ p ) < 80 %
(m)
P(%) < 1 m
(%)
ϵmax
(m)
UWB as [42]0.3040.2750.4100.48897.102.001
Proposal0.1680.1240.2080.26099.951.150
Table 10. Comparison with state of art the systems.
Table 10. Comparison with state of art the systems.
Indoor Accuracy
RMSE
Maximum
Tested
Indoor
Range
Maximum
Indoor
Horizontal
Velocity
Maximum
Tested
Outdoor
Range
Minimum
Positioning
Rate
Sensitivity
to Lighting
Conditions
X (m)Y (m)(m)(m/s)(m)(Hz)
Vision [19]0.0120.01420.190-10Yes
UWB as [42]0.3550.2054.53.139-3.3No
This work0.1770.1154.53.1392025No
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ochoa-de-Eribe-Landaberea, A.; Zamora-Cadenas, L.; Peñagaricano-Muñoa, O.; Velez, I. UWB and IMU-Based UAV’s Assistance System for Autonomous Landing on a Platform. Sensors 2022, 22, 2347. https://doi.org/10.3390/s22062347

AMA Style

Ochoa-de-Eribe-Landaberea A, Zamora-Cadenas L, Peñagaricano-Muñoa O, Velez I. UWB and IMU-Based UAV’s Assistance System for Autonomous Landing on a Platform. Sensors. 2022; 22(6):2347. https://doi.org/10.3390/s22062347

Chicago/Turabian Style

Ochoa-de-Eribe-Landaberea, Aitor, Leticia Zamora-Cadenas, Oier Peñagaricano-Muñoa, and Igone Velez. 2022. "UWB and IMU-Based UAV’s Assistance System for Autonomous Landing on a Platform" Sensors 22, no. 6: 2347. https://doi.org/10.3390/s22062347

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop