Next Article in Journal
Multi-Type Task Assignment Algorithm for Heterogeneous UAV Cluster Based on Improved NSGA-Ⅱ
Previous Article in Journal
Multi-Unmanned Aerial Vehicle Confrontation in Intelligent Air Combat: A Multi-Agent Deep Reinforcement Learning Approach
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Equivalent Spatial Plane-Based Relative Pose Estimation of UAVs

1
Department of Aviation Maintenance Engineering, Xi’an Aeronautical Polytechnic Institute, Xi’an 710089, China
2
School of Electronic Information Engineering, Xi’an Technological University, Xi’an 710021, China
3
The Youth Innovation Team of Shaanxi Universities, Xi’an 710021, China
*
Author to whom correspondence should be addressed.
Drones 2024, 8(8), 383; https://doi.org/10.3390/drones8080383
Submission received: 28 May 2024 / Revised: 31 July 2024 / Accepted: 6 August 2024 / Published: 8 August 2024

Abstract

The accuracy of relative pose estimation is an important foundation for ensuring the safety and stability of autonomous aerial refueling (AAR) of unmanned aerial vehicles (UAV), and in response to this problem, a relative pose estimation method of UAVs based on the spatial equivalent plane is proposed in this paper. The UAV is equivalent to a spatial polygonal plane, and according to the measurement information of the Global Navigation Satellite System (GNSS) receivers, the equivalent polygonal plane equation is solved through the three-point normal vector and the minimum sum of squares of the distance from the four points to the plane. The equations of the distance between the geometric centers of the two polygonal planes, the angle between planes, and the angle between lines are used to calculate the relative pose information of the UAVs. Finally, the simulation environment and initial parameters are utilized for numerical simulation and results analysis. The simulation results show that without considering the motion model of the UAV, the proposed method can accurately estimate the relative pose information of the UAVs. In addition, in the presence of measurement errors, the relative pose estimation method based on the equivalent triangle plane can identify the position of the measurement point with the error, and the relative pose estimation method based on the equivalent quadrilateral plane has good robustness. The simulation results verify the feasibility and effectiveness of the proposed method.

1. Introduction

Autonomous aerial refueling technology effectively improves the endurance capacity and air holding time of the UAV and solves problems such as underloading [1]. In the process of AAR, the relative pose information of the UAVs is usually fed back to the controller so that the UAVs can fly under a certain safe distance and at the same altitude to ensure the stability and safety of the refueling process [2,3]; therefore, research on the relative pose estimation method has very important practical significance in ensuring the safety and stability of the AAR process.
At present, the relative pose estimation methods of the UAVs mainly focus on estimation methods based on the global navigation satellite system (GNSS), the inertial navigation system (INS), the vision navigation equipment (VisNav), and their combination [4,5]. The GNSS and INS can directly measure the relative pose parameter information of UAVs; however, the disadvantage of the GNSS is that the signal update frequency is slow and the signal is easily lost due to interference from external factors, while the disadvantage of INS is that the accumulated error leads to lower measurement accuracy, therefore relative pose estimation methods based on GNSS or INS have poor robustness and low estimation accuracy [6,7,8]. VisNav can effectively make up for the shortcomings of GNSS or INS and improve the accuracy of relative pose estimations [9]. Campa et al. [10] used a camera installed on a UAV to observe feature points on another UAV with a known location and established a relative line-of-sight vector equation to solve the relative pose between the camera and the UAV. Mammarella and Fravolini [11,12] set a series of dot-shaped marker lights on a tanker as auxiliary identification markers and then extracted and tracked these marker points, using iterative algorithms such as the Gauss–Newton method to estimate the relative pose information. Ding et al. [13] discussed the application of the orthogonal iteration (OI) algorithm in AAR and analyzed the influence of the number of marker points and layout on the accuracy of relative pose estimation. Doebbler et al. [14] used the active variable contour algorithm to extract rectangular markers set on the nose and achieved relative pose estimation by tracking the centroid of the rectangular markers. Zhang et al. [15,16] established an optimization method based on the geometric information between stereo vision and the target object to measure the pose of the spatial craft and obtain the altitude solution of the non-cooperative target. Hinterstoisser et al. [17] estimated the target pose information by matching the real image with the viewpoint production template, set the threshold, obtained the ideal matching result, and completed the pose estimation by matching the real image with the obtained sample template, including the color and depth gradient from different angles. Weaver et al. [18] compared the predicted image of the tanker with the actual measured image and used the obtained image error and inertial navigation system data for fusion calculation to obtain the relative pose information. Li Guorong [19] back-calculated and estimated the target flight attitude by comparing the simulated image and real image, while the target attitude parameters were determined using the pyramid area-matching algorithm and the external contour point set matching was used to optimize the iterative algorithm. Liebelt et al. [20] used the target contour-matching method to calculate the deviation between the reprojection result and the actual imaging result of the target edge with a given initial value, and then iterative optimization minimized the deviation to achieve the target pose solution. In a long-distance situation, Teng et al. [21] used two cameras to extract and cluster image line features and solve the altitude of the helicopter under geometric constraints. Under the constraints of single-view and multi-view UAVs, Yuan et al. [22] established a particle filter algorithm based on multiple cameras to estimate the pose information of the UAV, which improved the estimation accuracy. However, the relative pose estimation method based on VisNav needs to know the parameters of the camera and UAV model in advance, while at the same time, there are measurement and quantization errors in the actual imaging process and extraction errors in the feature extraction process, which will affect the estimation accuracy of the relative pose. In addition, this method usually transforms the position and altitude into the same objective function and iteratively solves the optimal solution, which requires a large amount of calculation and affects the real-time performance of the method. Because of the shortcomings and deficiencies of a single system/device, a relative pose estimation method based on multiple measurement systems/devices has been proposed and studied. Williamson et al. [23] established a relative navigation method based on INS/GPS for the formation flight of UAVs, but when the GPS signal was interfered with, the accuracy of relative navigation was reduced. Strohmeier et al. [24] proposed a relative pose estimation method of the UAVs based on GPS/MEMS IMU but the estimation accuracy was affected by the cumulative error of the IMU. Christian et al. [25] proposed a relative pose estimation method of light UAVs based on GPS/MEMS IMU, but the estimation accuracy was affected by the stability of the GPS signal and the cumulative errors of the IMU. Fosbury [26], Shao Wei [27], and Wang Long [28] researched the relative navigation information method of UAVs based on VisNav/INS. They used the image obtained by the camera to calculate the relative pose information of UAVs and correct the errors caused by the INS, and this method improved the accuracy of relative navigation. However, when the field of view was occluded, VisNav was not able to observe the feature points so the relative pose information could not be correctly calculated, which affected the accuracy of relative navigation. However, most existing methods require the establishment of accurate system models. Considering the complexity and bias of the motion model of the tanker/receiver, as well as the computational complexity of model-based methods, a relative pose estimation method of UAVs based on the spatial equivalent plane is proposed in this paper. The proposed method equates the UAV to a spatial polygonal plane and uses the relative distance and angle between the equivalent polygonal planes to solve the relative pose information of the UAVs.
The structure of this paper is as follows. The spatial equivalent triangle plane equation of the UAV based on the three-point normal vector and the spatial equivalent quadrilateral plane equation of the UAV based on the minimum square sums of the distance from the four points to the plane are determined, and the relative pose of the UAV is solved in Section 2; initial simulation parameters are configured to verify the proposed method and analyze the simulation results in Section 3; finally, conclusions are drawn in Section 4.

2. Relative Pose Estimation Method of UAVs

2.1. Problem Description

In this paper, when we research the relative pose estimation of the UAVs, the UAV is described as a spatial equivalent polygonal plane, and the problem of solving the relative pose estimation between UAVs is transformed into the problem of solving the relative distance and angle between two spatial equivalent polygonal planes in the same coordinate system. Taking the Earth-Centered, Earth-Fixed (ECEF) system as the reference coordinate system and the fixed-wing UAV as the research object, GNSS receivers are installed on the nose, tail, and two wings of the UAV to receive a satellite navigation signal and realize the single-receiver positioning, and then the positioning information of the GNSS receivers is used to determine the spatial equivalent polygonal plane equation of the UAV and calculate the relative position and angle between the spatial equivalent polygonal planes. Finally, we can obtain the relative pose information of the UAVs. The schematic diagram of relative pose estimation is shown in Figure 1.

2.2. Determination of Spatial Equivalent Polygon Plane Equation

Taking the ECEF as the reference coordinate system, the measured value of the GNSS receiver installed on the UAV is Pn = (xn, yn, zn), (n = 1, 2,…, 8), where each UAV is equivalent to a spatial polygonal plane. When the measured values of four points on each UAV are known, we can use three points to determine a triangle plane or four points to determine a quadrilateral plane; therefore, the spatial triangle plane equation based on the three-point normal vector and the spatial quadrilateral plane based on the minimum square sum of the distance from the four points to the plane will be discussed in this section.

2.2.1. Three Points to Determine the Spatial Triangle Plane

For UAV A, when the measurement values of the GNSS receivers are known, any three measurement values can be used to determine the equivalent spatial triangle plane equation of UAV A. According to the three measurement values, the normal vector n A of the equivalent spatial triangle plane is:
n A = P i P j × P i P k = ( ( L 3 L 6 L 4 L 5 ) , ( L 5 L 2 L 6 L 1 ) , ( L 1 L 4 L 2 L 3 ) )
where L1 = xjxi, L2 = xkxi, L3 = yjyi, L4 = ykyi, L5 = zjzi, and L6 = zkzi, while the values of i, j, and k are (i = 1, j = 2, k = 3), (i = 1, j = 2, k = 4), (i = 1, j = 3, k = 4), and (i = 2, j = 3, k = 4).
The point normal form equation of a spatial plane is:
a ( x x 0 ) + b ( y y 0 ) + c ( z z 0 ) = 0
where (a, b, c) represents the normal vector and (x0, y0, z0) represents the known point.
When the normal vector is known, if the equivalent spatial triangle plane of UAV A passes through the origin of the reference coordinate system, that is, point (0, 0, 0), the equation of the spatial triangle plane is:
( L 3 L 6 L 4 L 5 ) x + ( L 5 L 2 L 6 L 1 ) y + ( L 1 L 4 L 2 L 3 ) z = 0
If the equivalent spatial triangle plane of UAV A does not pass through the origin of the reference coordinate system, at this time, point Pn = (xn, yn, zn), (n = i, j, k) can be taken as the known point. Then, the equation of the spatial triangle plane is:
( L 3 L 6 L 4 L 5 ) ( x x n ) + ( L 5 L 2 L 6 L 1 ) ( y y n ) + ( L 1 L 4 L 2 L 3 ) ( z z n ) = 0
Similarly, for UAV B, the normal vector n B of the spatial triangle plane determined by any three points on the aircraft is:
n B = R l R q × R l R w = ( ( M 3 M 6 M 4 M 5 ) , ( M 5 M 2 M 6 M 1 ) , ( M 1 M 4 M 2 M 3 ) )
where M1 = xqxl, M2 = xwxl, M3 = yqyl, M4 = ywyl, M5 = zqzl, and M6 = zwzl, while the values of l, q, and w are (l = 5, q = 6, w = 7), (l = 5, q = 6, w = 8), (l = 5, q = 7, w = 8), and (l = 6, q = 7, w = 8).
When the normal vector is known, if the equivalent spatial triangle plane of UAV B passes through the origin of the reference coordinate system, that is, point (0, 0, 0), the equation of the spatial triangle plane is:
( M 3 M 6 M 4 M 5 ) x + ( M 5 M 2 M 6 M 1 ) y + ( M 1 M 4 M 2 M 3 ) z = 0
If the equivalent spatial triangle plane of UAV B does not pass through the origin of the reference coordinate system, at this time, point Pn = (xn, yn, zn), (n = l, q, w) can be taken as the known point. Then, the equation of the spatial triangle plane is:
( M 3 M 6 M 4 M 5 ) ( x x n ) + ( M 5 M 2 M 6 M 1 ) ( y y n ) + ( M 1 M 4 M 2 M 3 ) ( z z n ) = 0

2.2.2. Four Points to Determine the Spatial Quadrilateral Plane

In the previous section, the three-point normal vector is used to determine the spatial triangle plane equation. In this section, the four points on the UAV will be used to determine the equivalent spatial quadrilateral plane equation of the UAV. Assuming that the equivalent spatial quadrilateral plane equation of the UAV A is:
e A x + f A y + g A z = d A
where (eA, fA, gA) is the normal vector and dA is the dot product of the normal vector value and the known point value.
The minimum sum of the squares of distances from the point to the spatial plane is used to determine the spatial quadrilateral plane equation. The distance from the measurement points to the spatial quadrilateral plane is:
r A n = | e A x n + f A y n + g A z n d A | e A 2 + f A 2 + g A 2 ( n = 1 , 2 , 3 , 4 )
In order to minimize the sum of the squares of distances from four points on UAV A to the equivalent spatial quadrilateral plane, we define an object function J(eA, fA, gA, dA):
J ( e A , f A , g A , d A ) = n = 1 4 ( | x n e A + y n f A + z n g A d A | e A 2 + f A 2 + g A 2 ) 2
Equation (10) is a nonlinear function. To find the optimal solution, the Gauss–Newton method can be used to iteratively calculate the linear optimal solution.
Let J n = ( | x n e A + y n f A + z n g A d A | e A 2 + f A 2 + g A 2 ) 2 , (n = 1, 2, 3, 4), the partial derivative is:
{ J n e A = 2 x n | x n e A + y n f A + z n g A d A | e A | x n e A + y n f A + z n g A d A | 2 ( e A 2 + f A 2 + g A 2 ) 2 J n f A = 2 y n | x n e A + y n f A + z n g A d A | f A | x n e A + y n f A + z n g A d A | 2 ( e A 2 + f A 2 + g A 2 ) 2 J n g A = 2 z n | x n e A + y n f A + z n g A d A | g A | x n e A + y n f A + z n g A d A | 2 ( e A 2 + f A 2 + g A 2 ) 2 J n d A = 2 | x n e A + y n f A + z n g A d A | ( e A 2 + f A 2 + g A 2 )
According to the results, the Jacobian matrix HA can be obtained:
H A = J n ( e A , f A , g A , d A ) , ( n = 1 , 2 , 3 , 4 )
Let αA = (eA, fA, gA, dA)T, and select ζA as the iteration variable, while the Gauss–Newton method is used for iterative calculation.
α A m + 1 = α A m ( H A T H A ) 1 H A T ζ A m , ( m = 0 , 1 , 2 , )
where m is the iteration number.
Substituting the final iterative calculation result into Equation (8), we can obtain the equivalent spatial quadrilateral plane equation of UAV A.
Similarly, we assume that the equivalent spatial quadrilateral plane equation of UAV B is:
e B x + f B y + g B z = d B
where (eB, fB, gB) represents the normal vector and dB is the dot product of the normal vector value and the known point value.
Defining the object function J(eB, fB, gB, dB):
J ( e B , f B , g B , d B ) = n = 5 8 ( | x n e B + y n f B + z n g B d B | e B 2 + f B 2 + g B 2 ) 2
Calculating the partial derivative of the Equation (15), the Jacobian matrix HB is:
H B = J n ( e B , f B , g B , d B ) , ( n = 5 , 6 , 7 , 8 )
Let αB = (eB, fB, gB, dB)T, and select ζB as the iteration variable, while the Gauss–Newton method is used for iterative calculation.
α B m + 1 = α B m ( H B T H B ) 1 H B T ζ B m , ( m = 0 , 1 , 2 , )
Substituting the final iterative calculation result into Equation (11), we can obtain the equivalent spatial quadrilateral plane equation of UAV B.

2.3. Relative Pose Estimation of UAVs

The pose of the UAV is constantly changing during flight, and the accurate estimation of the relative pose of the UAV is the prerequisite and basis for multiple UAVs to complete coordinated control tasks. In this paper, the UAV is equivalent to a spatial polygonal plane, therefore the problem of the relative pose between UAVs is transformed into a problem of the relative position and angle between polygonal planes. We can define the distance between the geometric center of the spatial polygon planes as the relative distance rrel between UAVs, the angle between the normal vectors of the spatial polygon planes as the relative angle φrel between UAVs, the angle between the straight lines formed by the nose and tail of the UAVs as the relative angle θrel between the UAVs, and the angle between the straight lines formed by the two wings of the UAVs as the relative angle ψrel between the UAVs. When the normal vectors of the equivalent spatial polygonal planes of the UAVs and the measurement values of the GNSS receivers on the UAV are known, we can calculate the relative pose information of the UAVs.
The relative distance rrel between the UAVs can be solved by calculating the relative distance between the geometric centers of the equivalent spatial polygonal planes. When the measurement values of the GNSS receivers are known, the geometric center position coordinates OA and OB of the equivalent spatial polygon planes of UAV A and B are, respectively:
{ O A = 1 N ( n = 1 N x n , n = 1 N y n , n = 1 N z n ) O B = 1 N ( s = 1 N x s , s = 1 N y s , s = 1 N z s )
where N is the number of the GNSS receiver, and in this paper, N is 3 or 4.
According to the results of Equation (18), the relative distance rrel is:
r r e l = | O A O B | = 1 N ( s = 1 , n = 1 N ( x s x n ) ) 2 + ( s = 1 , n = 1 N ( y s y n ) ) 2 + ( s = 1 , n = 1 N ( z s z n ) ) 2
When the normal vectors of the spatial polygon planes are known, the relative angle φrel between the equivalent spatial triangle planes is:
ϕ r e l = arccos | n A n B | | n A | | n B | = arccos ( L 3 L 6 L 4 L 5 ) ( M 3 M 6 M 4 M 5 ) + ( L 5 L 2 L 6 L 1 ) ( M 5 M 2 M 6 M 1 ) + ( L 1 L 4 L 2 L 3 ) ( M 1 M 4 M 2 M 3 ) ( L 3 L 6 L 4 L 5 ) 2 + ( L 5 L 2 L 6 L 1 ) 2 + ( L 1 L 4 L 2 L 3 ) 2 ( M 3 M 6 M 4 M 5 ) 2 + ( M 5 M 2 M 6 M 1 ) 2 + ( M 1 M 4 M 2 M 3 ) 2
The relative angle φrel between the equivalent spatial quadrilateral planes is:
ϕ r e l = arccos | e A e B + f A f B + g A g B | e A 2 + f A 2 + g A 2 e B 2 + f B 2 + g B 2
The measurement values of the nose and tail of UAV A are P1 and P2, respectively, and the measurement values of the nose and tail of UAV B are P5 and P6, respectively, so the relative angle θrel is:
θ r e l = arccos | P 1 P 2 P 5 P 6 | | P 1 P 2 | | P 5 P 6 | = arccos ( x 2 x 1 ) ( x 6 x 5 ) + ( y 2 y 1 ) ( y 6 y 5 ) + ( z 2 z 1 ) ( z 6 z 5 ) ( x 2 x 1 ) 2 + ( y 2 y 1 ) 2 + ( z 2 z 1 ) 2 ( x 6 x 5 ) 2 + ( y 6 y 5 ) 2 + ( z 6 z 5 ) 2
The measurement values of the nose and tail of UAV A are P3 and P4, respectively, and the measurement values of the nose and tail of UAV B are P7 and P8, respectively, so the relative angle ψrel is:
ψ r e l = arccos | P 3 P 4 P 7 P 8 | | P 3 P 4 | | P 7 P 8 | = arccos ( x 4 x 3 ) ( x 8 x 7 ) + ( y 4 y 3 ) ( y 8 y 7 ) + ( z 4 z 3 ) ( z 8 z 7 ) ( x 4 x 3 ) 2 + ( y 4 y 3 ) 2 + ( z 4 z 3 ) 2 ( x 8 x 7 ) 2 + ( y 8 y 7 ) 2 + ( z 8 z 7 ) 2
According to the results of Equations (19)–(23), we can obtain the relative pose information of the UAVs.
Based on the above deduction and analysis, the steps for solving the relative attitude are as follows:
Step 1: Use data P1, P2, P3, and P4 from UAV A to calculate the plane normal vector n A and then use the plane equation to obtain the equivalent plane of UAV A.
Step 2: Use data P5, P6, P7, and P8 from UAV B to calculate the normal vector n B on the plane and then use the plane equation to obtain the equivalent plane of UAV B.
Step 3: Calculate the angle between two equivalent planes to obtain the relative attitude angle φrel.
Step 4: Calculate the relative attitude angle θrel using data P1, P2, P5, and P6.
Step 5: Calculate the relative attitude angle ψrel using data P3, P4, P7, and P8.

2.4. Influence of Measurement Error on Estimation Results

The GNSS receivers will have measured errors during the measurement process and affect the accuracy of the relative pose estimation. In this section, we will use the error analysis method to evaluate the estimation results. Assuming that the measurement value Pu (u = 1, 2,..., n) of the GNSS receivers on UAV A has a measurement error of Δ = (Δx, Δy, Δz), according to Equations (18) and (19), we can obtain:
Δ r r e l = ( s = 1 , n = 1 N ( x n x s ) ) Δ x u + ( s = 1 , n = 1 N ( y n y s ) ) Δ y u + ( s = 1 , n = 1 N ( z n z s ) ) Δ z u N ( s = 1 , n = 1 N ( x s x n ) ) 2 + ( s = 1 , n = 1 N ( y s y n ) ) 2 + ( s = 1 , n = 1 N ( z s z n ) ) 2
where Δrrel is the difference between the current moment and the previous moment of the relative position estimation and (Δxu, Δyu, Δzu) is the difference between the current moment and the previous moment of the measurement value Pu.
Substituting the measurement error of the measured value Pu into Equation (24), we can obtain:
Δ r r e l = ( s = 1 , n = 1 N ( x n x s ) + Δ x ) Δ x u + ( s = 1 , n = 1 N ( y n y s ) + Δ y ) Δ y u + ( s = 1 , n = 1 N ( z n z s ) + Δ z ) Δ z u N ( s = 1 , n = 1 N ( x s x n ) Δ x ) 2 + ( s = 1 , n = 1 N ( y s y n ) Δ y ) 2 + ( s = 1 , n = 1 N ( z s z n ) Δ z ) 2
Equation (25) gives the relationship between the measurement error and the estimation difference in the relative position. It shows that the measurement error will inevitably affect the estimation accuracy of the relative distance and make the estimation result larger or smaller.
For the relative angle φrel between the equivalent polygonal planes of the UAVs, if the measured value Pi exists with measurement error, we can obtain:
Δ ϕ r e l = ( E 1 F 1 + E 2 F 2 + E 3 F 3 ) E 2 F [ E 2 ( L 6 L 5 ) + E 3 ( L 3 L 4 ) ] [ ( L 6 L 5 ) F 2 + ( L 3 L 4 ) F 3 ] F E 5 F 4 E 3 F 2 ( E 1 F 1 + E 2 F 2 + E 3 F 3 ) 2 Δ x i + ( E 1 F 1 + E 2 F 2 + E 3 F 3 ) E 2 F [ E 1 ( L 5 L 6 ) + E 3 ( L 2 L 1 ) ] [ ( L 5 L 6 ) F 2 + ( L 2 L 1 ) F 3 ] F E 5 F 4 E 3 F 2 ( E 1 F 1 + E 2 F 2 + E 3 F 3 ) 2 Δ y i + ( E 1 F 1 + E 2 F 2 + E 3 F 3 ) E 2 F [ E 1 ( L 4 L 3 ) + E 2 ( L 1 L 2 ) ] [ ( L 4 L 3 ) F 2 + ( L 1 L 2 ) F 3 ] F E 5 F 4 E 3 F 2 ( E 1 F 1 + E 2 F 2 + E 3 F 3 ) 2 Δ z i
where E 1 = ( L 3 L 6 L 4 L 5 ) , E 2 = ( L 5 L 2 L 6 L 1 ) , E 3 = ( L 1 L 4 L 2 L 3 ) , E = E 1 2 + E 2 2 + E 3 2 , F 1 = ( M 3 M 6 M 4 M 5 ) , F 2 = ( M 5 M 2 M 6 M 1 ) , F 3 = ( M 1 M 4 M 2 M 3 ) , and F = F 1 2 + F 2 2 + F 3 2 .
Substituting the measurement error of the measured value Pi into Equation (26), we can obtain:
Δ ϕ r e l = [ G + Δ x ( ( L 6 L 5 ) F 2 + ( L 3 L 4 ) F 3 ) ] E 2 F [ E 2 ( L 6 L 5 ) + E 3 ( L 3 L 4 ) + Δ x ( ( L 6 L 5 ) 2 + ( L 3 L 4 ) 2 ) ] [ ( L 6 L 5 ) F 2 + ( L 3 L 4 ) F 3 ] F E 5 F 4 E 3 F 2 ( G + Δ x ( ( L 6 L 5 ) F 2 + ( L 3 L 4 ) F 3 ) ) 2 Δ x i + [ G + Δ y ( ( L 5 L 6 ) F 2 + ( L 2 L 1 ) F 3 ) ] E 2 F [ E 1 ( L 5 L 6 ) + E 3 ( L 2 L 1 ) + Δ y ( ( L 5 L 6 ) 2 + ( L 2 L 1 ) 2 ) ] [ ( L 5 L 6 ) F 2 + ( L 2 L 1 ) F 3 ] F E 5 F 4 E 3 F 2 ( G + Δ y ( ( L 5 L 6 ) F 2 + ( L 2 L 1 ) F 3 ) ) 2 Δ y i + [ G + Δ z ( ( L 4 L 3 ) F 2 + ( L 1 L 2 ) F 3 ) ] E 2 F [ E 1 ( L 4 L 3 ) + E 2 ( L 1 L 2 ) + Δ z ( ( L 4 L 3 ) 2 + ( L 1 L 2 ) 2 ) ] [ ( L 4 L 3 ) F 2 + ( L 1 L 2 ) F 3 ] F E 5 F 4 E 3 F 2 ( G + Δ z ( ( L 4 L 3 ) F 2 + ( L 1 L 2 ) F 3 ) ) 2 Δ z i
where G = ( E 1 F 1 + E 2 F 2 + E 3 F 3 ) .
If the measured value Pj exists with measurement error, we can obtain:
Δ ϕ r e l = G F ( L 1 E 3 L 6 E 2 ) E 2 F ( L 4 F 3 L 6 F 2 ) E 5 F 4 E 3 F 2 G 2 Δ x j + G F ( L 6 E 1 L 2 E 3 ) E 2 F ( L 6 F 1 L 2 F 3 ) E 5 F 4 E 3 F 2 G 2 Δ y i + G F ( L 2 E 2 L 4 E 1 ) E 2 F ( L 2 F 2 L 4 F 1 ) E 5 F 4 E 3 F 2 G 2 Δ z i
Substituting the measurement error of the measured value Pj into Equation (28), we can obtain:
Δ ϕ r e l = G F [ L 1 E 3 L 6 E 2 + Δ x ( L 1 L 4 + E 3 + L 6 2 ) + ( Δ x ) 2 L 4 ] E 2 F ( L 4 F 3 L 6 F 2 ) E 5 F 4 E 3 F 2 G 2 Δ x j + G F [ L 6 E 1 L 2 E 3 + Δ y ( L 2 2 + L 6 2 ) ] E 2 F ( L 6 F 1 L 2 F 3 ) E 5 F 4 E 3 F 2 G 2 Δ y i + G F [ L 2 E 2 L 4 E 1 + Δ z ( L 2 L 5 + L 4 2 ) ] E 2 F ( L 2 F 2 L 4 F 1 ) E 5 F 4 E 3 F 2 G 2 Δ z i
According to the results of Equations (26)–(28), the measurement error can affect the estimation results of the relative angle φrel between the equivalent polygonal planes to a certain extent, and the size of the error affects the deviation of the estimation result.
If the measured value P1 exists with measurement error, according to Equations (22)–(23), we can obtain:
Δ θ r e l = U ( V 1 Δ x 1 + V 2 Δ y 1 + V 3 Δ z 1 ) V ( U 1 V 1 + U 2 V 2 + U 3 V 3 ) ( U 1 Δ x 1 + U 2 Δ y 1 + U 3 Δ z 1 ) U 4 [ V 2 ( U 1 V 1 + U 2 V 2 + U 3 V 3 ) 2 ]
where U 1 = ( x 2 x 1 ) , U 2 = ( y 2 y 1 ) , U 3 = ( z 2 z 1 ) , U = U 1 2 + U 2 2 + U 3 2 , V 1 = ( x 6 x 5 ) , V 2 = ( y 6 y 5 ) , V 3 = ( z 6 z 5 ) , and V = V 1 2 + V 2 2 + V 3 2 .
Substituting the measurement error of the measured value P1 into Equation (30), we can obtain:
Δ θ r e l = ( U 1 Δ x ) 2 + ( U 2 Δ y ) 2 + ( U 3 Δ z ) 2 ( V 1 Δ x 1 + V 2 Δ y 1 + V 3 Δ z 1 ) [ ( U 1 Δ x ) 2 + ( U 2 Δ y ) 2 + ( U 3 Δ z ) 2 ] 2 [ V 2 ( ( U 1 Δ x ) V 1 + ( U 2 Δ y ) V 2 + ( U 3 Δ z ) V 3 ) 2 ] V ( ( U 1 Δ x ) V 1 + ( U 2 Δ y ) V 2 + ( U 3 Δ z ) V 3 ) ( ( U 1 Δ x ) Δ x 1 + ( U 2 Δ y ) Δ y 1 + ( U 3 Δ z ) Δ z 1 ) [ ( U 1 Δ x ) 2 + ( U 2 Δ y ) 2 + ( U 3 Δ z ) 2 ] 2 [ V 2 ( ( U 1 Δ x ) V 1 + ( U 2 Δ y ) V 2 + ( U 3 Δ z ) V 3 ) 2 ]
In the same way, we can obtain:
Δ ψ r e l = ( W 1 Δ x ) 2 + ( W 2 Δ y ) 2 + ( W 3 Δ z ) 2 ( Q 1 Δ x 1 + Q 2 Δ y 1 + Q 3 Δ z 1 ) [ ( W 1 Δ x ) 2 + ( W 2 Δ y ) 2 + ( W 3 Δ z ) 2 ] 2 [ Q 2 ( ( W 1 Δ x ) Q 1 + ( W 2 Δ y ) Q 2 + ( W 3 Δ z ) Q 3 ) 2 ] Q ( ( W 1 Δ x ) Q 1 + ( W 2 Δ y ) Q 2 + ( W 3 Δ z ) Q 3 ) ( ( W 1 Δ x ) Δ x 1 + ( W 2 Δ y ) Δ y 1 + ( W 3 Δ z ) Δ z 1 ) [ ( W 1 Δ x ) 2 + ( W 2 Δ y ) 2 + ( W 3 Δ z ) 2 ] 2 [ Q 2 ( ( W 1 Δ x ) Q 1 + ( W 2 Δ y ) Q 2 + ( W 3 Δ z ) Q 3 ) 2 ]
where W 1 = ( x 4 x 3 ) , W 2 = ( y 4 y 3 ) , W 3 = ( z 4 z 3 ) , W = W 1 2 + W 2 2 + W 3 2 , Q 1 = ( x 8 x 7 ) , Q 2 = ( y 8 y 7 ) , Q 3 = ( z 8 z 7 ) , and Q = Q 1 2 + Q 2 2 + Q 3 2 .
According to the results of Equations (31) and (32), the measurement error can affect the estimation results of the relative angle θrel and ψrel, and the magnitude of the measurement error determines whether the estimation result tends to increase or decrease.

3. Numerical Simulation and Results Analysis

3.1. Simulation Environment and Parameters

The simulation runs in a high-performance computer environment, and its configuration is Intel Xeon Silver 4215R/Tray CPU (main frequency is 3.2 GHz), NVIDIA Geforce Titan RTX professional graphics card, 32G DDR4 RAM, SSD Samsung 480 GB 2.5” SATA hard disk, and Windows 10 operation system with 64-bit. The MATLAB 2019 software is used to realize the real-time simulation result analysis of the proposed method in this paper.
The eight GNSS receivers are equipped on the nose, tail, and two wings of UAV A and UAV B to measure and output the position information in real time. We assume that the initial positions of the nose, tail, left wing, and right wing of UAV A are (0, 1.05, 2) km, (0, 0.93, 2) km, (−0.04, 1, 2) km, and (0.04, 1, 2) km, and the initial positions of the nose, tail, left wing, and right wing of UAV B are (0, 1.55, 2.5) km, (0, 1.43, 2.5) km, (−0.04, 1.5, 2.5) km, and (0.04, 1.5, 2.5) km. The initial positions of the centroids of UAV A and B are (0, 1, 2) km and (0, 1.5, 2.5) km, and their flight motion model is:
{ v A = [ 0.35 cos t 16 + δ x 0.35 cos t 19 + δ y 0.25 + 0.1 * ( δ z 0.5 ) ] x A = x A 0 + v A t
where xA0 is the initial position of the center of mass of UAV A and ( δ x , δ y , δ z ) T is the random noise.
{ v B = [ 0.35 cos t 16.5 + ξ x 0.35 cos t 18.5 + ξ y 0.25 + 0.1 * ( ξ z 0.5 ) ] x B = x B 0 + v B t
where xB0 is the initial position of the center of mass of UAV B and ( ξ x , ξ y , ξ z ) T is the random noise.
The formula of the relative position and the angle between UAV A and UAV B can be written as:
r A B = ( x A x x B x ) 2 + ( x A y x B y ) 2 + ( x A z x B z ) 2
where xA = (xAx, xAy, xAz)T and xB = (xBx, xBy, xBz)T.
{ ϕ A B = arccos x A x x B x r A B cos ψ A B θ A B = arcsin x A y x B y r A B cos ψ A B ψ A B = arcsin x A z x B z r A B
Setting the simulation time as t = 100 s, the flight trajectory, true relative position, and angle of UAV A and UAV B are shown in Figure 2, Figure 3 and Figure 4.

3.2. Simulation Results and Analysis

According to the design scheme of the relative pose estimation in Figure 1, let the measurement points 123 on UAV A determine the equivalent triangle plane as A1, the measurement points 124 on UAV A determine the equivalent triangle plane as A2, the measurement points 134 on UAV A determine the equivalent triangle plane as A3, the measurement points 234 on UAV A determine the equivalent triangle plane as A4, the measurement points 567 on UAV B determine the equivalent triangle plane as B1, the measurement points 568 on UAV B determine the equivalent triangle plane as B2, the measurement points 578 on UAV B determine the equivalent triangle plane as B3, and the measurement points 678 on UAV B determine the equivalent triangle plane as B4. The relative position estimation results and the errors between the estimation results and the true relative position of the equivalent spatial triangle planes A1 and B1, A2 and B2, A3 and B3, and A4 and B4 are shown in Figure 5a–d.
As can be seen from Figure 5, the relative position estimation results of the equivalent triangle planes A1 and B1, A2 and B2, A3 and B3, and A4 and B4 are very close to the true relative position results, and the estimation error is small. In addition, the estimation results between different triangle planes are very similar, which means that the equivalent triangle planes determined by any three points on the UAV can aptly estimate the relative position of the UAVs, and the accuracy of the estimation results is high.
The relative angle estimation result and the errors between the estimation results and the true relative angle of the equivalent spatial triangle planes A1 and B1, A2 and B2, A3 and B3, and A4 and B4 are shown in Figure 6a–d.
As can be seen from Figure 6, the relative angle estimation results of the equivalent triangle planes A1 and B1, A2 and B2, A3 and B3, and A4 and B4 are very close to the true relative angle results, and the estimation error is small. In addition, the estimation results between different triangle planes are very similar, which means that the equivalent triangle planes determined by any three points on the UAV can aptly estimate the relative position of the UAVs, and the accuracy of the estimation results is high.
As can be seen from Figure 5 and Figure 6, the relative pose method based on the equivalent triangle plane can accurately estimate the relative pose of the UAV and has a high estimation accuracy.
We let the equivalent quadrilateral plane of UAV A be A and the equivalent quadrilateral plane of UAV B be B. The relative position estimation results and the errors between the estimation results and the true relative position of the equivalent quadrilateral planes A and B are shown in Figure 7. The relative angle estimation result and the errors between the estimation results and the true relative angle of the equivalent quadrilateral planes A and B are shown in Figure 8.
As can be seen from Figure 7 and Figure 8, the relative pose method based on the equivalent quadrilateral plane can accurately estimate the relative pose of the UAV and has a high estimation accuracy. Compared with the relative pose method based on the equivalent triangle plane, the estimation results and accuracy of the two are very similar, which shows that if the measurement values of the three GNSS receivers on the UAV are known, the relative pose information of the UAV can be aptly estimated.
In the process of receiving data, due to the influence of external factors, the measurement data of the GNSS receivers will produce errors. In this section, the influence of the measurement errors on the estimation results is analyzed as follows. If the measurement data of GNSS receiver 1 has a measurement error, the estimation results of the relative pose of the UAVs are shown in Figure 9. The relative estimation accuracy is shown in Table 1.
If the measurement data of GNSS receiver 2 has a measurement error, the estimation results of the relative pose of the UAVs are shown in Figure 10. The relative estimation accuracy is shown in Table 2.
As can be seen from Figure 9 and Figure 10 and Table 1 and Table 2, the measurement errors can affect the results of the relative pose estimation, but they will only affect the relative pose estimation results of the equivalent polygonal planes including the measurement points with the measurement errors. Compared with the relative pose method based on the equivalent triangle plane, the relative pose method based on the equivalent quadrilateral plane is less affected by measurement errors, which means that the relative pose estimation method based on the multi-GNSS fusion to determine the equivalent polygon plane has good robustness. In addition, the position of the GNSS receiver with the error can be judged by the influence of the measurement errors on the estimation result.
Based on the above simulation results, it can be seen that the method proposed in this paper can more accurately estimate the relative pose results of the UAVs and can effectively identify the position of the GNSS receiver with measurement errors; at the same time, it also shows that the relative pose estimation method based on multi-data fusion used to determine the equivalent spatial plane has good robustness.

4. Conclusions

In this paper, we propose a relative pose estimation method for UAVs based on the equivalent spatial plane. In terms of theoretical derivation, four GNSS receivers are installed on the UAV to measure and output the position information, and three different GNSS measurement values are selected to calculate the normal vector and determine the equivalent triangle plane equation. The objective function based on the minimum sum of the squares of distances from the four points to the spatial plane is established, and the Gauss–Newton iteration method is used to iteratively calculate the normal vector and solve the equivalent quadrilateral plane equation. When the equivalent polygonal plane equation of the UAVs and the GNSS measurement values are known, the relative position and angle between the UAVs are calculated. Finally, the influence of the measurement errors in the measurement data on the estimation results of the relative pose is analyzed. In terms of the simulation analysis, the simulation environment and initial parameters are conducted to verify the performance of the proposed method.
The simulation results show that the proposed method can accurately estimate the relative attitude of the tanker/receiver without relying on precise mathematical models; the measurement errors only affect the relative pose estimation results including the measurement points with measurement errors, which show that the proposed method can achieve the detection and diagnosis of abnormal changes in GNSS data. The simulation results verify the applicability and effectiveness of the proposed method in this paper, and the research results can provide a certain theoretical and practical basis for cluster control and autonomous formation.

Author Contributions

Conceptualization, H.W. and S.G.; methodology, H.W. and S.G.; formal analysis, H.W.; resources, C.C.; data curation, J.L.; writing—original draft preparation, H.W.; writing—review and editing, S.G., C.C. and J.L.; project administration, H.W.; funding acquisition, H.W. and C.C. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by The Youth Innovation Team of Shaanxi Universities, grant number 2023-997, and the 2023 Scientific Research Plan Project Natural Science Key Project of Xi’an Aeronautical Polytechnic Institute, grant number 23XHZK-01.

Data Availability Statement

The data supporting the findings of this study are available within the article.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Xu, X.B.; Duan, H.B.; Guo, Y.J.; Deng, Y.M. A cascade adaboost and CNN algorithm for drogue detection in UAV autonomous aerial refueling. Neurocomputing 2020, 408, 121–134. [Google Scholar] [CrossRef]
  2. Chao, Z.; Zhou, S.L.; Ming, L.; Zhang, W.G.; Scalia, M. UAV Formation Flight Based on Nonlinear Model Predictive Control. Math. Probl. Eng. 2012, 2012, 181–188. [Google Scholar] [CrossRef]
  3. de Marina, H.G.; Espinosa, F.; Santos, C. Adaptive UAV Attitude Estimation Employing Unscented Kalman Filter, FOAM and Low-Cost MEMS Sensors. Sensors 2012, 12, 9566–9585. [Google Scholar] [CrossRef] [PubMed]
  4. Li, J. Relative pose measurement of binocular vision based on feature circle. Optik 2019, 194, 163121. [Google Scholar] [CrossRef]
  5. Williamson, W.R.; Glenn, G.J.; Dang, V.T.; Speyer, J.L.; Stecko, S.M.; Takacs, J.M. Sensor fusion applied to autonomous aerial refueling. J. Guid. Control Dyn. 2009, 32, 262–275. [Google Scholar] [CrossRef]
  6. Gross, J.; Gu, Y.; Rhudy, M.; Chahl, J. Fixed-Wing UAV Attitude Estimation Using Single Antenna GPS Signal Strength Measurements. Aerospatial 2016, 3, 14. [Google Scholar] [CrossRef]
  7. Korbly, R. Sensing Relative Attitudes for Automatic Docking. J. Guid. Control Dyn. 1983, 6, 213–215. [Google Scholar] [CrossRef]
  8. Koksal, N.; Jalalmaab, M.; Fidan, B. Adaptive Linear Quadratic Attitude Tracking Control of a Quadrotor UAV Based on IMU Sensor Data Fusion. Sensors 2018, 19, 46. [Google Scholar] [CrossRef] [PubMed]
  9. Li, B.R.; Mu, C.D.; Wu, B.T. Vision based close-range relative pose estimation for autonomous aerial refueling. J. Tsinghua Univ. (Sci. Technol.) 2012, 52, 1664–1669. [Google Scholar] [CrossRef]
  10. Campa, G.; Napolitano, M.; Fravolini, M. Simulation environment for machine vision based aerial refueling for UAVs. IEEE Trans. Aerosp. Electron. Syst. 2009, 45, 138–151. [Google Scholar] [CrossRef]
  11. Mammarella, M.; Campa, G.; Napolitano, M.R.; Fravolini, M.L. Comparison of point matching algorithms for the UAV aerial refueling problem. Mach. Vis. Appl. 2010, 21, 241–251. [Google Scholar] [CrossRef]
  12. Fravolini, M.L.; Campa, G.; Napolitano, M.R. Evaluation of machine vision algorithms for autonomous aerial refueling for unmanned aerial vehicles. J. Aerosp. Comput. Inf. Commun. 2007, 4, 968–985. [Google Scholar] [CrossRef]
  13. Ding, M.; Wei, L.; Wang, B. Vision-based estimation of relative pose in autonomous aerial refueling. Chin. J. Aeronaut. 2011, 6, 807–815. [Google Scholar] [CrossRef]
  14. Doebbler, J.; Spaeth, T.; Valasek, J.; Monda, M.J.; Schaub, H. Boom and receptacle autonomous air refueling using visual snake optical sensor. J. Guid. Control Dyn. 2009, 30, 175–1769. [Google Scholar] [CrossRef]
  15. Zhang, L.M.; Zhu, F.; Hao, Y.M.; Pan, W. Optimization-based non-cooperative spatialcraft pose estimation using stereo cameras during proximity operations. Appl. Opt. 2017, 56, 4522–4531. [Google Scholar] [CrossRef] [PubMed]
  16. Zhang, L.M.; Zhu, F.; Hao, Y.M.; Pan, W. Rectangular-structure-based pose estimation method for non-cooperative rendezvous. Appl. Opt. 2018, 57, 6164–6173. [Google Scholar] [CrossRef] [PubMed]
  17. Hinterstoisser, S.; Lepetit, V.; Ilic, S.; Holzer, S.; Konolige, K.; Bradski, G.; Navab, N. Technical Demonstration on Model Based Training, Detection and Pose Estimation of Texture-Less 3D Objects in Heavily Cluttered Scenes. In Proceedings of the 12th European Conference on Computer Vision (ECCV), Florence, Italy, 7–13 October 2012; Volume 7585, pp. 593–596. [Google Scholar] [CrossRef]
  18. Weaver, A.D.; Veth, M.J. Image-based relative navigation for the autonomous refueling problem using predictive rendering. In Proceedings of the 2009 IEEE Aerospace Conference, Big Sky, MT, USA, 7–14 March 2009; IEEE Computer Society: Piscataway, NJ, USA, 2009; pp. 1–13. [Google Scholar] [CrossRef]
  19. Li, G. Three-Dimensional Attitude Measurement of Complex Rigid Flying Target Based on Perspective Projection Matching. Ph.D. Thesis, Harbin Institute of Technology, Harbin, China, 2015. Available online: https://kns.cnki.net/KCMS/detail/detail.aspx?dbname=CMFD201601&filename=1015980350.nh (accessed on 1 July 2015).
  20. Liebelt, J.; Schmid, C.; Schertler, K. Viewpoint-independent object class detection using 3D feature maps. In Proceedings of the 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2008), Anchorage, AK, USA, 23–28 June 2008; pp. 978–986. [Google Scholar] [CrossRef]
  21. Teng, X.C.; Yu, Q.F.; Luo, J.; Zhang, X.H.; Wang, G. Pose Estimation for Straight Wing Aircraft Based on Consistent Line Clustering and Planes Intersection. Sensors 2019, 19, 342. [Google Scholar] [CrossRef] [PubMed]
  22. Williamson, W.R.; Abdel-Hafez, M.F.; Rhee, I.; Song, E.J.; Wolfe, J.D.; Chichka, D.F.; Speyer, J.L. An instrumentation system applied to formation flight. IEEE Trans. Control Syst. Technol. 2007, 15, 75–85. [Google Scholar] [CrossRef]
  23. Yuan, H.; Xiao, C.; Xiu, S.; Wen, Y.; Zhou, C.; Li, Q. A New Combined Vision Technique for Micro Aerial Vehicle Pose Estimation. Robotics 2017, 6, 6. [Google Scholar] [CrossRef]
  24. Strohmeier, M.; Montenegro, S. Coupled GPS/MEMS IMU Attitude Determination of Small UAVs with COTS. Electronics 2017, 6, 15. [Google Scholar] [CrossRef]
  25. Eling, C.; Klingbeil, L.; Kuhlmann, H. Real-Time Single-Frequency GPS/MEMS-IMU Attitude Determination of Lightweight UAVs. Sensors 2015, 15, 26212–26235. [Google Scholar] [CrossRef] [PubMed]
  26. Fosbury, A.M.; Crassidis, J.L. Relative navigation of air vehicles. J. Guid. Control Dyn. 2008, 31, 824–834. [Google Scholar] [CrossRef]
  27. Shao, W.; Chang, X.H.; Cui, P.Y.; Cui, H.T. Coupled feature matching and INS for small body landing navigation. J. Astronaut. 2010, 31, 1748–1755. [Google Scholar] [CrossRef]
  28. Wang, L.; Dong, X.M.; Zhang, Z.L. Relative pose measurement based on tightly-coupled INS/Vision. J. Chin. Inert. Technol. 2011, 19, 686–691. [Google Scholar] [CrossRef]
Figure 1. Scheme of relative pose estimation between UAVs.
Figure 1. Scheme of relative pose estimation between UAVs.
Drones 08 00383 g001
Figure 2. Flight trajectory of UAVs.
Figure 2. Flight trajectory of UAVs.
Drones 08 00383 g002
Figure 3. Relative position.
Figure 3. Relative position.
Drones 08 00383 g003
Figure 4. Relative angle.
Figure 4. Relative angle.
Drones 08 00383 g004
Figure 5. Relative position estimation and errors. (a) Relative position estimation and error between true value and estimation value 1; (b) Relative position estimation and error between true value and estimation value 2; (c) Relative position estimation and error between true value and estimation value 3; (d) Relative position estimation and error between true value and estimation value 4.
Figure 5. Relative position estimation and errors. (a) Relative position estimation and error between true value and estimation value 1; (b) Relative position estimation and error between true value and estimation value 2; (c) Relative position estimation and error between true value and estimation value 3; (d) Relative position estimation and error between true value and estimation value 4.
Drones 08 00383 g005
Figure 6. Relative angle estimation and errors. (a) Relative angles estimation and error between true value and estimation value 1; (b) Relative angles estimation and error between true value and estimation value 2; (c) Relative angles estimation and error between true value and estimation value 3; (d) Relative angles estimation and error between true value and estimation value 4.
Figure 6. Relative angle estimation and errors. (a) Relative angles estimation and error between true value and estimation value 1; (b) Relative angles estimation and error between true value and estimation value 2; (c) Relative angles estimation and error between true value and estimation value 3; (d) Relative angles estimation and error between true value and estimation value 4.
Drones 08 00383 g006
Figure 7. Relative position estimation and errors.
Figure 7. Relative position estimation and errors.
Drones 08 00383 g007
Figure 8. Relative angle estimation and errors.
Figure 8. Relative angle estimation and errors.
Drones 08 00383 g008
Figure 9. Relative pose estimation results.
Figure 9. Relative pose estimation results.
Drones 08 00383 g009
Figure 10. Relative pose estimation results.
Figure 10. Relative pose estimation results.
Drones 08 00383 g010
Table 1. Relative estimation accuracy.
Table 1. Relative estimation accuracy.
Relative Estimation AccuracyRMSE
Relative Position/kmφθψ
Estimation value 1 and true value0.02822.52.218.9
Estimation value 2 and true value0.03125.31.920.2
Estimation value 3 and true value0.02520.81.722.3
Estimation value 4 and true value0.0093.22.83.1
Estimation value 5 and true value0.01915.71.324.7
Table 2. Relative estimation accuracy.
Table 2. Relative estimation accuracy.
Relative Estimation AccuracyRMSE
Relative Position/kmφθψ
Estimation value 1 and true value0.03231.61.521.1
Estimation value 2 and true value0.03833.11.722.8
Estimation value 3 and true value0.0102.50.91.1
Estimation value 4 and true value0.04130.92.119.7
Estimation value 5 and true value0.02232.22.018.9
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, H.; Gong, S.; Chen, C.; Li, J. Equivalent Spatial Plane-Based Relative Pose Estimation of UAVs. Drones 2024, 8, 383. https://doi.org/10.3390/drones8080383

AMA Style

Wang H, Gong S, Chen C, Li J. Equivalent Spatial Plane-Based Relative Pose Estimation of UAVs. Drones. 2024; 8(8):383. https://doi.org/10.3390/drones8080383

Chicago/Turabian Style

Wang, Hangyu, Shuangyi Gong, Chaobo Chen, and Jichao Li. 2024. "Equivalent Spatial Plane-Based Relative Pose Estimation of UAVs" Drones 8, no. 8: 383. https://doi.org/10.3390/drones8080383

APA Style

Wang, H., Gong, S., Chen, C., & Li, J. (2024). Equivalent Spatial Plane-Based Relative Pose Estimation of UAVs. Drones, 8(8), 383. https://doi.org/10.3390/drones8080383

Article Metrics

Back to TopTop