Design of Airport Obstacle-Free Zone Monitoring UAV System Based on Computer Vision

In recent years, a rising number of incidents between Unmanned Aerial Vehicles (UAVs) and planes have been reported at airports and airfields. A design scheme for an airport obstacle-free zone monitoring UAV system based on computer vision is proposed. The system integrates the functions of identification, tracking, and expelling and is mainly used for low-cost control of balloon airborne objects and small aircrafts. First, a quadcopter dynamic model and 2-Degrees of Freedom (2-DOF) Pan/Tilt/Zoom (PTZ) model are analyzed, and an attitude back-stepping controller based on disturbance compensation is designed. Second, a low and slow small-target self-identification and tracking technology is constructed against a complex environment. Based on the You Only Look Once (YOLO) and Kernel Correlation Filter (KCF) algorithms, an autonomous target recognition and high-speed tracking plan with great robustness and high reliability is designed. Third, a PTZ controller and automatic aiming strategy based on Anti-Windup Proportional Integral Derivative (PID) algorithm is designed, and a simplified, automatic-aiming expelling device, the environmentally friendly gel ball blaster, which features high speed and high accuracy, is built. The feasibility and stability of the project can be verified through prototype experiments.


Introduction
The control of air-floating objects and small low-slow aircrafts means monitoring and expelling targeted air-floating object and small low-slow aircraft through technical methods and devices. With the rapid growth of the consumer-grade Unmanned Aerial Vehicle (UAV) market, problems of illegal flying bring about risks to security and privacy. Between 19 and 21 December 2018, hundreds of flights were cancelled at Gatwick Airport near London, UK, following reports of drone sightings close to the runway. The reports caused major disruption, affecting approximately 140,000 passengers and 1000 flights [1]. Accidents caused by drones are more and more frequent, so it is necessary to regulate the use of drones. It is of great significance to deploy anti-UAV and anti-aircraft drift target systems in the airport obstacle-free zone. However, UAV reconnaissance is a challenging task due to their small size and low flying speed. Many technologies such as radar monitoring, audio monitoring, video monitoring, and Radio Frequency (RF) monitoring have the potential to detect and locate drones. Each technique has its advantages and disadvantages.

1.
Radar: Radar is mainly used for the measurement and tracking of large aircraft, whereas UAVs usually fly at low altitude and low speed. Aleksander Nowak et al. introduced a method of fast, simultaneous calibration using many mobile Frequency Modulated Continuous Wave (FMCW) radars operating in the network; this is used in the anti-drone scheme [2]. Multerer et al. presented an anti-drone system that consists of a three-dimensional (3D) FMCW Multiple Input Table 1 summarizes the above monitoring techniques. The detection range of the target in the table is obtained from the literature and may vary with the type of target, monitoring environment, hardware parameters, and corresponding algorithm.    This paper proposes a new identification, tracking and expelling system, which can realize fully automatic, over the horizon and low-cost expulsion of a target object in flight. The design framework is divided into four parts: the underlying flight control system, the target fast recognition and tracking system, the aiming and PTZ control system, and the obstacle avoidance control system. The UAV consists of a quadcopter frame with a 680 mm wheelbase, 15 × 7 carbon fiber propellers (Shanghai, China) and 390 kV 5008 brushless motors (Shanghai, China). It is modeled with Pro/E to obtain the UAV structure model. The PIXHAWK pilot (Berkeley, CA, USA), as the bottom flight controller, runs the self-stabilization algorithm. The Jetson Nano development board is the core image processing unit, while the Arduino development board 1 is the core unit of the aiming design control system, and the Arduino development board 2, single line lidar, and ultrasonic sensor are the core units of the obstacle avoidance control system. They are shown in Figures 1-3.

Quadcopter Dynamic Model
An accurate and reliable mathematical model is the foundation for designing a flight platform. The design and study of the flight control system have been carried out conveniently thanks to the dynamic motion simulation platform. By taking the actual flight environment into account, realistic assumptions have been proposed in this section based on the analysis of rotor aerodynamics. Equations of kinematics and dynamics that can be used for accurately describing the flight motion of a UAV (Unmanned Aerial Vehicle) have been derived from the UAV flight principle and the overall force analysis.
First of all, our assumptions are given as follows: 1. Assuming that the earth is a standard sphere, the reference ground is a standard horizontal plane and we can ignore the effect of the Earth's rotation. 2. The quadcopter body is considered to be rigid and we can ignore its elastic deformation. 3. The air drag during the quadcopter flight is proportional to the square of the flight velocity, while the air drag direction is opposite to the velocity direction.

Quadcopter Aerodynamic and Torque Analysis
During the quadcopter flight, forces acting on the quadcopter mainly include the lift and drag generated by the rotor, the gravity of the quadcopter, and the frictional drag generated by the quadcopter during flight.
The torque acting on it is primarily composed of aerodynamic torque, torque, and the rolling torque generated by the rotor, as well as the gyroscopic effect torque generated during the rotation of the rotor. The force analysis is shown in Figure 4.

Quadcopter Dynamic Model
An accurate and reliable mathematical model is the foundation for designing a flight platform. The design and study of the flight control system have been carried out conveniently thanks to the dynamic motion simulation platform. By taking the actual flight environment into account, realistic assumptions have been proposed in this section based on the analysis of rotor aerodynamics. Equations of kinematics and dynamics that can be used for accurately describing the flight motion of a UAV (Unmanned Aerial Vehicle) have been derived from the UAV flight principle and the overall force analysis.
First of all, our assumptions are given as follows: 1.
Assuming that the earth is a standard sphere, the reference ground is a standard horizontal plane and we can ignore the effect of the Earth's rotation.

2.
The quadcopter body is considered to be rigid and we can ignore its elastic deformation. 3.
The air drag during the quadcopter flight is proportional to the square of the flight velocity, while the air drag direction is opposite to the velocity direction.

Quadcopter Aerodynamic and Torque Analysis
During the quadcopter flight, forces acting on the quadcopter mainly include the lift and drag generated by the rotor, the gravity of the quadcopter, and the frictional drag generated by the quadcopter during flight.
The torque acting on it is primarily composed of aerodynamic torque, torque, and the rolling torque generated by the rotor, as well as the gyroscopic effect torque generated during the rotation of the rotor. The force analysis is shown in Figure 4.
Hence, the external forces received during the quadcopter flight are as follows: where k tr represents the frictional drag coefficient between quadcopter and air; V is its flight speed; T i is the lift force generated by the rotor No. i; and D i is the drag generated by the rotor No. i. External torques received are presented as follows: where l x and l y are the horizontal distances between the rotor center and the quadcopter centroid, respectively, as shown in Figure 4; h represents the vertical distance between the rotor center and the quadcopter centroid; g ix and g iy are the gyroscopic effect stress torques generated when the rotor No. i is rotated; and J r is the moment of inertia of the motor and the rotor rotating group winding around the motor shaft. When four motors are applied on the x and y axes of the body coordinate system, the total gyroscopic effect torque is presented as follows: Sensors 2020, 20, x FOR PEER REVIEW 6 of 36 Hence, the external forces received during the quadcopter flight are as follows: where tr k represents the frictional drag coefficient between quadcopter and air; V is its flight speed; where x l and y l are the horizontal distances between the rotor center and the quadcopter centroid, respectively, as shown in Figure 4; h represents the vertical distance between the rotor center and the quadcopter centroid; ix g and iy g are the gyroscopic effect stress torques generated when the rotor No. i is rotated; and r J is the moment of inertia of the motor and the rotor rotating group winding around the motor shaft. When four motors are applied on the x and y axes of the body coordinate system, the total gyroscopic effect torque is presented as follows:

Definition of Coordinate System
In order to establish a state equation that can accurately describe quadcopter motion, a reasonable coordinate system should be selected with reference to specific coordinate systems for describing the relevant parameters of quadcopter linear and angular motions. This is not only a critical step in establishing the quadcopter model, but also an essential part of designing and studying the quadcopter flight control system. For better describing the quadcopter motion state, two rectangular coordinate systems are presented as follows: is the origin of the coordinates; the positive direction of x E points to the east; the positive direction of y E points to the south; and the positive direction of z E is determined to be perpendicular to the horizontal plane, pointing to the geocenter according to the right-hand rule.

Body coordinate system C B
The body coordinate system is defined as C B : (O B , x B , y B , z B ), which is connected with the quadcopter. O B is the origin of coordinates, which coincides with the center of the quadcopter (all devices are included) and with O E at the initial moment. The x B axis is in the plane that passes through the centroid and is perpendicular to the connection between motors No. 1 and No. 2, pointing to the Sensors 2020, 20, 2475 7 of 35 front of the body. The y B axis is in the plane that passes through the centroid and is perpendicular to the connection between motors No. 2 and No. 3, pointing to the right of the body. The z B axis is in the vertical plane passing through the centroid, pointing to the lower part of the body.
The flight motion of the quadcopter involves linear motion in three directions and angular motion winding around three axes. To be specific, the linear motion should be described in the ground coordinate system, while the angular motion should be described in the body coordinate system. In that case, the ground coordinate system and the body coordinate system should be used simultaneously in the quadcopter system modeling for describing and modeling the flight motion. The ground coordinate system C E and the body coordinate system C B used in the quadcopter system modeling are presented in Figure 5. It should be noted that the origins of the two coordinate systems are coincident in the initial state. Moreover, the quadcopter plane simplified structure is presented in Figure 5 to precisely demonstrate the significance of various parameters in the quadcopter model. C O x y z , which is connected with the quadcopter.

B
O is the origin of coordinates, which coincides with the center of the quadcopter (all devices are included) and with E O at the initial moment. The B x axis is in the plane that passes through the centroid and is perpendicular to the connection between motors No. 1 and No. 2, pointing to the front of the body. The B y axis is in the plane that passes through the centroid and is perpendicular to the connection between motors No. 2 and No. 3, pointing to the right of the body. The B z axis is in the vertical plane passing through the centroid, pointing to the lower part of the body. The flight motion of the quadcopter involves linear motion in three directions and angular motion winding around three axes. To be specific, the linear motion should be described in the ground coordinate system, while the angular motion should be described in the body coordinate system. In that case, the ground coordinate system and the body coordinate system should be used simultaneously in the quadcopter system modeling for describing and modeling the flight motion. The ground coordinate system E C and the body coordinate system B C used in the quadcopter system modeling are presented in Figure 5. It should be noted that the origins of the two coordinate systems are coincident in the initial state. Moreover, the quadcopter plane simplified structure is presented in Figure 5 to precisely demonstrate the significance of various parameters in the quadcopter model. respectively, in the quadcopter plane simplified structure shown in Figure 6. In general, the center of a multirotor quadcopter is assumed to be at the geometric center for modeling the multirotor quadcopter in the majority of the existing literature, obtaining  l txi and l tyi represent the distance between the motor No.i and the x B axis and the y B axis, respectively, in the quadcopter plane simplified structure shown in Figure 6. In general, the center of a multirotor quadcopter is assumed to be at the geometric center for modeling the multirotor quadcopter in the majority of the existing literature, obtaining l tx1 = l tx2 = l tx3 = l tx4 and l ty1 = l ty2 = l ty3 = l ty4 . As a matter of fact, the center position deviates from the geometric center, affecting the control effect of the multirotor quadcopter flight control system. Therefore, considering that the center position deviates from the geometric center, the distance between each motor and the x B axis and the y B axis possesses the following relationship: l tx1 = l tx4 , l tx2 = l tx3 , l ty1 = l ty2 , and l ty3 = l ty4 . Based on the quadcopter coordinate system shown in Figure 5, the attitude angle of quadcopter can be defined by the relationship between the body coordinate system B C and the ground coordinate system E C . The attitude angle of the quadcopter in this paper can be defined as follows: 1. Roll angle φ : It is the angle between the body axis B B O y and the horizontal plane, and the right Based on the quadcopter coordinate system shown in Figure 5, the attitude angle of quadcopter can be defined by the relationship between the body coordinate system C B and the ground coordinate system C E . The attitude angle of the quadcopter in this paper can be defined as follows: 1.
Roll angle φ: It is the angle between the body axis O B y B and the horizontal plane, and the right roll is positive.

2.
Pitch angle θ: It is the angle between the body axis O B x B and the horizontal plane, and the up is positive. 3.
Yaw angle ψ: It is the angle between the projection of the body axis O B x B on the horizontal plane and the ground coordinate system axis O E x E , and the right yaw is positive.

Euler Angles
The body coordinate system C B can be considered as the coordinate system, being fixed to the quadcopter body in the ground coordinate system C E . p = [x, y, z] T can be employed to present the position vector of the quadcopter centroid. What is more, the three attitude angles a = [φ, θ, ψ] T of the quadcopter can represent the position vector of the quadcopter attitude.
The conversion matrix converting from the body coordinate system to the ground coordinate system is:

Kinetic Equations
The multirotor quadcopter flight motion in the air can be decomposed into the quadcopter translation relative to the ground coordinate system and the rotation relative to the body coordinate system.
The translational motion equation of quadcopter is presented as follows: where p t = [x t , y t , z t ] T and v t = [u t , v t , w t ] T represent the position and velocity of the quadcopter in the ground coordinate system, respectively; m t is the quadcopter mass; g is the acceleration of gravity; → z E = [0, 0, 1] T is the unit vector in the ground coordinate system; R BE is the conversion matrix from the body coordinate system to the ground coordinate system; and F is the combined external forces acting on quadcopter, except for gravity. The attitude motion equation of quadcopter is presented as follows: .
where a t = [φ t , θ t , ψ t ] T represents the attitude angle of quadcopter in the ground coordinate system; Φ t (a t ) is the conversion matrix between the angular velocity in the body coordinate system and the Euler angular velocity in the ground coordinate system; ω t = [p t , q t , r t ] T is the component of the rotation angular velocity of the body coordinate system on the axis of the body coordinate system relative to the ground coordinate system; J t is the quadcopter rotational inertia matrix; Ge t is the rotor-gyroscopic effect stress torque, as shown in Equation (3); and M is the combined external torque acting on the quadcopter, except for the gyroscopic effect stress torque.

Target Expulsion Device
The ejection device of the launch system adopts the idea of a gel ball blaster, and is manufactured by 3D printing, with the features shown in Figures 7 and 8. It is composed of a high-speed motor, a spring, a piston, a cylinder, a gear set, and a power supply. The launcher uses safe and environmentally friendly gel balls to expel targets, as shown in Figure 9.
Euler angular velocity in the ground coordinate system; is the component of the rotation angular velocity of the body coordinate system on the axis of the body coordinate system relative to the ground coordinate system; t J is the quadcopter rotational inertia matrix ; t Ge is the rotor-gyroscopic effect stress torque, as shown in Equation (3); and M is the combined external torque acting on the quadcopter, except for the gyroscopic effect stress torque. By combining Equations (5) and (6), the six-degree-of-freedom nonlinear mathematical model of quadcopter can be obtained:

Target Expulsion Device
The ejection device of the launch system adopts the idea of a gel ball blaster, and is manufactured by 3D printing, with the features shown in Figures 7 and 8. It is composed of a high-speed motor, a spring, a piston, a cylinder, a gear set, and a power supply. The launcher uses safe and environmentally friendly gel balls to expel targets, as shown in Figure 9.   rotation angular velocity of the body coordinate system on the axis of the body coordinate system relative to the ground coordinate system; t J is the quadcopter rotational inertia matrix ; t Ge is the rotor-gyroscopic effect stress torque, as shown in Equation (3); and M is the combined external torque acting on the quadcopter, except for the gyroscopic effect stress torque. By combining Equations (5) and (6), the six-degree-of-freedom nonlinear mathematical model of quadcopter can be obtained:

Target Expulsion Device
The ejection device of the launch system adopts the idea of a gel ball blaster, and is manufactured by 3D printing, with the features shown in Figures 7 and 8. It is composed of a high-speed motor, a spring, a piston, a cylinder, a gear set, and a power supply. The launcher uses safe and environmentally friendly gel balls to expel targets, as shown in Figure 9.

Design of High-Speed Steering Gear PTZ
Due to the limitations of the load and power consumption of a quadcopter, the requirements for the PTZ are being light in weight, small in volume, and simple in structure. Based on the above principles, a two-degrees-of-freedom PTZ controlled by the steering gear is designed, as shown in Figure 10, including yaw axis steering gear, a yaw axis adapter plate, pitch axis steering gear, a pitch axis adapter plate, a lens mounting base and lens, and a launcher mounting base. The model of the steering gear is ds8611, while its torque is 7.4 V 18 kg/cm and speed is 0.14 s/60°. The bracket is made using 3D printing. In order to ensure the continuous supply of gel balls for the launching device, a continuous gel ball feeding device is designed, as shown in the Figure 8. The gel balls slide downward under the action of gravity and enter the feeding hole. The designed projectile supply device can store up to 200 gel balls at a time, which can meet the requirements of expelling targets on the flight mission (see Figure 10).

Design of High-Speed Steering Gear PTZ
Due to the limitations of the load and power consumption of a quadcopter, the requirements for the PTZ are being light in weight, small in volume, and simple in structure. Based on the above principles, a two-degrees-of-freedom PTZ controlled by the steering gear is designed, as shown in Figure 10, including yaw axis steering gear, a yaw axis adapter plate, pitch axis steering gear, a pitch axis adapter plate, a lens mounting base and lens, and a launcher mounting base. The model of the steering gear is ds8611, while its torque is 7.4 V 18 kg/cm and speed is 0.14 s/60 • . The bracket is made using 3D printing. In order to ensure the continuous supply of gel balls for the launching device, a continuous gel ball feeding device is designed, as shown in the Figure 8. The gel balls slide downward under the action of gravity and enter the feeding hole. The designed projectile supply device can store up to 200 gel balls at a time, which can meet the requirements of expelling targets on the flight mission (see Figure 10).

Design of High-Speed Steering Gear PTZ
Due to the limitations of the load and power consumption of a quadcopter, the requirements for the PTZ are being light in weight, small in volume, and simple in structure. Based on the above principles, a two-degrees-of-freedom PTZ controlled by the steering gear is designed, as shown in Figure 10, including yaw axis steering gear, a yaw axis adapter plate, pitch axis steering gear, a pitch axis adapter plate, a lens mounting base and lens, and a launcher mounting base. The model of the steering gear is ds8611, while its torque is 7.4 V 18 kg/cm and speed is 0.14 s/60°. The bracket is made using 3D printing. In order to ensure the continuous supply of gel balls for the launching device, a continuous gel ball feeding device is designed, as shown in the Figure 8. The gel balls slide downward under the action of gravity and enter the feeding hole. The designed projectile supply device can store up to 200 gel balls at a time, which can meet the requirements of expelling targets on the flight mission (see Figure 10). The actuator design is the model obtained by identifying the test results. The operating characteristics of the steering gear include the dead band, linear zone, and nonlinear zone. The steering gear working in the linear zone can be approximately represented as a second-order model as follows: Since the calculation is discrete, the above equation should be discretized as follows: where ( ) u z is the control signal of the actuator; ( ) y z is the feedback signal of the actuator; and 1 a ,  The actuator design is the model obtained by identifying the test results. The operating characteristics of the steering gear include the dead band, linear zone, and nonlinear zone. The steering gear working in the linear zone can be approximately represented as a second-order model as follows: Since the calculation is discrete, the above equation should be discretized as follows: where u(z) is the control signal of the actuator; y(z) is the feedback signal of the actuator; and a 1 , a 2 , b 1 , and b 2 are the parameters obtained through identification. Actuator identification is performed using the adaptive genetic algorithm [18], obtaining a 1 = 0.7349, a 2 = −0.0076, b 1 = −0.0986, and b 2 = −0.1787.

Ground Station
The ground monitoring computer is the basis of the reliable operation for the identification, tracking, and expelling system. With the help of the ground system, the remote control and monitoring of the equipment are realized. The key functions are as follows: (1) Planning the flight route of the UAV and (2) monitoring the status of the UAV in real time, while also sending instructions, as shown in Figure 11.

Ground Station
The ground monitoring computer is the basis of the reliable operation for the identification, tracking, and expelling system. With the help of the ground system, the remote control and monitoring of the equipment are realized. The key functions are as follows: (1) Planning the flight route of the UAV and (2) monitoring the status of the UAV in real time, while also sending instructions, as shown in Figure 11.

Design of Backstepping Quadcopter Controller Based on Interference Compensation
The attitude controller is the most important part of multi-rotor UAV flight control system, because the performance of the attitude controller directly affects the performance of trajectory

Design of Backstepping Quadcopter Controller Based on Interference Compensation
The attitude controller is the most important part of multi-rotor UAV flight control system, because the performance of the attitude controller directly affects the performance of trajectory tracking. Therefore, an attitude controller with superior performance is a prerequisite for stable flight of a multi-rotor UAV flight controller. At the same time, due to the complex structure of multi-rotor UAV, it is difficult to build a precise six-freedom nonlinear mathematical model, and the flight performance of multi-rotor UAV would be affected by the uncertainties of kinetic model, external interference and other common factors. Control methods that are frequently applied nowadays include PID control [19], sliding model control [20], backstepping control [21], feedback linearization control [22], and robust control [23]. The backstepping method is especially suitable for some under-actuated systems. It converts problems related to the design of complicated and high-order controlled system into problems of low-order system decomposed from it. The method designs a virtual and intermediate control variable for each decomposed subsystem to stabilize them, and to thereby obtain the actual control input of the system through iteration. The backstepping method combines the selection of the Lyapunov function well with the solving of the control law, which proves Lyapunov liability. When designing the quadcopter controller, this thesis had fully considered the uncertainties of the kinetic model as well as the influence of external interference, and used radial basis function neural network (RBFNN) to approximate the uncertain items of the model. As for the inference brought by changes of the mass of gel balls after launching from the quadcopter as well as changes of external air current, we had used nonlinear observer to make compensation, and designed the backstepping controller based on interference compensation. Stability of the system was proved with the analysis method in Lyapunov stability theory. At the end, verification was conducted on the simulation platform, and flight test was conducted.

Quadcopter Modeling and Processing
The nonlinear model of 6-DOF quadcopters is expressed by Equation (10): (10) in which p t = [x t , y t , z t ] T is the position state vector of quadcopter in the ground coordinate system; υ t = [u t , v t , w t ] T is the speed state vector of the quadcopter in the ground coordinate system; a t = [φ t , θ t , ψ t ] T is the attitude state vector of the quadcopter in the ground coordinate system; ω t = [p t , q t , r t ] T is the state vector of the attitude angular velocity vector of the quadcopter in the body coordinate system; Φ t (t) is the transform matrix between attitude angular velocity and Eulerian angular velocity, as shown by Equation (11); g is the local gravitational acceleration, g = 9.81 m/s 2 ; m t is the mass of the quadcopter, m t = 1.94 kg; R BE is the transform matrix between the body coordinate system and the ground coordinate system, as shown by Equation (12); k tr is the air drag coefficient of quadcopter, which is given by Equation (13); J t is the moment of inertia matrix of quadcopter, J t = diag 2.37 × 10 −3 3.51 × 10 −3 5.31 × 10 −3 kg·m 2 ; Ge t is the gyroscopic moment, as expressed by Equation (14); J tr is the rotational inertia of motor and rotary wing around the motor shaft, J tr = 4.92 × 10 −6 kg·m 2 ; M t = [u t1 , u t2 , u t3 , u t4 ] T is the resultant moment excluding the gyroscopic moment, with the control variables u t1 , u t2 , u t3 , u t4 being defined by Equation (15); and l t is the length of the quadcopter's frame arm, l t = 0.265 m. As shown in Table 2. Substituting the above equations into Equation (10) and introducing the external disturbance term d tu , d tv , · · · , d tr , we can obtain the nonlinear model of 6-DoF quadcopters, as shown in Equation (16). We selected the three virtual control variables as follows: Now, its translation dynamical equation can be rewritten as: where the three virtual control variables, u tx , u ty , and u tz , and the desired yaw angle ψ td can be used to solve the desired control variable u t1 , desired roll angle φ td , and desired pitch angle θ td .
As can be seen from the nonlinear mathematical model of 6-DoF quadcopters expressed by Equation (16), the resultant external moment acting on a quadcopter leads to changes in its attitude angular velocity and thus in the attitude angle, which ultimately causes changes in the spatial position of the quadcopter. Therefore, similar to the case with the quadcopter, the trajectory tracking controller is also designed with a double closed-loop control structure comprised of an attitude controller (inner loop) and a trajectory tracking controller (outer loop). In the design of the inner-loop controller, a RBFNN is adopted to approximate uncertainties of the model, and a nonlinear observer is adopted to compensate for the influence caused by external disturbances.

Back-Stepping Attitude Controller Design Based on Disturbance Compensation
The design and stability analysis of RBFNN and nonlinear disturbance observer in quadcopter trajectory tracking controller, as well as the demonstration of their stability, are omitted. The estimations of external disturbance terms D tυ Equation (20) and D tω Equation (21) in the quadcopter model can be directly given by equations. The detailed process is omitted here.
where u t = u tx , u ty , u tz T , L tυ > 0 and L tω > 0 are all parameters to be designed for the observer. Similarly, the quadcopter attitude controller is also designed with a double closed-loop structure that is comprised of an attitude angular velocity controller (inner loop) and an attitude angle controller (outer loop). The structural diagram of the attitude controller is shown in Figure 12. Next, the process by which the attitude angle controller is designed using the back-stepping method is described.
The error between the attitude angle of the quadcopter t Y and the desired attitude angle d Y is defined as: Select a Lyapunov function for the quadcopter attitude angle system, as follows: Taking the derivative of the function, and substituting into Equation (22), we obtain: Let the command for attitude angular velocity be: Substituting Equation (25) into Equation (24), we obtain: That is, when adopting the command for attitude angular velocity designated by Equation (25), the attitude angle system of the quadcopter is stabilized.
The error between the attitude angular velocity of the quadcopter t X and the desired attitude angular velocity d X is defined as: is the attitude angular velocity of the quadcopter and [ ] , , is the desired attitude angular velocity. Taking the derivative of the above function, we can substitute it into the quadcopter attitude model as follows: in which, 0 T L L = > is the parameter of the auxiliary system to be designed; represents the RBFNN estimation of the Next, the process by which the attitude angle controller is designed using the back-stepping method is described.
The error between the attitude angle of the quadcopter Y t and the desired attitude angle Y d is defined as: Select a Lyapunov function for the quadcopter attitude angle system, as follows: Taking the derivative of the function, and substituting into Equation (22), we obtain: Let the command for attitude angular velocity be: where Λ Y = Λ T Y > 0 is the matrix to be designed. Substituting Equation (25) into Equation (24), we obtain: That is, when adopting the command for attitude angular velocity designated by Equation (25), the attitude angle system of the quadcopter is stabilized.
The error between the attitude angular velocity of the quadcopter X t and the desired attitude angular velocity X d is defined as: where X t = [p t , q t , r t ] T is the attitude angular velocity of the quadcopter and X d = [p d , q d , r d ] T is the desired attitude angular velocity. Taking the derivative of the above function, we can substitute it into the quadcopter attitude model as follows: in which, L = L T > 0 is the parameter of the auxiliary system to be designed;D t (t) represents the estimation of external unknown interference D;Ŵ T ξ(Y t ) represents the RBFNN estimation of the function. Among which, represents unknown external interference. We obtain: Letting we have: Select a Lyapunov function for the quadcopter attitude angular velocity system, as follows: Taking the derivative of the above function, and substituting it into Equation (31), we obtain: .
Observe and analyze the above equation, and select the adaptive rate ofŴ, as shown below: .
where Λ W = Λ −1 W > 0 and µ W > 0 are both parameters to be designed. Substituting Equation (34) into Equation (33), we obtain: In the meantime, the following transformation should also be taken into account: Substituting Equation (36) into Equation (35), we obtain: .
Sensors 2020, 20, 2475 16 of 35 To ensure the stability of the attitude angular velocity system of the quadcopter, the parameters of the controller and relevant auxiliary systems should at least satisfy the following conditions: Meanwhile, as known from literature [24,25], where V X is convergent and lim t→∞ V X = C K . Thus, under the effect of the control moment for attitude angle, as shown by Equation (30), the attitude angular velocity system of the quadcopter is stable.

Trajectory Tracking Control Based on the Back-Stepping Method
As can be known from the above, the model of the quadcopter position system is as follows: The quadcopter position system shown by Equation (42) is divided into three subsystems along the longitudinal and horizontal movements, which are described in the state space: The following part takes the height controller as an example, where the back-stepping method is adopted in design.
The height error of the quadcopter is defined as follows: where y 3d is the desired height of the quadcopter and y 3 is its actual height. Select a Lyapunov function for the quadcopter height subsystem, as follows: Sensors 2020, 20, 2475

of 35
Taking the derivative of the above Lyapunov function, we obtain: Design the following virtual control variable according to Equation (46): where Λ y3 > 0 is the controller parameter to be designed. Substituting Equation (47) into Equation (46), we obtain: that is, the virtual control variable designed according to Equation (47) enables the error of quadcopter height to converge. The longitudinal speed error of the quadcopter is defined as follows: where x 3d is the desired longitudinal speed of the quadcopter and x 3 is its actual longitudinal speed.
Taking the derivative of the longitudinal speed of the quadcopter, we obtain: .
Furthermore, we have: Substituting Equation (51) into Equation (50), we obtain: where Λ x3 > 0 is the controller parameter to be designed. Substituting Equation (52) into Equation (51), we obtain: Select a Lyapunov function for the quadcopter longitudinal speed subsystem, as follows: In the meantime, we use the nonlinear observer to estimate the external longitudinal disturbance term d tw . The specific definition is shown below. The demonstration process of its stability is omitted here.
According to Equation (56), we can obtain: .d Substituting Equation (57) into Equation (56), we obtain: To ensure the convergence of the quadcopter's longitudinal speed error, the following condition should be satisfied: By the same token, we can derive the virtual control variables for the two subsystems along horizontal movements, as shown below: x td + Λ y1 e y1 u x = ..

Target Recognition, Location
The deep learning model has become a hot research topic in computer vision due to its strong presentation ability, data accumulation, and computing power progress. It is mainly divided into three levels: classification, detection, and segmentation.
The classification task focuses on the whole, giving the content description of the whole picture, while the detection focuses on the specific object target, requiring the simultaneous acquisition of the class information and location information of the target. Compared with classification, detection is the understanding of the foreground and background of a picture. We need to isolate an interesting target from the background and determine the target description (class and location). Therefore, the output of a detection model is a list, and each item in the list uses a dataset, giving the class and position of the detected target (often expressed by coordinates of a rectangular detection box).
With the development of Deep Neural Networks (DNN), the Convolutional Neural Network (CNN) has been widely used in image recognition. When a video collected by the camera is subject to frame extracting to obtain the image, the first task is to detect all kinds of aircrafts in the image, that is, target detection of the corresponding image processing task. However, CNN can only judge if the target object appears in the image, but cannot locate the position of the target object in the image. Ross et al. [26], from Berkeley University, proposed a new network structure named Regions with Convolutional Neural Network Features (R-CNN) in 2014, which realized the functions of image recognition and item location. Girshick proposed a new Faster R-CNN [27] in 2016. In structure, Faster R-CNN has already integrated feature extraction, proposal, bounding box regression, and classification in one network, greatly improving the overall performance, especially in detection speed. The improved versions of Fast R-CNN [28] and Faster R-CNN realized higher recognition accuracy, but the application of classifiers in the R-CNN series network caused the processing speed of Fast R-CNN to be 0.5 frames per second (FPS), while that of Faster R-CNN reached 7 FPS. This is a great improvement compared to R-CNN, but still fails to meet the real-time requirement.
In 2016, Redmon et al. [29][30][31] from the University of Washington proposed a real-time object detection network YOLO (You Only Look Once). The YOLO algorithm is also a classic algorithm in the field of target detection. The core idea is to use the entire picture as an input to the network, directly returning the location of the bounding box and its class in the output layer. The object detection problem is transformed into a regression problem for processing and a single neural network can be used to obtain the position coordinates and relative size of the object from one image. On a computer with a GPU, YOLO in its standard version can process an image on the real-time basis at a rate of 45 FPS, but the YOLO fast version can reach a speed of 155 FPS, doubling the average accuracy of any other real-time object detection method.
We compared the similarities and differences of Faster R-CNN and YOLOv3 in target detection. With regard to the traditional multiclass detection task, the target detection task in high-speed field is relatively simple (for aircraft detection only). In terms of accuracy, YOLOv3 has better performance; in terms of processing speed, Faster R-CNN algorithm runs more slowly: the time spent on one image is 5-6 s, which does not meet the speed requirement in actual testing tasks. However, the YOLOv3 algorithm takes only 0.1-0.2 s to process one image and satisfies the requirement better in actual conditions. To improve the accuracy of lane detection in complex scenarios, an adaptive lane feature learning algorithm that can automatically learn the features of a lane in various scenarios is proposed [32]. As a result, the YOLOv3 algorithm is adopted for target detection.
This paper aims to propose a real-time image recognition, location, and tracking system on the basis of a YOLO network. YOLO divides the whole input image into a S × S grid that can predict the normalized relative coordinates (x, y), normalized relative length and width (w, h), and confidence level con f of the central position of R bounding boxes, as well as C conditional probabilities, that is, the probability that this object belongs to one class when this grid contains the target object. The YOLO network structure is shown in Figure 13, including 24 convolutional layers and two fully connected layers. Convolutional layers are used to extract the features in the image, while the fully connected layers are used to build the relationship between the features and the probability of image position and target. The output of the YOLO network is a S × S × (5 × R + C) tensor, where each 1 × 1 × (5 × R + C) tensor corresponds to one S × S grid from the image, including conditional probability, bounding box size, and coordinate information.
where coord λ is the predicted weight of the bounding box coordinate; noobj λ is the confidence weight of bounding box without the target object ; 1 obj ij is used to judge if the j bounding box in i grid is responsible for the object; and 1 obj i is used to judge whether there is an object center in i grid. Through this loss of function, YOLO can achieve a balance between the bounding box coordinates and size, confidence, and conditional probability. In YOLO, there will be multiple bounding boxes in one grid conducting the prediction of the object. However, in the training process, it is hoped that each object can be predicted by only one bounding box in the end. Therefore, if the IOU (Intersection Over Union) of one current bounding box prediction to the GTB (Ground True Box) is the highest, this bounding box will be in charge of the prediction of this object. As the training progresses, each bounding box will provide a better prediction of the responsible object.
Meanwhile, when YOLO is used for real-time location of an aircraft or a target, it is only needed to recognize the single object, the aircraft, so C = 1. Moreover, because the output of YOLO contains bounding box coordinate and size, it is required to transmit the coordinate and size of the bounding The loss function of YOLO is shown in Equation (61): where η 1 is shown in Equation (62), representing the prediction of bounding box coordinate and its size; η 2 is shown in Equation (63), representing the confidence level prediction of a bounding box containing a target object; η 3 is shown in Equation (64), representing the confidence level prediction of a bounding box without a target object; and η 4 is shown in Equation (65), representing the conditional class probability prediction.
where λ coord is the predicted weight of the bounding box coordinate; λ noobj is the confidence weight of bounding box without the target object; 1 obj ij is used to judge if the j bounding box in i grid is responsible for the object; and 1 obj i is used to judge whether there is an object center in i grid. Through this loss of function, YOLO can achieve a balance between the bounding box coordinates and size, confidence, and conditional probability. In YOLO, there will be multiple bounding boxes in one grid conducting the prediction of the object. However, in the training process, it is hoped that each object can be predicted by only one bounding box in the end. Therefore, if the IOU (Intersection Over Union) of one current bounding box prediction to the GTB (Ground True Box) is the highest, this bounding box will be in charge of the prediction of this object. As the training progresses, each bounding box will provide a better prediction of the responsible object.
Meanwhile, when YOLO is used for real-time location of an aircraft or a target, it is only needed to recognize the single object, the aircraft, so C = 1. Moreover, because the output of YOLO contains bounding box coordinate and size, it is required to transmit the coordinate and size of the bounding box in the image into the actual location and distance of the aircraft. Therefore, the last fully connected layer is added to the YOLO network to build the relationship between the coordinate and size of the bounding box and the actual location and distance of the aircraft.
In network training, 2000 images are collected as the specimens, and the locations of the targets in sample images are marked. Some specimens are shown as Figures 14-17. Figure 18 provides the recognition of targets by YOLO after training.
Sensors 2020, 20, x FOR PEER REVIEW 22 of 36 box in the image into the actual location and distance of the aircraft. Therefore, the last fully connected layer is added to the YOLO network to build the relationship between the coordinate and size of the bounding box and the actual location and distance of the aircraft. In network training, 2000 images are collected as the specimens, and the locations of the targets in sample images are marked. Some specimens are shown as Figures 14-17. Figure 18 provides the recognition of targets by YOLO after training.     box in the image into the actual location and distance of the aircraft. Therefore, the last fully connected layer is added to the YOLO network to build the relationship between the coordinate and size of the bounding box and the actual location and distance of the aircraft. In network training, 2000 images are collected as the specimens, and the locations of the targets in sample images are marked. Some specimens are shown as Figures 14-17. Figure 18 provides the recognition of targets by YOLO after training.     box in the image into the actual location and distance of the aircraft. Therefore, the last fully connected layer is added to the YOLO network to build the relationship between the coordinate and size of the bounding box and the actual location and distance of the aircraft. In network training, 2000 images are collected as the specimens, and the locations of the targets in sample images are marked. Some specimens are shown as Figures 14-17. Figure 18 provides the recognition of targets by YOLO after training.     box in the image into the actual location and distance of the aircraft. Therefore, the last fully connected layer is added to the YOLO network to build the relationship between the coordinate and size of the bounding box and the actual location and distance of the aircraft. In network training, 2000 images are collected as the specimens, and the locations of the targets in sample images are marked. Some specimens are shown as Figures 14-17. Figure 18 provides the recognition of targets by YOLO after training.     As seen in Table 3, when the UAV is away from the image boundary, it can be located accurately, but when the UAV approaches the image boundary or goes beyond the image range, the accuracy of the location may decrease.

Target Tracking
In the early stage, image tracking algorithms such as Camshift, light stream, and background subtraction were very popular, and were applied successfully in static background conditions. After 2008, such methods were gradually abandoned, with the research focus shifting to the study of image tracking with a dynamic or complex background.
Currently  As seen in Table 3, when the UAV is away from the image boundary, it can be located accurately, but when the UAV approaches the image boundary or goes beyond the image range, the accuracy of the location may decrease.

Target Tracking
In the early stage, image tracking algorithms such as Camshift, light stream, and background subtraction were very popular, and were applied successfully in static background conditions. After 2008, such methods were gradually abandoned, with the research focus shifting to the study of image tracking with a dynamic or complex background.
Currently The main contributions of the KCF algorithm are as follows: (1) Positive and negative samples are collected by using the cyclic matrix of the surrounding area of the target, and the target detector is trained by using ridge regression. The operation of a matrix is transformed into a Hadamard product of vector by the diagonalization property of a cyclic matrix in Fourier space, i.e., the dot product of an element, greatly reducing the amount of computation, improving the speed of operation, and making the algorithm meet the real-time need. (2) The ridge regression of linear space is mapped to the nonlinear space by a kernel function. In the nonlinear space, by solving a dual problem and some common constraints, the calculation can also be simplified by diagonalizing the cyclic matrix Fourier space. (3) A way to integrate multichannel data into the algorithm is presented. The histogram of oriented gradient (HOG) feature is used when the features of a targeted area are extracted. The HOG feature divides the image into smaller parts called cells. Gradient information is extracted from the cells, and a gradient orientation histogram is drawn to reduce the influence of light. By gathering the orientation histograms of several cells for block normalization, all orientation histograms of cells are connected in a series to get the features of the image.
The accelerating methods used in KCF are as follows: (1)  We have built the OpenCV running environment on the Jetson nano and deployed the YOLO algorithm and KCF tracker [33]. Then we prepared for the next test.

Principle of Visual Feedback Servo Tracking
Through image acquisition and comparison of successive frames, computer vision feedback can be realized. The difference between the target position extracted according to the next image information and the position information extracted from the previous image is used as the input signal of PTZ position control; the space movement of the moving target is converted into frame image plane coordinates, and the angle of rotation required for PTZ aiming is calculated to form the visual feedback.
As shown in Figure 19, the optical center point of the camera mounted on the PTZ is used as the reference point O e , and a space reference coordinate system O e x e y e z e is built. By means of the space reference coordinate system and image plane coordinate transformation, the coordinate of the target in the camera imaging plane, the image plane coordinate, is determined. Assuming that at moment t, the coordinate of the target object in the reference coordinate system O e x e y e z e is A(x t , y t , z t ), after image plane coordinate transformation, its location coordinate A(x t1 , y t1 , ) in the image plane can be determined. After a very short time ∆t, PTZ does not act; the space reference coordinate system remains unchanged and the coordinate of the target object in the reference coordinate system is A(x t+∆t , y t+∆t , z t+∆t ). After image plane coordinate transformation, its location coordinate A(x t1+∆t , y t1+∆t , ) in the image plane can be determined. After calculation, within the time interval ∆t, the coordinate of the target object turns α and β relative to the x e axis and y e axis of the space reference coordinate system.
Assuming that the execution time of the PTZ action is 0, after the PTZ location adjustment, the space datum coordinate system becomes O e+∆t x e+∆t y e+∆t z e+∆t . The coordinate of the target in relative to the new space coordinate system is S t+∆t x t+∆t , y t+∆t , z t+∆t , and the image plane coordinate is A t+∆t a x,t+∆t , a y,t+∆t . Assuming that before every rotation in this process, the camera coordinate system is the space coordinate system, its mathematical model can be described as shown in Equation (66): where O e is the set space coordinate at moment t, i.e., the standard coordinate system; O e+∆t is the camera coordinate after rotation at moment ∆t, equivalent to the transition coordinate system introduced for calculating the rotation. In R e+∆t = R α × R β , R e+∆t is the rotation matrix. Its coordinate transformation, i.e., the process of movement, is as shown in Figure 19. The target object moves from Point A to Point B, and yaw angle α and pitch angle β are adjusted through PTZ to ensure that the visual axis is aligned with the target object and achieve target tracking. Assuming that the execution time of the PTZ action is 0, after the PTZ location adjustment, the space datum coordinate system becomes ' is the rotation matrix. Its coordinate transformation, i.e., the process of movement, is as shown in Figure 19. The target object moves from Point A to Point B, and yaw angle α and pitch angle β are adjusted through PTZ to ensure that the visual axis is aligned with the target object and achieve target tracking. The 2-DOF PTZ angle is adjusted to ensure the coincidence of optical center and rotation center; the tracking motion of the visual axis can be broken into the rotation motion around x axis and y axis in the camera coordinate system. The matrix is shown in Equation (67) The workflow is as follows: in coordinate system o, the starting position of the UAV is the origin of coordinates, the starting position of target is ( , , ) (2) When the target is found, it marks the target and sends the target frame coordinates and color images to the target tracking algorithm for initialization. (3) The color image is sent to the target tracking algorithm for iterative updating, and the next color image is re-executed in the third step. (4) According to the color image and the corresponding target frame coordinate information, the UAV flight height, speed, and attitude angle are controlled in order to enable the UAV to approach the target. The Jetson Nano detects distance The 2-DOF PTZ angle is adjusted to ensure the coincidence of optical center and rotation center; the tracking motion of the visual axis can be broken into the rotation motion around x axis and y axis in the camera coordinate system. The matrix is shown in Equation (67): cos ∆β 0 − sin ∆β − sin ∆β sin ∆α cos ∆α − sin ∆α cos ∆β − sin ∆β cos ∆α sin ∆α cos ∆α cos ∆β The workflow is as follows: in coordinate system o, the starting position of the UAV is the origin of coordinates, the starting position of target is A(x t , y t , z t ) and the detection angle of airborne camera is 120 • . After a short time ∆t, the target position is A(x t+∆t , y t+∆t , z t+∆t ). (1) The Jetson Nano collects a video stream through the UAV front-end camera. (2) When the target is found, it marks the target and sends the target frame coordinates and color images to the target tracking algorithm for initialization.
(3) The color image is sent to the target tracking algorithm for iterative updating, and the next color image is re-executed in the third step. (4) According to the color image and the corresponding target frame coordinate information, the UAV flight height, speed, and attitude angle are controlled in order to enable the UAV to approach the target. The Jetson Nano detects distance via the front-end ultrasonic ranging module to keep a safe distance from the target. (5) After entering the range, the Arduino control board executes the fire control program, drives the two-degrees-of-freedom steering gear pan tilt, quickly corrects angles α and β between the collimation and the target, automatically executes the shooting command after aiming, drives the gel ball blaster unit motor to launch through the relay, and completes its task of hitting the target.
In ideal conditions, the new coordinate should be the same as the image plane coordinate at moment t. In the actual movement process, due to the extraction error of image plane coordinates in the target recognition process and the tracking error of the control system, an because of the continuous movement of the target during the tracking process, the visual feedback system suffers an upper limit of tracking speed. In order to further improve the speed and accuracy of target tracking, higher requirements are put forward for the design of a PTZ controller.

Anti-Windup PID Control Algorithm
In the design process of the pan tilt control system, the structural strength and response speed of the steering gear are limited; in general, this is referred to as plant input limitation. In addition, the PTZ module requires frequent switches to different modes, such as from follow mode to target attack mode, which is known as plant input substitution. Due to the existence of input limitation and displacement, the input and output of the control system are sometimes unequal, which leads to further variation in the closed-loop response of the control system, resulting in the windup phenomenon. The PID controller is widely used across various aspects of control system design. In order to eliminate static error, the windup phenomenon is inevitable in the integral part of the controller. The fast-tracking task results in higher requirements in the design of the PTZ control system. Typically, the control system takes a small signal as input in the process of debugging and operation. When the PTZ of the steering gear quickly follows and adjusts the firing angle, the control signal input is given a large range of sudden change, which is prone to large overshoots and vibrations, affecting the stability of the entire flight control system.
In view of the windup phenomenon in the PTZ control system, we have established the PTZ model of the steering gear, analyzed the influence of the structure, speed and force of the steering gear on tracking and proposed an Anti-Windup PID controller to reduce the influence of actuator saturation and improve the dynamic response performance of the steering gear PTZ. Firstly, we ignore the nonlinear effect of saturation caused by the actuator limitation of the steering gear and take the deviation between the expected position of the aim point and the actual position as the input value, integrate the saturation error, and weaken the saturation effect by adjusting the adaptive coefficient. When the pan tilt of the steering gear is adjusted slightly, the compensator does not work. When the PTZ of the steering gear is adjusted rapidly and at a large angle, the path and time information and large signal are taken as input values. The PID controller with anti-integral saturation compensation will play a role in ensuring the control performance as the system is saturated. The input video stream resolution is 1080 × 720 pixels. The center abscissa (540,360) of the frame image pixel of the video stream is taken as the given value, and the center abscissa of the target frame is taken as the output value and negative feedback, all of which form a closed-loop control loop. After the difference between the given value and the feedback value is passed through the Anti-Windup PID controller, it is sent to the Arduino control board through the serial port. From this, the angular speed of rotation is calculated in order to control the rotation of the pan tilt of the steering gear so that the target is in the center of the image.
In this simulation experiment, the Anti-Windup PID controller adopts back-calculation, and the structure is shown in Figure 20.
automatically executes the shooting command after aiming, drives the gel ball blaster unit motor to launch through the relay, and completes its task of hitting the target.
In ideal conditions, the new coordinate should be the same as the image plane coordinate at moment t. In the actual movement process, due to the extraction error of image plane coordinates in the target recognition process and the tracking error of the control system, an because of the continuous movement of the target during the tracking process, the visual feedback system suffers an upper limit of tracking speed. In order to further improve the speed and accuracy of target tracking, higher requirements are put forward for the design of a PTZ controller.

Anti-Windup PID Control Algorithm
In the design process of the pan tilt control system, the structural strength and response speed of the steering gear are limited; in general, this is referred to as plant input limitation. In addition, the PTZ module requires frequent switches to different modes, such as from follow mode to target attack mode, which is known as plant input substitution. Due to the existence of input limitation and displacement, the input and output of the control system are sometimes unequal, which leads to further variation in the closed-loop response of the control system, resulting in the windup phenomenon. The PID controller is widely used across various aspects of control system design. In order to eliminate static error, the windup phenomenon is inevitable in the integral part of the controller. The fast-tracking task results in higher requirements in the design of the PTZ control system. Typically, the control system takes a small signal as input in the process of debugging and operation. When the PTZ of the steering gear quickly follows and adjusts the firing angle, the control signal input is given a large range of sudden change, which is prone to large overshoots and vibrations, affecting the stability of the entire flight control system.
In view of the windup phenomenon in the PTZ control system, we have established the PTZ model of the steering gear, analyzed the influence of the structure, speed and force of the steering gear on tracking and proposed an Anti-Windup PID controller to reduce the influence of actuator saturation and improve the dynamic response performance of the steering gear PTZ. Firstly, we ignore the nonlinear effect of saturation caused by the actuator limitation of the steering gear and take the deviation between the expected position of the aim point and the actual position as the input value, integrate the saturation error, and weaken the saturation effect by adjusting the adaptive coefficient. When the pan tilt of the steering gear is adjusted slightly, the compensator does not work. When the PTZ of the steering gear is adjusted rapidly and at a large angle, the path and time information and large signal are taken as input values. The PID controller with anti-integral saturation compensation will play a role in ensuring the control performance as the system is saturated. The input video stream resolution is 1080 × 720 pixels. The center abscissa (540,360) of the frame image pixel of the video stream is taken as the given value, and the center abscissa of the target frame is taken as the output value and negative feedback, all of which form a closed-loop control loop. After the difference between the given value and the feedback value is passed through the Anti-Windup PID controller, it is sent to the Arduino control board through the serial port. From this, the angular speed of rotation is calculated in order to control the rotation of the pan tilt of the steering gear so that the target is in the center of the image.
In this simulation experiment, the Anti-Windup PID controller adopts back-calculation, and the structure is shown in Figure 20. Anti-Windup PID controller output expression: In the determination of distance from the target, we adopted the ultrasonic ranging scheme. The threshold value is set to 10 m. When the ultrasonic device detects that it is 10 m from the target, the UAV stops moving toward it. The 10 m distance can effectively ensure the safety of the UAV, and at the same time, remain within effective expelling range. In this paper, the binocular camera is not used for video streaming and target depth information collection, because the data volume of the binocular camera is too large, and the maximum processing speed of the airborne processor Jetson Nano can only reach three frames per second. The real-time performance of the program is poor.

Experimental Results
The experiment is divided into five stages: In the first stage, in Sections 2.2 and 3.1, the back-stepping method was applied to design the attitude controller and trajectory tracking controller of quadcopters, and the Lyapunov stability of the said controllers was demonstrated. An experimental simulation based on the quadcopter model was conducted to validate the correctness of the designed controllers. To demonstrate the advanced nature of the controllers, the traditional PID controller was used as a comparable counterpart. The diagram of the trajectory tracking controller designed for the quadcopter is shown in Figure 21.  In the determination of distance from the target, we adopted the ultrasonic ranging scheme. The threshold value is set to 10 m. When the ultrasonic device detects that it is 10 m from the target, the UAV stops moving toward it. The 10 m distance can effectively ensure the safety of the UAV, and at the same time, remain within effective expelling range. In this paper, the binocular camera is not used for video streaming and target depth information collection, because the data volume of the binocular camera is too large, and the maximum processing speed of the airborne processor Jetson Nano can only reach three frames per second. The real-time performance of the program is poor.

Experimental Results
The experiment is divided into five stages: In the first stage, in Sections 2.2 and 3.1, the back-stepping method was applied to design the attitude controller and trajectory tracking controller of quadcopters, and the Lyapunov stability of the said controllers was demonstrated. An experimental simulation based on the quadcopter model was conducted to validate the correctness of the designed controllers. To demonstrate the advanced nature of the controllers, the traditional PID controller was used as a comparable counterpart. The diagram of the trajectory tracking controller designed for the quadcopter is shown in Figure 21. Under the Matlab2018/Simulink environment, the trajectory tracking of the designed controller was simulated, where a fixed step size of 0.001 s was adopted. The specified trajectory was given by Equation (69), and the initial values for the attitude angle, its angular velocity, position, and speed were all zero in the initial state. The following parameters for the controller were selected, as shown in Table 4.  Under the Matlab2018/Simulink environment, the trajectory tracking of the designed controller was simulated, where a fixed step size of 0.001 s was adopted. The specified trajectory was given by Equation (69), and the initial values for the attitude angle, its angular velocity, position, and speed were all zero in the initial state. The following parameters for the controller were selected, as shown in Table 4.
The external disturbance terms D tυ (t) and D tω (t) are shown by Equation (70) and Equation (71), respectively.
The simulation results are shown in Figure 22.
The external disturbance terms The simulation results are shown in Figure 22. As can be seen from the simulation results in Figure 22, the quadcopter has fairly good performance in terms of tracking the desired trajectories under the designed trajectory tracking controller. As can be seen from Figure 22b, when tracking 3D trajectories, the maximum error is kept within 0.15 m; comparatively, the maximum error under the PID controller reaches 0.3 m. Furthermore, the average error of the designed controller is also lower than that of the PID controller. As seen in Figure 22c-f, the maximum tracking errors along the x-and y-axis are around 0.1 m, and both the maximum and average errors are lower than those of the PID controller. As seen in Figure  22g-h, the tracking error remains within 0.02 m along the z-axis-an evidently better performance than the PID controller.
As seen in Figure 22i,n, both the designed attitude controller and the PID attitude controller can effectively maintain the quadcopter's attitude. However, the attitude controller designed in this paper can enable the three attitude angles of the quadcopter to rapidly converge towards the desired angles.
In the meantime, as seen in Equation (69)  As can be seen from the simulation results in Figure 22, the quadcopter has fairly good performance in terms of tracking the desired trajectories under the designed trajectory tracking controller. As can be seen from Figure 22b, when tracking 3D trajectories, the maximum error is kept within 0.15 m; comparatively, the maximum error under the PID controller reaches 0.3 m. Furthermore, the average error of the designed controller is also lower than that of the PID controller. As seen in Figure 22c-f, the maximum tracking errors along the xand y-axis are around 0.1 m, and both the maximum and average errors are lower than those of the PID controller. As seen in Figure 22g-h, the tracking error remains within 0.02 m along the z-axis-an evidently better performance than the PID controller.
As seen in Figure 22i,n, both the designed attitude controller and the PID attitude controller can effectively maintain the quadcopter's attitude. However, the attitude controller designed in this paper can enable the three attitude angles of the quadcopter to rapidly converge towards the desired angles.
In the meantime, as seen in Equation (69), the desired trajectories of the quadcopter when t ∈ (0, 5] s and t ∈ (25,30] s are straight upwards and straight downwards. Under the effect of external disturbances, the tracking errors along the xand y-axis remain within 0.01 m, and around 0.02 m along the z-axis. These lay the foundation for quadrotor flight stability. During outdoor testing with disturbance factors like varied wind force, wind direction, and mass, the controller also delivered fairly good stability and reliability. In the second stage, we ran a test for target tracking algorithm. The test was conducted with the selected balloon in the air and the fixed-wing model aircraft in flight. The result showed that fast and accurate tracking can be realized in airborne image processing with a velocity of around 7 FPS when the size, speed, posture, and background of the target change. As shown in Figure 23. Table 5  In the second stage, we ran a test for target tracking algorithm. The test was conducted with the selected balloon in the air and the fixed-wing model aircraft in flight. The result showed that fast and accurate tracking can be realized in airborne image processing with a velocity of around 7 FPS when the size, speed, posture, and background of the target change. As shown in Figure 23. Table 5   In the third stage, we ran a trajectory tracking MATLAB simulation experiment of the steering gear PTZ. Expected trajectories in shapes of square and Z were set up to test the performance of Anti-Windup PID controller. Compared with a traditional PID controller, the results showed that, under the large signal input with rapid changes, the overshoot of the Anti-Windup PID controller is smaller  In the third stage, we ran a trajectory tracking MATLAB simulation experiment of the steering gear PTZ. Expected trajectories in shapes of square and Z were set up to test the performance of Anti-Windup PID controller. Compared with a traditional PID controller, the results showed that, under the large signal input with rapid changes, the overshoot of the Anti-Windup PID controller is smaller with faster shrink, and its PTZ controlling performance is better than that of the traditional PID controller, as shown in Figure 24. with faster shrink, and its PTZ controlling performance is better than that of the traditional PID controller, as shown in Figure 24. In the fourth stage, we ran an indoor static test. The UAV was placed on a box with a height of 0.5 m. The target was set 1.5 m high and 8 m away from the UAV. When the target was calibrated, Jetson nano, Arduino, and the steering gear PTZ located the target within 2 s, corrected the shooting trajectory, and triggered the shooting procedure. The gel ball trajectory was scattered within the range of 10 cm. High-speed cameras had been used to shoot the target, and then the number of impact points in each target area was counted through slow-motion playback to calculate the accuracy of PTZ launch. Hit rate of indoor static test (deviation <20 cm) was 94.44%. An expected result was achieved in the experiment. Indoor accuracy test, as shown in Table 6 and Figure 25.  In the fifth stage, we launched an outdoor dynamic expelling test. As shown in Table 7 and Figure 26. The target was fixed on the DJI (DJ-Innovations, Shenzhen, China) UAV, which could move randomly in the air. The UAV we designed could follow the flight with a response speed of only 0.5 s. Meanwhile, it processed the target tracking program at high speed (the rate in testing In the fourth stage, we ran an indoor static test. The UAV was placed on a box with a height of 0.5 m. The target was set 1.5 m high and 8 m away from the UAV. When the target was calibrated, Jetson nano, Arduino, and the steering gear PTZ located the target within 2 s, corrected the shooting trajectory, and triggered the shooting procedure. The gel ball trajectory was scattered within the range of 10 cm. High-speed cameras had been used to shoot the target, and then the number of impact points in each target area was counted through slow-motion playback to calculate the accuracy of PTZ launch. Hit rate of indoor static test (deviation <20 cm) was 94.44%. An expected result was achieved in the experiment. Indoor accuracy test, as shown in Table 6 and Figure 25. with faster shrink, and its PTZ controlling performance is better than that of the traditional PID controller, as shown in Figure 24. In the fourth stage, we ran an indoor static test. The UAV was placed on a box with a height of 0.5 m. The target was set 1.5 m high and 8 m away from the UAV. When the target was calibrated, Jetson nano, Arduino, and the steering gear PTZ located the target within 2 s, corrected the shooting trajectory, and triggered the shooting procedure. The gel ball trajectory was scattered within the range of 10 cm. High-speed cameras had been used to shoot the target, and then the number of impact points in each target area was counted through slow-motion playback to calculate the accuracy of PTZ launch. Hit rate of indoor static test (deviation <20 cm) was 94.44%. An expected result was achieved in the experiment. Indoor accuracy test, as shown in Table 6 and Figure 25.  In the fifth stage, we launched an outdoor dynamic expelling test. As shown in Table 7 and Figure 26. The target was fixed on the DJI (DJ-Innovations, Shenzhen, China) UAV, which could move randomly in the air. The UAV we designed could follow the flight with a response speed of only 0.5 s. Meanwhile, it processed the target tracking program at high speed (the rate in testing In the fifth stage, we launched an outdoor dynamic expelling test. As shown in Table 7 and Figure 26. The target was fixed on the DJI (DJ-Innovations, Shenzhen, China) UAV, which could move randomly in the air. The UAV we designed could follow the flight with a response speed of only 0.5 s. Meanwhile, it processed the target tracking program at high speed (the rate in testing environment was 7.2 Fps), and executed the trajectory correcting program (10 times/s) and the launching program (15 times/s). The gel ball dispersion was within 20 cm in field tests. Figure 27 is a close-up shot of the target, and Figure 28 is the gray-scale image of the target, which shows the distribution of impact points visually. The test method is the same as above. Hit rate of outdoor dynamic test (deviation <20 cm) was 82.98%. In the outdoor dynamic shooting test, fast and effective expulsion of the target UAV was achieved. Sensors 2020, 20, x FOR PEER REVIEW 33 of 36 environment was 7.2 Fps), and executed the trajectory correcting program (10 times/s) and the launching program (15 times/s). The gel ball dispersion was within 20 cm in field tests. Figure 27 is a close-up shot of the target, and Figure 28 is the gray-scale image of the target, which shows the distribution of impact points visually. The test method is the same as above. Hit rate of outdoor dynamic test (deviation <20 cm) was 82.98%. In the outdoor dynamic shooting test, fast and effective expulsion of the target UAV was achieved.  . Figure 27. Target image.

Conclusions and Future Work
We have designed an integrated identification, tracking, and expelling quadcopter UAV system based on computer vision, proposed a real-time image recognition, location, and tracking plan based on the YOLO and KCF algorithms, and designed corresponding target tracking and expelling strategies. We have made a prototype UAV, conducted target recognition and tracking tests, a PTZ control algorithm test, an indoor static test, and outdoor dynamic expelling testing, and assessed the design plan in different scenarios. According to the experiment, an automatic target identification and tracking system is designed based on the YOLO and KCF algorithms, and the ability to identify and track the target with high speed in a complex environment is realized. Additionally, the highspeed steering gear PTZ that is designed by the Anti-Windup algorithm is adopted to effectively Sensors 2020, 20, x FOR PEER REVIEW 33 of 36 environment was 7.2 Fps), and executed the trajectory correcting program (10 times/s) and the launching program (15 times/s). The gel ball dispersion was within 20 cm in field tests. Figure 27 is a close-up shot of the target, and Figure 28 is the gray-scale image of the target, which shows the distribution of impact points visually. The test method is the same as above. Hit rate of outdoor dynamic test (deviation <20 cm) was 82.98%. In the outdoor dynamic shooting test, fast and effective expulsion of the target UAV was achieved.

Conclusions and Future Work
We have designed an integrated identification, tracking, and expelling quadcopter UAV system based on computer vision, proposed a real-time image recognition, location, and tracking plan based on the YOLO and KCF algorithms, and designed corresponding target tracking and expelling strategies. We have made a prototype UAV, conducted target recognition and tracking tests, a PTZ control algorithm test, an indoor static test, and outdoor dynamic expelling testing, and assessed the design plan in different scenarios. According to the experiment, an automatic target identification and tracking system is designed based on the YOLO and KCF algorithms, and the ability to identify and track the target with high speed in a complex environment is realized. Additionally, the highspeed steering gear PTZ that is designed by the Anti-Windup algorithm is adopted to effectively Sensors 2020, 20, x FOR PEER REVIEW 33 of 36 environment was 7.2 Fps), and executed the trajectory correcting program (10 times/s) and the launching program (15 times/s). The gel ball dispersion was within 20 cm in field tests. Figure 27 is a close-up shot of the target, and Figure 28 is the gray-scale image of the target, which shows the distribution of impact points visually. The test method is the same as above. Hit rate of outdoor dynamic test (deviation <20 cm) was 82.98%. In the outdoor dynamic shooting test, fast and effective expulsion of the target UAV was achieved.

Conclusions and Future Work
We have designed an integrated identification, tracking, and expelling quadcopter UAV system based on computer vision, proposed a real-time image recognition, location, and tracking plan based on the YOLO and KCF algorithms, and designed corresponding target tracking and expelling strategies. We have made a prototype UAV, conducted target recognition and tracking tests, a PTZ control algorithm test, an indoor static test, and outdoor dynamic expelling testing, and assessed the design plan in different scenarios. According to the experiment, an automatic target identification and tracking system is designed based on the YOLO and KCF algorithms, and the ability to identify and track the target with high speed in a complex environment is realized. Additionally, the highspeed steering gear PTZ that is designed by the Anti-Windup algorithm is adopted to effectively

Conclusions and Future Work
We have designed an integrated identification, tracking, and expelling quadcopter UAV system based on computer vision, proposed a real-time image recognition, location, and tracking plan based on the YOLO and KCF algorithms, and designed corresponding target tracking and expelling strategies. We have made a prototype UAV, conducted target recognition and tracking tests, a PTZ control algorithm test, an indoor static test, and outdoor dynamic expelling testing, and assessed the design plan in different scenarios. According to the experiment, an automatic target identification and tracking system is designed based on the YOLO and KCF algorithms, and the ability to identify and track the target with high speed in a complex environment is realized. Additionally, the high-speed steering gear PTZ that is designed by the Anti-Windup algorithm is adopted to effectively ensure the ability of operating the aiming and expelling device with high speed and precision. The UAV system based on