Next Article in Journal
Improved Model Predictive Control for Asymmetric T-Type NPC 3-Level Inverter
Previous Article in Journal
Design of Composite Disturbance Observer and Continuous Terminal Sliding Mode Control for Piezoelectric Nanopositioning Stage
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Vision-Based Target Detection and Tracking for a Miniature Pan-Tilt Inertially Stabilized Platform

1
Tianjin Key Laboratory of Intelligent Control of Electrical Equipment, School of Control Science and Engineering, Tiangong University, Tianjin 300387, China
2
School of Electrical and Electronic Engineering, Tiangong University, Tianjin 300387, China
3
Faculty of Natural and Mathematical Sciences, King’s College London, London WC2R 2LS, UK
*
Author to whom correspondence should be addressed.
Electronics 2021, 10(18), 2243; https://doi.org/10.3390/electronics10182243
Submission received: 6 August 2021 / Revised: 6 September 2021 / Accepted: 8 September 2021 / Published: 13 September 2021
(This article belongs to the Section Systems & Control Engineering)

Abstract

:
This paper presents a novel visual servoing sheme for a miniature pan-tilt intertially stabilized platform (ISP). A fully customized ISP can be mounted on a miniature quadcopter to achieve stationary or moving target detection and tracking. The airborne pan-tilt ISP can effectively isolate a disturbing rotational motion of the carrier, ensuring the stabilization of the optical axis of the camera in order to obtain a clear video image. Meanwhile, the ISP guarantees that the target is always on the optical axis of the camera, so as to achieve the target detection and tracking. The vision-based tracking control design adopts a cascaded control structure based on the mathematical model, which can accurately reflect the dynamic characteristics of the ISP. The inner loop of the proposed controller employs a proportional lag compensator to improve the stability of the optical axis, and the outer loop adopts the feedback linearization-based sliding mode control method to achieve the target tracking. Numerical simulations and laboratory experiments demonstrate that the proposed controller can achieve satisfactory tracking performance.

1. Introduction

Commonly used small-scale unmanned aerial vehicles (UAVs) can be classified into fixed-wing aircraft, helicopters and multicopters. Before 2010, fixed-wing aircraft and helicopters took overwhelming dominance both in military and civil fields. However, due to the simple structure, ease of use, high reliability and low cost, the quadcopter as the most popular multicopter has consolidated its dominance in the market of small-scale UAVs in recent years [1]. Quadcopters have the characteristics of hovering, vertical take-off and landing, low-speed at low altitude and multi-attitude fight. Besides the small size and light weight, they have low requirements for take-off and landing sites, suitable for flying in confined spaces and in close proximity to people, such as within urban canyons and even inside buildings. Unfortunately, flying within these environments is very challenging from a navigation perspective, as the Global Positioning System (GPS) signal will be largely degraded or even unavailable due to dropouts and multipath effects [2]. Driven by this actual requirement, vision-based navigation and active ranging sensor-based navigation in GPS-denied environments are formulated to replace traditional GPS and inertial sensor navigation systems [3].
The quadcopter without payloads has no practical value in airborne flight. Only when the quadcopter carries mission payloads, such as a camera or a laser detection and ranging (LADAR) system, can it effectively improve the adaptability to the environment to perform various actual tasks, as mentioned in [4]. In order to achieve superior performance, these payloads need to be installed on a inertially stabilized platform (ISP). Airborne ISP, which takes the advantage of the gyro characteristics to maintain its own stability, is the core device of the navigation system. In order to accomplish the tasks of detecting and tracking ground or low-altitude targets, it is necessary to ensure that the vision sensor can always obtain stable and high-quality target images. Therefore, the vision sensor must be mounted on a highly stabilized platform. The ISP generally has two functions. On the one hand, it ensures the stabilization of the optical axis direction with respect to inertial space by measuring the attitude change of the carrier and reacting to isolate the rotational movement of the carrier. On the other hand, it can quickly respond to the control command, rotating with the desired angular velocity to ensure that the optical axis of the camera points to the desired orientation, so as to achieve the target detection and tracking. The ISP can reduce the effects of external factors on payloads to achieve a stable low-altitude shooting based on the inertial line-of-sight (LOS) stabilization. Recently, due to the continuous miniaturization of the quadcopter, most available commercial ISPs, such as the DJI Zemmuse Z15 camera platform, are no longer suitable for the miniature quadcopter, whose weight is less than 1 Kg. Thus, it is crucial to customize an appropriate ISP for the miniature UAV.
The ISP greatly increases the possibilities of target detection and tracking. In general, it will have a successful behavior regardless of the rotation movement and vibration of the quadcopter or the unexpected movement of the object. Related to this topic, visually stabilized platform technology was studied extensively in the past few decades. Adaptive and fuzzy controllers that decouple the axes for the passive LOS stabilization system were reported in earlier studies [5,6,7]. A rigorous analysis of control problems related to a standard double gimbal system is presented in [8]; however, it is not directly applicable to inertial stabilization. In order to obtain good adaptability for the servo system with a nonlinear property and uncertain factors, various kinds of fuzzy proportional–integral–derivative (PID) control methods have been proposed for the LOS stabilization of two-axis ISP in [9,10,11]. Considering the model uncertainties and carrier vibration or external disturbance, robust and disturbance rejection controllers are presented in [12,13,14,15,16,17]. Survey paper [18] confirms that this topic is still relevant within the engineering community and for defence technological needs. The above-cited works mostly focus on the task of inertial stabilization only. Another survey paper [19] discusses in detail how to extend the inertial rate stabilizing feedback loop to a visual tracking system by suggesting the common cascaded control structure for every rotational degree-of-freedom (DOF). According to this design method, decoupled cascaded controllers for two-axis gimbal are reported in [20,21]. Considering the nonlinear characteristics of camera motion, a feedback linearizaiton-based visual pointing and tracking control scheme for an inertially stabilized double-gimbal airborne camera platform is designed in [22,23]. The proposed scheme is thoroughly simulated and verified by laboratory experiments and compared against the more intuitive decoupled control scheme. In addition, pan-tilt camera platform postion controllers with a single feedback loop for each axis are designed in [24,25].
This paper presents a new vision-based target tracking controller for a customized miniature pan-tilt ISP in the presence of carrier disturbance and target movement. The target can be detected and recognized based on a computer vision algorithm. In order to facilitate the control design, a mathematical model that can accurately reflect the dynamic characteristics of the ISP is established, combining theoretical analysis and system identification. The target tracking controller adopts a two-layer cascaded control method based on the established mathematical model. The parameters of the controller are easier to determine than the model-free controllers that are designed in [9,11]. The proposed controller elaborately combines a lag compensator in the frequency domain with the feedback linearization-based sliding mode control method in the time domain to achieve satisfactory tracking performance. Based on the identified model, the inner loop of the proposed controller employs a proportional lag compensator to ensure the stability of LOS, while achieving satisfactory dynamic and steady-state performance. According to the nonlinear characteristics of camera motion [26], the outer loop adopts the feedback linearization-based sliding mode control method to achieve precise target tracking. The stabilization loop accepts commands from the output of the corresponding position loop. As is well known, sliding mode control can effectively adapt to model uncertainties and suppress external disturbances. When a disturbing rotational movement of the carrier or an unexpected movement of the object exists, the proposed controller will offer a significant improvement over the single feedback loop controllers as employed in [24,25]. A host-based control system (HCS) [27] is built as the avionics architecture of the ISP. The control algorithm is real-time-implemented based on the MATLAB software platform in the ground control station (GCS), so we can easily change the code and debug the program through wireless communication. In order to support the theoretical analysis, numerical simulations and laboratory experiments are also given in this paper. This is the most important extension of [19], which only presents the theoretical analysis of control design. This paper improves the control method and makes some important theoretical extensions of the previous conference paper [28].
This paper is organized as follows. In the next section, the modeling methodology of the ISP is introduced in detail. Section 3 is the main body of this paper, which presents the vision-based target tracking control design for the ISP. Numerical simulations and laboratory experiments are given in Section 4 and Section 5, respectively. The conclusions and future works are presented in Section 6.

2. Mathematical Model

2.1. Coordinate Systems

Because the ISP with a standard double-gimbal configuration can detect any stationary or moving target within a certain range on the ground, the ISP is designed in a simplified 2-DOF form. As shown in Figure 1, the mathematical model of the double-gimbal camera platform involves several coordinate systems, such as the inertial coordinate O n X n Y n Z n , the body coordinate O X b Y b Z b , the azimuth frame O X a Y a Z a and the elevation frame O X e Y e Z e . The mechanism of the ISP is composed of two gimbals. The outer gimbal, which is fixed on the bottom of the quadrotor UAV by the support shaft O Z a , realizes the rotation of the platform around the azimuth axis. The azimuth angle, denoted by θ a , is the angle from the azimuth frame O X a Y a Z a rotating around the O Z a axis to the body frame O X b Y b Z b . The inner gimbal, which is fixed on the azimuth gimbal by the support shift O Y e , achieves the rotation of the platform around the elevation axis. The elevation angle, denoted by θ e , is the angle from the elevation frame O X e Y e Z e rotating around the O Y e axis to the body frame O X b Y b Z b . The support shift O Z a and O Y e are perpendicular to each other. The airborne camera whose optical axis direction is parallel to the O X e axis is mounted on the elevation gimbal.

2.2. Stabilization Principle

When the miniature quadrotor UAV has a disturbing angular velocity ω b = [ ω b x ω b y ω b z ] T with respect to the body coordinate, it will be transmitted to the angular velocity ω e dis = [ ω e x dis ω e y dis ω e z dis ] T with respect to the elevation frame through the mechanism of the platform, resulting in instability of the optical axis. The relationship can be obtained through two Euler rotations as follows:
ω e x dis ω e y dis ω e z dis = C θ a C θ e S θ a C θ e S θ e S θ a C θ a 0 C θ a S θ e S θ a S θ e C θ e ω b x ω b y ω b z .
Hereafter, the abbreviations S · , C · and T · represent the trigonometric functions cos ( · ) , sin ( · ) and tan ( · ) , respectively.
In order to compensate for the disturbing rotational movement of the carrier, the servo controller needs to output the compensative angular velocity θ ˙ e and θ ˙ a , which are projected to the angular velocity ω e com = [ ω e x com ω e y com ω e z com ] T with respect to the elevation frame. Thus, they can be expressed in components as follows:
ω e x com ω e y com ω e z com = θ ˙ a S θ e θ ˙ e θ ˙ a C θ e .
Taking a superposition of (1) and (2), we can obtain
ω e x = θ ˙ a S θ e + ω b x C θ a C θ e + ω b y S θ a C θ e ω b z S θ e ω e y = θ ˙ e ω b x S θ a + ω b y C θ a ω e z = θ ˙ a C θ e + ω b x C θ a S θ e + ω b y S θ a S θ e + ω b z C θ e ,
where ω e x , ω e y , ω e z are three angular velocity components with respect to the elevation frame. To maintain the LOS orientation of azimuth axis and elevation axis stabilization, the conditions are required as follows:
ω e y = 0 ω e z = 0 .
Substituting (3) into (4), we can obtain the compensative angular velocity equation as
θ ˙ e = ω b x S θ a ω b y C θ a θ ˙ a = ω b x C θ a T θ e ω b y S θ a T θ e ω b z .
Substituting (1) into (5), the simplified compensative angular velocity equation can be expressed as follows
θ ˙ e = ω e y dis θ ˙ a = ω e z dis C θ e .
The integrated three-axis gyroscope mounted on the elevation gimbal measures the angular velocity ω e y and ω e z , respectively. The stabilized loop governs servo actuators to rotate with the angular rate θ ˙ e and θ ˙ a around the elevation axis and azimuth axis, respectively. When (6) is satisfied, the ISP can isolate the disturbing rotational movement of the carrier to achieve the stabilization of the optical axis.

2.3. Dynamic Model of the Gimbal

In order to facilitate the performance analysis of the system, we need to establish a mathematical model that can describe the dynamic performance of the system accurately. Generally, theoretical modeling and experimental modeling are two methods to establish a mathematical model of an electromechanical system. Theoretical modeling is accomplished by analyzing the motion characteristics of the process with known laws. Experimental modeling is realized based on the measurements of the input and output signals containing the dynamic characteristics of the process. Due to the complexity of environmental conditions and the uncertainty of model parameters, there are some differences between the actual physical model and the mathematical model based on theoretical analysis. Furthermore, as an on-the-shelf product, the electromechanical parameters of a servo actuator are difficult to obtain. When the analytical method cannot accurately establish a mathematical model of the system, the experimental method will show its advantages. In this paper, the mathematical model of the system is established by combining theoretical analysis with system identification. Based on the theoretical analysis, the structure and order of the model are determined. Then, we use system identification to estimate the unknown parameters, so as to determine the best-fitting model for the measured system.
According to the operation principle, the electromechanical model of the servo actuator can be equivalent to a DC torque motor. The double gimbals and the camera are added to the shafts of servo actuators as inertial loads. The equivalent circuit diagram of the servo actuator is shown in Figure 2.
As shown in Figure 2, U a is the armature voltage of the motor, and I a is the armature current of the motor. Ω is the angular velocity of the output shaft of the servo motor. L a is the motor armature inductance, and R a is the motor armature resistance. E b is the counter electromotive force induced by Ω . According to the analysis of mechanical and electrical principles [13,29], the block diagram of the gimbal electromechanical model is shown in Figure 3.
Generally, the transfer function of current regulator W i ( s ) in Figure 3 is designed to be a simple proportional–integral (PI) controller, which can be expressed as
W i ( s ) = K p ( 1 + 1 T i s ) ,
where K p presents proportional control gain, and T i represents the integral time constant. U c is the normalized control input of the servo motor, and M m is the electromagnetic torque of the motor. θ is the rotational angle of the output shaft of the servo motor. k p w m denotes the power amplification factor of the pulse-width modulation (PWM) driver. k i indicates the calibration coefficient of the current loop. C m is the motor torque constant, and C e is the counter electromotive force constant. J denotes the equivalent total moment of inertia on the motor shaft. According to the equivalent transformation rule of the block diagram, the transfer function between the normalized servo input U c and the gimbal angular velocity Ω will take the following form:
Ω ( s ) U c ( s ) = K ( T z s + 1 ) ( T ω 2 s 2 + 2 ζ T ω s + 1 ) ( T p s + 1 ) .
From (8), we know that the gimbal dynamic model is a third-order under-damped and zero-point system; the expressions of the parameters are given as follows:
K = R a k i C e ζ = T i ( R a + K p k p w m k i ) 2 K p k p w m k i T i L a T ω = T i L a K p k p w m k i T z = T i T p = J R a C e C m .
In order to estimate the unknown parameters of the linear system, frequency domain system identification is adopted using the MATLAB System Identification App. According to the model structure obtained from the theoretical analysis, the recursive least square method is used to estimate the unknown parameters. Using sinusoidal signals as the excitation signal, the dynamic model of the gimbal is obtained by changing the frequency and amplitude of the input excitation signal. In order to ensure identification accuracy, the input excitation signal with 0.4 to 10 Hz frequency range is applied to identify and validate the model. The number of samples is 210, and the number of iterations is 20. Neglecting the dynamic coupling between the azimuth axis and elevation axis, the transfer function between the normalized servo input of azimuth axis u c x and azimuth angular velocity ω e z can be expressed as
G a ( s ) = 0.3352 × ( 4.541 s + 1 ) ( s 2 10.48 2 + 0.821 s 10.48 + 1 ) ( s 4.328 + 1 ) .
The transfer function between the normalized servo input of the elevation axis u c y and elevation angular velocity ω e y can be expressed as
G e ( s ) = 0.2557 × ( 6.058 s + 1 ) ( s 2 9.285 2 + 0.9075 s 9.285 + 1 ) ( s 4.604 + 1 ) .
In order to validate the correctness of the identified model, the actual measured output signal is compared with the simulated output signal based on the identified model and input excitation signal. As shown in Figure 4 and Figure 5, the actual measured output signal, depicted by solid blue line, and the simulated output signal, depicted by dashed red line, fit well.

2.4. Camera Motion and Interaction Matrix

In order to facilitate the design and implement of a vision-based controller for the task of target pointing and tracking, a camera model must be developed. As shown in Figure 6, the frontal pinhole projection is a commonly used camera model. It is simple and convenient and involves placing the image plane in front of the optical center. The coordinates of the target expressed in the camera frame O c X c Y c Z c are given by p o = [ x y z ] T . The image coordinate frame O i X i Y i is orthogonal to the optical axis, located at a focal distance λ from the origin of the camera coordinate frame. The coordinates of the intersection point of the line connecting the object with the origin are p i = [ u w λ ] T expressed in the camera frame. Correspondingly, the vector s = [ u w ] T gives the image coordinates. According to the geometrical relationship of the pinhole projection model, we have that
u = λ x z w = λ y z .
In general, the movement of the camera in the inertial space is characterized by its translational velocity v c = [ v c x v c y v c z ] T and rotational velocity ω c = [ ω c z ω c y ω c z ] T , both expressed in the camera frame. Stacking them together, a time-dependent vector ξ ( t ) = [ v c ( t ) ω c ( t ) ] T R 6 is formed. We consider the simple case of a single-point feature and assume that the ground object does not move. Extension of the results stated here to the case of a moving ground target is feasible, but the resulting interaction matrix will be a function of the velocity of the ground object. The motion of the object as viewed by the so-called image feature velocity s ˙ ( t ) can be obtained as a time derivative of the image feature vector s ( t ) . It is possible to relate the velocity of the camera ξ to the velocity of the image of the point s ˙ by an interaction matrix transform. This matrix transform is derived in detail in [26,30] as
u ˙ w ˙ = λ z 0 u z u w λ λ 2 + u 2 λ w 0 λ z w z λ 2 + w 2 λ u w λ u v c x v c y v c z ω c x ω c y ω c z .
Note that the interaction matrix is a function of the image coordinates of point p o and the depth of the point with respect to the camera frame. The focal length λ is regarded as a fixed parameter. Therefore, this equation is typically written as
s ˙ ( t ) = L ( u , w , z ) ξ ( t ) .
It is useful to decouple and rewrite (14) as a composition of two parts:
s ˙ ( t ) = L v ( u , w , z ) v c ( t ) + L ω ( u , w ) ω c ( t ) .
In (15), the part corresponding to the translation of the camera frame L v ( u , w , z ) is a function of both the image coordinates of the point and its depth, while the part corresponding to the rotation of the camera frame L ω ( u , w ) is a function of only the image coordinates of the point. This can be beneficial in real-world situations when the exact value of z may not be known. In such a case, errors in the value of z merely cause a scaling of the matrix L v ( u , w , z ) , and this kind of scaling effect can be compensated by using fairly simple control methods.
The relationship between the inertial angular rate vector ω c with respect to the camera frame and the angular rate vector ω e with respect to the elevation frame can be expressed using a constant rotation matrix R C E :
ω e = R C E ω c = 0 0 1 1 0 0 0 1 0 ω c .

3. Control Design

In practical application, the ISP is inevitably subject to a variety of carrier disturbances and external interferences. These disturbances will lead to a serious decline in the stabilization and accuracy of the system, meaning that the target cannot be tracked. Therefore, it is necessary to design a reliable servo controller to overcome the interference and improve the system performance. The superior control design for the ISP can effectively isolate the attitude change of the quadrotor UAV during the flight in order to ensure the stabilization of the optical axis, so that the camera can obtain a stable video image. Meanwhile, the control system steers the gimbal to move the target into the center of the field of view, so as to achieve stationary or moving target detection and tracking. In the following, the visual servoing scheme will be introduced in detail.

3.1. Control System Structure

The visual servoing control design adopts the cascaded control method. The inner loop is the speed loop, which controls the angular velocity of the gimbal to ensure the stabilization of the platform. The outer loop is the position loop, which controls the orientation of the optical axis to achieve the target tracking. As shown in Figure 7, the proposed cascaded controller is composed of the inertial rate stabilization controller and the visual tracking controller. The gyroscope is an inertial rate sensor of the stabilization controller, which can effectively isolate the disturbance of the carrier and stabilize the platform, ensuring that the camera can obtain a stable video image. Meanwhile, the visual tracking controller enables the camera gimbal to rotate as instructed, ensuring that the optical axis points to the desired orientation for dynamic target tracking. Next, the control algorithm design will be introduced in detail.

3.2. Inertial Rate Stabilization Controller

When the quadrotor UAV has a disturbing angular velocity with respect to the inertial coordinate, it will be transmitted to the camera frame through the mechanism of the gimbal, resulting in instability of the optical axis. Therefore, a reliable servo control algorithm based on the gyroscopic angular velocity signal needs to be designed to effectively isolate the angular velocity interference of the carrier, so that the optical axis can be stabilized at the preset orientation with respect to the inertial coordinate system, regardless of the movement of the carrier. As shown in Figure 7, the output of the visual tracking controller is used as the desired angular velocity of the inertial rate stabilization controller. It is compared with the corresponding axial angular velocity measured by the gyroscope to obtain the angular velocity error, which serves as the input of the inertial rate stabilization controller. The control output, which is calculated through the control algorithm, is sent to drive the servo actuator of the corresponding gimbal.
In order to facilitate the control design, the bode diagrams of the azimuth axis and elevation axis can be depicted as shown in Figure 8 and Figure 9. The bode diagrams of uncalibrated system, calibrated system and closed-loop system are depicted by the dashed blue line, green dotted line and red solid line, respectively. According to the identified dynamics, the inertial rate stabilization controller is designed as
G c ( s ) = K p ( T d s + 1 ) T z s + 1 ,
where K p is the proportional gain, and T d denotes the time constant.
Proposition 1.
The control law of (17) ensures the stability of the closed-loop system of the inertial rate tracking errors, provided that the control gain K p and time constant T d are selected to satisfy the following inequalities:
0 < K p < 2 ζ ( T ω 2 + 2 ζ T ω T p + T p 2 ) K ( T ω T p T ω T d 2 ζ T p T d ) , 0 < T d T d t h K p > 0 , T d > T d t h ,
T d > 0 , 0 < K p K p t h T d > K p K T ω T p 2 ζ ( T ω 2 + 2 ζ T ω T p + T p 2 ) K p K T ω + 2 ζ K p K T p , K p > K p t h ,
where T d t h = T ω T p T ω + 2 ζ T p and K p t h = 2 ζ ( T ω 2 + 2 ζ T ω T p + T p 2 ) K T ω T p .
Proof. 
According to the control law of (17), the characteristic equation of the closed-loop system is expressed as
D ( s ) = a 0 s 3 + a 1 s 2 + a 2 s + a 3 ,
where a 0 = T ω 2 T p , a 1 = T ω 2 + 2 ζ T ω T p , a 2 = 2 ζ T ω + T p + K p K T d and a 3 = K p K + 1 .
In order to ensure the stability of closed-loop system of e ω , the inequality (18) and (19) can be deduced based on the Routh-Herwitz stability criterion. □
As shown in Figure 8, the azimuth axis steady-state error of the unit step input signal of the closed-loop system is 0.2716 when K p = 8 is chosen. The cut-off frequency of the uncalibrated system is 76.5 rad/s, and the phase margin is 9.6 deg. In order to improve the phase margin, lag calibration is adopted. The phase margin will be increased to 27.1 deg at 25.3 rad/s when T d = 0.4541 is chosen, and the bandwidth of the closed-loop system is 41.67 rad/s. As shown in Figure 9, the elevation axis steady-state error of the unit step input signal of the closed-loop system is 0.3283 when K p = 8 is chosen. The cut-off frequency of the uncalibrated system is 70.5 rad/s, and the phase margin is 10.5 deg. In order to improve the phase margin, lag calibration is adopted. The phase margin will be increased to 30.8 deg at 23 rad/s when T d = 0.6058 is chosen, and the bandwidth of the closed-loop system is 39.17 rad/s.

3.3. Visual Tracking Controller

The visual tracking control design is based on the camera model in (13). A tracking error in the image plane is defined as
e ( t ) = s ref ( t ) s ( t ) ,
where s ref = [ u ref w ref ] T presents the reference trajectory. By taking the time derivative of e , the open-loop visual tracking error dynamics can be obtained as
e ˙ = s ˙ ref L v ( u , w , z ) v c L ω ( u , w ) ω c .
Let the angular velocity tracking error be defined as
e ω ( t ) = ω c des ( t ) ω c ( t ) ,
where ω c des = [ ω c x des ω c y des ω c z des ] T denotes the desired angular velocity with respect to the camera frame. According to the constant rotation matrix in (16), the open-loop visual tracking error dynamics can be rewritten as
e ˙ = s ˙ ref L v ( u , w , z ) v c + L ω ( u , w ) e ω y e ω z w u ω e x L ω ( u , w ) ω e y des ω e z des ,
where L ω ( u , w ) = u w λ λ 2 + u 2 λ λ 2 + w 2 λ u w λ . Based on the structure of (24) and subsequent stability analysis, the desired angular velocity vector is designed as
ω e y des ω e z des = L ω 1 s ˙ ref w u ω e x + ρ sgn ( e ) + k e ,
where ρ , k R + are control gains. The inverse matrix can be expressed as
L ω 1 = u w λ ( λ 2 + u 2 + w 2 ) λ 2 + u 2 λ ( λ 2 + u 2 + w 2 ) λ 2 + w 2 λ ( λ 2 + u 2 + w 2 ) u w λ ( λ 2 + u 2 + w 2 ) .
After substituting (25) into (24), the closed-loop dynamics of e ( t ) can be obtained
e ˙ = L v ( u , w , z ) v c + L ω ( u , w ) e ω y e ω z ρ s g n ( e ) k e .
Proposition 2.
The control law of (25) ensures the global exponential convergence of the visual tracking error as illustrated by
lim t e ( t ) = 0 ,
provided that the control gain ρ is selected to satisfy the following inequality:
L v ( u , w , z ) v c L ω ( u , w ) e ω y e ω z ρ .
Proof. 
To prove the above proposition, we define a Lyapunov function candidate V ( t ) R as follows:
V = 1 2 e T e .
After taking the time derivative of (30), and substituting (27) into the resulting equation, we can obtain
V ˙ = e T L v ( u , w , z ) v c + e T L ω ( u , w ) e ω y e ω z ρ e T s g n ( e ) k e T e .
By using the condition in (29), the expression in (31) can be upper-bounded by
V ˙ k e T e = 2 k V .
From (32), it can be concluded that V ( t ) is exponential convergence. Since V ( t ) of (30) is a non-negative function, we can conclude that V ( t ) L . According to (32), we know that e ( t ) is exponentially stable. Thus, the result in (28) is proven. □

4. Numerical Simulations

This section presents the simulation results of the proposed control algorithm. The resolution of the camera CCD chip is 320 × 240 pixels. The geometric parameter of camera focal length λ = 7.5 mm is estimated by the MATLAB Camera Calibrator App. Closed-loop responses with the proposed controller are simulated when the image of the observed object is initially located outside the center of the image frame. The control goal is to bring the observed object into the center of the field of view. The carrier disturbing angular velocity caused by a gust or turbulence and the target moving velocity are shown in Figure 10, and this will be transmitted to the camera frame through the mechanism of the gimbal.
In order to validate the superior performance of the vision-based tracking controller, comparative numerical simulations between the proposed controller and the cascaded PID controller are conducted. The sampling time of numerical simulations is 0.01 s. The simulation results of the proposed controller and the cascaded PID controller are depicted by solid blue line and dashed green line, respectively. For avoidance of the chatter generated by the signum function in (25), we replace s g n ( e ) with a saturation function s a t ( e / B ) , where B is the boundary layer thickness. The trajectories of horizontal and vertical tracking errors in the image pixel coordinate frame are illustrated in Figure 11. When the miniature quadrotor UAV has a disturbing angular velocity ω b z with respect to the body coordinate from 14 s to 16 s, and the target moves with a sinusoidal translational velocity v c z from 20 s to 25 s in the simulations, the maximum horizontal tracking error u is within ± 10 px for the proposed controller, and it is within ± 30 px for the cascaded PID controller. Moreover, it can be seen that the horizontal tracking error u of the proposed controller is driven to 0 at last in the simulations, and the convergence speed is obviously faster than that of the cascaded PID controller. Therefore, we can conclude that the disturbance rejection ability and tracking performance of the proposed controller are better than those of the cascaded PID controller in the presence of carrier disturbance and target movement. The normalized control inputs of the azimuth axis and elevation axis are provided in Figure 12. It can be seen that the changes of the control inputs are milder than those of the cascaded PID controller, and they are all with reasonable values.

5. Experimental Results

In order to validate the target detection and tracking performance of the ISP, we have customized an experimental prototype for the miniature pan-tilt ISP. As shown in Figure 13, the hardware components of the self-developed miniature pan-tilt ISP consist of a servo controller, servo actuators, inertial measurement unit (IMU), vision sensor, etc. The volume of the miniature ISP is 85 mm × 38 mm × 75 mm, and the total weight is only 60 g; thus, it is much smaller and lighter than other commercial ISPs. It is suitable for a miniature UAV with a very limited load weight. The servo actuators installed in the azimuth gimbal and elevation gimbal steer the gimbal around the corresponding shaft, respectively. The vision sensor and IMU are mounted on the elevation gimbal. The servo controller and other electronic equipments are installed in the electronic equipment compartment of the quadrotor UAV. In order to facilitate the control design and program debugging, HCS is built as the avionics architecture of the miniature ISP [28].

5.1. Target Detection

In the target detection and tracking mission, the target selected is a small white ball. In order to accomplish this mission, video image acquisition and preprocessing are required. First of all, we create a video input object in the MATLAB environment, setting the video device to grayscale mode. Next, we create a timer object, setting the timer callback function to trigger the event. After starting the timer, the video image acquisition and preprocessing program is executed regularly. As shown in Figure 14, the image acquisition is performed by triggering image, and the image preprocessing is executed in the timer callback function subsequently. According to the threshold, the gray image is converted into a binary image, which can reduce the calculation load and enhance the real-time tracking performance. Afterwards, the small object generated by the noise signal is removed from the binary image by a filtering algorithm of mathematical morphology. The outer boundary of the region in the filtered binary image is searched using an edge detection algorithm, and then the geometric properties of the region, such as area and center coordinates, are calculated. Based on the thresholds set in advance, each region is traversed according to the geometric parameters to recognize the target. As shown in Figure 14, the target being tracked in natural background is marked with a small circle at the center of the small ball. After seeking out the target region in the image, the image-based tracking algorithm will be performed.

5.2. Target Tracking Experiments

The whole process of the target tracking mission is accomplished in an indoor environment. The avionics architecture of HCS introduced in [28] is used to validate the functionality and performance of the proposed visual servoing scheme. The experimental tests can be conducted after finishing the electrical connection of ISP and GCS, while the small ball to be tracked should be within the field of view of the airborne camera.
In order to further demonstrate the superior performance of the target tracking control scheme, comparative experimental results between the proposed controller and the cascaded PID controller of the previous conference paper [28] are given in this section. The control period of the target tracking experiments is 0.1 s. The experimental results of the proposed controller and the cascaded PID controller are depicted by the solid blue line and dashed green line, respectively. The trajectories of the horizontal and vertical tracking errors in the image pixel coordinate system are illustrated in Figure 15. The carrier disturbance occurs between 7 s and 12 s, and the target moves after 17 s during this experimental test. It can be seen that the maximum horizontal tracking error u is within ± 40 px, the maximum vertical tracking error w is within ± 30 px, and they are all driven near to 0 at last during this test, which means that the proposed controller steers the gimbal to move the target into the center of the field of view. Furthermore, the comparative tracking error trajectories of the cascaded PID controller are also given in Figure 15. Similarly, the carrier disturbance occurs between 2 s and 7 s, and the target moves after 11 s during the experimental test. It can be seen that the maximum horizontal tracking error u is within ± 60 px, the maximum vertical tracking error w is beyond ± 40 px, and they are not driven to 0 at last during the test. Thus, the disturbance rejection ability and tracking performance of the cascaded PID controller are weaker than those of the proposed controller. The trajectories of the azimuth and elevation angular velocities are shown in Figure 16, where it can be seen that the angular velocities are driven to 0 quickly in the presence of external disturbance and target movement. Obviously, the changes of angular velocities, especially ω e y for the cascaded PID controller, are sharper than those of the proposed controller, which means that the camera image will become obscure when using the cascaded PID controller in the presence of carrier disturbance and target movement. The normalized servo inputs of the azimuth axis and elevation axis are provided in Figure 17, where it can be seen that the changes of control inputs of the proposed controller are milder than those of the cascaded PID controller and they have reasonable values. Therefore, we can conclude that satisfactory tracking performance of the proposed controller in the experimental tests is achieved in the presence of carrier disturbance and target movement.

6. Conclusions

This paper has presented a new visual servoing scheme for a miniature pan-tilt ISP. For the purpose of control design, the mathematical model of the ISP is established, combining theory analysis and system identification. The proposed controller adopts a cascaded control method based on the identified dynamics. The inner loop uses a proportional lag compensator to ensure the stabilization of the ISP, and the outer loop employs the feedback linearization-based sliding mode control method to achieve target tracking. The parameter domains for the global exponential stability of the closed-loop system are given as propositions, and they are proven by the frequency domain approach and Lyapunov-based method. The superior performance of the proposed controller compared with a traditional cascaded PID controller is verified by numerical simulations and experimental tests in the presence of carrier disturbance and target movement. The miniature pan-tilt ISP will be installed on a quadrotor UAV for actual flight experiments in the near future.

Author Contributions

Conceptualization, J.G.; project administration, C.Y.; data curation and interpretation, X.Z.; writing—original draft preparation, F.C.; writing—review and editing, J.G. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Key Project of Tianjin National Science Foundation (No. 18JCZDJC96700), the Scientific Research Project of Tianjin Education Commission (No. 2019KJ013) and the National College Students Innovation Training Program (No. 201910058008).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Quan, Q. Introduction to Muticopter Design and Control; Springer Nature: Singapore, 2017; pp. 3–5. [Google Scholar]
  2. Madison, R.; Andrews, G.; DeBitetto, P.; Rasmussen, S.; Bottkol, M. Vision-aided navigation for small UAVs in GPS-challenged environments. In Proceedings of the AIAA Infotech@Aerospace Conferene and Exhibit, Rohnert Park, CA, USA, 7–10 May 2007. [Google Scholar]
  3. Kendoul, F. Survey of advances in guidance, navigation, and control of unmanned rotorcraft systems. J. Field Robot. 2012, 29, 315–378. [Google Scholar] [CrossRef]
  4. Floreano, D.; Wood, R.J. Science, technology and the future of small autonomous drones. Nature 2015, 521, 460–466. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. Hilkert, J.M.; Hullender, D.A. Adaptive control system techniques applied to inertial stabilization systems. In Proceedings of the International Society for Optical Engineering, Rosemont, IL, USA, 27–28 September 1990; Volume 1304, pp. 190–206. [Google Scholar]
  6. Nie, J. Fuzzy control of multivariable nonlinear servomechanisms with explicit decoupling scheme. IEEE Trans. Fuzzy Syst. 1997, 5, 304–311. [Google Scholar] [CrossRef]
  7. Lee, T.H.; Ge, S.S.; Wong, C.P. Adaptive neural network feedback control of a passive line-of-sight stabilization system. Mechatronics 1998, 8, 887–903. [Google Scholar] [CrossRef]
  8. Osborne, J.; Hicks, G.; Fuentes, R. Global analysis of the double-gimbal mechanism. IEEE Contr. Syst. Mag. 2008, 28, 44–64. [Google Scholar]
  9. Ji, W.; Li, Q.; Zhao, D.; Fang, S. Adaptive fuzzy PID composite control with hysteresis-band switching for line of sight stabilization servo system. Aerosp. Sci. Technol. 2011, 15, 25–32. [Google Scholar] [CrossRef]
  10. Abdo, M.M.; Vali, A.R.; Toloei, A.R.; Arvan, M.R. Stabilization loop of a two axes gimbal system using self-tuning PID type fuzzy controller. ISA Trans. 2014, 53, 591–602. [Google Scholar] [CrossRef] [PubMed]
  11. Liu, F.; Wang, H.; Shi, Q.; Wang, H.; Zhang, M.; Zhao, H. Comparision of an ANFIS and fuzzy PID control model for performance in a two-axis inertial stabilized platform. IEEE Access 2017, 5, 12951–12962. [Google Scholar] [CrossRef]
  12. Ambrose, H.; Qu, Z.; Johnson, R. Nonlinear robust control for a passive line-of-sight stabilization system. In Proceedings of the 2001 IEEE International Conference on Control Applications, Mexico City, Mexico, 7–10 September 2001; pp. 942–947. [Google Scholar]
  13. Kim, S.B.; Kim, S.H.; Kwak, Y.K. Robust control for a two-axis gimbaled sensor system with multivariable feedback systems. IET Control. Theory Appl. 2010, 4, 539–551. [Google Scholar] [CrossRef]
  14. Řezáč, M.; Hurák, Z. Structured MIMO H design for dual-stage inertial stabilization: Case study for HIFOO and Hinfstruct solvers. Mechatronics 2013, 4, 1084–1093. [Google Scholar] [CrossRef]
  15. Lei, X.; Zou, Y.; Dong, F. A composite control method based on the adaptive RBFNN feedback control and the ESO for two-axis intertially stabilized platforms. ISA Trans. 2015, 59, 424–433. [Google Scholar] [CrossRef] [PubMed]
  16. Safa, A.; Abdolmalaki, R.Y.; Dong, F. Robust output feedback tracking control for inertially stabilized platforms with matched and unmatched uncertainties. IEEE Trans. Contr. Syst. Technol. 2019, 27, 118–131. [Google Scholar] [CrossRef]
  17. Lee, D.H.; Tran, D.Q.; Kim, Y.B.; Chakir, S. A robust double active control system design for disturbance suppression of a two-axis gimbal system. Electronics 2020, 9, 1638. [Google Scholar] [CrossRef]
  18. Hilkert, J.M. Inertially stabilized platform technology: Concepts and principles. IEEE Contr. Syst. Mag. 2008, 28, 26–46. [Google Scholar]
  19. Masten, M.K. Inertially stabilized platform for optical imaging systems: Tracking dynamic targets with mobile sensors. IEEE Cont. Syst. Mag. 2008, 28, 47–64. [Google Scholar]
  20. Zhou, X.; Zhang, H.; Yu, R. Decoupling control for two-axis inertially stabilized platform based on an inverse system and internal model control. Mechatronics 2014, 24, 1203–1213. [Google Scholar] [CrossRef]
  21. Dong, F.; Lei, X.; Chou, W. A dynamic model and control method for a two-axis inertially stabilized platform. IEEE Trans. Ind. Electron. 2017, 64, 432–439. [Google Scholar] [CrossRef]
  22. Hurák, Z.; Řezáč, M. Combined line-of-sight inertial stabilization and visual tracking: Application to an airborne camera platform. In Proceedings of the 48th Conference of Decision and Control, Shanghai, China, 15–18 December 2009; pp. 8458–8463. [Google Scholar]
  23. Hurák, Z.; Řezáč, M. Image-based pointing and tracking for inertially stabilized airborne camera platform. IEEE Trans. Contr. Syst. Technol. 2012, 20, 1146–1159. [Google Scholar] [CrossRef]
  24. Olivares-Méndez, M.A.; Campoy, P.; Martínez, C.; Mondragón, I. A pan-tilt camera fuzzy vision controller on an unmanned aerial vehicle. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO, USA, 10–15 October 2009; pp. 2879–2884. [Google Scholar]
  25. Rajesh, R.J.; Ananda, C.M. PSO tuned PID controller for controlling camera position in UAV using 2-axis gimbal. In Proceedings of the 2015 International Conference on Power and Advanced Control Engineering, Bengaluru, India, 12–14 August 2015; pp. 128–133. [Google Scholar]
  26. Spong, M.W.; Hutchinson, S.; Vidyasagar, M. Robot Modeling and Control; John Wiley: New York, NY, USA, 2006; pp. 355–374. [Google Scholar]
  27. Nonami, K.; Kendoul, K.; Suzuki, S.; Wang, W.; Nakazawa, D. Autonomous Flying Robots: Unmanned Aerial Vehicles and Micro Aerial Vehicles; Springer: Tokyo, Japan, 2010; pp. 121–122. [Google Scholar]
  28. Liu, B.; Guo, J.; Rong, J.; Li, B. Cascaded control design for a stabilized pan-tilt camera platfrom on a quadrotor UAV. In Proceedings of the 8th Annual Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems, Tianjin, China, 19–23 July 2018; pp. 907–912. [Google Scholar]
  29. Liu, B.; Wang, C.; Li, W.; Li, Z. Robust controller design using the Nevanlinna-Pick interpolation in gyro stabilzed pod. Discret. Dyn. Nat. Soc. 2010, 2010. [Google Scholar] [CrossRef] [Green Version]
  30. Zhao, G.; Chen, G.; Chen, J.; Hua, C. Finite control for image-based visual servoing of a quadrotor using nonsingular fast terminal sliding mode. Int. J. Control Autom. Syst. 2020, 18, 2337–2348. [Google Scholar] [CrossRef]
Figure 1. Coordinate systems of the ISP.
Figure 1. Coordinate systems of the ISP.
Electronics 10 02243 g001
Figure 2. Equivalent circuit diagram.
Figure 2. Equivalent circuit diagram.
Electronics 10 02243 g002
Figure 3. Gimbal electromechanical model.
Figure 3. Gimbal electromechanical model.
Electronics 10 02243 g003
Figure 4. Model validation of azimuth axis.
Figure 4. Model validation of azimuth axis.
Electronics 10 02243 g004
Figure 5. Model validation of elevation axis.
Figure 5. Model validation of elevation axis.
Electronics 10 02243 g005
Figure 6. Frontal pinhole projection model.
Figure 6. Frontal pinhole projection model.
Electronics 10 02243 g006
Figure 7. Cascaded control system structure.
Figure 7. Cascaded control system structure.
Electronics 10 02243 g007
Figure 8. Bode diagram of azimuth axis.
Figure 8. Bode diagram of azimuth axis.
Electronics 10 02243 g008
Figure 9. Bode diagram of elevation axis.
Figure 9. Bode diagram of elevation axis.
Electronics 10 02243 g009
Figure 10. Carrier disturbing angular velocity and target moving velocity in numerical simulations.
Figure 10. Carrier disturbing angular velocity and target moving velocity in numerical simulations.
Electronics 10 02243 g010
Figure 11. Trajectories of horizontal and vertical tracking errors in numerical simulations.
Figure 11. Trajectories of horizontal and vertical tracking errors in numerical simulations.
Electronics 10 02243 g011
Figure 12. Normalized control inputs of azimuth axis and elevation axis in numerical simulations.
Figure 12. Normalized control inputs of azimuth axis and elevation axis in numerical simulations.
Electronics 10 02243 g012
Figure 13. Experimental prototype of miniature pan-tilt ISP.
Figure 13. Experimental prototype of miniature pan-tilt ISP.
Electronics 10 02243 g013
Figure 14. Image aquisition and preprocessing.
Figure 14. Image aquisition and preprocessing.
Electronics 10 02243 g014
Figure 15. Trajectories of horizontal and vertical tracking errors in experimental tests.
Figure 15. Trajectories of horizontal and vertical tracking errors in experimental tests.
Electronics 10 02243 g015
Figure 16. Trajectories of azimuth and elevation angular velocities in experimental tests.
Figure 16. Trajectories of azimuth and elevation angular velocities in experimental tests.
Electronics 10 02243 g016
Figure 17. Normalized control inputs of azimuth gimbal and elevation gimbal in experimental tests.
Figure 17. Normalized control inputs of azimuth gimbal and elevation gimbal in experimental tests.
Electronics 10 02243 g017
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Guo, J.; Yuan, C.; Zhang, X.; Chen, F. Vision-Based Target Detection and Tracking for a Miniature Pan-Tilt Inertially Stabilized Platform. Electronics 2021, 10, 2243. https://doi.org/10.3390/electronics10182243

AMA Style

Guo J, Yuan C, Zhang X, Chen F. Vision-Based Target Detection and Tracking for a Miniature Pan-Tilt Inertially Stabilized Platform. Electronics. 2021; 10(18):2243. https://doi.org/10.3390/electronics10182243

Chicago/Turabian Style

Guo, Jianchuan, Chenhu Yuan, Xu Zhang, and Fan Chen. 2021. "Vision-Based Target Detection and Tracking for a Miniature Pan-Tilt Inertially Stabilized Platform" Electronics 10, no. 18: 2243. https://doi.org/10.3390/electronics10182243

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop