Next Article in Journal
A Self-Adaptive Progressive Support Selection Scheme for Collaborative Wideband Spectrum Sensing
Next Article in Special Issue
Benefits of Multi-Constellation/Multi-Frequency GNSS in a Tightly Coupled GNSS/IMU/Odometry Integration Algorithm
Previous Article in Journal
A Simple and Low-Cost Optical Fiber Intensity-Based Configuration for Perfluorinated Compounds in Water Solution
Previous Article in Special Issue
Performance Characterization of GNSS/IMU/DVL Integration under Real Maritime Jamming Conditions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

New Approaches to the Integration of Navigation Systems for Autonomous Unmanned Vehicles (UAV)

Institute for Information Transmission Problems RAS, Bolshoy Karetny per. 19, build.1, Moscow 127051, Russia
*
Author to whom correspondence should be addressed.
Sensors 2018, 18(9), 3010; https://doi.org/10.3390/s18093010
Submission received: 1 June 2018 / Revised: 31 August 2018 / Accepted: 4 September 2018 / Published: 8 September 2018
(This article belongs to the Special Issue GNSS and Fusion with Other Sensors)

Abstract

:
The article presents an overview of the theoretical and experimental work related to unmanned aerial vehicles (UAVs) motion parameters estimation based on the integration of video measurements obtained by the on-board optoelectronic camera and data from the UAV’s own inertial navigation system (INS). The use of various approaches described in the literature which show good characteristics in computer simulations or in fairly simple conditions close to laboratory ones demonstrates the sufficient complexity of the problems associated with adaption of camera parameters to the changing conditions of a real flight. In our experiments, we used computer simulation methods applying them to the real images and processing methods of videos obtained during real flights. For example, it was noted that the use of images that are very different in scale and in the aspect angle from the observed images in flight makes it very difficult to use the methodology of singular points. At the same time, the matching of the observed and reference images using rectilinear segments, such as images of road sections and the walls of the buildings look quite promising. In addition, in our experiments we used the projective transformation matrix computation from frame to frame, which together with the filtering estimates for the coordinate and angular velocities provides additional possibilities for estimating the UAV position. Data on the UAV position determining based on the methods of video navigation obtained during real flights are presented. New approaches to video navigation obtained using the methods of conjugation rectilinear segments, characteristic curvilinear elements and segmentation of textured and colored regions are demonstrated. Also the application of the method of calculating projective transformations from frame-to-frame is shown which gives estimates of the displacements and rotations of the apparatus and thereby serves to the UAV position estimation by filtering. Thus, the aim of the work was to analyze various approaches to UAV navigation using video data as an additional source of information about the position and velocity of the vehicle.

1. Introduction

The integration of observation channels in control systems of objects subjected to perturbations and measurement errors of the motion is based on on the observations control theory started in the early 1960s. The first works on this topic were based on the simple Kalman filter property, namely: the possibility of determining the root-mean-square estimation error in advance, without observations, by solving the Riccati equation for the error covariance matrix [1]. The development of this methodology allowed solving problems with a combination of discrete and continuous observations for stochastic systems of discrete-continuous type. At the same time, methods were developed for solving problems with constraints imposed on the composition of observations, temporal and energy constraints both on separate channels and on aggregate. For a wide class of problems with convex structure, necessary and sufficient conditions for optimality were obtained, both in the form of dynamic programming equations and the generalized maximum principle, which opens the possibility of a numerical solution [2,3]. The tasks of integrating surveillance and control systems for UAVs open a new wide field of application of the observation control methods, especially when performing autonomous flight tasks. One of the most important problems is the detection of the erroneous operation of individual observation subsystems, in which the solution of navigational tasks should be redistributed or transferred to backup subsystems or other systems operating on other physical principles [4].
A typical example: navigation through satellite channels such as global positioning system (GPS), which is quite reliable in simple flight conditions, but in a complex terrain (mountains, gorges), it is necessary to use methods to determine your position with the help of other systems based, for example, on landmarks observed either with optoelectronic cameras, or radar [5,6,7,8]. Here the serious problem of converting the signals of these systems into data suitable for navigation arises. The human-operator copes with this task on the basis of training. That is the serious problem in computer vision area and it is one of the mainstream in the UAV autonomous flight [9]. Meanwhile, the prospects for creating artificial intelligence systems of this level for UAV applications are still far from reality.
At the same time, the implementation of simple flight tasks, such as either access to the aerial survey area or tracking the reference trajectory [10,11,12] the organization of data transfer in conditions of limited time and energy storage [13,14] and even landing [15], are quite accessible for performing UAVs in the autonomous mode with reliable navigation aids.
Unmanned aerial, land-based and underwater-based vehicles that perform the autonomous missions use, as a rule, an on-board navigation system supplemented by sensors of various physical nature. At the same time, unlike remote control systems in which these sensors present information in the form as operator-friendly as possible, the measurement results should be converted into the input signals of the control system, which requires other approaches. This is especially evident in the example of an optical or optoelectronic surveillance system, whose purpose in remote control mode is to provide the operator with the best possible image of the surrounding terrain. At the same time, in an autonomous flight, the observing system should be able to search for the characteristic objects in the observed landscape and give the control system their coordinates and estimate the distances between them. Of course, the issue of providing an exellent image and determining the metric properties of the observed images are connected, and in no case cancel one another. However, what a human operator does automatically basing on a sufficiently high-quality image of the terrain, the readings of other sensors and undoubtedly on previous experience, the control system algorithm must do by using data from video and other systems, with the same accuracy as the human operator.
In this paper, various approaches to the integration of surveillance channels for UAVs are presented. Some of which were tested on real data obtained during flights with registration of the underlying earth’s surface by on-board video-camera. In our previous works [16,17] we examined the approach to video navigation using the methodology of detecting special “points”, the coordinates of which are predetermined on a reference image of the terrain formed on the basis of satellite or aerial imagery [12]. However, in the practical use of this method, difficulties were encountered in identifying these specific points. These difficulties are associated with a significant difference in the quality of the on-board and reference images with the essential difference in scales. Therefore, when using real images, the best results were obtained by matching images not by points, but by rectilinear segments, such as images of road sections, walls of buildings [18]. This algorithm is a new modification of the navigation for the “sparse” optical flow (OF). Moreover, in areas of non-conjugation of template and on-board images for tracing the trajectory of UAV, it is necessary to apply methods based on so-called “dense” OF with the determination of the angular and linear velocities of the camera.
In the next Section 2 we present a review of various approaches applied to a video navigation and tested during work performed by a research group joined in the IITP RAS. Then in Section 3 we consider various approach to filtering in estimation of the UAV position. The approach to navigation based on computing of projective matrices between successive frames registered by on-board camera presented with the joint filtering algorithm in Section 4. Section 5 is conclusion.
We should underline that the main contribution of the paper is a review of different approaches to the UAV video navigation along with results of some experiments related to possible new developments which look promising in implementation of long-term autonomous missions performed by multi-purpose UAV.

2. General Approach to the Data Fusion Model of UAV Navigation

2.1. UAV Motion Model

In modern conditions, it becomes extremely important to fulfill UAV missions without using external navigation systems, this is why in this paper we focus on the benefits which can be derived directly from video data and on how to convert them into the UAV control system entrance form. The autonomous UAV navigation tasks solution requires obtaining the coordinates estimates, camera orientation angles, as well as the coordinates velocities of the apparatus itself and the angular velocities of the camera orientation change. For navigation one can use simple UAV model taking into account kinematic relations for position, velocities and accelerations such as (1), which in discrete time have a form
X ( t k + 1 ) = X ( t k ) + V ( t k ) Δ t + ACC ( t k ) Δ t 2 2 , V ( t k + 1 ) = V ( t k ) + ACC ( t k ) Δ t , ACC ( t k ) = U ( t k ) + W ( t k ) .
Here t k is a sequence of discrete times, such that t k + 1 t k = Δ t , X ( t k ) R 3 is vector of current UAV coordinates in the Earth coordinate system, V ( t k ) R 3 is velocities vector, ACC ( t k ) R 3 is accelerations vector of, U ( t k ) is vector of programmed accelerations from the UAV control system, W ( t k ) is a vector of perturbations including aerodynamic influences and control system errors.
For navigation needs we must get the current attitude and velocity estimates.
Thus, to solve the navigation problems, the motion model must be completed with a model of observations, which can contain, in an explicit or implicit form, the current coordinates and/or velocities and possibly accelerations. Typically, this information comes from the inertial navigation system (INS) and from the sensor system or the global satellite navigation system and serves as an additional means which increases the accuracy and detects failures in the navigation system. For autonomous UAV flights this additional system is highly required and if the trained pilot uses this information automatically, for UAV it is necessary to convert the video information into data suitable for use by the vehicle control system. Below we give a series of examples of video features used for navigation.

2.2. New Possibilities Related to the Usage of On-Board Opto-Electronic Cameras

The use of opto-electronic cameras aboard the UAV opens a multitude of ways to separately or jointly evaluate the coordinates and velocities that characterize the position of the UAV and the orientation of the surveillance system. Some examples of successful usage of on-board cameras for micro aerial vehicles (MAV) in GPS denied environment were reported in [19,20]. It is known the series of succesful usage of such small cameras in various applcations including indoors and outdoor MAV autonomous flights [21,22]. However, in this research we were focused on outdoor UAV applications with the usage of on-board camera as an additional source of navigation information. It should be noted that there are only two approaches to video navigation: the first one is navigation through ground objects with known coordinates and the second one which is determination of the absolute UAV velocities, by observing the evolution of the video image of the underlying surface. In both cases one needs to take into account the filtering of altitude and the speed of the device received from the INS. In reality, both approaches should be used, but it is necessary first to investigate their accuracy characteristics. These accuracy characteristics, of course, depend on a variety of factors, such as illumination, shooting conditions, seasonality and others. Therefore, it is not possible to determine in advance which algorithms and approaches will be most effective. This is what determines the purpose of this work—to review the existing methods and, if possible, to assess their effectiveness in video navigation issues.
We list only a few of them and give our comments related to our experience obtained with real and/or virtual flights.
  • Usage of the terrain maps and comparing the images of observed specific objects with its position on a preloaded terrain map. This seemingly most obvious method requires the presence of huge collection of observable objects on the board for reliable operation of the recognition system. These images must be recorded under different observation conditions, including aspect, scale, lighting, and so on. Of course, for some characteristic objects these problems are completely surmountable, but on the whole this creates serious difficulties.
  • To solve this problem, special techniques have been invented that can be attributed to the allocation of some characteristic small regions (singular points) that are distinguished by a special behavior of the illumination distribution that can be encoded by some set of features that are invariant to the scale and change of the aspect angles [23,24,25,26]. The application of this approach is described in the work [16,27], where it is demonstrated on model images using a 3D map of the local area. In these work we used a computer simulation of a UAV flight and simulated on board video camera imaging. The simulating program is written in MATLAB. Type of the feature points are: ASIFT realized in OpenCV (Python) [26]. Feature points in this model work as in a real flight because the images for the camera model and for the template images were transformed by projective mapping and created by observations from different satellites. However, the use of this method is limited by the need to ensure the closeness of registration conditions. Moreover, significant difference in the resolution level of the reference image and the recorded images in flight also leads to significant errors.
  • In tasks of the UAV navigating with the use of a preloaded map of the ground, the matching of reference and observable images plays a fundamental role. In recent years, the methodology based on the images matching with the use of singular points has been further developed. For example ORB methodology versus SIFT and SURF, use a very economical set of binary features of singular points, which allows to significantly reduce the execution time of the registration operation and demonstrate very high resistance to images noise and rotations [28]. In a series of detailed surveys [29,30,31] various alignment methods are examined either on the Oxford dataset test set and on others, while the ORB performance is high in terms of time consuming and the rate of erroneous pixel matching. Meanwhile, from the viewpoint of solving the problems of video navigation, it is more important the accuracy of matching, and more importantly, for specific images such as aerial photographs. In this connection, the results obtained with photogrammetric surveys using ORB-SLAM2 [32,33,34] which show the high potentialities of the ORB methodology, are of great interest in applications related to video navigation.
  • Less sensitive to the difference in shooting conditions are methods based on combining extended linear objects such as roads, house walls, rectilinear power lines and so on [18]. The application of analysis of linear objects gives rise to the usage of fast Hough transform [35,36].
Here we give some results of the image matching based on combining the linear objects (see Figure 1).
The estimation of the trajectory, based on the alignment of linear elements on template and on-board captured image shows good quality of the vehicle position estimation (see Figure 2). Of course, in the areas where there is no alignments the position is estimated on the basis of INS data only.
  • Similarly to linear objects, it is possible to use curvilinear shape preserving their forms for successful alignment, at least for various season, namely: the boundaries of forest, lands, banks of rivers and water bodies basing on the form [37] and color-texture domains [38,39]. An example is given in (see Figure 3) below.
  • It should be noted that the use of the above-mentioned approaches for navigation requires, on the one hand, the solution of the camera calibration problems and the elimination of all kinds of registration nonlinearities, such as distortion [40,41,42], motion blurring [43,44], but the most important peculiarity is the registration of images on 2D photodetector array, that is, the transformation of the 3D coordinates of the object into 2D, which gives only the angular coordinates of objects, known in the literature as bearing-only observations [45]. This is special area of nonlinear filtering problem, which may be solved more or less successfully with the aid of linearized or extended Kalman filtering and also with particle and unscented Kalman filtering. Meanwhile the comparison of various filtering solution shows [46,47] either the presence of uncontrolled bias [48] or the urgent necessity of the filter dimension extension like for particle and unscented filtering. Meanwhile comparison of the filtering accuracy shows almost identical accuracy [49], that is why one should prefer most simple pseudomeasurement Kalman filter without bias, developed on the basis of Pugachev’s conditionally optimal filtering [50,51,52,53]. To obtain 3D coordinates, it is also possible to measure the range, which is possible using stereo systems [54,55,56] or using active radio or laser range finders [6]. The latter can be limited in use because they need essential power and disclose the UAV position, and the stereo systems require very accurate calibration and need the creation of a significant triangulation base which is rather difficult to maintain on small-sized UAV in flight.
  • The problem of observing bearings only has long been in the focus of the interests of nonlinear filtering specialists, since it leads to the problem of estimating the position from nonlinear measurements. In the paper [16], we described a new filtering approach using the pseudo-measurement method, which allows expanding the observation system, up to unbiased estimations of the UAV’s own position, on the basis of the determination of bearings of terrestrial objects with known coordinates. However, the filtering is not the only problem which arises in bearing-only observations. Another issue is the association of observed objects with their images on template. Here the various approaches based on RANSAC solutions are necessary [57], such as [58,59], but the most important is the fusion of the current position estimation with the procedure of outliers rejection [60], for details see [16].
  • In addition, bearing monitoring requires knowledge of the position of the line of sight of the surveillance system, which is not determined by the orientation angles of the apparatus coming from the INS. That is why it is of interest to estimate the line of sight position from the evolution of the optical flow (OF) or projective matrices describing the transformation of images of terrain sections over two consecutive frames. In the case of visual navigation one needs also the set of angles, determining the orientation of the camera optical axis. The general model, developed for OF observation and describing the geometry of the observation is given in [61], and the corresponding filtering equations for the UAV attitude parameters have been obtained in [62]. These equations and models were tested with the aid of special software package [63] and the possibility of the estimate of coordinates and angular velocities of the UAV were successfully demonstrated in [64,65,66]. However, neither OF nor evolution of projective matrices give the exact values of angles determining the position of the line of sight but define rather the angular velocities, so the problem of the angles estimation remains and must be solved with aid of filtering.

2.3. The UAV Position and the Coordinates Velocities Estimation

The methods described above for measuring various parameters, associated with the UAV movement supply different information, which must be appropriately converted into the inputs of the control system. In particular, the data on the coordinate velocities of the UAV motion are contained in the OF measurements and are extracted by filtering the dynamics equations with the corresponding measurements [65,67]. More difficult is the use of bearing-only measurements, although a complete set of filtering equations is given in the works [11,16,53]. In addition, observations of moving target with aid of bearing-only observations allow us to evaluate their velocities, which is shown in the work [68].
Filtering equations for the UAV velocities on the basis of the OF measurements have been given in [64,69] with examples of the estimation of the current altitude and coordinate velocity of straightforward motion.

2.4. The UAV Angles and Angular Velocities Estimation

In general, the OF field contains the information about the coordinate velocities of the UAV and angular velocities of the sight line. Reliable results were obtained on virtual series of images modelling the flight with constant coordinate velocity and rotation along yaw angle [64]. However, experiments with the real video shows the high correlation level between different motions, for example between pitch angle and velocity of descent. Unfortunately, the measurement of the position of the line of sight in the UAV coordinate system, which is very precise in principle, is distorted by the own movement of the apparatus, since the UAV slopes are necessary for maneuvering and their separation from the angles of the line of sight is a very delicate problem. For example in experiments with quadrocopter equipped with stabilized camera, one needed to distinguish the angle of the camera inclination and the vehicle inclination which is necessary for UAV motion itself. In our experiments without careful determination of the UAV inclination angle we did not get any reliable results related to the coordinate and vertical UAV motion [61,62,63].

Estimation of the Angular Position

UAV angular position estimation is given by three angles θ ( t k ) , φ ( t k ) , γ ( t k ) (pitch, roll and yaw, respectively), angular velocities ω p ( t k ) , ω r ( t k ) , ω y ( t k ) and angular accelerations a p ( t k ) , a r ( t k ) , a y ( t k ) .
Pitch angle and pitch angular velocity dynamics described by the following relations:
θ ( t k + 1 ) = θ ( t k ) + ω p ( t k ) Δ t + a p ( t k ) Δ t 2 2 , ω p ( t k + 1 ) = ω p ( t k ) + a p ( t k ) Δ t + W p ( t k ) .
where W p ( t k ) — is the white noise with variance σ p 2 .
The pitch angular velocity measurement using the OF has the following form:
m p ( t k ) = ω p ( t k ) + W ω p ( t k ) ,
where W ω p ( t k ) — is the noise in the angular velocity measurements using OF, which is the white noise with variance σ ω p 2 .
Similarly to the coordinate velocity estimation we get the pitch angle θ ( t k ) and pitch angular velocity ω p ( t k ) estimations:
θ ^ ( t k + 1 ) = θ ^ ( t k ) + ω ^ p ( t k ) Δ t + a p ( t k ) Δ t 2 2 ω ^ p ( t k + 1 ) = K p ( t k + 1 ) m p ( t k + 1 ) + ( 1 K p ( t k + 1 ) ) ( ω ^ p ( t k ) + a p ( t k ) Δ t ) , K p ( t k + 1 ) = P ^ ω p ω p ( t k ) + σ p 2 P ^ ω p ω p ( t k ) + σ p 2 + σ ω p 2 , P ^ ω p ω p ( t k + 1 ) = σ ω p 2 ( P ^ ω p ω p ( t k ) + σ p 2 ) P ^ ω p ω p ( t k ) + σ p 2 + σ ω p 2 .
The formulae for φ ^ , γ ^ and ω r ^ , ω y ^ are analogous and used in the model based on the OF estimation.

3. Joint Estimation of the UAV Attitude with the Aid of Filtering

3.1. Visual-Based Navigation Approaches

Several studies have demonstrated the effectiveness of approaches based on motion field estimation and feature tracking for visual odometry [70]. Vision based methods have been proposed even in the context of autonomous landing management [12]. In [47] a visual odometry based on geometric homography was proposed. However, the homography analysis uses only 2D reference points coordinates, though for evaluation of the current UAV altitude the 3D coordinates are necessary. All such approaches presume the presence of some recognition system in order to detect the objects nominated in advance. Examples of such objects can be special buildings, crossroads, tops of mountains and so on. The principal difficulties are the different scale and aspect angles of observed and stored images which leads to the necessity of huge templates library in the UAV control system memory. Here one can avoid this difficulty, by using another approach based on observation of so-called feature points [71] that are the scale and the aspect angle invariant. For this purpose the technology of feature points [23] is used. In [10] the approach based on the coordinates correspondence of the reference points observed by on-board camera and the reference points on the map loaded into UAV’s memory before the mission start had been suggested. During the flight these maps are compared with the frame of the land, directly observed with the help of on-board video camera. As a result one can detect current location and orientation without time-error accumulation. These methods are invariant to some transformations and also are noise-stable so that predetermined maps can be different in scale, aspect angle, season, luminosity, weather conditions, etc. This technology appeared in [72]. The contribution of work [16] is the usage of modified unbiased pseudomeasurements filter for bearing only observations of some reference points with known terrain coordinates.

3.2. Kalman Filter

In order to obtain metric data from visual observations one needs first to make observations from different positions (i.e., triangulation) and then to use nonlinear filtering. However, all nonlinear filters either have unknown bias [48] or are very difficult for on-board implementation like the Bayesian type estimation [27,73]. Approaches for a position estimation based on bearing-only observations had been analyzed long ago especially for submarine applications [49] and nowadays for UAV applications [46].
Comparison of different nonlinear filters for bearing-only observations in the issue of the ground-based object localization [74] shows that EKF (extended Kalman filter), unscented Kalman filter, particle filter and pseudomeasurement filter give almost the same level of accuracy, while the pseudomeasurement filter is usually more stable and simple for on-board implementation. This observation is in accordance with older results [49], where all these filters were compared in the issue of moving objects localization. It has been mentioned that all these filters have bias which makes their use in data fusion issues rather problematic [45]. The principle requirement for such filters in data fusion is the non-biased estimate with known mean square characterization of the error. Among the variety of possible filters the pseudomeasurement filter can be easily modified to satisfy the data fusion demands. The idea of such nonlinear filtering has been developed by V. S. Pugachev and I. Sinitsyn in the form of so-called conditionally-optimal filtering [50], which provides the non-biased estimation within the class of linear filters with the minimum mean squared error. In our previous works we developed such a filter (so called Pseudomeasurement Kalman Filter (PKF)) for the UAV position estimation and give the algorithm for path planning along with the reference trajectory under external perturbations and noisy measurements [16,53].

3.3. Optical Absolute Positioning

Some known aerospace maps of a terrain in a flight zone are loaded into the aircraft memory before a start of flight. During the flight these maps are compared with the frame of the land, directly observed with the help of on-board video camera. For this purpose the technology of feature points [23] is used. As a result one can detect current location and orientation without time-error accumulation. These methods are invariant to some transformations and also are noise-stable so that predetermined maps can vary in height, season, luminosity, weather conditions, etc. Also from the moment of previous plane surveying the picture of this landscape can be changed due to human and natural activity. All approaches based on the capturing of the objects assigned in advance presume the presence of some on-board recognition system in order to detect and recognise such objects. Here we avoid this difficulty by using the observation of feature points [71] that are the scale and the aspect angle invariant. In addition, the modified pseudomeasurements Kalman Filtering (PKF) is used for estimation of UAV positions and control algorithm.
One should mention also the epipolar position estimation for absolute positioning [75], where it helps at landing on runway (see Figure 4).

4. Projection Matrices Techniques for Videonavigation

The transformation of images of plane regions when the camera position is changed is described by a projective transformation given by the corresponding matrix. The complete matrix of the projective transformation contains information on the displacement of the main point of the lens and the rotation of the line of sight. Once installed on an aircraft flying over a relatively flat portion of the earth’s surface, the camera registers a sequence of frames, and if there is overlap between consecutive frames, analysis of the displacement of characteristic points in the overlap region carries information about the linear and angular motion of the camera and thereby about the UAV motion. Projective transformations are often used to match images, but here we use alignment for motion analysis. Although it is quite natural, here is the first time when we used it to estimate UAV motion using actual video survey data. The OF gives the estimation via measurement of coordinate velocities of the image shift and therefore provides just local esimations where angular components are highly correlated with coordinate velocities and the latter are few orders higher than angular ones. So the issue of estimation of angular velocities on the basis of OF looks very difficult. At the same time the estimation on the basis of projective matrix evolution looks more promising for estimation of angles of the sight line [76]. The estimation of motion via projective matrices have been known long ago [77] and remains in the focus of researchers until now [9,78,79]. The OF works only on the basis of local information on the speed of motion, which leads to the error drift incresing. Therefore, it would be useful to obtain corrective data about the orientation of the UAV at some intermediate time instances. Such information maybe obtained with the aid of projective transformation between successive frames.
Of course this method is known for a long time, however, until now in the literature we did find only few examples of using this method for the UAV navigation, see for example [80]. Perhaps the reason is that this method needs a good estimation of the initial position and the line of sight angles, therefore the fusion with filtering algorithms taking into account the dynamical model of the UAV is rather urgent. The artcile [80] presents also the estimation of errors, though they depend on the specific landscape features, so it would be nice to estimate the influence of the projective matrices computation on the accuracy of the shift and rotation evaluation. Here we also present the experimental results of the UAV position estimation based on the computation of projective matrices on the basis of real video data in combination with the filtering of coordinates and angles. The basic idea of our approach is similar to [81], though in our implementation of homography between two successive frames we add the estimation of motion via Kalman filtering. In [80] an interesting application of the UAV observation for road traffic where it is demonstrated the application of the projective transform to estimation of the vehicle velocity by analysis of two successive frames. The disadvantage of this approach is that it is based on knowledge of coordinates of some specific points within field of view. Generally such points are absent. Very interesting example of projective matrix technique usage is given in [82]. However, all these approaches look like successful applications in rather specific cases. General approach to the camera pose estimation has been presented in [81] and our algorithm follows to it basically. The principal novelty is the fusion of projection technique with Kalman filtering for angular position of sight line. It is the more so important since other techniques do not estimate the angular position of the camera directly and the angular position of camera appears implicitly only. The information about the direction of the line of sight is important in the usage of the OF as a sensor of linear and angular velocities especially if the UAV performs manoeuvres with large changes of roll angle and pitch angle. The projection matrix techniques look rather promising in fusion with the OF, however this approach needs further experiments and additional flights.

4.1. Coordinate Systems

4.1.1. Earth Coordinate System

We assume the flat earth surface, the coordinate system O X Y Z is chosen as follows:
  • The origin O belongs to the earth surface.
  • Axis O X is directed to the east.
  • Axis O Y is directed to the north.
  • Axis O Z is directed to the zenith.
The earth surface is described by equation z = 0 .

4.1.2. The Camera Coordinate System

The pinhole camera model is used and the following camera coordinate system O X Y Z where
  • O is the principal point;
  • axis O X is directed right of the image;
  • axis O Y is directed to the top of the image;
  • and axis O Z is directed strightforwardly along the optical axis of the camera,
then coordinates of point r = x y z in the camera coordinate system related with homogeneous coordinates in pixels p = p x p y p z of its image via the camera matrix C by relation
p = C r .
The coordinate systems are shown on Figure 5.

4.1.3. Position of the Camera

Let at some moment the principal point of camera O is represented in the earth coordinate system O X Y Z by vector t . Then the coordinates of some point in the camera coordinate system r and in the earth coordinate system r = x y z related by formula
r = R ( r t ) ,
where R is the matrix of the camera rotation, such that R T R = R R T = E .
So we have
p = C R ( r t ) .

4.1.4. Representation of the Camera Rotation with the Aid of Roll, Pitch, Yaw Angles

For practical reasons the position of the camera can be defined by superposition of three rotations corresponding to the rotations of the vehicle, such as:
  • Yaw that is rotation about the axis O Z so that positive angle corresponds the anticlockwise rotation.
  • Pitch that is rotation about the axis O X so that for positive angle image moves downward.
  • Roll that is rotation about the axis O Y so that positive angle image moves right.
If the optical axis of the camera directed downward all rotations are zeros and the top of image is directed to the north.

4.2. Projective Representation of the Frame-to-Frame Transformation

Let us find the matrix of the projective transformation P between homogeneous coordinates on the earth surface ρ = x y 1 and coordinates p of the frame pixels
p = P ρ .
Denote as M [ 1 , 2 ] the matrix comprising first two columns of M , that is
M [ 1 , 2 ] = M 1 0 0 1 0 0 .
p = C R ( r t ) = C R x y 0 t = C R x y 0 R t
= C R [ 1 , 2 ] x y R t = C R [ 1 , 2 ] , R t ρ .
Therefore,
P = C R [ 1 , 2 ] , R t .

4.3. Determining of the Camera Position

Assume we have two positions of the camera, where
  • the first one is known R 1 , t 1 .
  • the second one to be determined R 2 , t 2 .
Then
  • p 1 = C R 1 ( r t 1 ) , P 1 = C R 1 [ 1 , 2 ] , R 1 t 1 , p 1 = P 1 ρ .
  • p 2 = C R 2 ( r t 2 ) , P 2 = C R 2 [ 1 , 2 ] , R 2 t 2 , p 2 = P 2 ρ .
And therefore,
p 2 = P 2 P 1 1 p 1 .
From other side, p 2 = H p 1 , where H is the matrix of projective transformation of frame 1 to frame 2, so we get
H = P 2 P 1 1 = C R 2 [ 1 , 2 ] , R 2 t 2 C R 1 [ 1 , 2 ] , R 1 t 1 1
= C R 2 [ 1 , 2 ] , R 2 t 2 R 1 [ 1 , 2 ] , R 1 t 1 1 C 1 ,
C 1 H C = R 2 [ 1 , 2 ] , R 2 t 2 R 1 [ 1 , 2 ] , R 1 t 1 1 ,
C 1 H C R 1 [ 1 , 2 ] , R 1 t 1 = R 2 [ 1 , 2 ] , R 2 t 2 ,
Any matrix obtained from H by multiplying on k 0 , that is k H , determines the same projective transformation. Assume we got the estimate H ^ of the matrix k H obtained on the basis of two successive frames. An example of the matrix H obtained on the basis of two successive frames like in Figure 6 is shown below.
Example of calculated matrix H between two successive frames
H = 1.023 0.008753 4.014 0.001305 1.010 70.08 1.131 × 10 5 6.478 × 10 6 1.0
For getting H ^ we use the RANSAC methodology [57] and interpret the difference H ^ k H as a normal noise, so as H ^ = k H + ε , where ε is a matrix normal noise added to all entries of H ^ . It permits to determine the second camera position as a solution of the following minimization problem:
R ^ 2 , t ^ 2 , k ^ = arg min R , t , k C 1 H ^ k C R 1 [ 1 , 2 ] , R 1 t 1 R [ 1 , 2 ] , R t F 2 .
or equivalently
R ^ 2 , t ^ 2 , k ^ = arg min R , t , k C 1 H ^ C R 1 [ 1 , 2 ] , R 1 t 1 k R [ 1 , 2 ] , R t F 2 .

4.4. Solution of Minimization Problem

The minimization problem (3) or/and (4) admit the following solution. Introduce the matrix G = C 1 H ^ C R 1 [ 1 , 2 ] , R 1 t 1 . Then the above problem may be reformulated as follows
R ^ 2 , t ^ 2 , k ^ = arg min R , t , k G k R [ 1 , 2 ] , R t F 2 .
Denote as M [ 3 ] the third column of matrix M ( M [ 3 ] = M 0 0 1 ) and by analogy denote M [ 1 ] and M [ 2 ] . So the minimization problem may be rewritten as
R ^ 2 , t ^ 2 , k ^ = arg min R , t , k G [ 1 , 2 ] k R [ 1 , 2 ] F 2 + G [ 3 ] + k R t 2 2 .
The first term does not depend on t, the second term achieves its minimum at t = 1 k R T G [ 3 ] where it is equal zero. Thus t ^ 2 = 1 k ^ R ^ 2 T G [ 3 ] . By substitution to the original minimizing term one can reduce the problem to the following
R ^ 2 , k ^ = arg min R , k G [ 1 , 2 ] k R [ 1 , 2 ] F 2 .
Vectors G [ 1 ] and G [ 2 ] belong to some plane γ , therefore vectors R ^ 2 [ 1 ] and R ^ 2 [ 2 ] giving minimum in (5) belong to the same plane.
Introduce the orthonormal basis V = V [ 1 ] , V [ 2 ] , V [ 3 ] , such that V [ 1 ] , V [ 2 ] γ define the system of coordinates on γ
V [ 1 ] = G [ 1 ] | | G [ 1 ] | | 2 , V [ 2 ] = G [ 2 ] ( V [ 1 ] T G [ 2 ] ) V [ 1 ] | | G [ 2 ] ( V [ 1 ] T G [ 2 ] ) V [ 1 ] | | 2 , V [ 3 ] = V [ 1 ] × V [ 2 ] .
In this system of coordinates γ vectors G [ 1 ] and G [ 2 ] have coordinates [ g [ 1 ] , g [ 2 ] ] = V 1 G [ 1 , 2 ] .
At the same time vectors R ^ 2 [ 1 ] and R ^ 2 [ 2 ] in the same system of coordinates have representation
[ r [ 1 ] , r [ 2 ] ] = cos ( α ) sin ( α ) sin ( α ) cos ( α ) 0 0 .
So the problem is reduced to the determining of an angle α ^ and k ^ :
α ^ , k ^ = arg min α , k [ g [ 1 ] , g [ 2 ] ] k cos ( α ) sin ( α ) sin ( α ) cos ( α ) 0 0 F .
By differentiation with respect to α one can obtain that
tan ( α ^ ) = g 2 [ 1 ] g 1 [ 2 ] g 2 [ 2 ] + g 1 [ 1 ]
does not depend on k .
From this equation one can obtain two solutions, the second one will be rejected later.
Now one can obtain [ r [ 1 ] , r [ 2 ] ] .
Later the rotation matrix be determined by transformation of its columns back to the original coordinates system
R ^ 2 = V [ r [ 1 ] , r [ 2 ] , r [ 1 ] × r [ 2 ] ] .
Then define k ^ in accordance with linear regression with quadratic penalization
k ^ = g [ 1 ] T r [ 1 ] + g [ 2 ] T r [ 2 ] r [ 1 ] T r [ 1 ] + r [ 2 ] T r [ 2 ] ,
finally, t ^ 2 = 1 k ^ R ^ 2 T G [ 3 ] . It appears that two solutions for t ^ 2 differs by the sign of Z coordinate, so the extra solution lies under earth and must be rejected.

4.5. Testing of the Algorithm

Suppose that we have a map, represented in raster(-scan) graphics, where q = q x q y 1 are their pixel coordinates. They are related with the earth coordinates by matrix Q , that is
q = Q ρ .
Let the length unit on the earth corresponds to k pixels on the map and the map size is w (width) and h (hight) pixels. For example the origin of the earth coordinate system O is in the map center. Then:
Q = k 0 w / 2 0 k h / 2 0 0 1 .
Relation between the frame pixels p i and the map pixels q is given by
p i = P i ρ , ρ = Q 1 q p i = P i Q 1 q ,
or
p i = A i q , A i = P i Q 1 = C R i [ 1 , 2 ] , R i t i Q 1 .
Note that
H = P 2 P 1 1 = P 2 ( Q 1 Q ) P 1 1 = ( P 2 Q 1 ) ( Q P 1 1 ) = ( P 2 Q 1 ) ( P 1 Q 1 ) 1 = A 2 A 1 1 .
Then the test algorithm is as follows
  • Choose the map and find the connection with the earth coordinates by determining Q.
  • Choose C.
  • Choose R i , t i .
  • Calculate A i = C R i [ 1 , 2 ] , R i t i Q 1 .
  • Forget for a moment R 2 , t 2 .
  • Obtain two frames in accordance with A i from the map.
  • Visually test that they correspond to R i , t i .
  • Calculate the matrix H = A 2 A 1 1 .
  • Model H ^ = k H + ε , k 0 , ε is a noise.
  • Find R ^ 2 , t ^ 2 .
  • Recall R 2 , t 2 .
  • Compare R ^ 2 with R 2 , and t ^ 2 with t 2 .

4.5.1. Testing Results

In this testing we are evaluating the influence of noise in determining of projective matrix on the camera motion estimation. Testing shows that
  • in the case of the noise absence ε = 0 in the matrix H ^ the method gives exact values of rotation and shift R ^ 2 = R 2 , t ^ 2 = t 2 ,
  • an increasing of the noise level evidently produce the increasing of errors in estimation, though the exact estimations needs further research with additional flight experiments.
An example of the algorithm testing on real flight data with the estimation of the UAV position on the basis of Kalman filtering is shown in Figure 7. More specific results showing the tracking of separate UAV motion components are shown also on Figure 8, Figure 9 and Figure 10.
They show more or less good correspondence but only on rather short time intervals, approximately 30 s; however, it is typical for algorithms giving the estimates of the UAV velocities, since all algorithms which involve accumulating of the drift need the correction with the aid of another algorithm. These algorithms should be based on matching of observed images and templates, some of them are described in Section 2 and Section 3.

4.5.2. Statistical Analysis of Projective Matrices Algorithm

Thus, the average error equals 3.1837 m per frame, where the average frame shift equals 16.8158 m. The average relative error/per frame equals 19.08 . See the Table 1.

4.5.3. Comparison of Projective Matrices Algorithm with OF Estimation

Both projective matrices algorithms and the OF give infromation related to the coordinate and the angular velocities of the UAV. We test the OF approach [63] on the same videodata as for projective matrices.
On Figure 11 and Figure 12 in comparison with corresponding data on Figure 8 and Figure 9 one can see a very short period of reliable estimation of coordinates, so the projective matrices computation shows more relible tracking, however it is just a comparison of the algorithms per se, without fusion with INS, which is necessary in order to evaluate the current flight parameters. The estimation of the orientation angles is given in Figure 13.
The described algorithm gives only one step in the estimation of the displacement and rotation of the camera, while the initial data for the operation of the algorithm must be obtained from the system for estimating the position of the apparatus. In other words, the algorithm based on the calculation of the projective transformation from frame to frame can complement the sensors of the coordinate and angular velocities of the apparatus and its performance depends on the accuracy of the position matching of the singular points. Of course, this modelling example needs further verification with new videodata and new telemetry data from INS. It is obvious that the noise in the definition of the projective transformation matrix is decisive in assessing the operability of the algorithm and depends on a variety of factors. Therefore, further analysis of the algorithm will be performed on the new flight data sets. The matherial presented in this Section 4 provides a complex tracking algorithm in which the computation of the projective transformation serves as a sensor of displacement and rotations of the line of sight. Moreover, with the example of a sufficiently long flight, the possibility of determining the UAV velocities is shown on the basis of the algorithm for calculating a projective transformation from frame to frame. This demonstrates the possible efficiency of the approach and opens the way for its integration in the navigation system. Anyway the efficiency of this algorithm strongly depends on the other systems giving, for example, the initial estimates of position and angles, otherwise the increasing errors are inevitable. It is clear that the realization of video navigation from observations of the earth’s surface is a difficult task and we are only at the very beginning of the road.

5. Conclusions

The article describes a number of approaches to video navigation based on observation of the earth’s surface during an autonomous UAV flight. It should be noted that their performance depends on external conditions and the observed landscape. Therefore, it is diffcult to choose the most promising approach in advance and most likely it is necessary to rely on the use of various algorithms taking into account the enviromental conditions and computational and energy limitations of real UAV. However, there is an important problem, namely, the determination of the quality of the evaluation and the selection of the most reliable observation channels during the flight. The theory of control of observations opens the way to a solution, since the theory of filtering, as a rule, uses a discrepancy between the predicted and observed values and videos could help to detect them. This is probably, most of all, the main direction of future research in the field of video integration with standard navigation systems.

Author Contributions

The work presented here was carried out in collaboration among all authors. All authors have contributed to, seen and approved the manuscript. B.M. is the main author, having conducted the survey and written the content. A.P. and I.K. developed and analyzed the projective matrix approach to UAV navigation. K.S. and A.M. were responsible for the parts related to the usage of stochastic filtering in the UAV position and attitude estimation. D.S. performed the analysis of videoseqiences for computation the sequence of projective matrices. E.K. developed the algorithm of geolicalization on the basis linear objects.

Funding

This research was funded by Russian Science Foundation Grant 14-50-00150.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Miller, B.M. Optimal control of observations in the filtering of diffusion processes I,II. Autom. Remote Control 1985, 46, 207–214, 745–754. [Google Scholar]
  2. Miller, B.M.; Runggaldier, W.J. Optimization of observations: A stochastic control approach. SIAM J. Control Optim. 1997, 35, 1030–1052. [Google Scholar] [CrossRef]
  3. Miller, B.M.; Rubinovich, E.Y. Impusive Control in Continuous and Discrete-Continuous Systems; Kluwer Academic/Plenum Publishres: New York, NY, USA, 2003. [Google Scholar]
  4. Kolosov, K.S. Robust complex processing of UAV navigation measurements. Inf. Process. 2017, 17, 245–257. [Google Scholar]
  5. Andreev, K.; Rubinovich, E. UAV Guidance When Tracking a Ground Moving Target with Bearing-Only Measurements and Digital Elevation Map Support; Southern Federal University, Engineering Sciences: Rostov Oblast, Russia, 2015; pp. 185–195. (In Russian) [Google Scholar]
  6. Sidorchuk, D.S.; Volkov, V.V. Fusion of radar, visible and thermal imagery with account for differences in brightness and chromaticity perception. Sens. Syst. 2018, 32, 14–18. (In Russian) [Google Scholar]
  7. Andreev, K.V. Optimal Trajectories for Unmanned Aerial Vehicle Tracking the Moving Targets Using Linear Antenna Array. Control Sci. 2015, 5, 76–84. (In Russian) [Google Scholar]
  8. Abulkhanov, D.; Konovalenko, I.; Nikolaev, D.; Savchik, A.; Shvets, E.; Sidorchuk, D. Neural Network-based Feature Point Descriptors for Registration of Optical and SAR Images. In Proceedings of the 10th International Conference on Machine Vision (ICMV 2017), Vienna, Austria, 13–15 November 2017; Voulme 10696. [Google Scholar]
  9. Kanellakis, C.; Nikolakopoulos, G. Survey on Computer Vision for UAVs: Current Developments and Trends. J. Intell. Robot. Syst. 2017, 87, 141–168. [Google Scholar] [CrossRef] [Green Version]
  10. Konovalenko, I.; Miller, A.; Miller, B.; Nikolaev, D. UAV navigation on the basis of the feature points detection on underlying surface. In Proceedings of the 29th European Conference on Modelling and Simulation (ECMS 2015), Albena, Bulgaria, 26–29 May 2015; pp. 499–505. [Google Scholar]
  11. Karpenko, S.; Konovalenko, I.; Miller, A.; Miller, B.; Nikolaev, D. Visual navigation of the UAVs on the basis of 3D natural landmarks. In Proceedings of the Eighth International Conference on Machine Vision ICMV 2015, Barcelona, Spain, 19–20 November 2015; Volume 9875, pp. 1–10. [Google Scholar] [CrossRef]
  12. Cesetti, A.; Frontoni, E.; Mancini, A.; Zingaretti, P.; Longhi, S. A Vision-Based Guidance System for UAV Navigation and Safe Landing using Natural Landmarks. J. Intell. Robot. Syst. 2010, 57, 233–257. [Google Scholar] [CrossRef]
  13. Miller, B.; Miller, G.; Semenikhin, K. Optimization of the Data Transmission Flow from Moving Object to Nonhomogeneous Network of Base Stations. IFAC PapersOnLine 2017, 50, 6160–6165. [Google Scholar] [CrossRef]
  14. Miller, B.M.; Miller, G.B.; Semenikhin, V.K. Optimal Channel Choice for Lossy Data Flow Transmission. Autom. Remote Control 2018, 79, 66–77. [Google Scholar] [CrossRef]
  15. Miller, A.B.; Miller, B.M. Stochastic control of light UAV at landing with the aid of bearing-only observations. In Proceedings of the 8th International Conference on Machine Vision (ICMV), Barcelona, Spain, 19–21 November 2015; Volume 9875. [Google Scholar] [CrossRef]
  16. Konovalenko, I.; Miller, A.; Miller, B.; Popov, A.; Stepanyan, K. UAV Control on the Basis of 3D Landmark Bearing-Only Observations. Sensors 2015, 15, 29802–29820. [Google Scholar] [Green Version]
  17. Karpenko, S.; Konovalenko, I.; Miller, A.; Miller, B.; Nikolaev, D. Stochastic control of UAV on the basis of robust filtering of 3D natural landmarks observations. In Proceedings of the Conference on Information Technology and Systems, Olympic Village, Sochi, Russia, 7–11 September 2015; pp. 442–455. [Google Scholar]
  18. Kunina, I.; Terekhin, A.; Khanipov, T.; Kuznetsova, E.; Nikolaev, D. Aerial image geolocalization by matching its line structure with route map. Proc. SPIE 2017, 10341. [Google Scholar] [CrossRef]
  19. Pestana, J.; Sanchez-Lopez, J.L.; Saripalli, S.; Campoy, P. Vision based GPS-denied Object Tracking and following for unmanned aerial vehicles. In Proceedings of the 2013 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Linkoping, Sweden, 21–26 October 2013. [Google Scholar] [CrossRef]
  20. Pestana, J.; Sanchez-Lopez, J.L.; Saripalli, S.; Campoy, P. Computer Vision Based General Object Following for GPS-denied Multirotor Unmanned Vehicles. In Proceedings of the 2014 American Control Conference (ACC), Portland, OR, USA, 4–6 June 2014; pp. 1886–1891. [Google Scholar]
  21. Pestana, J.; Sanchez-Lopez, J.L.; Saripalli, S.; Campoy, P. A Vision-based Quadrotor Swarm for the participation in the 2013 International Micro Air Vehicle Competition. In Proceedings of the 2014 International Conference on Unmanned Aircraft Systems (ICUAS), Orlando, FL, USA, 27–30 May 2014; pp. 617–622. [Google Scholar]
  22. Pestana, J.; Sanchez-Lopez, J.L.; de la Puente, P.; Carrio, A.; Campoy, P. A Vision-based Quadrotor Multi-robot Solution for the Indoor Autonomy Challenge of the 2013 International Micro Air Vehicle Competition. J. Intell. Robot. Syst. 2016, 84, 601–620. [Google Scholar] [CrossRef]
  23. Lowe, D.G. Object recognition from local scale-invariant features. In Proceedings of the International Conference on Computer Vision, Kerkyra, Greece, 20–27 September 1999; Voulme 2, pp. 1150–1157. [Google Scholar]
  24. Lowe, D. Distinctive image features from scale-invariant key points. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
  25. Morel, J.; Yu, G. ASIFT: A New Framework for Fully Affine Invariant Image Comparison. SIAM J. Imaging Sci. 2009, 2, 438–469. [Google Scholar] [CrossRef] [Green Version]
  26. 2012 opencv/opencv Wiki GitHub. Available online: https://github.com/Itseez/opencv/blob/master/samples/python2/asift.ru (accessed on 24 September 2014).
  27. Bishop, A.N.; Fidan, B.; Anderson, B.D.O.; Dogancay, K.; Pathirana, P.N. Optimality analysis of sensor-target localization geometries. Automatica 2010, 46, 479–492. [Google Scholar] [CrossRef]
  28. Ethan, R.; Vincent, R.; Kurt, K.; Gary, B. ORB: An efficient alternative to SIFT or SURF. In Proceedings of the 2011 International Conference on Computer Vision, Barcelona, Spain, 6–13 November 2011; pp. 2564–2571. [Google Scholar]
  29. Işik, Ş.; Özkan, K. A Comparative Evaluation of Well-known Feature Detectors and Descriptors. Int. J. Appl. Math. Electron. Comput. 2015, 3, 1–6. [Google Scholar] [CrossRef]
  30. Adel, E.; Elmogy, M.; Elbakry, H. Image Stitching System Based on ORB Feature-Based Technique and Compensation Blending. Int. J. Adv. Comput. Sci. Appl. 2015, 6, 55–62. [Google Scholar] [CrossRef]
  31. Karami, E.; Prasad, S.; Shehata, M. Image Matching Using SIFT, SURF, BRIEF and ORB: Performance Comparison for Distorted Images. arXiv, 2017; arXiv:1710.02726. [Google Scholar]
  32. Burdziakowski, P. Low Cost Hexacopter Autonomous Platform for Testing and Developing Photogrammetry Technologies and Intelligent Navigation Systems. In Proceedings of the 10th International Conference on “Environmental Engineering”, Vilnius, Lithuania, 27–28 April 2017; pp. 1–6. [Google Scholar]
  33. Burdziakowski, P.; Janowski, A.; Przyborski, M.; Szulwic, J. A Modern Approach to an Unmanned Vehicle Navigation. In Proceedings of the 16th International Multidisciplinary Scientific GeoConference (SGEM 2016), Albena, Bulgaria, 28 June–6 July 2016; pp. 747–758. [Google Scholar]
  34. Burdziakowski, P. Towards Precise Visual Navigation and Direct Georeferencing for MAV Using ORB-SLAM2. In Proceedings of the 2017 Baltic Geodetic Congress (BGC Geomatics), Gdansk, Poland, 22–25 June 2017; pp. 394–398. [Google Scholar]
  35. Ershov, E.; Terekhin, A.; Nikolaev, D.; Postnikov, V.; Karpenko, S. Fast Hough transform analysis: Pattern deviation from line segment. In Proceedings of the 8th International Conference on Machine Vision (ICMV 2015), Barcelona, Spain, 19–21 November 2015; Volume 9875, p. 987509. [Google Scholar]
  36. Ershov, E.; Terekhin, A.; Karpenko, S.; Nikolaev, D.; Postnikov, V. Fast 3D Hough Transform computation. In Proceedings of the 30th European Conference on Modelling and Simulation (ECMS 2016), Regensburg, Germany, 31 May–3 June 2016; pp. 227–230. [Google Scholar]
  37. Savchik, A.V.; Sablina, V.A. Finding the correspondence between closed curves under projective distortions. Sens. Syst. 2018, 32, 60–66. (In Russian) [Google Scholar]
  38. Kunina, I.; Teplyakov, L.M.; Gladkov, A.P.; Khanipov, T.; Nikolaev, D.P. Aerial images visual localization on a vector map using color-texture segmentation. In Proceedings of the International Conference on Machine Vision, Vienna, Austria, 13–15 November 2017; Volume 10696, p. 106961. [Google Scholar] [CrossRef]
  39. Teplyakov, L.M.; Kunina, I.A.; Gladkov, A.P. Visual localisation of aerial images on vector map using colour-texture segmentation. Sens. Syst. 2018, 32, 19–25. (In Russian) [Google Scholar]
  40. Kunina, I.; Gladilin, S.; Nikolaev, D. Blind radial distortion compensation in a single image using a fast Hough transform. Comput. Opt. 2016, 40, 395–403. [Google Scholar] [CrossRef]
  41. Kunina, I.; Terekhin, A.; Gladilin, S.; Nikolaev, D. Blind radial distortion compensation from video using fast Hough transform. In Proceedings of the 2016 International Conference on Robotics and Machine Vision, ICRMV 2016, Moscow, Russia, 14–16 September 2016; Volume 10253, pp. 1–7. [Google Scholar]
  42. Kunina, I.; Volkov, A.; Gladilin, S.; Nikolaev, D. Demosaicing as the problem of regularization. In Proceedings of the Eighth International Conference on Machine Vision (ICMV 2015), Barcelona, Spain, 19–20 November 2015; Volume 9875, pp. 1–5. [Google Scholar]
  43. Karnaukhov, V.; Mozerov, M. Motion blur estimation based on multitarget matching model. Opt. Eng. 2016, 55, 100502. [Google Scholar] [CrossRef]
  44. Bolshakov, A.; Gracheva, M.; Sidorchuk, D. How many observers do you need to create a reliable saliency map in VR attention study? In Proceedings of the ECVP 2017, Berlin, Germany, 27–31 August 2017; Voulme 46. [Google Scholar]
  45. Aidala, V.J.; Nardone, S.C. Biased Estimation Properties of the Pseudolinear Tracking Filter. IEEE Trans. Aerosp. Electron. Syst. 1982, 18, 432–441. [Google Scholar] [CrossRef]
  46. Osborn, R.W., III; Bar-Shalom, Y. Statistical Efficiency of Composite Position Measurements from Passive Sensors. IEEE Trans. Aerosp. Electron. Syst. 2013, 49, 2799–2806. [Google Scholar] [CrossRef]
  47. Wang, C.-L.; Wang, T.-M.; Liang, J.-H.; Zhang, Y.-C.; Zhou, Y. Bearing-only Visual SLAM for Small Unmanned Aerial Vehicles in GPS-denied Environments. Int. J. Autom. Comput. 2013, 10, 387–396. [Google Scholar] [CrossRef] [Green Version]
  48. Belfadel, D.; Osborne, R.W., III; Bar-Shalom, Y. Bias Estimation for Optical Sensor Measurements with Targets of Opportunity. In Proceedings of the 16th International Conference on Information, Fusion Istanbul, Turkey, 9–12 July2013; pp. 1805–1812. [Google Scholar]
  49. Lin, X.; Kirubarajan, T.; Bar-Shalom, Y.; Maskell, S. Comparison of EKF, Pseudomeasurement and Particle Filters for a Bearing-only Target Tracking Problem. Proc. SPIE 2002, 4728, 240–250. [Google Scholar]
  50. Pugachev, V.S.; Sinitsyn, I.N. Stochastic Differential Systems. Analysis and Filtering; Wiley: Hoboken, NJ, USA, 1987. [Google Scholar]
  51. Miller, B.M.; Pankov, A.R. Theory of Random Processes; Phizmatlit: Moscow, Russian, 2007. (In Russian) [Google Scholar]
  52. Amelin, K.S.; Miller, A.B. An Algorithm for Refinement of the Position of a Light UAV on the Basis of Kalman Filtering of Bearing Measurements. J. Commun. Technol. Electron. 2014, 59, 622–631. [Google Scholar] [CrossRef]
  53. Miller, A.B. Development of the motion control on the basis of Kalman filtering of bearing-only measurements. Autom. Remote Control 2015, 76, 1018–1035. [Google Scholar] [CrossRef]
  54. Volkov, A.; Ershov, E.; Gladilin, S.; Nikolaev, D. Stereo-based visual localization without triangulation for unmanned robotics platform. In Proceedings of the 2016 International Conference on Robotics and Machine Vision, Moscow, Russian, 14–16 September 2016; Voulme 10253. [Google Scholar] [CrossRef]
  55. Ershov, E.; Karnaukhov, V.; Mozerov, M. Stereovision Algorithms Applicability Investigation for Motion Parallax of Monocular Camera Case. Inf. Process. 2016, 61, 695–704. [Google Scholar] [CrossRef]
  56. Ershov, E.; Karnaukhov, V.; Mozerov, M. Probabilistic choice between symmetric disparities in motion stereo for lateral navigation system. Opt. Eng. 2016, 55, 023101. [Google Scholar] [CrossRef]
  57. Fischler, M.A.; Bolles, R.C. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Commun. ACM 1981, 24, 381–395. [Google Scholar] [CrossRef]
  58. Torr, P.H.S.; Zisserman, A. MLESAC: A New Robust Estimator with Application to Estimating Image Geometry. Comput. Vis. Image Underst. 2000, 78, 138–156. [Google Scholar] [CrossRef] [Green Version]
  59. Zuliani, M.; Kenney, C.S.; Manjunath, B.S. The MultiRANSAC Algorithm and its Application to Detect Planar Homographies. Proceedings of 12th the IEEE International Conference on Image Processing (ICIP 2005), Genova, Italy, 11–14 September 2005; Volume 3, pp. 2969–2972. [Google Scholar]
  60. Civera, J.; Grasa, O.G.; Davison, A.J.; Montiel, J.M.M. 1-Point RANSAC for EKF-Based Structure from Motion. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO, USA, 11–15 October 2009; pp. 3498–3504. [Google Scholar]
  61. Popov, A.; Miller, A.; Miller, B.; Stepanyan, K. Application of the Optical Flow as a Navigation Sensor for UAV. In Proceedings of the 39th IITP RAS Interdisciplinary Conference & School, Olympic Village, Sochi, Russia, 7–11 September 2015; pp. 390–398, ISBN 978-5-901158-28-9. [Google Scholar]
  62. Popov, A.; Miller, A.; Miller, B.; Stepanyan, K.; Konovalenko, I.; Sidorchuk, D.; Koptelov, I. UAV navigation on the basis of video sequences registered by on-board camera. In Proceedings of the 40th Interdisciplinary Conference & School Information Technology and Systems 2016, Repino, St. Petersburg, Russia, 25–30 September 2016; pp. 370–376. [Google Scholar]
  63. Popov, A.; Stepanyan, K.; Miller, B.; Miller, A. Software package IMODEL for analysis of algorithms for control and navigations of UAV with the aid of observation of underlying surface. In Proceedings of the Abstracts of XX Anniversary International Conference on Computational Mechanics and Modern Applied Software Packages (CMMASS2017), Alushta, Crimea, 24–31 May 2017; pp. 607–611. (In Russian). [Google Scholar]
  64. Popov, A.; Miller, A.; Miller, B.; Stepanyan, K. Optical Flow as a Navigation Means for UAVs with Opto-electronic Cameras. In Proceedings of the 56th Israel Annual Conference on Aerospace Sciences, Tel-Aviv and Haifa, Israel, 9–10 March 2016. [Google Scholar]
  65. Miller, B.M.; Stepanyan, K.V.; Popov, A.K.; Miller, A.B. UAV Navigation Based on Videosequences Captured by the Onboard Video Camera. Autom. Remote Control 2017, 78, 2211–2221. [Google Scholar] [CrossRef]
  66. Popov, A.K.; Miller, A.B.; Stepanyan, K.V.; Miller, B.M. Modelling of the unmanned aerial vehicle navigation on the basis of two height-shifted onboard cameras. Sens. Syst. 2018, 32, 26–34. (In Russian) [Google Scholar]
  67. Popov, A.; Miller, A.; Miller, B.; Stepanyan, K. Estimation of velocities via Optical Flow. In Proceedings of the International Conference on Robotics and Machine Vision (ICRMV), Moscow, Russia, 14 September 2016; pp. 1–5. [Google Scholar] [CrossRef]
  68. Miller, A.; Miller, B. Pseudomeasurement Kalman filter in underwater target motion analisys & Integration of bearing-only and active-range measurement. IFAC PapersOnLine 2017, 50, 3817–3822. [Google Scholar]
  69. Popov, A.; Miller, A.; Miller, B.; Stepanyan, K. Optical Flow and Inertial Navigation System Fusion in UAV Navigation. In Proceedings of the Conference on Unmanned/Unattended Sensors and Sensor Networks XII, Edinburgh, UK, 26 September 2016; pp. 1–16. [Google Scholar] [CrossRef]
  70. Caballero, F.; Merino, L.; Ferruz, J.; Ollero, A. Vision-based Odometry and SLAM for Medium and High Altitude Flying UAVs. J. Intell. Robot. Syst. 2009, 54, 137–161. [Google Scholar] [CrossRef]
  71. Konovalenko, I.; Kuznetsova, E. Experimental Comparison of Methods for Estimation of the Observed Velocity of the Vehicle in Video Stream. Proc. SPIE 2015, 9445. [Google Scholar] [CrossRef]
  72. Guan, X.; Bai, H. A GPU accelerated real-time self-contained visual navigation system for UAVs. Proceeding of the IEEE International Conference on Information and Automation, Shenyang, China, 6–8 June 2012; pp. 578–581. [Google Scholar]
  73. Jauffet, C.; Pillon, D.; Pignoll, A.C. Leg-by-leg Bearings-Only Target Motion Analysis Without Observer Maneuver. J. Adv. Inf. Fusion 2011, 6, 24–38. [Google Scholar]
  74. Miller, B.M.; Stepanyan, K.V.; Miller, A.B.; Andreev, K.V.; Khoroshenkikh, S.N. Optimal filter selection for UAV trajectory control problems. In Proceedings of the 37th Conference on Information Technology and Systems, Kaliningrad, Russia, 1–6 September 2013; pp. 327–333. [Google Scholar]
  75. Ovchinkin, A.A.; Ershov, E.I. The algorithm of epipole position estimation under pure camera translation. Sens. Syst. 2018, 32, 42–49. (In Russian) [Google Scholar]
  76. Le Coat, F.; Pissaloux, E.E. Modelling the Optical-flow with Projective-transform Approximation for Lerge Camera Movements. In Proceedings of the IEEE International Conference on Image Processing (ICIP), Paris, France, 27–30 October 2014; pp. 199–203. [Google Scholar]
  77. Longuet-Higgins, H.C. The Visual Ambiguity of a Moving Plane. Proc. R. Soc. Lond. Ser. B Biol. Sci. 1984, 223, 165–175. [Google Scholar] [CrossRef]
  78. Raudies, F.; Neumann, H. A review and evaluation of methods estimating ego-motion. Comput. Vis. Image Underst. 2012, 116, 606–633. [Google Scholar] [CrossRef]
  79. Yuan, D.; Liu, M.; Yin, J.; Hu, J. Camera motion estimation through monocular normal flow vectors. Pattern Recognit. Lett. 2015, 52, 59–64. [Google Scholar] [CrossRef]
  80. Babinec, A.; Apeltauer, J. On accuracy of position estimation from aerial imagery captured by low-flying UAVs. Int. J. Transp. Sci. Technol. 2016, 5, 152–166. [Google Scholar] [CrossRef]
  81. Martínez, C.; Mondragón, I.F.; Olivares-Médez, M.A.; Campoy, P. On-board and Ground Visual Pose Estimation Techniques for UAV Control. J. Intell. Robot. Syst. 2011, 61, 301–320. [Google Scholar] [CrossRef] [Green Version]
  82. Pachauri, A.; More, V.; Gaidhani, P.; Gupta, N. Autonomous Ingress of a UAV through a window using Monocular Vision. arXiv, 2016; arXiv:1607.07006v1. [Google Scholar]
Figure 1. Left image shows the bad alignment of a template image and the on-board captured images. In this case the coordinates determined from current image are useless and not used for determining of the UAV coordinates. The right image shows the good alignment of template and the on-board captured images. One can observe the alignment of linear elements which are not connected, it shows the good information capability of this method.
Figure 1. Left image shows the bad alignment of a template image and the on-board captured images. In this case the coordinates determined from current image are useless and not used for determining of the UAV coordinates. The right image shows the good alignment of template and the on-board captured images. One can observe the alignment of linear elements which are not connected, it shows the good information capability of this method.
Sensors 18 03010 g001
Figure 2. Solid line shows the position of the vehicle obtained from Satelite+ Navigation system (SNS). Blue circles show the points where the good alignment had been achieved.
Figure 2. Solid line shows the position of the vehicle obtained from Satelite+ Navigation system (SNS). Blue circles show the points where the good alignment had been achieved.
Sensors 18 03010 g002
Figure 3. Texture segmentation results: (a) full image, (b) zoomed and contrasted image containing the coastal line.
Figure 3. Texture segmentation results: (a) full image, (b) zoomed and contrasted image containing the coastal line.
Sensors 18 03010 g003
Figure 4. Landing on runway with well-structured epipolar lines.
Figure 4. Landing on runway with well-structured epipolar lines.
Sensors 18 03010 g004
Figure 5. Coordinate system of two successive frames positions used for determining the projective matrix giving the relation between coordinates of singular points.
Figure 5. Coordinate system of two successive frames positions used for determining the projective matrix giving the relation between coordinates of singular points.
Sensors 18 03010 g005
Figure 6. The two successive frames used for determining the projective matrix. Red crosses show the singular points usedw for the matrix calculation. The difference between two frames is rather small since it corresponds to time interval Δ t = 1 / 25 s. The low resolution is due to the aircraft motion, which produce additional blurring.
Figure 6. The two successive frames used for determining the projective matrix. Red crosses show the singular points usedw for the matrix calculation. The difference between two frames is rather small since it corresponds to time interval Δ t = 1 / 25 s. The low resolution is due to the aircraft motion, which produce additional blurring.
Sensors 18 03010 g006
Figure 7. Solid line shows the position of the vehicle obtained from Satelite+ Navigation system (SNS). Blue line shows the position of the vehicle determined with the aid of the sequence of projective matrices. One can observe the increasing shift, since there is no correction based on the position evaluation.
Figure 7. Solid line shows the position of the vehicle obtained from Satelite+ Navigation system (SNS). Blue line shows the position of the vehicle determined with the aid of the sequence of projective matrices. One can observe the increasing shift, since there is no correction based on the position evaluation.
Sensors 18 03010 g007
Figure 8. Tracking for X , Y coordinates. “Sensor” means estimation via projective matrices calculation.
Figure 8. Tracking for X , Y coordinates. “Sensor” means estimation via projective matrices calculation.
Sensors 18 03010 g008
Figure 9. Tracking for yaw angle. The correspondence is rather good, but the flight does not contain yaw maneuvering.
Figure 9. Tracking for yaw angle. The correspondence is rather good, but the flight does not contain yaw maneuvering.
Sensors 18 03010 g009
Figure 10. Tracking for Z coordinate. The correspondence is not bad on the very short interval at the beginning only, after that the error increases very fast.
Figure 10. Tracking for Z coordinate. The correspondence is not bad on the very short interval at the beginning only, after that the error increases very fast.
Sensors 18 03010 g010
Figure 11. Tracking for X , Y coordinates via OF algorithm. In comparison with Figure 8 one can observe very short period of reliable tracking.
Figure 11. Tracking for X , Y coordinates via OF algorithm. In comparison with Figure 8 one can observe very short period of reliable tracking.
Sensors 18 03010 g011
Figure 12. Tracking for Z coordinate via OF algorithm. In comparison with Figure 9 one can observe very short period of reliable tracking.
Figure 12. Tracking for Z coordinate via OF algorithm. In comparison with Figure 9 one can observe very short period of reliable tracking.
Sensors 18 03010 g012
Figure 13. Orientation angles of the UAV estimated via projective matrices algorithm. One can observe the turn in yaw angle which the plane performed between 60-th and 80-th seconds of flight. On Figure 7 it corresponds approximately 1700 m from the starting point. At the initial period the estimate of yaw angle is around 85 o (see also Figure 9.)
Figure 13. Orientation angles of the UAV estimated via projective matrices algorithm. One can observe the turn in yaw angle which the plane performed between 60-th and 80-th seconds of flight. On Figure 7 it corresponds approximately 1700 m from the starting point. At the initial period the estimate of yaw angle is around 85 o (see also Figure 9.)
Sensors 18 03010 g013
Table 1. Analysis of the shift estimation provided by projective matrices algorithm.
Table 1. Analysis of the shift estimation provided by projective matrices algorithm.
Number of Frame Δ t = 0.5 s L 2 Norm of Real Shift from INS L 2 Norm of the Error from INS and Projective Matrices
117.99500.5238
216.64702.2434
317.41320.9915
416.40363.2009
517.27925.2980
616.29513.7480
717.70306.7479
816.40294.8240
917.83090.1152
1016.17183.8627
1117.23132.6757
1215.83426.7456
1317.26522.1774
1415.79781.5271
1517.44732.2599
1615.94073.8367
1717.48625.3902
1816.00201.9714
1917.22862.1162
2015.94063.4193

Share and Cite

MDPI and ACS Style

Konovalenko, I.; Kuznetsova, E.; Miller, A.; Miller, B.; Popov, A.; Shepelev, D.; Stepanyan, K. New Approaches to the Integration of Navigation Systems for Autonomous Unmanned Vehicles (UAV). Sensors 2018, 18, 3010. https://doi.org/10.3390/s18093010

AMA Style

Konovalenko I, Kuznetsova E, Miller A, Miller B, Popov A, Shepelev D, Stepanyan K. New Approaches to the Integration of Navigation Systems for Autonomous Unmanned Vehicles (UAV). Sensors. 2018; 18(9):3010. https://doi.org/10.3390/s18093010

Chicago/Turabian Style

Konovalenko, Ivan, Elena Kuznetsova, Alexander Miller, Boris Miller, Alexey Popov, Denis Shepelev, and Karen Stepanyan. 2018. "New Approaches to the Integration of Navigation Systems for Autonomous Unmanned Vehicles (UAV)" Sensors 18, no. 9: 3010. https://doi.org/10.3390/s18093010

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop