Next Article in Journal
A Self-Adaptive Progressive Support Selection Scheme for Collaborative Wideband Spectrum Sensing
Next Article in Special Issue
Benefits of Multi-Constellation/Multi-Frequency GNSS in a Tightly Coupled GNSS/IMU/Odometry Integration Algorithm
Previous Article in Journal
A Simple and Low-Cost Optical Fiber Intensity-Based Configuration for Perfluorinated Compounds in Water Solution
Previous Article in Special Issue
Performance Characterization of GNSS/IMU/DVL Integration under Real Maritime Jamming Conditions
Open AccessArticle

New Approaches to the Integration of Navigation Systems for Autonomous Unmanned Vehicles (UAV)

Institute for Information Transmission Problems RAS, Bolshoy Karetny per. 19, build.1, Moscow 127051, Russia
Author to whom correspondence should be addressed.
Sensors 2018, 18(9), 3010;
Received: 1 June 2018 / Revised: 31 August 2018 / Accepted: 4 September 2018 / Published: 8 September 2018
(This article belongs to the Special Issue GNSS and Fusion with Other Sensors)
The article presents an overview of the theoretical and experimental work related to unmanned aerial vehicles (UAVs) motion parameters estimation based on the integration of video measurements obtained by the on-board optoelectronic camera and data from the UAV’s own inertial navigation system (INS). The use of various approaches described in the literature which show good characteristics in computer simulations or in fairly simple conditions close to laboratory ones demonstrates the sufficient complexity of the problems associated with adaption of camera parameters to the changing conditions of a real flight. In our experiments, we used computer simulation methods applying them to the real images and processing methods of videos obtained during real flights. For example, it was noted that the use of images that are very different in scale and in the aspect angle from the observed images in flight makes it very difficult to use the methodology of singular points. At the same time, the matching of the observed and reference images using rectilinear segments, such as images of road sections and the walls of the buildings look quite promising. In addition, in our experiments we used the projective transformation matrix computation from frame to frame, which together with the filtering estimates for the coordinate and angular velocities provides additional possibilities for estimating the UAV position. Data on the UAV position determining based on the methods of video navigation obtained during real flights are presented. New approaches to video navigation obtained using the methods of conjugation rectilinear segments, characteristic curvilinear elements and segmentation of textured and colored regions are demonstrated. Also the application of the method of calculating projective transformations from frame-to-frame is shown which gives estimates of the displacements and rotations of the apparatus and thereby serves to the UAV position estimation by filtering. Thus, the aim of the work was to analyze various approaches to UAV navigation using video data as an additional source of information about the position and velocity of the vehicle. View Full-Text
Keywords: UAV; videonavigation; projective geometry; feature points; Kalman filter UAV; videonavigation; projective geometry; feature points; Kalman filter
Show Figures

Figure 1

MDPI and ACS Style

Konovalenko, I.; Kuznetsova, E.; Miller, A.; Miller, B.; Popov, A.; Shepelev, D.; Stepanyan, K. New Approaches to the Integration of Navigation Systems for Autonomous Unmanned Vehicles (UAV). Sensors 2018, 18, 3010.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

Search more from Scilit
Back to TopTop