Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (2)

Search Parameters:
Keywords = videonavigation

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
20 pages, 2979 KiB  
Article
UAV Landing Based on the Optical Flow Videonavigation
by Alexander Miller, Boris Miller, Alexey Popov and Karen Stepanyan
Sensors 2019, 19(6), 1351; https://doi.org/10.3390/s19061351 - 18 Mar 2019
Cited by 30 | Viewed by 6381
Abstract
An automatic landing of an unmanned aerial vehicle (UAV) is a non-trivial task requiring a solution of a variety of technical and computational problems. The most important is the precise determination of altitude, especially at the final stage of approaching to the earth. [...] Read more.
An automatic landing of an unmanned aerial vehicle (UAV) is a non-trivial task requiring a solution of a variety of technical and computational problems. The most important is the precise determination of altitude, especially at the final stage of approaching to the earth. With current altimeters, the magnitude of measurement errors at the final phase of the descent may be unacceptably high for constructing an algorithm for controlling the landing manoeuvre. Therefore, it is desirable to have an additional sensor, which makes possible to estimate the height above the surface of the runway. It is possible to estimate all linear and angular UAV velocities simultaneously with the help of so-called optical flow (OF), determined by the sequence of images recorded by an onboard camera, however in pixel scale. To transform them into the real metrical values it is necessary to know the current flight altitude and the camera angular position values. The critical feature of the OF is its susceptibility to the camera resolution and the shift rate of the observed scene. During the descent phase of flight, these parameters change at least one hundred times together with the altitude. Therefore, for reliable application of the OF one needs to coordinate the shooting parameters with the current altitude. However, in case of the altimeter fault presence, the altitude is also still to be estimated with the aid of the OF, so one needs to have another tool for the camera control. One of the possible and straightforward ways is the camera resolution change by pixels averaging in computer part which performed in coordination with theoretically estimated and measured OF velocity. The article presents results of such algorithms testing from real video sequences obtained in flights with different approaches to the runway with simultaneous recording of telemetry and video data. Full article
Show Figures

Figure 1

23 pages, 5429 KiB  
Article
New Approaches to the Integration of Navigation Systems for Autonomous Unmanned Vehicles (UAV)
by Ivan Konovalenko, Elena Kuznetsova, Alexander Miller, Boris Miller, Alexey Popov, Denis Shepelev and Karen Stepanyan
Sensors 2018, 18(9), 3010; https://doi.org/10.3390/s18093010 - 8 Sep 2018
Cited by 20 | Viewed by 5193
Abstract
The article presents an overview of the theoretical and experimental work related to unmanned aerial vehicles (UAVs) motion parameters estimation based on the integration of video measurements obtained by the on-board optoelectronic camera and data from the UAV’s own inertial navigation system (INS). [...] Read more.
The article presents an overview of the theoretical and experimental work related to unmanned aerial vehicles (UAVs) motion parameters estimation based on the integration of video measurements obtained by the on-board optoelectronic camera and data from the UAV’s own inertial navigation system (INS). The use of various approaches described in the literature which show good characteristics in computer simulations or in fairly simple conditions close to laboratory ones demonstrates the sufficient complexity of the problems associated with adaption of camera parameters to the changing conditions of a real flight. In our experiments, we used computer simulation methods applying them to the real images and processing methods of videos obtained during real flights. For example, it was noted that the use of images that are very different in scale and in the aspect angle from the observed images in flight makes it very difficult to use the methodology of singular points. At the same time, the matching of the observed and reference images using rectilinear segments, such as images of road sections and the walls of the buildings look quite promising. In addition, in our experiments we used the projective transformation matrix computation from frame to frame, which together with the filtering estimates for the coordinate and angular velocities provides additional possibilities for estimating the UAV position. Data on the UAV position determining based on the methods of video navigation obtained during real flights are presented. New approaches to video navigation obtained using the methods of conjugation rectilinear segments, characteristic curvilinear elements and segmentation of textured and colored regions are demonstrated. Also the application of the method of calculating projective transformations from frame-to-frame is shown which gives estimates of the displacements and rotations of the apparatus and thereby serves to the UAV position estimation by filtering. Thus, the aim of the work was to analyze various approaches to UAV navigation using video data as an additional source of information about the position and velocity of the vehicle. Full article
(This article belongs to the Special Issue GNSS and Fusion with Other Sensors)
Show Figures

Figure 1

Back to TopTop