Next Article in Journal
Lower-Order Compensation Chain Threshold-Reduction Technique for Multi-Stage Voltage Multipliers
Next Article in Special Issue
On Target Localization Using Combined RSS and AoA Measurements
Previous Article in Journal
Mechanical Characterization of Polysilicon MEMS: A Hybrid TMCMC/POD-Kriging Approach
Article Menu
Issue 4 (April) cover image

Export Article

Open AccessArticle
Sensors 2018, 18(4), 1244; doi:10.3390/s18041244

Tightly-Coupled GNSS/Vision Using a Sky-Pointing Camera for Vehicle Navigation in Urban Areas

Position, Location and Navigation (PLAN) Group, Department of Geomatics Engineering, Schulich School of Engineering, University of Calgary, 2500 University Drive, N.W., Calgary, AB T2N 1N4, Canada
*
Author to whom correspondence should be addressed.
Received: 27 February 2018 / Revised: 3 April 2018 / Accepted: 9 April 2018 / Published: 17 April 2018
(This article belongs to the Collection Positioning and Navigation)

Abstract

This paper presents a method of fusing the ego-motion of a robot or a land vehicle estimated from an upward-facing camera with Global Navigation Satellite System (GNSS) signals for navigation purposes in urban environments. A sky-pointing camera is mounted on the top of a car and synchronized with a GNSS receiver. The advantages of this configuration are two-fold: firstly, for the GNSS signals, the upward-facing camera will be used to classify the acquired images into sky and non-sky (also known as segmentation). A satellite falling into the non-sky areas (e.g., buildings, trees) will be rejected and not considered for the final position solution computation. Secondly, the sky-pointing camera (with a field of view of about 90 degrees) is helpful for urban area ego-motion estimation in the sense that it does not see most of the moving objects (e.g., pedestrians, cars) and thus is able to estimate the ego-motion with fewer outliers than is typical with a forward-facing camera. The GNSS and visual information systems are tightly-coupled in a Kalman filter for the final position solution. Experimental results demonstrate the ability of the system to provide satisfactory navigation solutions and better accuracy than the GNSS-only and the loosely-coupled GNSS/vision, 20 percent and 82 percent (in the worst case) respectively, in a deep urban canyon, even in conditions with fewer than four GNSS satellites. View Full-Text
Keywords: visual odometry; upward-facing camera; motion estimation; satellites; GNSS; tightly-coupled integration; vehicle navigation; image segmentation; clustering algorithms visual odometry; upward-facing camera; motion estimation; satellites; GNSS; tightly-coupled integration; vehicle navigation; image segmentation; clustering algorithms
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Share & Cite This Article

MDPI and ACS Style

Gakne, P.V.; O’Keefe, K. Tightly-Coupled GNSS/Vision Using a Sky-Pointing Camera for Vehicle Navigation in Urban Areas. Sensors 2018, 18, 1244.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top