# An Autonomous Vehicle Navigation System Based on Inertial and Visual Sensors

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Coordinate Systems and Kalman Filter

#### 2.1. The Reference Coordinate Systems

- Coordinate: Earth-Centered Initially Fixed (ECIF) orthogonal reference coordinate system;
- t-coordinate: Orthogonal reference frame aligned with East-North-Up (ENU) geographic coordinate system;
- b-coordinate: Body coordinate system;
- n-coordinate: Navigation coordinate system;
- c-coordinate: Camera coordinate system;
- im-coordinate: Image coordinate system.

#### 2.2. Kalman Filter

## 3. Visual Image Processing

_{im}is in the im-coordinate, and its angle feature parameter is θ

_{im}. The projection of ${l}_{im}$ in c-coordinate is ${l}_{c}$, and the angle feature parameter is ${\theta}_{c}$. According to the principle of camera imaging, the image plane parallels with the ${x}_{c}{y}_{c}$ plane in the c-coordinate. So, ${l}_{c}//{l}_{im}$. Thus, ${\theta}_{c}={\theta}_{im}$. In other words, the feature angle in the c-coordinate is the same as within the im-coordinate.

_{c}axis. Thus, when the vehicle is rolling, θ

_{c}will be changed. Additionally, the pitching and heading of the vehicle cannot change ${\theta}_{c}$. Thus, we can use ${\theta}_{c}$ to describe the vehicle-rolling angle. We define the vehicle rolling angle error as $\Delta {\theta}_{c}$, it is the difference between ${\theta}_{c}$ and the mean value of ${\overline{\theta}}_{c}$. Thus, the visual attitude error $\Delta {A}_{c}$ in the c-coordinate is derived as

## 4. The Proposed Fusion Algorithm

## 5. Experiment and Results

#### 5.1. Camera Information Pre-Processing

#### 5.2. Experimental Results and Discussions

^{−50}and 8 × 10

^{−50}, respectively. The GPS position can prove that the position estimations of the other three navigation modes is receivable. For the only-SINS, the amplitude variation of latitude and longitude are 1.4236 × 10

^{−60}and 6.3310 × 10

^{−70}, respectively. For the direct inertial-visual integrated navigaiton system, the amplitude variation of latitude and longitude are 9.9625 × 10

^{−70}and 6.7123 × 10

^{−70}, respectively. The position range is not obvious difference with only-SINS. Thus, the direct integrated method cannot improve the accuracy of the navigation system. For the proposed inertial-visual integrated navigaiton system, the amplitude variation of latitude and longitude are 8.3885 × 10

^{−70}and 3.5869 × 10

^{−70}, respectively. The position estimation of the proposed inertial-visual integrated navigation system is more stable than that of only-SINS and direct inertial-visual integrated navigation system. It also can be reflected by the position standard deviation, as listed in Table 1. Figure 9 shows that the proposed inertial-visual integrated navigation system improves the accuracy of the position estimation.

## 6. Conclusions

## Author Contributions

## Funding

## Acknowledgments

## Conflicts of Interest

## References

- Zhao, J.D.; Liang, B.D.; Chen, Q.X. The key technology toward the self-driving car. Int. J. Intell. Autom. Syst.
**2018**, 6, 2–20. [Google Scholar] [CrossRef] - Liu, M.; Gao, Y.B.; Li, G.C.; Guang, X.; Li, S. An Improved Alignment Method for the Strapdown Inertial Navigation System (SINS). Sensors
**2016**, 16, 621. [Google Scholar] [CrossRef] [PubMed] - Zhou, J.; Nie, X.M.; Lin, J. A novel laser Doppler velocimeter and its integrated navigation system with strapdown inertial navigation. Opt. Laser Technol.
**2014**, 64, 319–323. [Google Scholar] [CrossRef] - Xu, F.; Fang, J.C. Velocity and position error compensation using strapdown inertial navigation system/celestial navigation system integration based on ensemble neural network. Aerosp. Sci. Technol.
**2008**, 12, 302–307. [Google Scholar] [CrossRef] - Zega, V.; Comi, C.; Minotti, P.; Langfelder, G.; Falorni, L.; Corigliano, A. A new MEMS three-axial frequency-modulated (FM) gyroscope: A mechanical perspective. Eur. J. Mech. A Solids
**2018**, 70, 203–212. [Google Scholar] [CrossRef] - Minotti, P.; Dellea, S.; Mussi, G.; Bonfanti, A.; Facchinetti, S.; Tocchio, A.; Zega, V.; Comi, C.; Lacaita, A.L.; Langfelder, G. High Scale-Factor Stability Frequency-Modulated MEMS Gyroscope: 3-Axis Sensor and Integrated Electronics Design. IEEE Trans. Ind. Electron.
**2018**, 65, 5040–5050. [Google Scholar] [CrossRef] - Grant, M.J.; Digonner, M.J.F. Double-Ring Resonator Optical Gyroscopes. J. Lightwave Technol.
**2018**, 36, 2708–2715. [Google Scholar] [CrossRef] - Yang, J.; Shi, C.W.; Yang, F.; Han, G.; Ning, J.; Yang, F.; Wang, X. Design and Simulation of a Novel Piezoelectric ALN-Si Cantilever Gyroscope. Micromachines
**2018**, 9, 81. [Google Scholar] [CrossRef] - Li, J.; Yang, J.Y.; Zhou, W.; Yin, R.; Zhou, Q.; Wang, M. Design and fabrication of GaAs-integrated optical chip used in a fiber optic gyroscope. In Proceedings of the Integrated Optoelectronics II, Beijing, China, 18–19 September 1998; pp. 157–162. [Google Scholar] [CrossRef]
- Li, T.; Zhang, H.P.; Gao, Z.Z.; Chen, Q.J.; Niu, X. High-Accuracy Positioning in Urban Environments Using Single-Frequency Multi-GNSS RTK/MEMS-IMU Integration. Remote Sens.
**2018**, 10, 205. [Google Scholar] [CrossRef] - Sun, G.H.; Zhu, Z.H. Fractional order tension control for stable and fast tethered satellite retrieval. Acta Astronaut.
**2014**, 104, 304–312. [Google Scholar] [CrossRef] - Zhang, T.; Xu, X.S. A new method of seamless land navigation for GPS/INS integrated system. Measurement
**2012**, 45, 691–701. [Google Scholar] [CrossRef] - Hu, Y.W.; Gong, J.W.; Jiang, Y.; Liu, L.; Xiong, G.; Chen, H. Hybrid Map-Based Navigation Method for Automatic Ground Vehicle in Urban Scenario. Remote Sens.
**2013**, 5, 3662–3680. [Google Scholar] [CrossRef] - Atia, M.M.; Hilal, A.R.; Stellings, C.; Hartwell, E.; Toonstra, J.; Miners, W.B.; Basir, O.A. A Low-Cost Lane-Determination System Using GNSS/IMU Fusion and HMM-Based Multistage Map Matching. IEEE Trans. Intell. Transp. Syst.
**2017**, 18, 3027–3037. [Google Scholar] [CrossRef] - Chambers, A.; Scherer, S.; Yoder, L.; Jain, S. Robust multi-sensor fusion for micro aerial vehicle navigation in GPS-degraded/denied environments. In Proceedings of the American Control Conference (ACC): 1892–1899, Portland, OR, USA, 4–6 December 2014. [Google Scholar]
- Huang, W.L.; Wen, D.; Geng, J.; Zheng, N.-N. Task-Specific Performance Evaluation of UGVs: Case Studies at the IVFC. IEEE Trans. Intell. Transp. Syst.
**2014**, 15, 1969–1979. [Google Scholar] [CrossRef] - Satzoda, R.K.; Trivedi, M.M. Drive Analysis Using Vehicle Dynamics and Vision-Based Lane Semantics. IEEE Trans. Intell. Transp. Syst.
**2015**, 16, 9–18. [Google Scholar] [CrossRef] [Green Version] - Vivacqua, R.; Vassallo, R.; Martins, F. A Low Cost Sensors Approach for Accurate Vehicle Localization and Autonomous Driving Application. Sensors
**2017**, 17, 2359. [Google Scholar] [CrossRef] [PubMed] - Sun, Q.; Zhang, Y.; Wang, J.; Gao, W. An Improved FAST Feature Extraction Based on RANSAC Method of Vision/SINS Integrated Navigation System in GNSS-Denied Environments. Adv. Space Res.
**2017**, 60, 2660–2671. [Google Scholar] [CrossRef] - Qing, Y.Y.; Zhang, H.Y.; Wang, S.H. Principles of Kalman Filtering and Integrated Navigation, 2nd ed.; Northwestern University of Technology Press: Xi’an, China, 2012. [Google Scholar]
- Qin, Y.Y. Inertial Navigation, 2nd ed.; Science Press: Beijing, China, 2014. [Google Scholar]
- Corke, P. Robotics, Vision and Control: Fundamental Algorithms in MATLAB, 2nd ed.; Springer Publishing Company: New York, NY, USA, 2013. [Google Scholar]
- Lee, S.H.; Lee, S.K.; Choi, J.S. Correction of radial distortion using a planar checkerboard pattern and its image. IEEE Trans. Consum. Electron.
**2009**, 55, 27–33. [Google Scholar] [CrossRef] - Shu, X.; Wu, X. Real-time High-Fidelity Compression for Extremely High Frame Rate Video Cameras. IEEE Trans. Comput. Imaging
**2018**, 4, 172–180. [Google Scholar] [CrossRef] - Hu, G.; Gao, S.; Zhong, Y. A derivative UKF for tightly coupled INS/GPS integrated navigation. ISA Trans.
**2015**, 56, 135–144. [Google Scholar] [CrossRef] [PubMed] - Hu, G.; Wang, W.; Zhong, Y.; Gao, B.; Gu, C. A new direct filtering approach to INS/GNSS integration. Aerosp. Sci. Technol.
**2018**, 77, 755–764. [Google Scholar] [CrossRef]

**Figure 2.**The relationships of the im-, c-, and b-coordinate. (

**a**) The relationship between the im- and c-coordinate; and (

**b**) the relationship between c- and b-coordinate.

**Figure 3.**The schematic diagram of inertial-visual integrated navigation system. The strapdown inertial navigation system (SINS) and the camera are fixed on the vehicle. When i = 1, the P

_{0}, V

_{0}, and A

_{0}are the original positions, velocity, and attitude of the vehicle, respectively, in this paper, they are offered by Global Position System (GPS).

**Figure 5.**Image processing. (

**a**) The original image of the camera; (

**b**) the calibrated image of (

**a**); (

**c**) one of the original images during the experiment; (

**d**) the calibrated image of (

**c**) and the line feature parameter $\theta $.

**Figure 7.**Static attitude error of the integrated navigation system that inertial and visual sensors fused directly.

**Figure 8.**Static attitude error of the integrated navigation system that inertial and visual sensors fused as proposed.

**Figure 9.**The position estimation of GPS, only-SINS, direct inertial-visual integrated navigation system, and proposed inertial-visual integrated navigation system.

**Table 1.**The Position Standard Deviation of GPS, only-SINS direct inertial-visual integrated navigation system and proposed inertial-visual integrated navigation system.

Latitude | Longitude | |
---|---|---|

GPS | 8.490414716838 × 10^{−6} | 2.278273360829 × 10^{−5} |

only-SINS | 2.367825000248 × 10^{−7} | 1.528532246718 × 10^{−7} |

direct integrated | 2.039732103084 × 10^{−7} | 7.745355384118 × 10^{−8} |

proposed integrated | 7.260776501218 × 10^{−8} | 5.653344095192 × 10^{−8} |

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Guang, X.; Gao, Y.; Leung, H.; Liu, P.; Li, G.
An Autonomous Vehicle Navigation System Based on Inertial and Visual Sensors. *Sensors* **2018**, *18*, 2952.
https://doi.org/10.3390/s18092952

**AMA Style**

Guang X, Gao Y, Leung H, Liu P, Li G.
An Autonomous Vehicle Navigation System Based on Inertial and Visual Sensors. *Sensors*. 2018; 18(9):2952.
https://doi.org/10.3390/s18092952

**Chicago/Turabian Style**

Guang, Xingxing, Yanbin Gao, Henry Leung, Pan Liu, and Guangchun Li.
2018. "An Autonomous Vehicle Navigation System Based on Inertial and Visual Sensors" *Sensors* 18, no. 9: 2952.
https://doi.org/10.3390/s18092952