Next Article in Journal
A Textile Sleeve for Monitoring Oxygen Saturation Using Multichannel Optical Fibre Photoplethysmography
Next Article in Special Issue
Texture Synthesis Repair of RealSense D435i Depth Images with Object-Oriented RGB Image Segmentation
Previous Article in Journal
Eliminating the Effect of Reflectance Properties on Reconstruction in Stripe Structured Light System
Previous Article in Special Issue
Multi-Constellation Software-Defined Receiver for Doppler Positioning with LEO Satellites
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Optical and Mass Flow Sensors for Aiding Vehicle Navigation in GNSS Denied Environment

1
Department of Geomatics Engineering, University of Calgary, Calgary, AB T2N 1N4, Canada
2
Department of Electrical and Computer Engineering, Port-Said University, Port-Said 42523, Egypt
3
Public Works Department, Ain Shams University, Cairo 11517, Egypt
*
Authors to whom correspondence should be addressed.
Sensors 2020, 20(22), 6567; https://doi.org/10.3390/s20226567
Submission received: 18 October 2020 / Revised: 5 November 2020 / Accepted: 14 November 2020 / Published: 17 November 2020

Abstract

:
Nowadays, autonomous vehicles have achieved a lot of research interest regarding the navigation, the surrounding environmental perception, and control. Global Navigation Satellite System/Inertial Navigation System (GNSS/INS) is one of the significant components of any vehicle navigation system. However, GNSS has limitations in some operating scenarios such as urban regions and indoor environments where the GNSS signal suffers from multipath or outage. On the other hand, INS standalone navigation solution degrades over time due to the INS errors. Therefore, a modern vehicle navigation system depends on integration between different sensors to aid INS for mitigating its drift during GNSS signal outage. However, there are some challenges for the aiding sensors related to their high price, high computational costs, and environmental and weather effects. This paper proposes an integrated aiding navigation system for vehicles in an indoor environment (e.g., underground parking). This proposed system is based on optical flow and multiple mass flow sensors integrations to aid the low-cost INS by providing the navigation extended Kalman filter (EKF) with forward velocity and change of heading updates to enhance the vehicle navigation. The optical flow is computed for frames taken using a consumer portable device (CPD) camera mounted in the upward-looking direction to avoid moving objects in front of the camera and to exploit the typical features of the underground parking or tunnels such as ducts and pipes. On the other hand, the multiple mass flow sensors measurements are modeled to provide forward velocity information. Moreover, a mass flow differential odometry is proposed where the vehicle change of heading is estimated from the multiple mass flow sensors measurements. This integrated aiding system can be used for unmanned aerial vehicles (UAV) and land vehicle navigations. However, the experimental results are implemented for land vehicles through the integration of CPD with mass flow sensors to aid the navigation system.

1. Introduction

The Global Navigation Satellite System (GNSS) and Inertial Navigation System (INS) integration is the main integrated navigation component in land vehicles. GNSS provides reliable long-term estimates for position and velocity [1]. However, open sky conditions should be fulfilled [2]. INS provides accurate short-term position, velocity, and attitudes estimates for navigation applications [3]. However, this navigation information is degraded over time because of the INS drift due to the sensors (accelerometers and gyroscopes) errors [1], especially when using low-cost INS (MEMS-based) [4]. Therefore, GNSS/INS integration overcomes the drawbacks of each sensor if used in standalone mode and provides a better navigation estimate [5] where GNSS provides position and velocity as updates to the navigation filter, and the INS is responsible for the prediction stage in the integration filter [6].
GNSS/INS integration may suffer in some operating environments such as urban canyons, tunnels, and underground parking [7] where the GNSS signals are degraded or blocked for a long time [8], and the INS only provides the navigation solution, which is deteriorated after a short time because of drifts and biases of accelerometer and gyroscopes [9]. For example, Equation (1) shows the position error due to the INS biases.
δ P = 1 2 b a t 2 + 1 6 b ω g t 3
where δP is the position error due to the accelerometer bias (ba), and the gyroscope bias (bω), g is the gravitational acceleration, t is the time of operation in standalone mode.
Therefore, INS should be aided by other sensors to reduce the drift of the final navigation solution. Odometers [10], magnetometers [11], ultrasonic sensors [12], light detection and ranging (LIDAR) [13], radio detection and ranging (RADAR) [14], and cameras [15] are examples of the aiding sensors that can be used for aiding INS. Moreover, land vehicle motion constraints can also be applied to aid the INS during GNSS signal outages [16]. Non-holonomic constraints (NHC), Zero velocity UPdaTe (ZUPT), and Zero Integrated Heading Rate (ZIHR) [17] are examples of the motion constraints. Finally, maps may be used to aid the land vehicle navigation [18] to reduce the INS large drift. However, some drawbacks are accompanied when using these sensors.
Odometer is the most common sensor that provides velocity information for land vehicles. Unfortunately, some errors affect the precision of the estimated forward velocity using a regular encoder. The wheel, the road surface, and the encoder itself are three sources of odometer errors. The wheel misalignment and the unequal wheel diameter are examples of the errors related to the wheel [19]. The road surface may affect the estimated forward velocity using odometers such as slippery surfaces due to ice or rain. Mountainous roads may cause wheel slipping and skidding [20]. The encoder resolution and sampling rate may affect the precision of the estimated velocity [19].
Magnetometer aids the INS to mitigate its heading drift during GNSS signal outages by providing heading updates. However, the estimated heading may be affected by the magnetic interference of the surrounding environment [21], especially in indoor scenarios (e.g., underground parking) for land vehicles. Ultrasonic sensors are typically used for obstacle avoidance [22] in land vehicles. However, [23] used the ultrasonic sensor as an aiding sensor by providing velocity and change of heading updates [24] to the navigation filter. Nevertheless, the sensor installment process and the estimated velocity and the heading change precision are the major drawbacks when using ultrasonic in land vehicle navigation.
LiDAR is used in environmental perception and navigation in land vehicles [25]. Unfortunately, there are major drawbacks related to their high price, the computational and processing cost, and the surrounding environment effects (i.e., the LiDAR measurements are affected by the dynamic objects), which lead to mismatching issues. Moreover, LiDAR data is affected by rainy weather. RADAR is usually used for collision avoidance [26], lane detection [27] as well as navigation in land vehicles though the high computational costs and high power consumption are major RADAR problems.
In the case of indoor scenarios, a beacon-based navigation [28,29] system may be used as aiding sensors for land vehicles [30]. However, additional hardware setups for the beacons transmitters and receivers is one of the main drawbacks of using such a system. Vision sensor is used in land vehicles as environmental awareness. Vision can also contribute in the navigation solution either by vision-based [31] or vision-aided navigation system [16]. Visual odometry (VO) is based on the camera motion estimation between successive images [32], where it is implemented by using a single vision sensor or stereo cameras [33].
Optical flow is one of the VO approaches, which is based on the bird and insects flight navigation [34]. The optical flow sensors are typically employed for platform velocity estimation. Previous researches integrated VO with INS [35], and/or magnetometers [34], or fused with motion constraints to aid INS [16]. VO can be integrated with maps, for example; [36] proposed a geo-localization land vehicle system based on visual odometry and public domain map sources for GNSS denied environment. A similar land vehicle navigation system for the urban environment was proposed by [37], where stereo VO using weighted non-linear estimation is implemented and fused with digital maps in which probabilistic map matching is used to limit the VO errors. A VO is proposed by [38] using the rear parking camera of land vehicles and integrated with the GNSS to provide an enhanced navigation solution. Reference [39] has proposed a tightly coupled integration between GNSS and visual information from a single sky pointing camera. The visual information is used to detect the open sky condition for GNSS and to estimate the ego-motion of the vehicle in the non-open sky regions. The VO is based on estimating the essential matrix in which the oriented fast and rotated BRIEF (ORB) is used for the feature extraction and matching in addition to the random sample consensus (RANSAC) algorithm is implemented for outlier rejection.
Another research adopting vision aided inertial navigation system for UAV applications is shown in [40], their approach is based on optical flow and machine learning Gaussian process regression (GPR) to enhance the estimation of the vehicle velocity. They accommodate real-time incremental training session during GNSS availability, then during the unavailability of GNSS the GPR attempt to correct the drift in both INS and VO systems.
There are some challenges for VO in land vehicle navigation. For example, for the camera that is mounted downwards, there are lack of features because of the narrow field of view in addition to the shadowing problems. For the cameras that are mounted facing the vehicle forward direction suffers from the dynamic objects, which lead to false matching. Therefore, a CPD camera that is faced in the upward direction solves some of the challenges regarding the narrow field of view, the shadowing, and the moving objects issues, especially for the indoor environment. Moreover, there are many features in the ceiling for such an environment (underground parking and tunnels) such as ducts and pipelines, which enhance the features detection and matching process. Finally, the vision sensors are not affected by the weather condition in the indoor environment scenarios.
Differential wheel odometry concept has been employed in many robotics and vehicle researches [41]. The idea is based on the vehicle model where the change of heading is estimated using the velocity estimated from multiple sensors mounted on the latitude (transverse) axis of the vehicle or robot. Different researches implemented the differential wheel odometry using different sensors such as [42] used multiple RADAR sensors, [23] used dual ultrasonic sensor, [43,44] used the anti-lock braking system (ABS) to provide velocity and heading information to the navigation system. CAN (Controller Area Network) bus may provide some useful information on the vehicle dynamics such as the forward velocity and steering angle data. However, the commercial on-board diagnostics (OBD-II) provides the velocity information and does not typically provide the steering angle data unless additional customized hardware and software designs are developed [45]. Previous researches used consumer portable devices (CPD) in many land vehicle applications such as monitoring the driving condition, evaluating the road surface quality [46], telematics [47], roadside signs recognition [48], lane tracking [49,50], mapping [51], and navigation [52]. CPDs are used to provide a full navigation solution as in [53] utilized the iPhone 4 inertial sensors along with the NHC for land vehicle navigation states estimation. Other researches used CPDs in aiding the land vehicle navigation, as in [54], which mounted CPDs on a land vehicle steering wheel to estimate the steering wheel angle by the CPD accelerometers and compute the heading change which is used as an update in the navigation filter to aid the low-cost INS during GNSS signal outage.
Therefore, searching for a new configuration for the typical aiding sensors (up-looking camera) and new aiding sensors (Mass Flow meter) to assist the low-cost INS during GNSS signal outage to improve the land vehicle navigation solution.
Section 2 describes the system overview for the proposed integrated aiding navigation system that is composed of a monocular camera and mass air flow sensors then the integration scheme is illustrated along with different integration navigation schemes (measurements based and federated based fusion techniques). Section 3 discusses the experimental results where the velocity and heading change estimation results are described along with the navigation solution. Finally, the conclusion results are illustrated.

2. System Overview

An integrated system is proposed to aid the INS during GNSS signal outage, especially in indoor scenarios such as underground parking and tunnels. The aiding system is composed of a CPD camera and multiple mass flow meters to provide the navigation filter with both velocity and change of heading information. The next subsections describe in detail the used aiding sensors in the proposed integrated system.

2.1. Monocular Visual Odometry

A VO is proposed using an up-looking CPD camera where X and Y optical flows are extracted through feature detection by Speeded Up Robust Features (SURF) detector and M-estimator Sample Consensus (MSAC) algorithm for outlier detection.
The camera is mounted in the up-looking direction and parallel to the ceiling. This setup avoids the dynamic objects in the camera scene and simplifies the velocity estimation by alleviating roll and pitch compensation. This setup uses the ceiling features such as pipes/ducts but suffers from the abrupt illumination changes due to the lights. The optical flow equations are described in [55] as shown in Equation (2).
p = f Z P
where P is the space point with coordinate (X,Y,Z) and p is the image point with (x,y,f) coordinates, and f is the focal length of the camera. Z is the range between the camera and the ceiling of the underground parking or tunnel, which may be estimated using an ultrasonic sensor. The image features displacements are multiplied by the observation rate to determine the optical vectors. MSAC is used to choose a single representative optical vector (u, v) based on observation consensus. Then the velocity is calculated in Equation (3) where s is the pixel size.
V V O = s f u Z

2.2. Mass Flow Sensors

The mass flow sensor is used to measure the flow rate of the air and gases. It is typically used in laboratory and medical applications. In this paper, multiple mass flow sensor measurements are manipulated to estimate the forward velocity and the change of heading of land vehicles to aid the INS in indoor environments such as tunnels and underground parking where GNSS signal is not available. This section is divided into two subsections where the first one describes the forward velocity estimation using the mass flow sensors while the second subsection illustrates the land vehicle heading estimation using the proposed mass flow differential wheel odometry

2.2.1. Mass Flow Velocity Estimation

The relation between the multiple mass flow sensors measurements and the computed velocity from a reference tactical grade INS is estimated through a linear regression model as shown in Figure 1. The mass flow velocity regression model is expected in an indoor environment to avoid the effect of the wind for outdoor scenarios.
For indoor scenarios, the multiple mass flow sensors use the predetermined regression model to estimate the land vehicle forward velocity to aid the low-cost INS for mitigating its large drift. Figure 2 exhibits the forward velocity estimation by multiple mass flow sensors and its integration with low-cost INS during GNSS signal outage.

2.2.2. Mass Flow Heading Change Estimation

The change of heading computation is based on the differential wheel odometry, which depends on the vehicle model concept [56]. Figure 3 and Equations (4)–(6) describe the heading change computation using multiple mass flow sensors.
V M F l = V cos ( δ ) S p a c i n g 2 ( H e a d i n g c h a n g e )
V M F r = V cos ( δ ) + S p a c i n g 2 ( H e a d i n g c h a n g e )
H e a d i n g c h a n g e = V M F r V M F l S p a c i n g
where the velocity (V) and the rotation angle (δ) is at the vehicle’s center of gravity.
The land vehicle heading change is estimated by differencing the velocities computed from the left and right mass flow sensors and divided by the spacing between them. The spacing between the multiple mass flow sensors is determined using linear measurements. However, there are two sources of errors when estimating the heading change, which are the two mass flow velocities and the spacing distance. Therefore, the system is calibrated using tactical grade INS through linear regression to compute the bias and the scale factor for the heading computation, as shown in Figure 4.
The scale factor and the bias are used to compensate for the errors of the heading change estimation using the proposed differential mass flow odometry. Figure 5 exhibits the flowchart of the differential mass flow odometry to aid the low-cost INS during GNSS signal outage.
It is worth mentioning that the Z gyroscope is integrated with the mass flow sensors in estimating the land vehicle change of heading as it provides a straight motion constraint to the proposed estimating method with the same concept that was described by [23]. Figure 6 shows the flowchart of the straight motion constraint using Z gyroscope.
On the one hand, when the Z gyroscope angular rate measurements are less than a certain threshold, then the vehicle is in a nearly straight motion, and there is no need to acquire any heading information from the multiple mass flow sensors. On the other hand, when the gyroscope measurements exceed this threshold, then the vehicle is in a turn motion state, and the proposed differential mass flow odometry provides the navigation filter with the change of heading updates.

2.3. Integration Scheme

Two sensors integration techniques are implemented, which are the measurement integration and federated integration methods. On one side, the integration of the measurement is based on fusing the aiding sensors’ observations with the INS in one navigation filter, which is the extended Kalman filter (EKF).
On the other side, the federated integration scheme depends on integrating the navigation solution where each aiding sensor is fused separately with the INS through banks of KF to provide a navigation solution and then these solutions are fused through weighted average least square adjustment (LSA) to determine one integrated solution.
During GNSS availability, GNSS/INS loosely coupled integration is used to estimate the navigation solution through EKF. On the other hand, the proposed aiding navigation system provides the navigation filter with the velocity and heading change updates to aid the low-cost INS during GNSS signal outage.
EKF states include the navigation states’ errors as well as the INS sensor error states (biases and scale factors). The error states vector δx consists of 21 states as follows:
δ x 1 × 21 = δ P 1 × 3 δ v 1 × 3 δ α 1 × 3 b i a s a 1 × 3 b i a s g 1 × 3 S F a 1 × 3 S F g 1 × 3
where δP, δv, and δα are the position, the velocity, and the attitude error states, respectively. biasa and biasg are the biases of the accelerometers and the gyroscopes, respectively. Finally, SFa and SFg are the scale factor of the accelerometers and gyroscopes, respectively.
KF main phases are the prediction and the update stages [57]. The system model defines the time evolution of the navigation states and counts for the prediction stage while the observation model affords the updates to the navigation filter.
The system and the prediction stages equations are described in the following equations [57].
The system stage:
x ˙ ( t ) = F ( t ) x ( t ) + G ( t ) w ( t )
x k + 1 = ϕ k , k + 1 x k + w   k .
ϕ k , k + 1 = ( I + F Δ t )
Q k = E ( w k w k T ) .
The prediction stage:
x k = ϕ k , k 1 x k 1 +
P k = ϕ k , k 1 P k 1 + ϕ k , k 1 T + Q k 1
where x ˙ is the state vector rate of change, F is the dynamics matrix, x is the error state vector, G is the shaping matrix, and wk is the white noise. ϕk,k+1 is the transition matrix, I is the identity matrix, and Δt is the time interval, Qk is the process noise matrix which describes the uncertainty of the system model. Finally, Pk is the states covariance matrix. (−) refers to the predicted elements, and finally, (+) refers to updated elements [57].
The observation model is described in Equations (13) and (14)
z k = H k x k + η k
R k = E ( η k η k T )
where zk is the observational vector, Hk is the design matrix, ηk is the measurement noise. Rk is the covariance matrix of the measurement noise, describes the uncertainty of the observations.
When the forward velocity updates vb=[vMF 0 0]T (for example mass airflow velocity updates) are applied to the navigation filter, the observation model and the design matrix are described in Equations (15) and (16)
δ z V M F = ( C b l ) 1 v l v b
H V M F ( 3 × 21 ) = [ 0 3 × 3 ( C b l ) 1 3 × 3 ( C b l ) 1 [ v l × ] 3 × 3 0 3 × 12 ]
where vb is the velocity update vector while is vMF is the estimated forward velocity from the Mass Flow sensors, C b l is the rotational matrix between the body frame to the local level frame. vlx is the skew-symmetric matrix of the velocity measured in the local level frame.
On the other side, the heading update observation model and the design matrix are explained in Equations (17) and (18).
δ z A M F = A I N S A M F
H A M F ( 1 × 21 ) = [ 0 1 × 3 0 1 × 3 [ 0 0 1 ] 0 1 × 12 ]
where AINS and AMF are the headings from the INS and the Mass Flow sensors differential odometry respectively.
The update stage is illustrated in Equations (19)–(21) [57].
K k = P k H k T [ H k P k H k T + R k ] 1
x k + = x k + K k [ z k H k x k ]
P k + = [ I K k H k ] P k .
The proposed aiding navigation system consists of an up-looking CPD camera, multiple mass flow sensors, and in-vehicle sensors (odometer). The CPD camera provides the filter with the forward velocity through the optical flow concept. In contrast, the multiple mass flow sensors provide both the velocity and the heading change of the land vehicle. Finally, the vehicle sensors provide velocity update through odometer. These updates are used to aid the low-cost INS to reduce its drift in GNSS denied environment and more specifically, the indoor scenarios such as the tunnels and underground parking. Figure 7 exhibits the flowchart of the proposed aiding system based on the measurement integration scheme.
Figure 8 shows the proposed aiding navigation system based on the federated integration scheme. Each aiding sensor is integrated separately with INS through EKF to provide a navigation solution then, these navigation solutions, along with their precision (P matrix) are fused through weighted average LSA.
The mass flow sensors measurements are modeled using the tactical grade INS from the first beginning. For example, the mass flow regression models are created once. For typical indoor scenario, the CPD camera velocity is regressed using the mass flow estimated velocities for some time before providing the navigation filter with the optical flow speed i.e., estimating the optical flow velocity bias and scale factor for some time from the mass flow sensor, then providing the filter with the regressed optical flow speed. It is essential to mention that the mass flow regression models could be enhanced using the final integrated navigation solution by providing the integrated forward velocity and heading change backward from the EKF.

3. Experimental Results

A real data set was collected at the University of Calgary region, where a part of the trajectory was in the underground parking. Pixhawk 2 board is used, which consists of an Invensense MPU-9250 IMU with 50 Hz data rate and a U-blox GNSS with 5 Hz data rate.
Two mass flow sensor of the model (SFM3000) and an iPhone 6 camera with up-looking setup were used in the test. The iPhone 6 camera is of model Sony Exmor RS with 1.471 μm pixel size, 29.89 mm focal length, with video resolution 1920 × 1080 pixels and 30 frames per second. The mass flow sensor measure flow rate from range −200 slm to +200 slm (slm is the standard liter per minute, which is a unit of volume flow rate of air corrected to temperature and pressure standard conditions) with operating temperature from −20 °C to +80 °C. Figure 9 exhibits the mass flow sensor used in the experiment
The mass flow sensors were mounted on the roof of the car, and the spacing between the mass flow sensors was 0.99 m. A reference navigation system was used in the experiment to form the linear regression of the mass flow sensors and evaluate the performance of the proposed aiding navigation system. The reference system is based on SPAN technology, which consists of NovAtel OEMV GNSS receiver and tactical grade IMU iMAR FSAS. The tactical grade IMU gyroscopes and accelerometers performance are as follows: The angular random walk is 0.1°/√hr, the rate bias is less than 0.75°/hr. In contrast, the accelerometer bias is 1.0 mg, and the scale factor is 300 PPM for both. The gyroscope input range is ± 500°/s, while the accelerometer range is ±5 g. The position performance is around 4 m RMS and 0.15 m/s for velocity RMS for 60 s GNSS signal outage. Uni-link Mini ELM327 OBD-II Bluetooth Scanner Tool is used to access the regular odometer data. It is worth mentioning here that the SPAN data (SPAN GNSS/INS integrated system) was post-processed through a loosely coupled scheme with EKF employing NHC and ZUPT as updates. Moreover, a forward and backward smoothing process is implemented to estimate the reference solution to help evaluating the proposed navigation system.

3.1. Velocity Estimation Results

The proposed aiding navigation system is based on estimating the forward velocity from different aiding sensors (CPD camera and mass flow sensors). The forward velocity estimated by the CPD camera is based on the optical flow, as discussed before, where the vertical distance (Z) between the CPD camera and the underground parking ceiling is pre-surveyed using linear tape measurements. This range may be measured using any ranging sensor such as an ultrasonic sensor.
The mass flow sensors velocity is based on a linear regression model for estimating the relation between the mass flow sensors measurements, and the reference velocity estimated from the tactical grade INS is shown in Figure 10.
Figure 11 shows the velocity estimated from different aiding sensors and the reference velocity determined from the tactical grade INS in the underground parking.
The difference between the forward velocity estimated by the VO and mass flow sensors and the SPAN reference velocity is calculated to evaluate the proposed methods. The root mean square error (RMSE) of the forward velocity estimation is 0.62 m/s and 0.27 m/s for the VO and the mass flow sensors methods, respectively.

3.2. Heading Change Estimation Results

The heading change is estimated using multiple mass flow sensors. The proposed method is based on the differential odometry concept, as described before in Equation (8).
Figure 12 exhibits the heading change estimated by the proposed method and the reference heading change computed from the SPAN tactical grade INS.
The difference between the heading change estimated by the mass flow sensors and the SPAN heading change is calculated to assess the proposed differential mass flow odometry method. The RMSE of the heading change estimation is 2.9°/s.

3.3. Navigation States Estimation Results

Loosely coupled GNSS/INS integration is implemented to estimate the navigation solution for the trajectory. The GNSS signal outage occurred for 190 s. when the vehicle entered the underground parking. The position RMSE of the INS stand-alone solution is around 1.98 km. It is worth mentioning that the low-cost INS standalone solution is implemented without any accelerometers and gyroscopes calibration and without the aid of the motion constraints such as Non-Holonomic Constraints (NHC).
The navigation states are estimated using different updates to investigate their effect on the final navigation solution, as shown in Figure 13.
The velocity updates are computed from the VO, mass flow meters, and an odometer, which is based on the measurement’s fusion technique. The difference between the trajectory of the reference and the estimated navigation solution using different updates are computed to calculate the position RMSE to assess the impact of each update on the final result as shown in Table 1 that describes the position RMSE of different navigation solution using various updates by a measurements fusion technique.
Table 1 shows that aiding the INS with the proposed velocity and heading change updates provide more reliable navigation solution that aid the INS with the regular odometer only during GNSS signal outage as the level of enhancement when using the proposed aiding system reached to around 35% when compared with aiding the INS with the typical odometer.
The Federated fusion technique is implemented to estimate different navigation solutions using various updates to show the difference between this kind of navigation solution integration and the measurement fusion method. Table 2 shows the position RMSE of different solutions using the federated fusion method.
Table 1 and Table 2 show that the measurement fusion technique provides a better navigation solution than the federated one. The performance difference between the measurement integration and federated integration schemes are mainly due to the mechanization correction step adopted in the measurement integration scheme. Any miss estimation of the position at early epoch(s) will be propagated to the next epochs, which induces this bias effect. Figure 14 exhibits the trajectory of the INS/velocity/heading change updates using both the measurements and the federated fusion methods.

4. Conclusions

A multi-modal aiding navigation system based on optical and mass air flow is proposed to aid the low-cost INS in autonomous vehicle navigation for the indoor environment. The proposed system consists of an up-looking camera (CPD) and multiple mass flow sensors where the camera provides forward velocity information through VO while mass flow sensors provide the navigation filter with both velocity and change of heading updates.
The up-looking camera overcomes the lack of features issues for indoor environment in addition to avoiding the dynamic objects, which makes the features detection and matching more reliable. Redundant velocity information for the land vehicle navigation from different sensors is implemented for fault detection and identification to overcome some of the odometers’ drawbacks.
The proposed aiding system provides the heading change information, which is not affected by magnetic interference as the case of the magnetometers. Moreover, the estimated velocity from the mass flow sensors is based on the quantity of air that passes through the sensors due to the land vehicle motion and therefore overcomes the slipping and skidding problems facing the regular wheel odometer and therefore the mass flow sensors better reflect the car motion.
The proposed aiding navigation system may be applied for both land vehicles and unmanned aerial vehicles, especially for the up-looking camera for indoor rescue. The mass flow sensors provide both velocity and heading change with a high data rate up to 120 Hz with good accuracy for the velocity and performance for the heading change, which can be helpful for some applications that face the challenge of harsh dynamics.
The cost of the mass air flow sensors is relatively higher than the other sensors (around 200 dollars) which could be reduced if massively produced for land vehicle navigation. Moreover, extreme light variations of the ceiling may deteriorate the optical flow accuracy.
A land vehicle experiment was conducted, and the results showed that the proposed aiding system enhances the navigation solution dramatically compared with the INS stand-alone solution. Different fusion schemes were implemented, which are the measurement integration scheme and the federated scheme, and the results showed that the measurements based provide a better solution than the federated one.

Author Contributions

This is research work accomplished under the supervision of N.E.-S. M.M. (Mohamed Moussa), S.Z., M.M. (Mostafa Mostafa), and A.M. designed and implemented the proposed algorithm. M.M. (Mohamed Moussa) and S.Z. performed the experiments. M.M. (Mohamed Moussa) wrote the paper. N.E.-S. contributed the sensors used in the experiments. N.E.-S., M.E. and A.M. reviewed and provided feedback on the paper. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by NSERC, and Canada Research Chairs programs.

Acknowledgments

This work was supported by Naser El-Sheimy research funds from NSERC and Canada Research Chairs programs.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Abd Rabbou, M.; El-Rabbany, A. Integration of GPS precise point positioning and MEMS-based INS using unscented particle filter. Sensors 2015, 15, 7228–7245. [Google Scholar] [CrossRef] [Green Version]
  2. Chiang, K.W.; Duong, T.T.; Liao, J.K. The performance analysis of a real-time integrated INS/GPS vehicle navigation system with abnormal GPS measurement elimination. Sensors 2013, 13, 10599–10622. [Google Scholar] [CrossRef] [Green Version]
  3. Iqbal, U.; Georgy, J.; Korenberg, M.J.; Noureldin, A. Augmenting Kalman filtering with parallel cascade identification for improved 2D land vehicle navigation. In Proceedings of the 72nd IEEE Vehicular Technology Conference, VTC Fall 2010, Ottawa, ON, Canada, 6–9 September 2010. [Google Scholar] [CrossRef]
  4. Shin, E.H. Accuracy Improvement of Low Cost INS/GPS for Land Applications. Master’s Thesis, University of Calgary, Calgary, AB, Canada, 2001. [Google Scholar]
  5. Liu, H.; Nassar, S.; El-Sheimy, N. Two-filter smoothing for accurate INS/GPS land-vehicle navigation in urban centers. IEEE Trans. Veh. Technol. 2010, 59, 4256–4267. [Google Scholar] [CrossRef]
  6. Falco, G.; Pini, M.; Marucco, G. Loose and tight GNSS/INS integrations: Comparison of performance assessed in real Urban scenarios. Sensors 2017, 17, 255. [Google Scholar] [CrossRef]
  7. Navidi, N.; Landry, R. A new survey on self-tuning integrated low-cost GPS/INS vehicle navigation system in Harsh environment. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 40, 75–81. [Google Scholar] [CrossRef] [Green Version]
  8. Niu, X.; Nassar, S.; El-Sheimy, N. An Accurate Land-Vehicle MEMS IMU/GPS Navigation System Using 3D Auxiliary Velocity Updates. J. Inst. Navig. 2007, 54, 177–188. [Google Scholar] [CrossRef]
  9. Lambrecht, S.; Nogueira, S.L.; Bortole, M.; Siqueira, A.A.G.; Terra, M.H.; Rocon, E.; Pons, J.L. Inertial sensor error reduction through calibration and sensor fusion. Sensors 2016, 16, 235. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  10. Aftatah, M.; Lahrech, A.; Abounada, A. Fusion of GPS/INS/Odometer measurements for land vehicle navigation with GPS outage. In Proceedings of the 2016 2nd International Conference on Cloud Computing Technologies and Applications (CloudTech), Marrakech, Morocco, 24–26 May 2016; pp. 48–55. [Google Scholar] [CrossRef]
  11. Won, D.; Ahn, J.; Sung, S.; Heo, M.; Im, S.H.; Lee, Y.J. Performance Improvement of Inertial Navigation System by Using Magnetometer with Vehicle Dynamic Constraints. J. Sens. 2015, 2015. [Google Scholar] [CrossRef] [Green Version]
  12. Moussa, M.; Moussa, A.; El-Sheimy, N. Multiple Ultrasonic Aiding System for Car Navigation in GNSS Denied Environment. In Proceedings of the 2018 IEEE/ION Position, Location and Navigation Symposium (PLANS), Monterey, CA, USA, 23–26 April 2018; pp. 133–140. [Google Scholar]
  13. Travis, W.; Simmons, A.T.; Bevly, D.M. Corridor Navigation with a LiDAR / INS Kalman Filter Solution. In Proceedings of the IEEE Intelligent Vehicles Symposium, Las Vegas, NV, USA, 6–8 June 2005; pp. 343–348. [Google Scholar]
  14. Parviainen, J.; López, M.A.V.; Pekkalin, O.; Hautamäki, J.; Collin, J.; Davidson, P. Using Doppler radar and MEMS gyro to augment DGPS for land vehicle navigation. In Proceedings of the 2009 IEEE Control Applications, (CCA) & Intelligent Control, St. Petersburg, Russia, 8–10 July 2009; pp. 1690–1695. [Google Scholar] [CrossRef]
  15. Kim, S.B.; Bazin, J.C.; Lee, H.K.; Choi, K.H.; Park, S.Y. Ground vehicle navigation in harsh urban conditions by integrating inertial navigation system, global positioning system, odometer and vision data. IET Radar Sonar Navig. 2011, 5, 814. [Google Scholar] [CrossRef]
  16. Liu, Z.; El-Sheimy, N.; Yu, C.; Qin, Y. Motion Constraints and Vanishing Point Aided Land Vehicle Navigation. Micromachines 2018, 9, 249. [Google Scholar] [CrossRef] [Green Version]
  17. Niu, X.; Zhang, H.; Chiang, K.; El-sheimy, N. Using Land-Vehicle Steering Constraint To Improve the Heading Estimation of Mems GPS/INS Georeferencing Systems. In Proceedings of the The 2010 Canadian Geomatics Conference and Symposium of Commission I, ISPRS Convergence in Geomatics—Shaping Canada’s Competitive Landscape, Calgary, AB, Canada, 14–18 June 2010. [Google Scholar]
  18. Velaga, N.R.; Quddus, M.A.; Bristow, A.L.; Zheng, Y. Map-aided integrity monitoring of a land vehicle navigation system. IEEE Trans. Intell. Transp. Syst. 2012, 13, 848–858. [Google Scholar] [CrossRef]
  19. Borenstein, J.; Feng, L. Measurments and Correction of Systematic Odometry Errors in Mobile Robots. IEEE Trans. Robot. Autom. 1996, 12, 869–880. [Google Scholar] [CrossRef] [Green Version]
  20. Wang, Z.; Tan, J.; Sun, Z. Error Factor and Mathematical Model of Positioning with Odometer Wheel. Adv. Mech. Eng. 2015, 7, 305981. [Google Scholar] [CrossRef] [PubMed]
  21. Afzal, M.H.; Renaudin, V.; Lachapelle, G. Assessment of Indoor Magnetic Field Anomalies using Multiple Magnetometers. In Proceedings of the ION Gnss 2010, Portland, Oregon, 21–24 September 2010; pp. 21–24. [Google Scholar]
  22. Han, S.; Park, S.; Lee, K. Mobile Robot Navigation by Circular Path Planning Algorithm Using Camera and Ultrasonic Sensor. In Proceedings of the 2009 IEEE International Symposium on Industrial Electronics, Seoul, Korea, 5–8 July 2009; pp. 1749–1754. [Google Scholar]
  23. Moussa, M.; Moussa, A.; El-sheimy, N. Ultrasonic Wheel Based Aiding for Land Vehicle Navigation in GNSS denied environment. In Proceedings of the 2019 International Technical Meeting, ION ITM 2019, Reston, VA, USA, 28–31 January 2019; pp. 319–333. [Google Scholar]
  24. Moussa, M.; Moussa, A.; El-Sheimy, N. Ultrasonic based heading estimation for aiding land vehilce navigation in GNSS denied environment. In Proceedings of the ISPRS TC I Mid-term Symposium Innovative Sensing—From Sensors to Methods Applications, Karlsruhe, Germany, 10–12 October 2018; pp. 10–12. [Google Scholar]
  25. Gao, Y.; Liu, S.; Atia, M.M.; Noureldin, A. INS/GPS/LiDAR integrated navigation system for urban and indoor environments using hybrid scan matching algorithm. Sensors 2015, 15, 23286–23302. [Google Scholar] [CrossRef] [Green Version]
  26. Sun, Z.; Bebis, G.; Miller, R. On-road vehicle detection using optical sensors: A review. In Proceedings of the 7th International IEEE Conference on Intelligent Transportation Systems, Washington, DC, USA, 3–6 October 2004; pp. 585–590. [Google Scholar] [CrossRef]
  27. Liu, X.; Mei, H.; Lu, H.; Kuang, H.; Ma, X. A vehicle steering recognition system based on low-cost smartphone sensors. Sensors 2017, 17, 633. [Google Scholar] [CrossRef]
  28. Takahashi, Y.; Honma, N.; Sato, J.; Murakami, T.; Murata, K. Accuracy comparison of wireless indoor positioning using single anchor: Tof only versus tof-doa hybrid method. In Proceedings of the Asia-Pacific Microwave Conference APMC 2019, Singapore, 10–13 December 2019; pp. 1679–1681. [Google Scholar] [CrossRef]
  29. Huh, J.H.; Seo, K. An indoor location-based control system using bluetooth beacons for IoT systems. Sensors 2017, 17, 2917. [Google Scholar] [CrossRef] [Green Version]
  30. Wilfinger, R.; Moder, T.; Wieser, M.; Grosswindhager, B. Indoor Position Determination Using Location Fingerprinting and Vehicle Sensor Data. In Proceedings of the 2016 European Navigation Conference (ENC), Helsinki, Finland, 30 May–2 June 2016; pp. 1–9. [Google Scholar] [CrossRef]
  31. Yol, A.; Delabarre, B.; Dame, A.; Dartois, J.É.; Marchand, E. Vision-based absolute localization for unmanned aerial vehicles. In Proceedings of the 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 14–18 September 2014; pp. 3429–3434. [Google Scholar] [CrossRef] [Green Version]
  32. Zheng, F.; Member, S.; Tang, H.; Member, S.; Liu, Y. Odometry-Vision-Based Ground Vehicle Motion. IEEE Trans. Cybern. 2018, 49, 2652–2663. [Google Scholar] [CrossRef]
  33. Dabove, P.; Lingua, A.M.; Piras, M. Photogrammetric visual odometry with unmanned ground vehicle using low cost sensors. In Proceedings of the 2018 IEEE/ION Position, Location and Navigation Symposium (PLANS), Monterey, CA, USA, 23–26 April 2018; pp. 426–431. [Google Scholar] [CrossRef]
  34. Tsai, S.E.; Zhuang, S.H. Optical flow sensor integrated navigation system for quadrotor in GPS-denied environment. In Proceedings of the 2016 International Conference on Robotics and Automation Engineering (ICRAE), Jeju, Korea, 27–29 August 2016; pp. 87–91. [Google Scholar] [CrossRef]
  35. Georgy, J.; Noureldin, A.; Syed, Z.; Goodall, C. Nonlinear Filtering for Tightly Coupled RISS / GPS Integration. In Proceedings of the IEEE/ION Position, Location and Navigation Symposium, Indian Wells, CA, USA, 4–6 May 2010; pp. 1014–1021. [Google Scholar] [CrossRef]
  36. Gupta, A.; Chang, H.; Yilmaz, A. Gps-Denied Geo-Localisation Using Visual Odometry. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, III-3, 263–270. [Google Scholar] [CrossRef]
  37. Parra, I.; Ángel Sotelo, M.; Llorca, D.F.; Fernández, C.; Llamazares, A.; Hernández, N.; García, I. Visual odometry and map fusion for GPS navigation assistance. In Proceedings of the 2011 IEEE International Symposium on Industrial Electronics, Gdansk, Poland, 27–30 June 2011; pp. 832–837. [Google Scholar] [CrossRef] [Green Version]
  38. Lovegrove, S.; Davison, A.J.; Iba??ez-Guzmn, J. Accurate visual odometry from a rear parking camera. In Proceedings of the 2011 IEEE Intelligent Vehicles Symposium (IV), Baden-Baden, Germany, 5–9 June 2011; pp. 788–793. [Google Scholar] [CrossRef] [Green Version]
  39. Gakne, P.V.; O’Keefe, K. Tightly-coupled GNSS/vision using a sky-pointing camera for vehicle navigation in urban areas. Sensors 2018, 18, 1244. [Google Scholar] [CrossRef] [Green Version]
  40. Mostafa, M.M.; Moussa, A.M.; El-Sheimy, N.; Sesay, A.B. A smart hybrid vision aided inertial navigation system approach for UAVs in a GNSS denied environment. Navig. J. Inst. Navig. 2018, 65, 533–547. [Google Scholar] [CrossRef]
  41. Carlson, C.R.; Gerdes, J.C.; Powell, J.D. Practical Position and Yaw Rate Estimation with GPS and Differential Wheelspeeds. In Proceedings of the AVEC 2002 6th International Symposium Of Advanced Vehicle Control, Hiroshima, Japan, 9–13 September 2002. [Google Scholar]
  42. Rogers, R.M. Improved heading using dual speed sensors for angular rate and odometry in land navigation. In Proceedings of the IEEE 1998 Position Location and Navigation Symposium (Cat. No.98CH36153), Palm Springs, CA, USA, 20–23 April 1998; pp. 177–184. [Google Scholar] [CrossRef]
  43. Stephen, J. Development of a Multi-Sensor GNSS Based Vehicle Navigation System. 2000. Available online: https://www.ucalgary.ca/engo_webdocs/GL/00.20140.JStephen.pdf (accessed on 1 October 2020).
  44. Bonnifait, P.; Bouron, P.; Crubille, P.; Meizel, D. Data fusion of four ABS sensors and GPS for an enhanced localization of car-like vehicles. In Proceedings of the 2001 ICRA IEEE International Conference on Robotics and Automation (Cat. No.01CH37164), Seoul, Korea, 21–26 May 2001; Volume 2, pp. 1597–1602. [Google Scholar] [CrossRef] [Green Version]
  45. Xiao, Z.; Li, P.; Havyarimana, V.; Georges, H.M.; Wang, D.; Li, K. GOI: A Novel Design for Vehicle Positioning and Trajectory Prediction Under Urban Environments. IEEE Sens. J. 2018, 18, 5586–5594. [Google Scholar] [CrossRef]
  46. Martinez, F.; Gonzalez, L.C.; Carlos, M.R. Identifying Roadway Surface Disruptions Based on Accelerometer Patterns. IEEE Lat. Am. Trans. 2014, 12, 455–461. [Google Scholar] [CrossRef]
  47. Wahlström, J.; Skog, I.; Händel, P. Smartphone-Based Vehicle Telematics: A Ten-Year Anniversary. IEEE Trans. Intell. Transp. Syst. 2017, 18, 2802–2825. [Google Scholar] [CrossRef] [Green Version]
  48. Lai, C.H.; Chuang, S.M.; Chu, P.C.; Li, C.H. An real-time roadside sign recognition scheme for mobile probing cars with smart phones. In Proceedings of the 2012 IEEE International Conference on Imaging Systems and Techniques, Manchester, UK, 16–17 July 2012; pp. 267–272. [Google Scholar] [CrossRef]
  49. Song, T.; Capurso, N.; Cheng, X.; Yu, J.; Chen, B.; Zhao, W. Enhancing GPS with Lane-Level Navigation to Facilitate Highway Driving. IEEE Trans. Veh. Technol. 2017, 66, 4579–4591. [Google Scholar] [CrossRef]
  50. Zhu, S.; Wang, X.; Zhang, Z.; Tian, X.; Wang, X. Lane-level vehicular localization utilizing smartphones. In Proceedings of the 2016 IEEE 84th Vehicular Technology Conference (VTC-Fall), Montreal, QC, Canada, 18–21 September 2016. [Google Scholar] [CrossRef]
  51. Yokozuka, M.; Hashimoto, N.; Matsumoto, O. Low-cost 3D mobile mapping system by 6 DOF localization using smartphone embedded sensors. In Proceedings of the 2015 IEEE International Conference on Vehicular Electronics and Safety (ICVES), Yokohama, Japan, 5–7 November 2015; pp. 182–189. [Google Scholar] [CrossRef]
  52. Walter, O.; Schmalenstroeer, J.; Engler, A.; Haeb-Umbach, R. Smartphone-based sensor fusion for improved vehicular navigation. In Proceedings of the 2013 10th Workshop on Positioning, Navigation and Communication (WPNC), Dresden, Germany, 20–21 March 2013. [Google Scholar] [CrossRef]
  53. Niu, X.; Zhang, Q.; Li, Y.; Cheng, Y.; Shi, C. Using inertial sensors of iPhone 4 for car navigation. In Proceedings of the 2012 IEEE/ION Position, Location and Navigation Symposium, Myrtle Beach, SC, USA, 23–26 April 2012; pp. 555–561. [Google Scholar] [CrossRef]
  54. Moussa, M.; Moussa, A.; El-Sheimy, N. Steering Angle Assisted Vehicular Navigation Using Portable Devices in GNSS-Denied Environments. Sensors 2019, 19, 1618. [Google Scholar] [CrossRef] [Green Version]
  55. Honegger, D.; Meier, L.; Tanskanen, P.; Pollefeys, M. An open source and open hardware embedded metric optical flow CMOS camera for indoor and outdoor applications. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 1736–1741. [Google Scholar] [CrossRef]
  56. Carlson, C.R.; Gerdes, J.C.; Powell, J.D. Error sources when land vehicle dead reckoning with differential wheelspeeds. Navig. J. Inst. Navig. 2004, 51, 13–27. [Google Scholar] [CrossRef]
  57. Noureldin, A.; Karamat, T.B.; Georgy, J. Fundamentals of Inertial Navigation, Satellite-Based Positioning and Their Integration; Springer Publishing: New York, NY, USA, 2013; pp. 1–313. [Google Scholar] [CrossRef]
Figure 1. Flowchart of the mass flow velocity regression model estimation.
Figure 1. Flowchart of the mass flow velocity regression model estimation.
Sensors 20 06567 g001
Figure 2. Flowchart of the proposed mass flow velocity aiding low-cost Inertial Navigation System (INS) through extended Kalman filter (EKF).
Figure 2. Flowchart of the proposed mass flow velocity aiding low-cost Inertial Navigation System (INS) through extended Kalman filter (EKF).
Sensors 20 06567 g002
Figure 3. Proposed differential mass flow odometry concept.
Figure 3. Proposed differential mass flow odometry concept.
Sensors 20 06567 g003
Figure 4. Flowchart of the mass flow heading change regression model estimation.
Figure 4. Flowchart of the mass flow heading change regression model estimation.
Sensors 20 06567 g004
Figure 5. Flowchart of the proposed differential mass flow odometry for aiding low-cost INS through EKF.
Figure 5. Flowchart of the proposed differential mass flow odometry for aiding low-cost INS through EKF.
Sensors 20 06567 g005
Figure 6. Flowchart of the straight motion constraint using Z gyroscope.
Figure 6. Flowchart of the straight motion constraint using Z gyroscope.
Sensors 20 06567 g006
Figure 7. Flowchart of the proposed aiding navigation system measurements integration scheme.
Figure 7. Flowchart of the proposed aiding navigation system measurements integration scheme.
Sensors 20 06567 g007
Figure 8. Flowchart of the proposed aiding navigation system federated integration scheme.
Figure 8. Flowchart of the proposed aiding navigation system federated integration scheme.
Sensors 20 06567 g008
Figure 9. Mass flow sensor (SFM3000).
Figure 9. Mass flow sensor (SFM3000).
Sensors 20 06567 g009
Figure 10. Linear regression between the mass flow sensor measurements and reference velocity estimated by the tactical grade INS.
Figure 10. Linear regression between the mass flow sensor measurements and reference velocity estimated by the tactical grade INS.
Sensors 20 06567 g010
Figure 11. Visual odometry (VO) velocity and mass flow sensors velocity versus SPAN reference velocity.
Figure 11. Visual odometry (VO) velocity and mass flow sensors velocity versus SPAN reference velocity.
Sensors 20 06567 g011
Figure 12. Mass flow change of heading versus SPAN reference heading change.
Figure 12. Mass flow change of heading versus SPAN reference heading change.
Sensors 20 06567 g012
Figure 13. The trajectory of the underground parking using different updates for 190 s. Global Navigation Satellite System (GNSS) signal outage (measurements fusion method).
Figure 13. The trajectory of the underground parking using different updates for 190 s. Global Navigation Satellite System (GNSS) signal outage (measurements fusion method).
Sensors 20 06567 g013
Figure 14. The trajectory of the underground parking INS/velocity/heading change updates for 190 s GNSS signal outage (measurements and federated fusion method).
Figure 14. The trajectory of the underground parking INS/velocity/heading change updates for 190 s GNSS signal outage (measurements and federated fusion method).
Sensors 20 06567 g014
Table 1. Position root mean square error (RMSE) for different navigation solution methods using measurements fusion technique.
Table 1. Position root mean square error (RMSE) for different navigation solution methods using measurements fusion technique.
Navigation Solution RMSE (m)
INS stand-alone 1980
INS/non-holonomic constraints (NHC)14.30
INS/odometer velocity update4.85
INS/velocity update3.96
INS/velocity/NHC4.00
INS/velocity/heading change3.60
Table 2. Position RMSE for different navigation solution methods using federated fusion technique.
Table 2. Position RMSE for different navigation solution methods using federated fusion technique.
Navigation Solution RMSE (m)
INS stand-alone 1980
INS/NHC14.30
INS/velocity update7.94
INS/velocity/NHC6.07
INS/velocity/heading change6.74
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Moussa, M.; Zahran, S.; Mostafa, M.; Moussa, A.; El-Sheimy, N.; Elhabiby, M. Optical and Mass Flow Sensors for Aiding Vehicle Navigation in GNSS Denied Environment. Sensors 2020, 20, 6567. https://doi.org/10.3390/s20226567

AMA Style

Moussa M, Zahran S, Mostafa M, Moussa A, El-Sheimy N, Elhabiby M. Optical and Mass Flow Sensors for Aiding Vehicle Navigation in GNSS Denied Environment. Sensors. 2020; 20(22):6567. https://doi.org/10.3390/s20226567

Chicago/Turabian Style

Moussa, Mohamed, Shady Zahran, Mostafa Mostafa, Adel Moussa, Naser El-Sheimy, and Mohamed Elhabiby. 2020. "Optical and Mass Flow Sensors for Aiding Vehicle Navigation in GNSS Denied Environment" Sensors 20, no. 22: 6567. https://doi.org/10.3390/s20226567

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop