Next Article in Journal
Sensor Fusion of Cameras and a Laser for City-Scale 3D Reconstruction
Previous Article in Journal
Using Smart Phone Sensors to Detect Transportation Modes
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Accuracy Assessment of the Integration of GNSS and a MEMS IMU in a Terrestrial Platform

by
Sergio Madeira
1,*,†,
Wenlin Yan
2,†,
Luísa Bastos
2,3,† and
José A. Gonçalves
2,3,†
1
University of Trás-os-Montes e Alto Douro, Vila Real 5000-801, Portugal; INESC-TEC - Technology and Science, Rua Dr. Roberto Frias, 4200-465 Porto, Portugal
2
Department of Geosciences Environment and Spatial Planning, Faculty of Science, University of Porto, Rua Campo Alegre 687, Porto 4169-007, Portugal
3
CIIMAR/CIMAR, University of Porto, Rua dos Bragas 289, Porto 4050-123, Portugal
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Sensors 2014, 14(11), 20866-20881; https://doi.org/10.3390/s141120866
Submission received: 14 August 2014 / Revised: 22 October 2014 / Accepted: 25 October 2014 / Published: 4 November 2014
(This article belongs to the Section Physical Sensors)

Abstract

: MEMS Inertial Measurement Units are available at low cost and can replace expensive units in mobile mapping platforms which need direct georeferencing. This is done through the integration with GNSS measurements in order to achieve a continuous positioning solution and to obtain orientation angles. This paper presents the results of the assessment of the accuracy of a system that integrates GNSS and a MEMS IMU in a terrestrial platform. We describe the methodology used and the tests realized where the accuracy of the positions and orientation parameters were assessed using an independent photogrammetric technique employing cameras that integrate the mobile mapping system developed by the authors. Results for the accuracy of attitude angles and coordinates show that accuracies better than a decimeter in positions, and under a degree in angles, can be achieved even considering that the terrestrial platform is operating in less than favorable environments.

1. Introduction

A Direct Georeferencing System (DGS) can be defined as a set of sensors onboard a platform, whose goal is to obtain positions (as three coordinates) and attitudes (as three angles) of the origin of a reference system, defined in the platform, without external control points. One main application of a DGS is as a major component of a Mobile Mapping System (MMS), which contains, besides the DGS, a set of remote sensors to acquire information from the surrounding objects. The linkage of absolute positions and orientation parameters to the remote sensors allows the determination of positional and geometrical information of the objects observed [1]. Detailed descriptions of this technology can be found for example in [2,3].

In terms of applications, terrestrial MMS can be used to acquire data of urban or road infrastructures and can be optimized for specific applications such as traffic sign inventory and railway or road inspection as described in [46]. More specific applications of terrestrial MMS are the automatic DTM (Digital Terrain Model) generation in sandy beaches [7], river shorelines change detection [8] or pavement surface analysis [9].

Nowadays two lines emerge in MMS. A first one associated with high to moderate cost navigation or tactical grade Inertial Measuring Units (IMUs) and laser/camera sensors, which when integrated with satellite positioning receivers (Global Navigation Satellite Systems—GNSS) can deliver high accuracy surveys/modeling. The level of accuracy achieved by GNSS/IMU systems used in DGS can vary significantly, depending on the grade of the IMUs and on the GNSS data used. For a tactical grade IMU, integrated with geodetic grade GNSS receivers (multi-frequency code and phase receivers), accuracies can be of the order of one decimeter for positions, and better than 0.03° for attitude [10]. Navigation grade IMUs can provide accuracies of one order of magnitude better. Some of these types of systems are commercially available and are mainly directed to road, urban or railway environment surveys. Low cost Micro Electromechanical System (MEMS) IMUs integrated with geodetic grade GNSS receivers can provide results at the several decimeter level for positions and 0.2° for attitude, as shown for example in [11,12]. A very robust way of developing a DGS is by integrating GNSS and IMU measurements—an insight into this approach can be found in [13,14], or [15]. Manufacturers of GNSS receivers and IMUs provide data sheets with information on the expected accuracy that these individual systems can achieve, which usually relates to optimal operating conditions. However, in terrestrial platforms the working environment is more aggressive and consequently an accuracy decrease can be expected. In a MMS, the accuracy performance of the DGS is critical to the final positional accuracy of the objects acquired by the data sensors.

The subject of accuracy evaluation or calibration of MEMS units using known orientation and positions was presented by Artese et al. [16]. El-Sheimy presented a standard technique to obtain relative orientation between pairs of imaging sensors and DGS by using external control points on a bundle adjustment with known parameters as constraints [17].

The main goal of the work presented here was to assess the linear and angular accuracy of a DGS mounted on a terrestrial platform previously developed by the authors and described in [7], considering good conditions for GNSS observation. The tested DGS is composed by a dual frequency GNSS receiver and a MEMS IMU. The methodology used for the accuracy assessment is based on the comparison of the parameters given by the DGS with those obtained independently by photogrammetric techniques, using control points.

In Section 2, we describe briefly the DGS used and address the problem of converting data between different reference systems (sensors and platform), including the used conventions. In Section 3 the surveying test carried out is also described, as well as the methodology to assess the accuracy of the DGS system implemented. Finally, in Section 4, a brief discussion of the results obtained is presented.

2. Equipment and Methodology

The implemented DGS relies on the integration of measurements from a GNSS phase receiver with measurements from an IMU through a Kalman filter, which is a classic configuration that can provide very high accuracy due to its complementarity. A description of GNSS and IMU systems is not in the scope of this work since there is enough and accessible literature on the subjects (for GNSS see for example [18] and for IMU [19]).

2.1. Equipment Used

The accuracy of a GNSS/IMU system depends on the equipment used, built in processing and integration software. The DGS equipment used in our case was a Novatel dual phase GNSS receiver (Novatel Inc., Calgary, Canada) and a AHRS440 IMU from Crossbow (Crossbow: San Jose, CA, USA). Tables 1 and 2 present their accuracies, taken from the manufacturers' specifications.

The remote sensors used were two digital CCD cameras whose characteristics are presented in Table 3.

The cameras were calibrated independently with a methodology developed by the authors [20]. The resulting five parameters (principal point coordinates, focal distance and two radial distortions) allows for a mean linear re-projection error of 0.35 pixels. The calibrated focal distances, always fixed in the maximum value, were approximately 3.6 mm for both cameras. Images of the vehicle and the equipment mounted on it can be viewed in Figure 1.

2.2. GNSS/INS Integration by Kalman Filter

The Least Squares method is the most common way to estimate the state vector in geomatics, which is purely based on measurements [21]. For navigation applications it is more efficient to use a sequential approach and here the Kalman Filter plays an important role in the integration of GNSS and IMU observations [2225], which is based on the knowledge of the measurements and the state vector dynamics.

In the Kalman Filter mechanical equations, the position, velocity and attitude computed in the navigation frame, combined with other information, such as sensors errors and local gravity disturbance, are taken as the states of the filter. Here we have used a 34 state vector for the Extend Kalman Filter (EKF) described as:

x = [ r e l v e 1 C b l b a b ω s a s ω s a a s a w δ a δ ω δ g ] T
where the thirty four states comprise: three position coordinates r e l, three velocities v e l, three attitude angles (expressed by direction cosine matrix C b l), six constant bias ba and bω, six scale bias sa and sω, six scale asymmetry saa and saw, six misalignment parameters δa and δω and one gravity disturbance δg. The related error states are:
δ x = [ δ r e l δ v e l ψ l ε b a ε b ω ε s a ε s ω ε s a a ε s a ω ε δ a ε δ ω ε δ g ] T

The dynamic error equation is:

δ x ˙ = F δ x + Gw
where F34×34 is the system dynamics matrix; G34×7 is the disturbance input matrix; w7×7 is the system white noise matrix. A typical Kalman Filter dynamic equation in our application can be written as:
x k = Φ k 1 x k 1
P k = Φ k 1 P k 1 Φ k 1 T + Q k 1
where the superscript “-” means “a prediction estimation”; the subscript “k” is the current index of discrete samplings; P34×34 is the covariance matrix of the state vector; Φk−1 is the state transition matrix, describing how the system changes over time, which can be approximated by the equation:
Φ k 1 I + F k 1 Δ t + 1 2 ! ( F k 1 Δ t ) 2 + 1 3 ! ( F k 1 Δ t ) 3

The noise matrix Qk−1 can be approximated as:

Q k 1 G k 1 Q Δ t G k 1 T
where Q is the so called Spectral Density Matrix, which is a function of white noise w [26]:
Q ( t ) δ ( t τ ) = E [ w ( t ) w ( t ) T ]

In this equationt is the current observation time, τ means the time difference between consecutive observations and δ is the Dirac delta function with dimension s−1; In our application the Qk−1 has the form:

Q k 1 = diag ( σ a x 2 σ a y 2 σ a z 2 σ w x 2 σ w y 2 σ w z 2 σ δ g 2 ) 7 × 7
where σa and σw are the standard deviations of the accelerometers and gyros; σδg is the standard deviation of the gravity. In high precision demanding applications, Q must be specified carefully, particularly for the low cost IMU. If Q is set too small, the Kalman Filter will trust the measurements more than the dynamic equation. In the EKF a typical representation for Q can be given as:
Q d k 1 1 2 [ Φ k 1 G k 1 Q k 1 G k 1 T Φ k 1 T + G k Q k G k T ] Δ t

The system observation equations can be written as:

z k = [ r imu l r gnss l v imu l v gnss l ] 6 × 1 = H k x k + v k
where Hk is the measurement matrix and vk is a zero mean Gaussian noise. When the GNSS updates are available the update process can be done using the following equations:
K k = P k 1 H k T [ H k P k 1 H k T + R k ] 1
x k + = x k + K k [ z k H k x k ]
P k + = [ I K k H k ] P k 1
where the superscript “+” means “a corrected estimation”; Kk is the so-called Kalman gain; Rk is the measurement noise covariance matrix.

2.3. Coordinate Transformation between Reference Systems

Figure 2 shows a plan of the platform used to carry out the surveys. Complementary information about the plan is that the cameras are leveled over the platform, as can be viewed in Figure 1b, and their centers are two centimeters above the center of the DGS. The lengths between components were measured rigorously and the values are shown. There are four reference systems (r.s.) present: the Cam left, Cam right, DGS (whose origin is considered at the center of the platform) and the cartographic r.s. in which the platform moves. In order to convert coordinates of points between reference frames, three linear and three angular components are needed.

Considering any of these r.s., say 1 and 2, assuming that only scale is maintained, then the following equation can be established relating to a point in space (P) [27]:

[ X ] 1 = M 21 1 . [ X ] 2 + [ X 0 2 ] 1
where [X]1 contains the coordinates of P in 1st r.s., [X]2 the coordinates of P in 2nd r.s., [X0]1 contains the coordinates of the origin of the 2nd r.s. in the 1st r.s. and M21 is a matrix that explains the rotation from the 1st to the 2nd r.s.

Considering the observation of the rotation angles, and the positions of cameras and DGS in the cartographic r.s., the following equations can be established using Equation (15):

[ X ] Cart = M LcamCart 1 [ X ] Lcam + [ X 0 Left ] Cart
[ X ] Cart = M RcamCart 1 [ X ] Rcam + [ X 0 Rcam ] Cart
[ X ] Lcam = M RcamLCam 1 [ X ] Rcam + [ X 0 Rcam ] Lcam
[ X ] Cart = M DGSCart 1 [ X ] DGS + [ X 0 DGS ] Cart
[ X ] DGS = M LcamDGS 1 [ X ] Lcam + [ X 0 Left ] DGS
where the M matrices contain the rotation angles in the indicated directions and the X0 contains the cartographic coordinates (X, Y, H) of the origins of the r. s. Using Equations (16)(18) the relative orientation parameters of the right camera relatively to the left camera can be deduced:
M RcamLcam = M RcamCart M LcamCart 1
[ X 0 Rcam ] Lcam = M LcamCart . ( [ X 0 Rcam X 0 Lcam ] Cart )

Using Equations (16), (19) and (20) the offset parameters relating the DGS with the left camera are:

M LcamDGS = M LcamCart . M DGSCart 1
[ X 0 Lcam ] DGS = M DGSCart . ( [ X 0 Lcam X 0 DGS ] Cart )

These results will be used as needed in the rest of the document.

2.4. Rotation Angles (ω φ κ versus Roll, Pitch, Heading)

Traditionally, in photogrammetry, the orientation angles were associated with aerial photography and used to calculate Earth referenced coordinates from the photographic coordinates. They are the well-known rotations ω ϕ κ, of the photographic camera axes [27].

With the development of navigation technology applied to mobile surveying platforms, orientation angles different from the usual ones in photogrammetry were adopted, namely the roll, pitch and heading angles—see Table 4.

In the case of the terrestrial image, the authors used an axis convention different from the usual in photogrammetry, namely by considering photo coordinates as (x,z) and the y axis forward from the photo, in so achieving a more intuitive relation between κ and heading angles. Despite the axes considered in each case being different and rotated in different order, the resulting rotation matrices are the same [27]. In this work the conventions used are presented in Table 4. In each case the rotation angles can be obtained from the rotation matrices as follows:

ω , ϕ , κ case : ω = arctan ( r 32 r 33 ) φ = arcsin ( r 31 ) κ = arctan ( r 21 r 11 )
roll , pitch , heading case : roll = arctan ( r 13 r 33 ) pitch = arcsin ( r 23 ) heading = arctan ( r 21 r 22 )
where rij represents (i,j) matrix elements.

3. Evaluation of the DGS Accuracy

In order to evaluate the quality of the obtained navigation solution, the authors compared the exterior orientation parameters obtained independently from the cameras on the platform, with the parameters derived from the DGS itself. The offsets between the DGS and the cameras must be considered. However only the linear offsets are known by means of rigorous measurements over the platform (Figure 2). In fact, offset determination using the independent observations from cameras and DGS were used in this work to estimate the accuracy of the later. The observations, the strategy adopted and the results obtained are described below.

3.1. Observations

The data acquisition was conducted in an urban environment with the terrestrial video-based MMS, described in [7]. The test zone was selected on pavement with painted crosswalks because there we have well-defined points with enough variability on the ground and in the photos and also because such observations may be easily repeated at other locations in order to apply this methodology in our future work.

The control points were observed with centimetric accuracy by means of a static relative GNSS survey (referring Table 1, the acquisition mode was RT-2 (Real Time Kinematics), with the reference station at a distance of 6 km. Figure 3 shows a general view of the test area (a) and a scheme with the observed control points (b).

Several passes were made through the test zone, from different directions, either following a straight line or a curve, collecting images at a rate of four per second (the cameras collect images at the same instant since they share the same trigger). Of these, nine pairs were selected, in which a sufficient number of control points could be observed, in order to compare the position and orientation parameters with those given by the DGS. This means that the tests were carried using eighteen images in which the control points could be observed. Of interest is the fact that the 2nd and 3rd pairs were selected because the vehicle was moving in a curve, and all the other were along a straight trajectory.

3.2. Position and Orientation Parameters by Space Resection and Respective Accuracy Estimation

The position and orientation parameters of the cameras in the cartographic space were obtained by space resection from the control points. This is a photogrammetric process by which the six exterior orientation parameters of an image are computed using at least 3 control points on the photo [27]. The image coordinates of the control points were identified manually and independently in each photo. The results of this phase are summarized in Table 5.

Table 5 contains the images used to perform exterior orientation, including the used control points, the exterior orientation parameters obtained and, in the right column, the calculated relative orientation parameters. The relative orientation matrices and linear relative parameters were obtained with Formulas (21) and (22) and the relative orientation angles were extracted from the rotation matrices as in Formula (25).

The relative orientation between cameras is constant for all pairs because they are fixed on the platform and the images of each pair were obtained at the same instants. Moreover, the exterior orientation of the images, as referred previously, was obtained independently in each photo. In this way it is expected that the relative orientation parameters calculated, regarded as observations, follow a normal law, so their standard deviations are a good estimate of the accuracy of this methodology for obtaining the exterior orientation of an image. Table 6 presents the accuracy estimations in the orientation angles, in the 3D positions and in the length of vector T.

The length separation between the two camera centers is known to be 1.044 m (as shown in Figure 2). In this case it is possible to present the root mean square error and the mean error of the length of vector T, what is also done in Table 6. This is a good estimate of the accuracy achieved by space resection, for the positional parameters.

3.3. Position and Orientation Parameters Given by the GNSS / IMU System and Respective Accuracy Estimation

After processing the IMU and GNSS observations, as described in Section 2.2, a navigation solution was obtained for the surveyed path. Of interest are the instants that correspond to the analyzed photogrammetric pairs. The results obtained are presented in Table 7.

Estimation of the accuracy of the IMU, either in coordinates or orientations, cannot rely on the differences between IMU and cameras observations as these depend on the directions of the vehicle. The followed strategy was to calculate the linear and angular offsets of the left camera in the IMU r.s., for each observation, which, in the absence of observation errors, should be constant.

The offset matrices, MLcamDGS, were calculated using Formula (23), where MLcamCart is the left camera rotation matrix, calculated with the rotation angles in Table 5, and MDGSCart is the rotation matrix of DGS calculated with the rotation angles in Table 7. The offset angles where extracted using Formula (26). The linear offsets were calculated using Formula (24), where [X0LcamX0DGS]Cart is the difference between the cartographic coordinates of the left camera and DGS, given respectively in Tables 5 and 7. The results are presented in Table 8.

In this case, as the DGS r.s. is leveled over the platform and its orientation matches the orientation of the platform, the linear offsets are known rigorously by means of precise measurements (see Figure 2). In this way it is possible to assess the mean error and the root mean square error of the 3 linear offset components, which were also given in Table 8. The linear accuracy of this GNSS/IMU integration can be assessed by means of the root mean square error of the vector T and the angular accuracy by means of the standard deviation of the obtained offset angles.

4. Discussion

The evaluation of the DGS accuracy relied on the comparison between its parameters computed using the measurements acquired in good GNSS observation conditions and the ones obtained from the exterior orientation of the left camera at the same instants, by space resection, using rigorous control points on the pavement. It was necessary first to estimate the accuracy of this exterior orientation process, which is summarized in Table 6, showing an accuracy of better than 1.5 cm in 3D positions and better than 0.1 degrees in orientation angles. This result agrees with what was expected regarding the geometric characteristics of camera/lens systems and previous experiments made during calibration tests [18]. The obtained mean linear re-projection errors of 0.35 pixels represent 0.6 cm to 1.5 cm, respectively at 4 to 10 meters distance, being that sort of distances at which the control points were observed.

The accuracy estimation of the DGS itself relied on the fact that the offset parameters between DGS and cameras must be constant. Using this condition, as shown by the results summarized in Table 8, the obtained accuracy was better than 6cm in 3D positions, and around 0.15 degrees in roll and pitch and 0.75 degrees in heading.

A major issue relates to the fact that the accuracy of the DGS is being assessed through an independent process of obtaining position and orientation. It can be noted, however, that the estimated accuracy with the space resection process is better than the accuracy estimated for the DGS. The relation is about 4 times better in 3D positions, 2.5 times in attitude (ω and φ angles) and 20 times in heading.

As a final remark, and regarding Table 8, if the GNSS/IMU observations made in curves are discarded, the 2nd and 3rd pairs on the table, the heading accuracy achieved is higher, remaining under 0.2°. As expected the heading accuracy given by the GNSS/IMU system looses quality when the vehicle is changing direction.

5. Conclusions

In this work the accuracy of the navigation solution provided by a DGS implemented with a particular GNSS/IMU integration was evaluated for a terrestrial platform moving in an urban environment. The used sensors were a high quality dual frequency GNSS receiver and a medium quality MEMS IMU, integrated by means of an extended Kalman filter. The results show that, for specific environments, these type of sensors can deliver good results and replace the more expensive tactical grade type of IMU.

The methodology used was to compare the DGS derived parameters with those obtained directly from the images applying space resection in the same instants. The obtained accuracy for the DGS, presented in Table 8, was of 6 cm in position parameters, 0.2° in attitude and 0.8° in heading.

Acknowledgments

This work was partially funded through FCT (Portuguese Foundation for Science and Technology) in the scope of projects PEst-C/MAR/LA0015/2013, DEOSOM (Detection and Evaluation of Oil Spills by Optical Methods), through the AMPERA ERA-NET, and PTDC/AGR-AAM/104819/2008.

Author Contributions

Madeira S., Gonçalves J. and Bastos L. conceived and designed the experiments; Madeira S. and Yan W. performed the experiments; Madeira S., Yan W. and Gonçalves J. analyzed the data; Madeira S., Gonçalves J., Yan W. and Bastos L. wrote the paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Madeira, S.; Gonçalves, J.; Bastos, L. Sensor Integration in a Low Cost Land Mobile Mapping System. Sensors 2012, 12, 2935–2953. [Google Scholar]
  2. Schwarz, K.P.; Sheimy, N. Digital Mobile Mapping Systems—State of the Art and Future Trends. In Advances in Mobile Mapping Technology; Taylor & Francis Group: London, UK, 2007; pp. 3–18. [Google Scholar]
  3. Gruen, A., Huang, T.S., Eds.; Calibration and Orientation of Cameras in Computer Vision; Springer Series in Information Sciences; Springer-Verlag: Berlin/Heidelberg, Germany, 2001; p. 235.
  4. Madeira, S.; Gonçalves, J.; Bastos, L. Photogrammetric mapping and measuring application using MATLAB. Comput. Geosci. 2010, 36, 699–706. [Google Scholar]
  5. Gikas, V.; Daskalakis, S. Determining Rail Track Axis Geometry Using Satellite and Terrestrial Geodetic Data. Surv. Rev. 2008, 40, 392–405. [Google Scholar]
  6. Wang, C.; Hassan, T.; El-Sheimy, N.; Lavigne, M. Automatic Road Vector Extraction for Mobile Mapping Systems. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2008, XXXVII-B3b, 516–522. [Google Scholar]
  7. Madeira, S.; Gonçalves, J.; Bastos, L. Accurate DTM generation in sand beaches using Mobile Mapping. J. Coast. Conserv. 2013. [Google Scholar] [CrossRef]
  8. Vaaja, M.; Hyyppä, J.; Kukko, A.; Kaartinen, H.; Hyyppä, H.; Alho, P. Mapping Topography Changes and Elevation Accuracies Using a Mobile Laser Scanner. Sensors 2011, 11, 587–600. [Google Scholar]
  9. Aoki, K.; Yamamoto, K.; Shimamura, H. Evaluation model for pavement surface distress on 3D point clouds from mobile mapping system. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2012, XXXIX-B3, 87–90. [Google Scholar]
  10. Deurloo, R.A. Development of a Kalman Filter Integrating System and Measurement Models for a Low-cost Strapdown Airborne Gravimetry System. Ph.D. Thesis, University of Porto, Porto, Portugal, 2011. [Google Scholar]
  11. Bastos, L.; Yan, W.; Magalhães, A.; Ayres-Sampaio, D.; Deurloo, R. Assessment the performance of low cost IMUs for strapdown airborne gravimetry using UAVs. Proceedings of the 4th International Galileo Science Colloquium, Scientific and Fundamental Aspects of the Galileo Programme, Prague, Czech Republic, 4–6 December 2013.
  12. Ayres-Sampaio, D.; Gonçalves, J.A.; Magalhães, A.; Bastos, L. Evaluating the performance of MEMS-based IMUs for direct georeferencing. Proceedings of the 8th Portuguese and Spanish Assembly of Geodesy and Geophysics, Evora, Portugal, 29–31 January 2014; pp. 337–340.
  13. Grewal, M.; Weill, L.; Andrews, A. Global Positioning Systems, Inertial Navigation and Integration; J. Wiley & Sons: Hoboken, NJ, USA, 2007. [Google Scholar]
  14. Schwarz, K.P.; Chapman, M.A.; Cannon, M.W.; Gong, P. An integrated INS/GPS Approach to the Georeferencing of Remotely Sensed Data. ISPRS J. Photogramm. Remote Sens. 1993, 59, 1667–1674. [Google Scholar]
  15. Borgese, G.; Rizzo, L.; Artese, G.; Pace, C. Compact Wireless GPS/inertial System. Proceedings of 4th Annual Caneus Fly-By-Wireless Workshop, Montreal, QC, Canada, 14–17 June 2011; pp. 15–18.
  16. Artese, G.; Trecroci, A. Calibration of a low cost MEMS INS sensor for an integrated navigation system. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2008, XXXVII-B5, 877–882. [Google Scholar]
  17. El-Sheimy, N. The Development of VISAT—A Mobile Survey System For GIS Applications. PhD Thesis, The University of Calgary, Calgary, Canada, 1996; pp. 80–87. [Google Scholar]
  18. Hofmann-Wellenhof, B.; Lichtenegger, H.; Wasle, E. GNSS—Global Navigation Satellite Systems: GPS, GLONASS, Galileo, and more; Springer-Verlag: Wien, Austria, 2008. [Google Scholar]
  19. Britting, K.R. Inertial Navigation System Analysis; John Wiley & Sons, Inc.: New York, NY, USA, 1972. [Google Scholar]
  20. Madeira, S.; Gonçalves, J.; Bastos, L. Fast camera calibration for low cost mobile mapping. Proceedings of the 6th International Symposium on Mobile Mapping Technology, Presidente Prudente, Brazil, 21–24 July 2009.
  21. Jekeli, C. Inertial Navigation Systems with Geodetic Applications; Walter de Gruyter: Berlin, Germany, 2001. [Google Scholar]
  22. Kalman, R.E. A New Approach to Linear Filtering and Prediction Problems. Trans. ASME J. Basic Eng. 1960, 82, 35–45. [Google Scholar]
  23. Moore, J.B.; Qi, H. Direct Kalman Filtering Approach for GPS/INS Integration. IEEE Trans. Aerosp. Elect. Syst. 2002, 38, 687–693. [Google Scholar]
  24. Yaakov, B.S.; Li, X.R.; Thiagalingam, K. Estimation with Applications to Tracking and Navigation; John Wiley & Sons: New York, NY, USA, 2001. [Google Scholar]
  25. Gikas, V.; Cross, P.A.; Akuamoa, A. A Rigorous and Integrated Approach to Hydrophone and Source Positioning during Multi-Streamer Offshore Seismic Exploration. Hydrogr. J. 1995, 77, 11–24. [Google Scholar]
  26. Gelb, A. Applied Optimal Estimation; M.I.T. Press: Cambridge, UK, 1974. [Google Scholar]
  27. Wolf, P.R.; Dewitt, B.A.; Wilkinson, B. Elements of Photogrammetry, with Applications in GIS, 3rd ed.; McGraw-Hill: New York NY, USA, 2000; pp. 237–245. [Google Scholar]
Figure 1. (a) Surveying vehicle; (b) Equipment detail.
Figure 1. (a) Surveying vehicle; (b) Equipment detail.
Sensors 14 20866f1 1024
Figure 2. Platform plan with components location and measured lengths.
Figure 2. Platform plan with components location and measured lengths.
Sensors 14 20866f2 1024
Figure 3. Test zone of the terrestrial case: (a) general view; (b) scheme of the observed control.
Figure 3. Test zone of the terrestrial case: (a) general view; (b) scheme of the observed control.
Sensors 14 20866f3 1024
Table 1. Novatel manufacturer specifications.
Table 1. Novatel manufacturer specifications.
Novatel DL-V3 Dual Frequency GNSS Phase Receiver
Horizontal PositionAccuracy (RMS)
Single Point L11.5 m
Single Point L1/L21.2 m
SBAS0.6 m
DGPS0.4 m
RT-200.2 m
RT-21 cm + 1 ppm
Table 2. AHRS440 manufacturer specifications.
Table 2. AHRS440 manufacturer specifications.
Crossbow AHRS440 IMU
Heading
Range (°)± 180°
Accuracy (RMS)< 1.0°
Resolution< 0.1°
Attitude
Range: Roll, Pitch± 180°, ± 90°
Accuracy (RMS)< 0.2°
Resolution< 0.02°
Table 3. Characteristics of camera/lens used.
Table 3. Characteristics of camera/lens used.
CameraLens
(3.2 × 2.4) mm2 CCD arrayFocal Length: Vario 1.8–3.6 mm
Cell size of 5.6 μm × 5.6 μmIris range: F1.6 – Close
Array size of 640 × 480 pixelsMinimum Object Distance: 0.2 m
Angle of view (Hor): 97° to 53° for a (3.2 × 2.4) mm2 CCD
Table 4. Rotations and rotation matrix conventions used for a terrestrial camera and a moving body.
Table 4. Rotations and rotation matrix conventions used for a terrestrial camera and a moving body.
Measuring ElementRotation OrderRotation Matrices
Sensors 14 20866t11st - ω R = R 3 ( κ ) R 2 ( φ ) R 1 ( ω ) = ( cos κ sin κ 0 sin κ cos κ 0 0 0 1 ) ( cos φ 0 sin φ 0 1 0 sin φ 0 cos φ ) ( 1 0 0 0 cos ω sin ω 0 sin ω cos ω )
2nd - ϕ
3rd - κ

Sensors 14 20866t21st - heading R = R 2 ( roll ) R 1 ( pitch ) R 3 ( heading ) = ( cos R 0 sin R 0 1 0 sin R 0 cos R ) ( 1 0 0 0 cos P sin P 0 sin P cos P ) ( cos H sin H 0 sin H cos H 0 0 0 1 )
2nd - pitch
3rd - roll
Table 5. Results of space resection in nine image pairs.
Table 5. Results of space resection in nine image pairs.
PairLeft ImageSpace ResectionRight ImageSpace ResectionRelative Orientation

1 Straight lineSensors 14 20866t3ω = −0.737°Sensors 14 20866t4ω = −2.258°ωrel = −0.174°
φ = 4.037°φ = 3.858°φ rel = −1.518°
κ = 283.200°κ = 284.374°κ rel = 1.067°
X = −43700.132 mX = −43699.833mXrel = 1.043 m
Y = 153423.686 mY = 153422.684mYrel = 0.062 m
Z = 8.435 mZ = 8.448mZrel = 0.021 m
Stdxy = 1.3;1.3 pixStdxy=1.7;1.5 pixT = 1.045 m

2 CurveSensors 14 20866t5ω = 1.777°Sensors 14 20866t6ω = − 0.397°ω rel = −0.147°
φ = 4.199°φ = 4.706°φ rel = −1.459°
κ = 255.460°κ = 256.671°κ rel = 1.102°
X = −43694.735 mX = −43694.946 mXrel = 1.035 m
Y = 153423.886 mY = 153422.871 mYrel = 0.048 m Zrel
Z = 8.423 mZ = 8.431m= 0.024 m
Stdxy = 2.4;1.2 pixStdxy = 2.2;0.9 pixT = 1.037 m

3 CurveSensors 14 20866t7ω = 0.365°Sensors 14 20866t8ω = −0.753°ω rel = −0.224°
φ = 3.092°φ = 4.259°φ rel = −1.599°
κ = 231.655°κ = 232.757°κ rel = 1.028°
X = −43692.415 mX = −43692.993 mXrel = 1.038 m
Y = 153422.506 mY = 153421.642 mYrel = 0.081 m
Z = 8.483mZ = 8.530 mZrel = 0.021 m
Stdxy = 1.7;1.3 pixStdxy = 1.3;1.2 pixT = 1.041 m

4 Straight lineSensors 14 20866t9ω = −4.790°Sensors 14 20866t10ω = −4.626°ω rel = −1.121°
φ = −0.199°φ = −1.677°φ rel = −1.483°
κ = 10.994°κ = 12.053°κ rel = 1.054°
X = −43687.679 mX = −43686.669 mXrel = 1.042 m
Y = 153411.606 mY = 153411.868 mYrel = 0.064 m
Z = 8.576 mZ = 8.570 mZrel = 0.013 m
Stdxy = 1.4;0.6 pixStdxy = 1.3;0.9 pixT = 1.044 m

5 Straight lineSensors 14 20866t11ω = −4.645°Sensors 14 20866t12ω = −4.460°ω rel = −0.096°
φ = 0.253°φ = −1.218°φ rel = −1.479°
κ = 10.846°κ = 11.866°κ rel = 1.017°
X = −43689.485 mX = −43688.495 mXrel = 1.020 m
Y = 153418.980 mY = 153419.236 mYrel = 0.065 m
Z = 8.487 mZ = 8.480 mZrel = 0.018 m
Stdxy = 1.3;1.1 pixStdxy = 1.1;1.0 pixT = 1.023 m

6 Straight lineSensors 14 20866t13ω = −3.787°Sensors 14 20866t14ω = −3.642°ω rel = −0.140°
φ = −0.965°φ = −2.585°φ rel = −1.621°
κ = 10.027°κ = 11.028°κ rel = 0.995°
X = −43691.099 mX = −43690.102 mXrel = 1.026 m
Y = 153425.459 mY = 153425.712 mYrel = 0.075 m
Z = 8.513 mZ = 8.526mZrel = 0.013 m
Stdxy = 1.1;1.0 pixStdxy = 1.5;1.5 pixT = 1.029 m

7 Straight lineSensors 14 20866t15ω = 4.873°Sensors 14 20866t16ω = 4.798°ω rel = −0.181°
φ = 1.225°φ = 2.613°φ rel = −1.379°
κ = 190.571°κ = 191.616°κ rel = 1.041°
X = −43694.993 mX = −43696.017 mXrel = 1.047 m
Y = 153440.921 mY = 153440.706 mYrel = 0.019 m
Z = 8.838 mZ = 8.874 mZrel = 0.032 m
Stdxy = 1.6;0.8 pixStdxy = 1.9;0.9 pixT = 1.047 m

8 Straight lineSensors 14 20866t17ω = 5.271°Sensors 14 20866t18ω = 5.139°ω rel = −0.154°
φ = 2.060°φ = 3.609°φ rel = −1.547°
κ = 190.560°κ = 191.576°κ rel = 1.008°
X = −43692.895 mX = −43693.894 mXrel = 1.033 m
Y = 153432.684 mY = 153432.413 mYrel = 0.079 m
Z = 8.678mZ = 8.708 mZrel = 0.018 m
Stdxy = 1.9;1.0 pixStdxy = 2.1;1.1 pixT = 1.036 m

9 Straight lineSensors 14 20866t19ω = 4.823°Sensors 14 20866t20ω = − 4.776°ω rel = −0.175°
φ = −0.074°φ = 1.269°φ rel = −1.333°
κ = 189.460°κ = 190.526°κ rel = 1.063°
X = −43691.246 mX = −43692.288 mXrel = 1.064 m
Y = 153425.281 mY = 153425.062 mYrel = 0.044 m
Z = 8.536 mZ = 8.540mZrel = 0.023 m
Stdxy = 1.5;1.0 pixStdxy = 1.3;0.7 pixT = 1.065 m
Table 6. Accuracy estimation of the cameras' exterior orientation by space resection.
Table 6. Accuracy estimation of the cameras' exterior orientation by space resection.
ΔωΔ φΔ κΔx (m)Δy (m)Δz (m)ΔT(m)
Std0.037°0.094°0.034°0.0130.0200.0060.012
Mean E−0.003
RMSE0.012
Table 7. Position and orientation parameters obtained from GNSS/IMU processing in the same instants of the photogrammetric pairs.
Table 7. Position and orientation parameters obtained from GNSS/IMU processing in the same instants of the photogrammetric pairs.
PairX (m)Y (m)H (m)Roll (°)Pitch (°)Heading (°)
1−43,700.51153,422.848.381.8480.45775.039
2−43,695.31153,422.958.410.8290.417101.501
3−43,693.06153,421.668.440.7113.761125.160
4−43,687.48153,411.238.57−0.039−0.849347.533
5−43,689.25153,418.568.52−0.498−0.721347.775
6−43,690.85153,425.038.510.954−1.791348.608
7−43,695.77153,440.538.84−1.4040.863168.090
8−43,693.66153,432.238.70−2.0730.330168.016
9−43,692.01153,424.828.48−0.0921.061169.328
Table 8. Offset parameters of left camera in GNSS/INS reference system.
Table 8. Offset parameters of left camera in GNSS/INS reference system.
Pair/offsetsOffset X (m)Offset Y (m)Offset H (m)T vector (m)Offset Roll (°)Offset Pitch (°)Offset Heading (°)
1−0.4110.1880.0610.4561.075−5.7921.697
2−0.5790.1580.0180.6010.867−5.4162.927
3−0.5580.0240.0730.5630.885−5.5403.010
4−0.4970.0850.0110.5050.845−5.5801.423
5−0.5240.133−0.0330.5410.770−5.3161.372
6−0.5370.1380.0160.5550.967−5.4931.256
7−0.4600.1060.0140.4721.239−5.6021.168
8−0.4620.0430.0000.4641.051−5.4981.184
9−0.4620.0270.0630.4671.135−5.7771.155
Mean−0.4990.1000.0250.5140.982−5.5571.688
Std0.0550.0590.0340.0530.1540.1550.746
Known Values−0.5220.1250.0200.537
Mean Error0.023−0.0250.005−0.023
RMSE0.0570.0610.0330.055

Share and Cite

MDPI and ACS Style

Madeira, S.; Yan, W.; Bastos, L.; Gonçalves, J.A. Accuracy Assessment of the Integration of GNSS and a MEMS IMU in a Terrestrial Platform. Sensors 2014, 14, 20866-20881. https://doi.org/10.3390/s141120866

AMA Style

Madeira S, Yan W, Bastos L, Gonçalves JA. Accuracy Assessment of the Integration of GNSS and a MEMS IMU in a Terrestrial Platform. Sensors. 2014; 14(11):20866-20881. https://doi.org/10.3390/s141120866

Chicago/Turabian Style

Madeira, Sergio, Wenlin Yan, Luísa Bastos, and José A. Gonçalves. 2014. "Accuracy Assessment of the Integration of GNSS and a MEMS IMU in a Terrestrial Platform" Sensors 14, no. 11: 20866-20881. https://doi.org/10.3390/s141120866

Article Metrics

Back to TopTop