# Utilization of a Terrestrial Laser Scanner for the Calibration of Mobile Mapping Systems

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Methodology

#### 2.1. Overview

#### 2.2. Camera Calibration

_{i},y

_{i}), focal length (c), principal point (x

_{p},y

_{p}), lens distortion (Δx

_{i},Δy

_{i}), camera position (x

_{c},y

_{c},z

_{c}), camera orientation (m

_{11}~ m

_{33}), and object position (x

_{o},y

_{o},z

_{o}). The collinearity equation can be represented by Equation (1) [25]:

_{n},y

_{n}can be calculated by Equation (2), and the camera orientation parameters (m

_{11}~ m

_{33}) can be calculated from the rotation angle (ω,ϕ,κ) by Equation (3):

_{n}is $\sqrt{{x}_{n}^{2}+{y}_{n}^{2}}$, A

_{1}, A

_{2}and A

_{3}are the radial distortion parameters and B

_{1}, B

_{2}are the tangential distortion parameters.

#### 2.3. Registration and Georeferencing of Terrestrial Laser Scanning Data

#### 2.4. Mobile Mapping System Calibration

_{a}) for A. Accordingly, the mathematical model for the camera sensor can be defined by Equation (6):

#### 2.5. Adjustment Model

_{1},x

_{2},…,x

_{n}), e is the random error vector. It is assumed that the random errors follow the normal distribution ($e~\left(0,{\sigma}_{0}^{2}{P}^{-1}\right))$. ${\sigma}_{0}^{2}$ is the variance component used as scale, and P is the weight matrix which is the inverse matrix of the variance-covariance matrix. The weight is inversely proportional to the observation variance and the covariance between uncorrelated observations is zero. The high variance indicates that the observation has a large error and requires a large correction. When a measurement system has different precision observations, the weight matrix is controlled by the observation variance. In this paper, the weight matrix for image points was basically set as the identity matrix with the assumption that the observations have identical precisions. Furthermore, the weight matrix is controlled when features such as lines or planes in images or point clouds are utilized as the observation of the system [24,58]. For the plane features extracted from the point cloud, the precisions of the points in the normal direction of the plane were set as one, and those in the other direction were set as zero. By this approach, the similarities of the plane features in pairwise datasets can be measured and the optimized parameters which maximize the similarity and minimize the discrepancy can be estimated. Furthermore, when it is predicted that the precisions of control features are different, the weight matrix should reflect their precisions. In this paper, for the plane features extracted from laser scanning data, the weights reflected the inverses of squared plane model fitting errors.

_{o}is the approximate parameter vector before the correction, Δξ is the correction vector of the unknown parameters, ξ

_{new}is the updated parameter vector after the correction, F is the nonlinear observation system with respect to ξ

_{o}, and J is the Jacobian matrix which includes the linearized equations of the nonlinear observation model and configured as Equation (12):

#### 2.6. Feature Extraction

_{i},y

_{i}), the variation (E(x

_{i},y

_{i})) of pixel values (I(x

_{i},y

_{i})) for the shift (Δx,Δy) in the window of which size is w can be represented by Equation (15):

_{x}and I

_{y}are the gradients of the pixel values in the x direction and y direction, respectively. Then, the score (R) for determining corner points can be calculated by Equation (17):

## 3. Data Preparation

#### 3.1. Sensors

#### 3.2. Datasets

## 4. Calibration Results

#### 4.1. Camera Calibration Results

#### 4.2. Boresight and Lever-Arm Calibration Result

## 5. Discussion and Future Work

## 6. Conclusions

## Acknowledgments

## Author Contributions

## Conflicts of Interest

## References

- Slattery, K.T.; Slattery, D.K.; Peterson, J.P. Road construction earthwork volume calculation using three-dimensional laser scanning. J. Surv. Eng.
**2011**, 138, 96–99. [Google Scholar] [CrossRef] - Gonzalez-Jorge, H.; Solla, M.; Armesto, J.; Arias, P. Novel method to determine laser scanner accuracy for applications in civil engineering. Opt. Appl.
**2012**, 42, 43–53. [Google Scholar] - Bitenc, M.; Lindenbergh, R.; Khoshelham, K.; Van Waarden, A.P. Evaluation of a lidar land-based mobile mapping system for monitoring sandy coasts. Remote Sens.
**2011**, 3, 1472–1491. [Google Scholar] [CrossRef] - Cho, H.; Hong, S.; Kim, S.; Park, H.; Park, I.; Sohn, H.-G. Application of a terrestrial lidar system for elevation mapping in terra nova bay, antarctica. Sensors
**2015**, 15, 23514. [Google Scholar] [CrossRef] [PubMed] - Pellenz, J.; Lang, D.; Neuhaus, F.; Paulus, D. Real-Time 3D mapping of rough terrain: A field report from disaster city. In Proceedings of the 2010 IEEE Safety Security and Rescue Robotics, Bremen, Germany, 26–30 July 2010; pp. 1–6.
- Gong, J. Mobile lidar data collection and analysis for post-sandy disaster recovery. In Proceedings of the 2013 International Workshop of Computing in Civil Engineering, Los Angeles, CA, USA, 23–25 June 2013.
- Tang, P.; Huber, D.; Akinci, B.; Lipman, R.; Lytle, A. Automatic reconstruction of as-built building information models from laser-scanned point clouds: A review of related techniques. Autom. Constr.
**2010**, 19, 829–843. [Google Scholar] [CrossRef] - Klein, L.; Li, N.; Becerik-Gerber, B. Imaged-based verification of as-built documentation of operational buildings. Autom. Constr.
**2012**, 21, 161–171. [Google Scholar] [CrossRef] - Snavely, N.; Seitz, S.M.; Szeliski, R. Photo Tourism: Exploring Photo Collections in 3D. ACM trans. Graph.
**2006**, 25, 835–846. [Google Scholar] [CrossRef] - De Reu, J.; Plets, G.; Verhoeven, G.; De Smedt, P.; Bats, M.; Cherretté, B.; De Maeyer, W.; Deconynck, J.; Herremans, D.; Laloo, P. Towards a three-dimensional cost-effective registration of the archaeological heritage. J. Archaeol. Sci.
**2013**, 40, 1108–1121. [Google Scholar] [CrossRef] - Golparvar-Fard, M.; Bohn, J.; Teizer, J.; Savarese, S.; Peña-Mora, F. Evaluation of image-based modeling and laser scanning accuracy for emerging automated performance monitoring techniques. Autom. Constr.
**2011**, 20, 1143–1155. [Google Scholar] [CrossRef] - Dai, F.; Rashidi, A.; Brilakis, I.; Vela, P. Comparison of image-based and time-of-flight-based technologies for three-dimensional reconstruction of infrastructure. J. Constr. Eng. Manag.
**2013**, 139, 69–79. [Google Scholar] [CrossRef] - Puttonen, E.; Lehtomäki, M.; Kaartinen, H.; Zhu, L.; Kukko, A.; Jaakkola, A. Improved sampling for terrestrial and mobile laser scanner point cloud data. Remote Sens.
**2013**, 5, 1754–1773. [Google Scholar] [CrossRef] - Olsen, M.J.; Kuester, F.; Chang, B.J.; Hutchinson, T.C. Terrestrial laser scanning-based structural damage assessment. J. Comput. Civil Eng.
**2009**, 24, 264–272. [Google Scholar] [CrossRef] - Gordon, S.J.; Lichti, D.D. Modeling terrestrial laser scanner data for precise structural deformation measurement. J. Surv. Eng.
**2007**, 133, 72–80. [Google Scholar] [CrossRef] - Schwarz, K.P.; El-Sheimy, N. Mobile mapping systems–state of the art and future trends. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.
**2004**, 35, 10. [Google Scholar] - Novak, K. The ohio state university highway mapping system: The stereo vision system component. In Proceedings of the 47th Annual Meeting of The Institute of Navigation, Williamsburg, VA, USA, 10–12 June 1991; pp. 121–124.
- Grejner-Brzezinska, D.A. Mobile mapping technology: Ten years later (part one). Surv. Land Inf. Syst.
**2001**, 61, 75–92. [Google Scholar] - Madeira, S.; Gonçalves, J.A.; Bastos, L. Sensor integration in a low cost land mobile mapping system. Sensors
**2012**, 12, 2935–2953. [Google Scholar] [CrossRef] [PubMed] - Sairam, N.; Nagarajan, S.; Ornitz, S. Development of mobile mapping system for 3d road asset inventory. Sensors
**2016**, 16, 367. [Google Scholar] [CrossRef] [PubMed] - Jaakkola, A.; Hyyppä, J.; Kukko, A.; Yu, X.; Kaartinen, H.; Lehtomäki, M.; Lin, Y. A low-cost multi-sensoral mobile mapping system and its feasibility for tree measurements. ISPRS J. Photogramm. Remote Sens.
**2010**, 65, 514–522. [Google Scholar] [CrossRef] - Da Silva, J.F.C.; Camargo, P.d.O.; Gallis, R. Development of a low-cost mobile mapping system: A south american experience. Photogramm. Record
**2003**, 18, 5–26. [Google Scholar] [CrossRef] - Habib, A.; Bang, K.I.; Kersting, A.P.; Chow, J. Alternative methodologies for lidar system calibration. Remote Sens.
**2010**, 2, 874–907. [Google Scholar] [CrossRef] - Chan, T.O.; Lichti, D.D.; Glennie, C.L. Multi-feature based boresight self-calibration of a terrestrial mobile mapping system. ISPRS J. Photogramm. Remote Sens.
**2013**, 82, 112–124. [Google Scholar] [CrossRef] - Habib, A.F.; Morgan, M.F. Automatic calibration of low-cost digital cameras. Opt. Eng.
**2003**, 42, 948–955. [Google Scholar] - van den Heuvel, F.A. Estimation of interior orientation parameters from constraints on line measurements in a single image. Int. Arch. Photogramm. Remote Sens.
**1999**, 32, W11. [Google Scholar] - Sturm, P.F.; Maybank, S.J. On plane-based camera calibration: A general algorithm, singularities, applications. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Fort Collins, CO, USA, 23–25 June 1999.
- Zhang, Z. Flexible camera calibration by viewing a plane from unknown orientations. In Proceedings of the Seventh IEEE International Conference on Computer Vision, Kerkyra, Greece, 20–27 September 1999; pp. 666–673.
- Bradski, G.; Kaehler, A. Learning Opencv: Computer Vision with the Opencv Library; O’Reilly Media, Inc.: Sebastopol, CA, USA, 2008. [Google Scholar]
- Blanchet, G.; Charbit, M. Digital Signal and Image Processing Using Matlab, Volume 2: Advances and Applications: The Deterministic Case; John Wiley & Sons: Hoboken, NJ, USA, 2015. [Google Scholar]
- Cain, C.; Leonessa, A. Validation of underwater sensor package using feature based slam. Sensors
**2016**, 16, 380. [Google Scholar] [CrossRef] [PubMed] - Chiang, K.-W.; Tsai, M.-L.; Naser, E.-S.; Habib, A.; Chu, C.-H. New calibration method using low cost mem imus to verify the performance of uav-borne mms payloads. Sensors
**2015**, 15, 6560–6585. [Google Scholar] [CrossRef] [PubMed] - Chan, T.O.; Lichti, D.D. Automatic in situ calibration of a spinning beam lidar system in static and kinematic modes. Remote Sens.
**2015**, 7, 10480–10500. [Google Scholar] [CrossRef] - Chan, T.; Lichti, D.D.; Belton, D. Temporal analysis and automatic calibration of the velodyne hdl-32e lidar system. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci.
**2013**, 2, 61–66. [Google Scholar] [CrossRef] - Glennie, C.; Lichti, D.D. Static calibration and analysis of the velodyne hdl-64e s2 for high accuracy mobile scanning. Remote Sens.
**2010**, 2, 1610–1624. [Google Scholar] [CrossRef] - Yang, M.Y.; Förstner, W. Plane detection in point cloud data. In Proceedings of the 2nd International Conference on Machine Control Guidance, Bonn, Germany, 9–11 March 2010; pp. 95–104.
- Le Scouarnec, R.; Touzé, T.; Lacambre, J.-B.; Seube, N. A new reliable boresight calibration method for mobile laser scanning applications. In Proceedings of the European Calibration and Orientation Workshop, Castelldefels, Spain, 12–14 February 2014. [CrossRef]
- Shang, E.; An, X.; Shi, M.; Meng, D.; Li, J.; Wu, T. An efficient calibration approach for arbitrary equipped 3-d lidar based on an orthogonal normal vector pair. J. Intell. Robot. Syst.
**2015**, 79, 21–36. [Google Scholar] [CrossRef] - Morales, J.; Martínez, J.L.; Mandow, A.; Reina, A.J.; Pequeño-Boter, A.; García-Cerezo, A. Boresight calibration of construction misalignments for 3d scanners built with a 2d laser rangefinder rotating on its optical center. Sensors
**2014**, 14, 20025–20040. [Google Scholar] [CrossRef] [PubMed] - Rieger, P.; Studnicka, N.; Pfennigbauer, M.; Zach, G. Boresight alignment method for mobile laser scanning systems. J. Appl. Geod.
**2010**, 4, 13–21. [Google Scholar] [CrossRef] - Filin, S. Recovery of systematic biases in laser altimetry data using natural surfaces. Photogramm. Eng. Remote Sens.
**2003**, 69, 1235–1242. [Google Scholar] [CrossRef] - Glennie, C. Calibration and kinematic analysis of the velodyne hdl-64e s2 lidar sensor. Photogramm. Eng. Remote Sens.
**2012**, 78, 339–347. [Google Scholar] [CrossRef] - Keller, F.; Sternberg, H. Multi-sensor platform for indoor mobile mapping: System calibration and using a total station for indoor applications. Remote Sens.
**2013**, 5, 5805–5824. [Google Scholar] [CrossRef] - Puente, I.; González-Jorge, H.; Martínez-Sánchez, J.; Arias, P. Review of mobile mapping and surveying technologies. Measurement
**2013**, 46, 2127–2145. [Google Scholar] [CrossRef] - Reshetyuk, Y. Self-Calibration and Direct Georeferencing in Terrestrial Laser Scanning. Ph.D. Thesis, Royal Institute of Technology, Stockholm, Sweden, 2009. [Google Scholar]
- Remondino, F.; Fraser, C. Digital camera calibration methods: Considerations and comparisons. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.
**2006**, 36, 266–272. [Google Scholar] - Ministry of Land, I.a.T.o.K. Standard of Horizontal Coordinate. Available online: http://www.law.go.kr/lsBylInfoPLinkR.do?lsiSeq=171727&lsNm=%EA%B3%B5%EA%B0%84%EC%A0%95%EB%B3%B4%EC%9D%98%EA%B5%AC%EC%B6%95%EB%B0%8F%EA%B4%80%EB%A6%AC%EB%93%B1%EC%97%90%EA%B4%80%ED%95%9C%EB%B2%95%EB%A5%A0%EC%8B%9C%ED%96%89%EB%A0%B9&bylNo=0002&bylBrNo=00&bylCls=BE&bylEfYd=&bylEfYdYn=Y (accessed on 29 August 2016).
- U.S. General Services Administration. Gsa Bim Guide for 3d Imaging. Available online: http://www.gsa.gov/bim (accessed on 28 January 2015).
- Becerik-Gerber, B.; Jazizadeh, F.; Kavulya, G.; Calis, G. Assessment of target types and layouts in 3d laser scanning for registration accuracy. Autom. Constr.
**2011**, 20, 649–658. [Google Scholar] [CrossRef] - Tan, K.; Cheng, X. Correction of incidence angle and distance effects on tls intensity data based on reference targets. Remote Sens.
**2016**, 8, 251. [Google Scholar] [CrossRef] - Rusinkiewicz, S.; Levoy, M. Efficient variants of the icp algorithm. In Proceedings of the Third International Conference on 3-D Digital Imaging and Modeling, Quebec City, QC, Canada, 28 May–1 June 2001; pp. 145–152.
- Korean National Geographic Information Institute. Gnss Data Service. Available online: http://gnss.ngii.go.kr/info/opsInfo (accessed on 26 January 2017).
- Hassan, E. Calibration of Multi-Sensor Laser Scanning Systems; University of Calgary: Calgary, AB, Canada, 2014. [Google Scholar]
- Leslar, M.; Wang, J.; Hu, B. Boresight and lever arm calibration of a mobile terrestrial lidar system. GEOMATICA
**2016**, 70, 97–112. [Google Scholar] [CrossRef] - El-Sheimy, N. Mobile multi-sensor systems: The new trend in mapping and gis applications. In Geodesy beyond 2000; Springer: New York, NY, USA, 2000; pp. 319–324. [Google Scholar]
- Mikhail, E.M.; Ackermann, F.E. Observations and Least Squares; IEP Don-Donnelley: New York, NY, USA; Hagerstown, MD, USA; San Francisco, CA, USA; London, UK, 1976. [Google Scholar]
- Ghilani, C.D. Adjustment Computations: Spatial Data Analysis; John Wiley & Sons, INC.: Hoboken, NJ, USA, 2010. [Google Scholar]
- Habib, A.; Ghanma, M.; Morgan, M.; Al-Ruzouq, R. Photogrammetric and lidar data registration using linear features. Photogramm. Eng. Remote Sens.
**2005**, 71, 699–707. [Google Scholar] [CrossRef] - Harris, C.; Stephens, M. A combined Corner and Edge Detector. In Proceedings of the Alvey Vision Conference, Manchester, UK, 31 August–2 September 1988; p. 50.
- AXIS. Axis f1005-e Sensor Unit. Available online: http://www.axis.com/files/datasheet/ds_f1005e_1498822_en_1509.pdf (accessed on 29 August 2016).
- Velodyne. Hdl 32-e Datasheet. Available online: http://velodynelidar.com/lidar/hdlproducts/97–0038d%20HDL-32E_datasheet.pdf (accessed on 29 August 2016).
- Commission, I.E. Safety of Laser Products: Equipment Classification, Requirements and User’s Guide; International Electrotechnical Commission: Geneva, Switzerland, 1998. [Google Scholar]
- Oxford Technical Solutions. Survey+ User Manual. Available online: http://www.oxts.com/Downloads/Products/surveyplus/survey+man.pdf (accessed on 29 August 2016).
- FARO. Faro Focus3d. Available online: http://www.faro.com/products/3d-surveying/laser-scanner-faro-focus-3d/overview (accessed on 29 August 2016).
- Trimble. Trimble r8 User Guide. Available online: http://geocourse.kz/Downloads/manuals/GPS/Trimble%20R8-R6-R4_v480A_UserGuide%20ENG.pdf (accessed on 29 August 2016).
- MathWorks. Detectcheckerboardpoints. Available online: http://kr.mathworks.com/help/vision/ref/detectcheckerboardpoints.html#outputarg_boardSize (accessed on 29 August 2016).
- Geiger, A.; Moosmann, F.; Car, Ö.; Schuster, B. Automatic Camera and Range Sensor Calibration Using a Single Shot. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation (ICRA), St. Paul, MN, USA, 14–18 May 2012.
- FARO. Scene. Available online: http://www.faro.com/products/faro-software/scene/downloads#Download (accessed on 29 August 2016).

**Figure 1.**Configuration of mobile mapping system: network video cameras (F: front, L: left, R: right), mobile laser scanner, and GNSS/INS.

**Figure 3.**(

**a**) Sphere target used for registration of terrestrial laser scanning data; (

**b**) sphere target detected in a point cloud (the green sphere is a fitted sphere model).

**Figure 6.**Example of reference feature extraction: (

**a**) point feature extracted from image; (

**b**) plane feature extracted from mobile laser scanning data.

**Figure 7.**(

**a**) Checkerboard and extracted reference points; (

**b**) reference points for calibration of CAM(F); (

**c**) reference points for calibration of CAM(L); (

**d**) reference points for calibration of CAM(R).

**Figure 8.**Terrestrial laser scanning data: (

**a**) distribution of scanning stations, registration targets, and georeferencing targets; (

**b**) registered and georeferenced point cloud of calibration site.

**Figure 9.**Datasets observed from image and laser sensors in MMS: (

**a**) CAM(F) image; (

**b**) CAM(L) image; (

**c**) CAM(R) image (

**d**) mobile laser scanning data (The origin of the coordinate system is center of sensor).

**Figure 10.**Distribution of reference features for boresight and lever-arm calibration: (

**a**) point features; (

**b**) plane features.

**Figure 11.**Result of image rectification with calibrated IOPs: (

**a**) original image; (

**b**) rectified image.

**Figure 12.**Example of good and bad geometries of control planes: (

**a**) successful case with eight planes; (

**b**) failed case with eight planes; (

**c**) successful case with four planes; (

**b**) failed case with four planes.

**Figure 13.**Boresight and lever-arm calibration results (In case of CAM(F), CAM(L), and CAM(R), the direction vectors in sensor frame are [0 0 1]. In case of mobile laser scanner, the direction vector in sensor frame is [1 0 0]): (

**a**) 2-dimensional view; (

**b**) 3D view.

**Figure 15.**Projection of mobile laser scanning data into network video camera images: (

**a**) CAM(F); (

**b**) CAM(R); (

**c**) CAM(L).

**Figure 16.**Operation of developed MMS under driving conditions: (

**a**) georeferenced point cloud (red: observed by MMS, black: point cloud observed and georeferenced by terrestrial laser scanning data and GNSS); and (

**b**) point cloud projected onto image.

**Figure 17.**Point cloud generated by developed MMS: (

**a**) point cloud projected on aerial orthophoto; (

**b**,

**c**) point cloud including color information.

**Figure 18.**Example of 3D road sign mapping (white boxes: faces of people and registration numbers of cars are screened due to privacy, red box: road sign, green dot: points indicating road signs, blue dots, points projected onto the image).

Model | AXIS F1005-E | |

Effective sensor size | 1/2.8″ | |

Focal length | 2.8 mm | |

Field of view | Horizontal | 113° |

Vertical | 62° | |

Image size | 1920 × 1200 pixels | |

Frame rate | 60 fps |

Model | HDL-32E | |

Number of channels | 32 | |

Measurement range | Up to 100 m | |

Rotation rate | 5~20 Hz | |

Accuracy | ± 2 cm | |

Field of view | Horizontal | 360° |

Vertical | −30.67~ 0.67° | |

Angular resolution | Horizontal | 0.1~0.4° |

Vertical | 1.33° | |

Laser | Class 1 903 nm Wavelength |

Model | OxTS Survey+ |

Position accuracy | Up to 0.01 m |

Velocity accuracy | 0.1 km/h |

Roll/pitch accuracy | 0.03° |

Heading accuracy | 0.1° |

Output rate | 100 Hz |

Size | 234 × 120 × 88 mm |

Model | Focus 3D | |

Type | Amplitude-Modulated Continuous Wave (AMCW) | |

Measurement range | Up to 120 m | |

Scan rate | 976,000 points/sec | |

Ranging error | ±2 mm | |

Ranging noise | 0.6 mm * | |

Field of view | Horizontal | 360° |

Vertical | 305° | |

Size | 240 × 200 × 100 mm |

Parameter (Unit) | Initial Value | Calibrated Value ± Precision | |||
---|---|---|---|---|---|

CAM(F) | CAM(L) | CAM(R) | |||

Focal length (mm) | 2.5 | 2.47 ± 0.0054 | 2.44 ± 0.0069 | 2.52 ± 0.0085 | |

Principal points (pixel) | x_{p} | 0 | −27.41 ± 0.70 | 0.47 ± 0.87 | −49.65 ± 0.95 |

y_{p} | 0 | 33.05 ± 0.96 | 59.03 ± 0.87 | 18.14 ± 0.87 | |

Radial distortion (unitless) | A_{1} (×10^{−1}) | 0 | −3.33 ± 0.017 | −3.39 ± 0.021 | −3.57 ± 0.03 |

A_{2} (×10^{−1}) | 0 | 1.29 ± 0.014 | 1.43 ± 0.021 | 1.65 ± 0.03 | |

A_{3} (×10^{−2}) | 0 | −2.47 ± 0.045 | −3.13 ± 0.078 | −4.15 ± 0.15 | |

Decentering distortion (unitless) | B_{1} (×10^{−4}) | 0 | −5.16 ± 0.73 | −8.59 ± 1.00 | −3.51 ± 1.21 |

B_{2} (×10^{−4}) | 0 | 1.98 ± 1.50 | 3.29 ± 1.51 | −5.71 ± 2.35 | |

Projection residuals (pixel) | x-direction | - | 0.47 | 0.36 | 0.50 |

y-direction | - | 0.36 | 0.32 | 0.42 |

Parameter (unit) | Calibrated Value ± Precision | Parameter (unit) | Calibrated Value ± Precision | ||
---|---|---|---|---|---|

CAM (F) | ${x}_{T}$ (mm) | 776.82 ± 6.25 | CAM (L) | ${x}_{T}$ (mm) | 478.36 ± 6.13 |

${y}_{T}$ (mm) | 1776.02 ± 5.72 | ${y}_{T}$ (mm) | 1650.72 ± 5.61 | ||

${z}_{T}$ (mm) | 383.23 ± 4.12 | ${z}_{T}$ (mm) | 340.34 ± 4.42 | ||

$\omega $ (degree) | −81.8879 ± 0.0712 | $\omega $ (degree) | −87.2623 ± 0.0725 | ||

$\phi $ (degree) | −0.0425 ± 0.0621 | $\phi $ (degree) | −0.7647 ± 0.0303 | ||

$\kappa $ (degree) | 179.2535 ± 0.0815 | $\kappa $ (degree) | −179.8742 ± 0.0853 | ||

CAM (R) | ${x}_{T}$ (mm) | 1060.64 ± 7.50 | Mobile laser scanner | ${x}_{T}$ (mm) | 793.87 ± 1.26 |

${y}_{T}$ (mm) | 1656.25 ± 8.32 | ${y}_{T}$ (mm) | 1120.07 ± 1.34 | ||

${z}_{T}$ (mm) | 424.36 ± 5.75 | ${z}_{T}$ (mm) | 892.54 ± 7.38 | ||

$\omega $ (degree) | −83.0442 ± 0.0641 | $\omega $ (degree) | −0.2845 ± 0.0642 | ||

$\phi $ (degree) | −0.2962 ± 0.0446 | $\phi $ (degree) | 5.2074 ± 0.0534 | ||

$\kappa $ (degree) | 175.9012 ± 0.0671 | $\kappa $ (degree) | 88.2112 ± 0.0113 |

Sensor (unit) | Projection Residual | External Check Result | ||
---|---|---|---|---|

Mean | RMSE * | Mean | RMSE | |

CAM(F) (pixel) | 0.00 | 0.44 | 0.23 | 0.65 |

CAM(L) (pixel) | 0.00 | 0.65 | 0.15 | 0.85 |

CAM(R) (pixel) | 0.00 | 0.62 | 0.35 | 0.77 |

Mobile laser scanner (mm) | 11.58 | 12.99 | 10.97 | 11.58 |

Number of Planes | Number of Precise Cases/Number of Combination * (percentage) | Precision of Estimated Parameters (mm) ** | External Check Result (mm) ** | |||
---|---|---|---|---|---|---|

${\mathit{x}}_{\mathit{T}}$ | ${\mathit{y}}_{\mathit{T}}$ | ${\mathit{z}}_{\mathit{T}}$ | Mean | RMSE | ||

3 | 0/120 (0%) | - | - | - | - | - |

4 | 22/210 (10%) | 5.53 | 3.92 | 11.08 | 15.71 | 15.86 |

5 | 76/252 (30%) | 1.15 | 1.79 | 9.13 | 11.41 | 12.16 |

6 | 97/210 (46%) | 3.15 | 1.93 | 9.48 | 9.33 | 11.65 |

7 | 66/120 (55%) | 0.99 | 1.18 | 7.03 | 8.22 | 9.01 |

8 | 31/45 (69%) | 1.31 | 1.48 | 7.53 | 10.24 | 11.28 |

9 | 10/10 (100%) | 1.49 | 1.64 | 7.84 | 9.21 | 9.78 |

10 | 1/1 (100%) | 1.26 | 1.34 | 7.38 | 10.97 | 11.58 |

© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license ( http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Hong, S.; Park, I.; Lee, J.; Lim, K.; Choi, Y.; Sohn, H.-G.
Utilization of a Terrestrial Laser Scanner for the Calibration of Mobile Mapping Systems. *Sensors* **2017**, *17*, 474.
https://doi.org/10.3390/s17030474

**AMA Style**

Hong S, Park I, Lee J, Lim K, Choi Y, Sohn H-G.
Utilization of a Terrestrial Laser Scanner for the Calibration of Mobile Mapping Systems. *Sensors*. 2017; 17(3):474.
https://doi.org/10.3390/s17030474

**Chicago/Turabian Style**

Hong, Seunghwan, Ilsuk Park, Jisang Lee, Kwangyong Lim, Yoonjo Choi, and Hong-Gyoo Sohn.
2017. "Utilization of a Terrestrial Laser Scanner for the Calibration of Mobile Mapping Systems" *Sensors* 17, no. 3: 474.
https://doi.org/10.3390/s17030474