Figure 1.
Sensor system description. Our sensor suite includes two 2D LiDARs, four cameras, a GPS, two wheel encoders, an IMU, and an altimeter. Note that the front LiDAR in (b) is only for moving object detection, and thus excluded in Simultaneous Localization and Mapping (SLAM) framework.
Figure 1.
Sensor system description. Our sensor suite includes two 2D LiDARs, four cameras, a GPS, two wheel encoders, an IMU, and an altimeter. Note that the front LiDAR in (b) is only for moving object detection, and thus excluded in Simultaneous Localization and Mapping (SLAM) framework.
Figure 2.
Software design diagram. The 3D point cloud map generation involves four main steps.
Figure 2.
Software design diagram. The 3D point cloud map generation involves four main steps.
Figure 3.
Depiction of the pose-graph SLAM factors. Odometry constraints (odo) are sequential whereas wall loop constraints (wall) can be non-sequential. Absolute constraints (abs) includes digital map correction and altimeter. GPS factors may occur irregularly. When wall is created, we also provide a digital map correction factor.
Figure 3.
Depiction of the pose-graph SLAM factors. Odometry constraints (odo) are sequential whereas wall loop constraints (wall) can be non-sequential. Absolute constraints (abs) includes digital map correction and altimeter. GPS factors may occur irregularly. When wall is created, we also provide a digital map correction factor.
Figure 4.
Shape file representations for building and contour. An example of polygon data type structure is in (a), and an example of contour map is in (b). (c) is sample shape-map (building) in the campus data. (d) is sample shape-map (DEM) in the campus data.
Figure 4.
Shape file representations for building and contour. An example of polygon data type structure is in (a), and an example of contour map is in (b). (c) is sample shape-map (building) in the campus data. (d) is sample shape-map (DEM) in the campus data.
Figure 5.
(a) Illustration of point cloud accumulation from vertically installed side scanning LiDAR. We only collect a single scan line per vehicle pose, and accumulate lines into a local point cloud set via motion. (b) An example of wall represented by digital map and point cloud map (white points in green box). For wall segmentation, we use 5-point Random Sample Consensus (RANSAC)-based plane fitting.
Figure 5.
(a) Illustration of point cloud accumulation from vertically installed side scanning LiDAR. We only collect a single scan line per vehicle pose, and accumulate lines into a local point cloud set via motion. (b) An example of wall represented by digital map and point cloud map (white points in green box). For wall segmentation, we use 5-point Random Sample Consensus (RANSAC)-based plane fitting.
Figure 6.
Block diagram for wall segmentation. The overall wall extraction algorithm can be divided into two parts. The first part is to obtain the initial normal, the second is a part of the segmentation wall to error distance test of the new incoming scan line by using initial normal information.
Figure 6.
Block diagram for wall segmentation. The overall wall extraction algorithm can be divided into two parts. The first part is to obtain the initial normal, the second is a part of the segmentation wall to error distance test of the new incoming scan line by using initial normal information.
Figure 7.
Result of digital map-based SLAM (a) Magenta line represents the correspondence of the digital map and building wall (green) extracted by point cloud. (b) Loop closure of trajectory generated using wall to wall matching showed by cyan line. (c) Correcting in the Z-axis direction, we use matching of Digital Elevation Model (DEM) and SLAM node. the yellow line represents this correspondence. (d) We can confirm the z-directional trajectory over the change of color.
Figure 7.
Result of digital map-based SLAM (a) Magenta line represents the correspondence of the digital map and building wall (green) extracted by point cloud. (b) Loop closure of trajectory generated using wall to wall matching showed by cyan line. (c) Correcting in the Z-axis direction, we use matching of Digital Elevation Model (DEM) and SLAM node. the yellow line represents this correspondence. (d) We can confirm the z-directional trajectory over the change of color.
Figure 8.
Coordinate the relationship between the camera, image and world. The unit of (, ) is the pixel measurement, and that of (, ) and () is the meter measurement. Reprinted, with permission, from Jeong et al. URAI 2016; ©2016 IEEE.
Figure 8.
Coordinate the relationship between the camera, image and world. The unit of (, ) is the pixel measurement, and that of (, ) and () is the meter measurement. Reprinted, with permission, from Jeong et al. URAI 2016; ©2016 IEEE.
Figure 9.
Side and top view of IPM model. In the illustration, and are the focal length of a camera; is the pitch angle; and are the half angle of the vertical and horizontal field of view (FOV) respectively. Reprinted, with permission, from Jeong et al. URAI 2016; ©2016 IEEE.
Figure 9.
Side and top view of IPM model. In the illustration, and are the focal length of a camera; is the pitch angle; and are the half angle of the vertical and horizontal field of view (FOV) respectively. Reprinted, with permission, from Jeong et al. URAI 2016; ©2016 IEEE.
Figure 10.
Result of self-consistent lane map process. Left (a,c) and right (b,d) images show the result of lane map.
Figure 10.
Result of self-consistent lane map process. Left (a,c) and right (b,d) images show the result of lane map.
Figure 11.
Compensation process of interpolated pose between DR computed node and iSAM node.
Figure 11.
Compensation process of interpolated pose between DR computed node and iSAM node.
Figure 12.
3D mapping result and data logging path. (a) 3D map result represented by pseudo-colored point cloud in digital map background. Nodes are color-coded by height varying from red (low) to purple (high). (b) Illustration of data logging path performed in the experiments. Connection of the trajectory represented by color shift from red (start) to purple (end). There are three circular paths covering the eastern, the western, and the northern part of the campus.
Figure 12.
3D mapping result and data logging path. (a) 3D map result represented by pseudo-colored point cloud in digital map background. Nodes are color-coded by height varying from red (low) to purple (high). (b) Illustration of data logging path performed in the experiments. Connection of the trajectory represented by color shift from red (start) to purple (end). There are three circular paths covering the eastern, the western, and the northern part of the campus.
Figure 13.
Proposed digital map-based SLAM results. (
a) Uncertainty versus path length . Whereas the uncertainty of DR navigation (blue, dot) shows unbounded growth, the uncertainty of SLAM (red solid) is bounded from wall-to-wall loop closures. The graph at the top of the pose uncertainty with log-scale of y axis (unit: m·rad. The following [
53,
54,
55], we use m·rad to show pose uncertainty), the bottom graph shows the position uncertainty (unit: m). (
b) Top-down view of the SLAM estimate (red line) versus dead-reckoning trajectory (gray line). (
c) The xy component of the SLAM trajectory estimate is plotted versus time, where the vertical axis represents mission time. Green lines show wall-to-wall matching for loop closure.
Figure 13.
Proposed digital map-based SLAM results. (
a) Uncertainty versus path length . Whereas the uncertainty of DR navigation (blue, dot) shows unbounded growth, the uncertainty of SLAM (red solid) is bounded from wall-to-wall loop closures. The graph at the top of the pose uncertainty with log-scale of y axis (unit: m·rad. The following [
53,
54,
55], we use m·rad to show pose uncertainty), the bottom graph shows the position uncertainty (unit: m). (
b) Top-down view of the SLAM estimate (red line) versus dead-reckoning trajectory (gray line). (
c) The xy component of the SLAM trajectory estimate is plotted versus time, where the vertical axis represents mission time. Green lines show wall-to-wall matching for loop closure.
Figure 14.
Sensor availability graph. The nodes are colored by the sensor type associated with the node. Blue dots are the GPS nodes and are in a smaller size for clear visualization for digital map nodes (green and red). Cyan nodes are loop-closure nodes via LiDAR comparison.
Figure 14.
Sensor availability graph. The nodes are colored by the sensor type associated with the node. Blue dots are the GPS nodes and are in a smaller size for clear visualization for digital map nodes (green and red). Cyan nodes are loop-closure nodes via LiDAR comparison.
Figure 15.
3D mapping result for sample buildings. Accumulated and refined point cloud using SLAM trajectory is given. (a) and (c) are the 3D mapping of two samples. Points are colored by the height, and green squares indicate the classified building walls. (b) and (d) are the aerial view of each sample building.
Figure 15.
3D mapping result for sample buildings. Accumulated and refined point cloud using SLAM trajectory is given. (a) and (c) are the 3D mapping of two samples. Points are colored by the height, and green squares indicate the classified building walls. (b) and (d) are the aerial view of each sample building.
Figure 16.
Generated lane map. (a) and (b) is a narrow straight section without and with parked car respectively. (c) is intersection and (d) includes speed bump and crosswalk.
Figure 16.
Generated lane map. (a) and (b) is a narrow straight section without and with parked car respectively. (c) is intersection and (d) includes speed bump and crosswalk.
Figure 17.
Accuracy analysis on sample points using VRS-GPS. (a) Four corners of the building rooftop are measured with building walls compensation. (b) Sample points on road marks are measured.
Figure 17.
Accuracy analysis on sample points using VRS-GPS. (a) Four corners of the building rooftop are measured with building walls compensation. (b) Sample points on road marks are measured.
Figure 18.
(a) Red square drawn from four sample ground truth corners. (b) By measuring perpendicular distance (purple line) from 3D point cloud to ground truth (red line) error is measured. (c) shows the topview for a clear illustration.
Figure 18.
(a) Red square drawn from four sample ground truth corners. (b) By measuring perpendicular distance (purple line) from 3D point cloud to ground truth (red line) error is measured. (c) shows the topview for a clear illustration.
Figure 19.
Six sample points having accurate global position are measured by RTK-GPS, and they are plotted on (a) street view, (b) lane map, (c) aerial map and (d) digital map. (c) Substantial position error with ground truth in aerial map, (d) Small position error with ground truth occurs, but lacking in detail such as road mark and cross walk. (b) Proposed method has position accuracy with global ground truth and detail information.
Figure 19.
Six sample points having accurate global position are measured by RTK-GPS, and they are plotted on (a) street view, (b) lane map, (c) aerial map and (d) digital map. (c) Substantial position error with ground truth in aerial map, (d) Small position error with ground truth occurs, but lacking in detail such as road mark and cross walk. (b) Proposed method has position accuracy with global ground truth and detail information.
Table 1.
Specifications of Urban Mapping System (UMS).
Table 1.
Specifications of Urban Mapping System (UMS).
Item | Specification |
---|
Dimensions | 1.67 m × 1.36 m × 0.31 m (L × W × H ) |
Dry weight | 35.8 kg |
LiDAR | SICK LMS291,200 (35 Hz) |
Imaging sensor | Point Grey Flea3, 1380 × 1024 pixel, 12-bit CCD (30 Hz) |
GPS | HUACE B20 (1 Hz) |
IMU sensor | Xsens MTi (100 Hz) |
Altimeter | WITHROBOT myPressure (1 Hz) |
Wheel encoder | Autonics E68S, rotary encoder type (100 Hz) |
Processor | Intel(R) Core(TM) i7-3790 [email protected] GHz |
Battery | Delkor 80 Ah, 12 V , lead–acid type |
Table 2.
Description on polygon record contents. The fields for a polygon type are box, numParts, numPoints, parts, and points. Box means the bounding box for the polygon stored in the order Xmin, Ymin, Xmax, Ymax. The number of closed curves in the polygon is described by NumParts. NumPoints is the total number of points for all closed curves. parts means an array of length numParts. For each closed curve, the index of its first point stored in the points array. Points are array of length numPoints. The points for each closed curve in the polygon are stored end to end.
Table 2.
Description on polygon record contents. The fields for a polygon type are box, numParts, numPoints, parts, and points. Box means the bounding box for the polygon stored in the order Xmin, Ymin, Xmax, Ymax. The number of closed curves in the polygon is described by NumParts. NumPoints is the total number of points for all closed curves. parts means an array of length numParts. For each closed curve, the index of its first point stored in the points array. Points are array of length numPoints. The points for each closed curve in the polygon are stored end to end.
Position | Field | Value | Type | Number | Byte Order |
---|
Byte 0 | Shape Type | 5 | Integer | 1 | Little |
Byte 4 | Box | Box | Double | 4 | Little |
Byte 36 | NumParts | NumParts | Integer | 1 | Little |
Byte 40 | NumPoints | NumPoints | Integer | 1 | Little |
Byte 44 | Parts | Parts | Integer | NumParts | Little |
Byte X† | Points | Points | Point | NumPoints | Little |
†X = 44 + 4 × NumParts | | | | | |
Table 3.
Discrepancy prior to the compensation (per unit meter).
Table 3.
Discrepancy prior to the compensation (per unit meter).
| Min | Max | Average |
---|
x (m) | | | |
y (m) | | | |
yaw () | | | |
Table 4.
Summary of Digital-Based SLAM results.
Table 4.
Summary of Digital-Based SLAM results.
Path Length | Logging Time | Computation Time | GPS Node | Wall Node | Digital Map Node | Total Nodes | No. of Point |
---|
9322.35m | 1952.06s | 792.04s (40.57%) | 482 (41.4%) | 249 (20.09%) | 136 (11.67%) | 1165 | 23,017,120 |
Table 5.
Positional error measurement between the 3D building point cloud and RTK-GPS measured building corners. The map error is computed from average of RMSE between the ground truth wall generated from RTK measured sample points and LiDAR 3D points in the mapped wall. Note that this error is over 9.32 km of travel distance.
Table 5.
Positional error measurement between the 3D building point cloud and RTK-GPS measured building corners. The map error is computed from average of RMSE between the ground truth wall generated from RTK measured sample points and LiDAR 3D points in the mapped wall. Note that this error is over 9.32 km of travel distance.
Sample | No. of 3D Map Points | Average RMSE |
---|
GPS Only [m] | Digital Map-Based [m] |
---|
Set 1 | 3352 | 0.437 | 0.190 |
Set 2 | 1967 | 1.010 | 0.193 |
Set 3 | 227 | 2.070 | 0.347 |
Set 4 | 3314 | 1.407 | 0.136 |
Table 6.
Error analysis for the proposed method on road marking. Each dataset has two to six sample points respectively. For set 2, no road marking in the aerial map and excluded in the comparison. The proposed method’s error is also written as a ratio to the aerial image for clear comparison.
Table 6.
Error analysis for the proposed method on road marking. Each dataset has two to six sample points respectively. For set 2, no road marking in the aerial map and excluded in the comparison. The proposed method’s error is also written as a ratio to the aerial image for clear comparison.
Sample | No. of Lane Points | Average RMSE |
---|
Aerial Image [m] | Proposed Method [m] |
---|
set 1 | 2 | 7.180 | 1.000 (13.927%) |
set 2 | 2 | - | 1.055 ( - %) |
set 3 | 2 | 8.948 | 1.699 (18.989%) |
set 4 | 6 | 8.044 | 0.622 (7.727%) |
set 5 | 2 | 11.261 | 0.610 (5.418%) |