# Low-Cost Calibration of Matching Error between Lidar and Motor for a Rotating 2D Lidar

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Problems

- (1)
- As mentioned in [1,4,5,6,7,8,9,10], the mechanical error between the 2D lidar and rotating unit can cause distortion of the 3D point cloud. Based on the idea of control variables, in order to focus on other reasons, we need to exclude this item first. By improving the manufacture and assembly accuracy of the mechanical parts of our prototype, we minimized the impact of mechanical errors on the accuracy of the 3D point cloud. We used computer numerical control (CNC) machine tools to manufacture the key parts of our prototype, with a tolerance level of IT5 (Chinese standard, GB/T1184-1996). In Appendix B, we verified that the self-reasons of our prototype (which include the mechanical error) has a much smaller impact on the accuracy of the 3D point clouds than the error of the 2D lidar itself (±40 mm). Therefore, we can exclude this item and assume that other reasons are causing the resulting observed error.
- (2)
- The acceleration and deceleration of the motor shaft will cause error. In a 3D scan of a rotating 2D lidar, we assume roughly that the shaft of the stepper motor rotates at angular velocity ω for time T, and it rotates 180° in total. However, in practice, the motion of the motor shaft is more complex. The motor shaft accelerates from standstill to angular velocity ω, and then it keeps rotating at this speed, after which it decelerates to standstill. We have ignored the acceleration and deceleration of the motor shaft, and this can lead to the shape error of the 3D point cloud, which is especially obvious at the closed position of the point cloud. The so-called closed position here refers to the area where the points collected by the 2D lidar when the motor shaft starts to rotate or stops rotating. The areas of these two types of points are adjacent, and these points are most directly affected by the acceleration and deceleration of the motor shaft.
- (3)
- The use of photoelectric switches may cause error. Since no encoder or servo motor is used, it is necessary to define the initial position of motor shaft by photoelectric switch. When finding the initial position, the motor shaft will rotate back and forth by a large margin and make the shading sheet which rotates with the motor shaft triggering the photoelectric switch. After the photoelectric switch is triggered, the controller sends a stop command to the motor to stop its shaft. The position of the motor shaft after it stops is considered to be the initial position. This process can be described in more detail, the shading sheet blocks the light beam of the photoelectric switch—the photoelectric switch is triggered—the controller sends the stop command to the stepper motor—the shaft of stepper motor decelerates until it stops. The accuracy of the initial position of the motor shaft is affected by many factors such as the response time of the photoelectric switch, the rotation direction and the speed of the motor shaft. For each 3D scan of a rotating 2D lidar, the motor shaft starts rotating from its initial position. Therefore, if the initial position of the motor shaft is not accurately defined, the attitude error of the 3D point cloud will be caused.
- (4)
- The uncertain time deviation may cause error. In the actual engineering operation, we found that although we require the 2D lidar and the stepper motor to start to work at the same time, there is an uncertain time deviation between the starting time of the 2D lidar and the starting time of the stepper motor. The value of this time deviation is very small, but it is enough to have a significant impact on the accuracy of the 3D point cloud. The reason for the uncertain time deviation is that the response time of the 2D lidar and stepper motor to the command is inconsistent and not constant, as mentioned in [16]. Because both the transmission of the command and the response of the motor or the 2D lidar take time, there is an uncertain time deviation between the time when the controller starts to send the command to the motor and the time when the motor starts to work. Similarly, there is an uncertain time deviation between the time when the controller starts to send the command to the 2D lidar and the time when the 2D lidar starts to work. The uncertainty of the transmission time of the command and the response time of the device is a common problem which is difficult to solve. Because of this problem, we can mark the time when the controller starts to send the command to the motor or the 2D lidar in the program, we cannot know the actual working time of the motor or 2D lidar. This is the cause of the uncertain time deviation between the starting time of the 2D lidar and the starting time of stepper motor. For each 3D scan of a rotating 2D lidar, this time deviation is uncertain, so the shape and attitude error of the 3D point cloud for each scan is also uncertain (see Figure 1). The reason for the shape error of the 3D point cloud can be analyzed as follows. Due to the uncertain time deviation, there may be two situations. One is that the motor starts to rotate after the 2D lidar has started scanning continuously for a short period of time. The 3D point cloud collected in this case is shown in Figure 1b. At the position where the motor starts to rotate (the beginning of the yellow circular arrow), there is a shape error marked by red line. The other situation is that the motor starts working earlier than the 2D lidar, so that the motor has stopped rotating before the 2D lidar stops scanning. The 3D point cloud collected in this case is shown in Figure 1c. At the position where the motor stops rotating (the end of the yellow circular arrow), there is a shape error marked by red line. In either case, a shape error of the 3D point cloud will result. In addition, the attitude error can also be caused. Therefore, the uncertain time deviation between the starting time of the 2D lidar and the starting time of stepper motor is one of the reasons for the uncertain error of the 3D point cloud in shape and attitude.

## 3. System Modeling

#### 3.1. Overview of Prototype

#### 3.2. Coordinate Conversion

_{L}Y

_{L}Z

_{L}is as follows. The origin point L of this coordinate frame is the center of the 2D lidar scanning sector. The plane Y

_{L}LZ

_{L}is coplanar with the scanning sector. The axis Z

_{L}is the middle line of the scanning sector, and the axis X

_{L}is perpendicular to the scanning sector. The position of coordinate frame L relative to the 2D lidar is fixed, but its position relative to rotating unit is not fixed, because the 2D lidar is rotational relative to rotating unit.

_{L}of coordinate frame L. The axis Y’ coincides with the starting and ending positions of a 3D scan, as shown in Figure 5. Since the starting and ending positions of a 3D scan is uncertain (as described in Section 2), the position of coordinate frame O’ relative to the rotating unit is uncertain, too.

_{motor}of the motor shaft. Among them, r can be obtained directly, while θ and φ

_{motor}can be obtained by linear interpolation.

**p**

_{L}relative to coordinate frame L according to the ranging data r and azimuth angle θ

**p**

_{L}in coordinate frame L to the corresponding point

**p**

_{O’}in coordinate frame O’

_{L}coincides with axis X’, axis Y

_{L}and axis Y’ are parallel, axis Z

_{L}and axis Z’ are parallel, and the distance between point L and point O’ is 13.9 mm. Therefore, We can know that ${\mathit{R}}_{L}^{{O}^{\prime}}$ =

**I**, ${\mathit{t}}_{L}^{{O}^{\prime}}$ = (13.9 mm 0 0)

^{T}.

**R**

_{M}is the rotation matrix calculated according to the rotation angle φ

_{motor}of the motor shaft, which shows the attitude change of coordinate frame L relative to its initial position after being rotated by the motor shaft. Since the direction of the rotation is along the axis Z’, there is no rotation component in other directions,

**R**

_{M}can be calculated as:

_{motor}is calculated linearly according to the serial number of the corresponding point in all points. The total number of points in the point cloud obtained by a 3D scan of a rotating 2D lidar is denoted as K, the rotation angle of the motor shaft at the k-th point is:

**p**

_{O’}in coordinate frame O’ to the corresponding point

**p**

_{O}in coordinate frame O

_{offset}, while φ

_{offset}shows the deflection angle of the coordinate frame O’ relative to the coordinate frame O in the Z-axis direction. Since the direction of the deflection is along the axis Z’, there is no rotation component in other directions, ${\mathit{R}}_{{O}^{\prime}}^{O}$ can be calculated as:

_{offset}. According to the above description, it can be known that for each 3D scan of a rotating 2D lidar, the value of angle φ

_{offset}is not constant. If we can accurately calculate the value of φ

_{offset}for each 3D scan, then we can eliminate the uncertain attitude error of the 3D point cloud according to Equations (8) and (9). As a result of which, we can get the 3D point cloud relative to coordinate frame O accurately—both in shape and attitude.

_{offset}for each 3D scan will be described in more detail in the next section.

## 4. Method

#### 4.1. Calibration of Uncertain Attitude Error of 3D Point Cloud

**C**

_{O’}. Next, we will correct the attitude error of point cloud

**C**

_{O’}according to the value of the angle φ

_{offset}. The angle φ

_{offset}is calculated as follows.

**C**

_{tri}) should be extracted from the point cloud

**C**

_{O’}. We define a special area (the blue area in Figure 6), and the point cloud in this area is the point cloud

**C**

_{tri}, which corresponds to the triangular plate, because there is only the triangular plate in this area and no other objects exist.

**C**

_{tri}are averaged to calculate the center point

**c**of the triangular plate. This is done to find the middle line of the triangular plate, which is the perpendicular line from the point

**c**to the rotation axis of motor shaft. The angle α between the middle line and the front direction of the prototype (which is also the direction of X-axis direction of coordinate frame O) is known. It is a constant value, which depends on the installation position of the triangular plate on the prototype. The angle β between the middle line and X’-axis of coordinate frame O’ can be calculated according to the 3D coordinates of the point

**c**. According to the description above, the value of angle β is uncertain. The angle between the X-axis of the coordinate frame O and the X’-axis of the coordinate frame O’ is φ

_{offset}. It can be seen from Figure 6 that the calculation formula for the angle φ

_{offset}is:

_{offset}of the coordinate frame O’ relative to the coordinate frame O in the direction of Z-axis. Then, we can eliminate the uncertain attitude error of the 3D point cloud according to Equations (8) and (9). A point cloud relative to the coordinate frame O can be obtained, this point cloud is called

**C**

_{O}. It is accurate both in shape and attitude.

**C**

_{tri}corresponding to the triangular plate from the point cloud

**C**

_{O’}and calculate its attitude relative to the point cloud

**C**

_{O’}. Since the attitude of the triangular plate relative to the point cloud

**C**

_{O’}is known, and the attitude of the triangular plate relative to the coordinate frame O-XYZ is also known, therefore, the attitude of the point cloud

**C**

_{O’}relative to the coordinate frame O-XYZ can be calculated, that is, the attitude error of point cloud

**C**

_{O’}can be calculated. Here, the triangular plate plays a key role.

**C**

_{O’}corresponding to this scan. Although the attitude error of the point cloud

**C**

_{O’}is not constant for each 3D scan, we can calculate it in each 3D scan. In this way, we can calibrate it.

_{offset}with the above method.

- (a)
- In our method, we define the middle line of the triangular plate according to the center point
**c**. We have also tried to define the middle line by finding the point in point cloud**C**_{tri}which is farthest from the rotation axis of the motor shaft (that is, the right-angled vertex of the triangular plate), it turns out that it is less accurate. The reason is obvious, the center point**c**is calculated based on all the points in point cloud**C**_{tri}, while the farthest point is just one point selected from point cloud**C**_{tri}. Therefore, the former is more accurate. Moreover, the accuracy of the 2D lidar Hokuyo UST-10LX we use in our prototype is ±40mm. At this level of accuracy, it is necessary to use the average of multiple points instead of a single point to define the middle line of the triangular plate. - (b)
- The larger the triangular plate is, the more points are contained in the point cloud
**C**_{tri}, and the calculation of the center point**c**is less affected by the accidental error. However, if the size of the triangular plate is too large, it will be inconvenient. The triangular plate installed on our prototype is an isosceles right triangle. The lengths of its three edges are 79.2 mm, 79.2 mm and 112 mm respectively. - (c)
- The distance between the triangular plate and the motor shaft should be moderate. If the distance is too close, a part of the triangular plate will be in the blind area of the scanning field and cannot be scanned. If the distance is too far, the beam of 2D lidar will irradiate the triangular plate at a more inclined angle, and the number of the points contained in the point cloud
**C**_{tri}will be reduced. In our prototype, the distance between the base edge of the isosceles right triangle and the rotation axis of the motor shaft is 54.75 mm. - (d)
- Since the center point
**c**is calculated by averaging the 3D coordinates of all the points in the point cloud**C**_{tri}, the points in point cloud**C**_{tri}should be evenly distributed on the surface of the triangular plate. Therefore, the strip blank area and strip overlap area shown in Figure 5 should be staggered with the point cloud**C**_{tri}. This should be considered when determining the installation position of the triangular plate on the prototype. The installation position determines the value of the angle α. In our prototype, α = 30°.

#### 4.2. Calibration of Installation Error of Triangular Plate

**C**

_{O}still have an attitude error. Even seemingly trivial installation errors can result in obvious attitude error of point clouds

**C**

_{O}. Unlike the uncertain attitude error described above, the attitude error here is constant. In order to calibrate this error, we need to find the optimal estimated value α

^{*}. This is an optimization problem. Since this error originates from the installation error of the triangular plate, only one calibration is required for this error, unless the installation position of the triangular plate has been changed, that is, it has been removed and reinstalled. If the installation position of the triangular plate has been changed, the error needs to be calibrated again.

**C**

_{O}. Through the work described above, the point cloud

**C**

_{O}is already known. We collect the 3D point cloud of the conference room once again according to the method described in Figure 1, and perform the following processing on the 3D point cloud. We extract the planes corresponding to the 4 walls of the room, and calculate their unit normal vectors

**n**

_{1},

**n**

_{2},

**n**

_{3},

**n**

_{4}(this can be done through Point Cloud Library [31]), as shown in Figure 7.

**C**

_{O}. This error is denoted as E

_{atti}, and we quantify E

_{atti}by Formula (11)

_{atti}of the point cloud

**C**

_{O}through the sum of the absolute values of the inner product of the unit vector. The vectors

**n**

_{1},

**n**

_{2},

**n**

_{3},

**n**

_{4}are unit normal vectors extracted from the point cloud

**C**

_{O}, and they correspond to the 4 walls shown in Figure 7. The vectors

**i**and

**j**are unit vectors of the X-axis and Y-axis of the coordinate frame O, respectively. The vectors

**i**and

**j**are fixed because the coordinate frame O is fixed relative to the stationary part of the prototype, the positive direction of X-axis parallel to the front direction of the prototype (see Section 3.2). In Figure 7, we put the prototype in the conference room in a fixed and not-skewed way. Therefore, the coordinate frame O is fixed and not-skewed relative to the conference room.

**C**

_{O}relative to coordinate frame O, the vectors

**n**

_{1}and

**n**

_{3}are perpendicular to the vector

**i**, and the vectors

**n**

_{2}and

**n**

_{4}are perpendicular to the vector

**j**, as a result of which, E

_{atti}= 0. While the angular deflection of the 3D point cloud

**C**

_{O}is between −90° and +90°, E

_{atti}increases as the absolute value of the angular deflection increases. Obviously, in the actual situation, the angular deflection of the 3D point cloud

**C**

_{O}cannot exceed the above range.

**C**

_{O}as φ

_{error}, that is, the angular deviation of the point cloud

**C**

_{O}relative to the correct attitude in the direction of Z-axis. If we rotate point cloud

**C**

_{O}by angle φ

_{error}to make it coincide with the correct attitude, then we can make E

_{atti}= 0. Since the vectors

**n**

_{1},

**n**

_{2},

**n**

_{3},

**n**

_{4}are extracted from the point cloud

**C**

_{O}, while we rotate point cloud

**C**

_{O}, vectors

**n**

_{1},

**n**

_{2},

**n**

_{3},

**n**

_{4}are also be rotated. We substitute the rotated vectors into Formula (11), and we can get the following formula:

_{error}

_{error}is numerically equal to the difference between the actual value and the expected value of angle α, that is:

_{error}to α, that is, function f

_{cost}(α) can be constituted, as shown below:

^{*}through:

**C**

_{O}as the input of the optimization algorithm, that is, $\left\{{\mathit{C}}_{O}^{1},{\mathit{C}}_{O}^{2},{\mathit{C}}_{O}^{3},\dots ,{\mathit{C}}_{O}^{50}\right\}$. The optimized value of angle α is the output of the algorithm. The reason why mode 7 is used is that in lower resolution mode (such as mode 1), the points in point cloud

**C**

_{tri}are sparser, so that the error of the 3D coordinate of the center point

**c**is bigger. The inaccuracy of point

**c**leads to the inaccuracy of angle β. Since the value of the angle α is constant, through Equation (10) we can know that the angle φ

_{offset}inherits the inaccuracy of the angle β, which ultimately leads to the inaccuracy of the attitude of point cloud

**C**

_{O}, as shown in Equations (8) and (9).

**C**

_{O}has been corrected, there are still residual errors remaining. One stems from the above-mentioned inaccuracy of point

**c**, which is particularly obvious in low-resolution mode. The other stems from the constant error of the angle α caused by the installation error, which depends on the accuracy of the installation and has nothing to do with the scanning mode. In low-resolution mode, the former is the main factor. The phenomenon is that the attitude of point cloud

**C**

_{O}obtained from several 3D scans is slightly different. In the high-resolution mode, the latter is the main factor. The phenomenon is that the attitude of the point cloud

**C**

_{O}obtained from several 3D scans is almost the same, but there is a constant difference between the attitude of point cloud

**C**

_{O}and the correct attitude. Here we focus on the latter, and calibrate the installation error of the triangular plate. Therefore, we use the highest resolution mode to collect data as the input of our optimization algorithm, for the purpose of avoiding the above-mentioned first type of error as much as possible, that is, the error caused by low resolution.

_{0}= 30°. Our optimization algorithm is shown in Algorithm A1, which is in Appendix C.

_{offset}, the value of angle α is replaced from the expected value of 30° to the value estimated by Algorithm A1.

_{offset}between point cloud

**C**

_{O’}relative to coordinate frame O’ and point cloud

**C**

_{O}relative to coordinate frame O. Since point cloud

**C**

_{O’}and angle φ

_{offset}are both known, point cloud

**C**

_{O}can be calculated according to Equations (8) and (9). It is accurate both in shape or attitude.

**C**

_{O’}, which has an uncertain attitude error, but no shape error. In Section 4.1, we calibrated the uncertain attitude error of the point cloud

**C**

_{O’}by using a triangular plate. In Section 4.2, we corrected the installation error of the triangular plate. Through the above steps, we can use our prototype to obtain a 3D point cloud with correct shape and attitude. We have calibrated the matching error between the 2D lidar and the motor at low cost. Unlike the existing methods, we do not need a sensor to measure the rotation angle of the motor shaft, which is costly.

## 5. Experiments

#### 5.1. Effectiveness Test

#### 5.1.1. Data of Experiments

**C**

_{O’}), and the blue 3D point cloud was obtained with attitude correction (that is, point cloud

**C**

_{O}).

_{error}of each 3D point cloud in Figure 8 through Equations (12)–(14). It is worth noting that $f\left({\phi}_{\mathrm{error}}\right)=0$ in Formula (14) is under ideal conditions. During data processing, we calculated the φ

_{error}that minimizes $f\left({\phi}_{\mathrm{error}}\right)$ as the result.

_{error}of 70 3D point clouds are shown in Figure 9. We used a line chart to make it more intuitive. The horizontal axis is the number of the mode, that is, mode 1 to mode 7 of the prototype. The vertical axis is the value of angle φ

_{error}, its units are degrees. Looking down on a 3D point cloud, if the offset is clockwise relative to the correct attitude, φ

_{error}is positive, while a counterclockwise offset relative to the correct attitude, indicates negative error. The red line corresponds to the 3D point clouds without attitude correction, and the blue line corresponds to the 3D point clouds with attitude correction. For each line, it is divided into 7 segments by 7 modes, and there are 5 sampling points in each mode, which corresponding to 5 3D point clouds collected in this mode.

_{error}of the 5 sampling points in each mode, we calculated the maximum, average and minimum values respectively, as shown in Figure 10.

#### 5.1.2. Characteristics of Uncertain Attitude Error

#### 5.1.3. Characteristics of Our Algorithm

**C**

_{tri}obtained in the low-resolution mode are relatively sparse, so that the calculated center point

**c**has a large error. The error is shown in two points, one is that the calculated value of center point

**c**has a large deviation from the true value, the other is that the calculated value of center point

**c**is greatly uncertain in each 3D scan. The above two points will lead to a large and uncertain error of angle β, because the angle β is calculated according to the 3D coordinates of the point

**c**. As a result, a large and uncertain error of angle φ

_{offset}is caused. This can be seen from Formula (10). Where α is a constant value. According to Formulas (8) and (9), this can finally result in a large and uncertain attitude error of the point cloud

**C**

_{O}.

#### 5.1.4. Effectiveness of Our Algorithm

_{error}that the attitude error is eliminated by our algorithm. This can be clearly seen from Figure 8, Figure 9 and Figure 10, the 3D point clouds without attitude correction (which is red) are obviously skew, and there are an obvious deviations between their φ

_{error}and 0, while the attitudes of the 3D point clouds with attitude correction (which is blue) are correct, and their φ

_{error}approach 0.

_{error}that, in general, in the same mode, the uncertainty of φ

_{error}of a 3D point cloud with attitude correction is smaller than that of a 3D point cloud without attitude correction (as shown in Figure 10). Comparing the red part and blue part of Figure 10, our algorithm reduces the uncertainty of φ

_{error}, especially in low-resolution mode. In high-resolution mode, our algorithm does not show obvious performance in reducing the uncertainty of φ

_{error}. After all, in high-resolution mode, the uncertainty of the attitude of the 3D point cloud is very small—even without attitude correction. However, mutation point which is similar to the 22nd point of the red line showed in Figure 9 cannot be excluded. Our algorithm can deal with this situation.

#### 5.2. Accuracy Test and Application Evaluation

_{map}with distance d in 7 modes is shown in Figure 11. The mapping error E

_{map}is calculated by the following formula

_{map}, its units are millimeters. The maximum range of the horizontal axis d is 10 m, which is the range of the 2D lidar we used in our prototype. The three curves in each graph are the variation curve of E

_{map}with distance d calculated according to the maximum, average and minimum values of angle φ

_{error}, respectively. The area between the top curve and the bottom curve represents the variation range of the attitude error of the 3D point cloud in this mode.

_{map}of each mode is limited within ±50 mm in Figure 11. Even in mode 1, which has the lowest accuracy, the mapping error E

_{map}can be limited within ±50 mm at d = 2.2996 m. In mode 7, the mapping error E

_{map}at the measuring range d = 10 m can be limited within ±50 mm. For 3D mapping of indoor autonomous mobile robots, a mapping error of ±50 mm is acceptable. After all, the accuracy of the 2D lidar Hokuyo UST-10LX used in our prototype is ±40 mm [30]. Moreover, after the robot has approached a target (such as a door), the rotating 2D lidar mounted on it can 3D scan again to refresh the 3D map. According to Formula (21), the mapping error E

_{map}of this target will be reduced because the distance is closer.

## 6. Conclusions

## Author Contributions

## Funding

## Acknowledgments

## Conflicts of Interest

## Appendix A

Symbol | Explanation |
---|---|

γ | Angular resolution of rotating 2D lidar in the direction of motor shaft. |

n | Number of 2D lidar scans included in a 3D scan. |

f | The scanning frequency of 2D lidar is f, which can be found in its product manual. |

T | The time used by a 3D scanning of a rotating 2D lidar is T. |

ω | The Angular velocity of the motor shaft in a rotating 2D lidar is ω. |

L | The coordinate frame of 2D lidar. |

O’ | The coordinate frame of rotating unit (before attitude correction). |

O | The coordinate frame of rotating unit (after attitude correction). |

r | Ranging data of 2D lidar. |

θ | The azimuth angle in the scanning sector of the 2D lidar corresponding to the ranging data r. |

p_{L} | A 3D point with respect to coordinate frame L, it is calculated according to the data of 2D LIDAR. |

${\mathit{R}}_{L}^{{O}^{\prime}}$ | Rotation matrix from coordinate frame L to coordinate frame O’ at the moment when a 3D scan of a rotating 2D lidar is started. |

${\mathit{t}}_{L}^{{O}^{\prime}}$ | The translation vector from coordinate frame L to coordinate frame O’ at the moment when a 3D scan of a rotating 2D lidar is started. |

φ_{motor} | Rotation angle of the motor shaft corresponding to the ranging data r. |

R_{M} | The rotation matrix calculated according to the rotation angle φ_{motor} of the motor shaft. |

K | The total number of points in the 3D point cloud obtained by a 3D scan of a rotating 2D lidar. |

k | The ordinal number of a point in all points of a 3D point cloud. |

p_{O’} | A 3D point with respect to coordinate frame O’, which is converted from point p_{L} in coordinate frame L. |

φ_{offset} | The deflection angle of the coordinate frame O’ relative to the coordinate frame O in the Z-axis direction. |

${\mathit{R}}_{{O}^{\prime}}^{O}$ | The rotation matrix calculated according to φ_{offset}. |

p_{O} | A 3D point with respect to coordinate frame O, which is converted from point p_{O’} in coordinate frame O’. |

C_{O’} | A 3D point cloud with respect to coordinate frame O’, which is collected by a rotating 2D lidar. |

C_{tri} | A point cloud corresponding to the triangular plate extracted from the point cloud C_{O’}. |

c | The center point of the triangular plate. |

α | The angle between the middle line of the triangular plate and the front direction of the prototype (which is also the positive direction of the axis X of coordinate frame O). |

β | The angle between the middle line of the triangular plate and the axis X’ of the coordinate frame O’. |

C_{O} | A 3D point cloud with respect to coordinate frame O, which is converted from point cloud C_{O’} through rigid body transformation. |

n_{1}~n_{4} | The unit normal vectors of the planes corresponding to the 4 walls of the room in the point cloud C_{O}, respectively. |

i, j | The unit normal vectors corresponding to X-axis and Y-axis of coordinate frame O, respectively. |

E_{atti} | The attitude error of point cloud C_{O}, which is expressed as the sum of absolute values of vector inner product. |

φ_{error} | The attitude error of point cloud C_{O}, which is expressed as an angle. |

${\mathit{R}}_{\mathrm{error}}$ | The rotation matrix calculated according to the angle φ_{error}. |

$f\left({\phi}_{\mathrm{error}}\right)$ | A function showing the attitude error of the point cloud C_{O}, with φ_{error} as the independent variable. |

f_{cost}(α) | A function showing the attitude error of the point cloud C_{O}, with α as the independent variable. |

M | The total number of 3D scans used for the calibration of the installation error of the triangular plate. |

m | The serial number of a 3D scan while calibrating the installation error of the triangular plate. |

${F}_{\mathrm{cost}}(\alpha )$ | The total attitude error of point cloud C_{O}, which is expressed by least square method. |

N_{1}~N_{4} | Data sets, formed by n_{1}~n_{4} from 50 point clouds C_{O}, respectively. |

k_{itera} | Parameter of Algorithm A1, the serial number of a iteration. |

k_{max} | Parameter of Algorithm A1, the maximum times of iteration. |

α_{0} | Parameter of Algorithm A1, the initial value of α. |

ε_{1} | Parameter of Algorithm A1, the first stopping criteria of the algorithm. |

ε_{2} | Parameter of Algorithm A1, the second stopping criteria of the algorithm. |

J(α) | Parameter of Algorithm A1, Jacobian matrix. |

A | Parameter of Algorithm A1, A = J(α)^{T}J(α). |

g | Parameter of Algorithm A1, g = J(α)^{T} F_{cost}(α). |

μ | Parameter of Algorithm A1, the damping parameter. |

τ | Parameter of Algorithm A1, a coefficient which is used to determine the initial value of μ. |

v | Parameter of Algorithm A1, a coefficient which is used to adjust the value of μ in each iteration. |

a_{ii} | Parameter of Algorithm A1, the element in A. |

I | Parameter of Algorithm A1, unit matrix. |

Δα | Parameter of Algorithm A1, the gain of α in each iteration. |

ρ | Parameter of Algorithm A1, gain ratio. |

d | The distance from the object to a rotating 2D lidar. |

E_{map} | The error of the 3D map at distance d, the map is obtained by a rotating 2D lidar which is calibrated by our method. |

## Appendix B

**Figure A1.**Four surfaces which were used to evaluate the accuracy of the 3D point cloud in shape. We marked them with different colors to distinguish them. The door and the projection screen were both avoided for they might reduce the accuracy of the extraction of the planes.

**Figure A2.**The evaluation of the shape accuracy of the 3D point clouds obtained by our prototype in 7 modes.

- (a)
- We can observe that the points in each surface are roughly evenly distributed on both sides of the ideal plane, which indicates that the surface and the ideal plane fit well. This shows that the shapes of these surfaces are very close to planes. Since they were obtained by scanning to the walls using our prototype, this can further indicate that the shape of the 3D point cloud collected by our prototype is correct.
- (b)
- For each surface, its X-Z view and Y-Z view show that the distribution range of the points which are distributed at both sides of the ideal plane is roughly coincident with the accuracy of the 2D lidar (which is ±40 mm) used in our prototype. There is no case where a large number of points exceed this range, which indicates that the surface is obviously distorted.
- (c)
- From the X-Z view and Y-Z view of each surface in Figure A2, we can find that very few sampling points exceed the range of ±0.04 m, which is the accuracy of the 2D lidar. So we can know that the shape error of the 3D point cloud caused by the self-reasons of our prototype (such as mechanical error, etc.) is relatively small compared to the error 2D lidar. Under the influence of the latter, the former is almost hardly noticeable.
- (d)
- From the distribution graphs of the deviations, we can find a law of normal distribution (for surfaces with more points, their normal distributions are more obvious), which indicates that the deviation values from the points to the ideal plane are approximately normally distributed. This verifies the previous conclusion, that is, the shape of the 3D point cloud collected by our prototype is correct. If the 3D point cloud is distorted, the distribution of the deviations from the points to the ideal plane will be severely affected by the shape of the distorted surface, and it will be not easy to find out a normal distribution in the distribution graphs of the deviations.

## Appendix C

Algorithm A1 The Calibration of Angle α Based on Levenberg–Marquardt Method | |

Input: 50 point clouds C_{O}, that is, $\left\{{\mathit{C}}_{O}^{1},{\mathit{C}}_{O}^{2},{\mathit{C}}_{O}^{3},\dots ,{\mathit{C}}_{O}^{50}\right\}$ | |

Output: Angle α, the installation angle of triangular plate | |

1 | begin program FindAngleAlpha ($\left\{{\mathit{C}}_{O}^{1},{\mathit{C}}_{O}^{2},{C}_{O}^{3},\dots ,{\mathit{C}}_{O}^{50}\right\}$) |

2 | Step1: for each point cloud C_{O} in $\left\{{\mathit{C}}_{O}^{1},{\mathit{C}}_{O}^{2},{\mathit{C}}_{O}^{3},\dots ,{\mathit{C}}_{O}^{50}\right\}$, extract the planes corresponding to the 4 walls, and calculate the unit normal vectors n_{1}~n_{4} of these 4 planes. The unit normal vectors extracted from 50 point clouds form the data sets N_{1}~N_{4} |

3 | N_{1}~N_{4}←ExtractNormal($\left\{{\mathit{C}}_{O}^{1},{\mathit{C}}_{O}^{2},{\mathit{C}}_{O}^{3},\dots ,{\mathit{C}}_{O}^{50}\right\}$) |

4 | Step2: calculate angle α through Levenberg–Marquardt method |

5 | k_{itera}: = 0; k_{max} = 500; α_{0}: = 30°; v: = 2; ε_{1} = 10^{−9}; ε_{2} = 10^{−9}; τ = 10^{−6} |

6 | A:= J(α)^{T}J(α); g:= J(α)^{T}F_{cost}(α) |

7 | found: = (||g||_{∞} ≤ ε_{1}); μ:= τ * max{a_{ii}} |

8 | while (not found) and (k_{itera} < k_{max}) |

9 | k_{itera}:= k_{itera} + 1; Solve (A + μI)Δα = −g |

10 | if ||Δα|| ≤ ε_{2}(||α|| + ε_{2}) |

11 | found: = true |

12 | else |

13 | $\rho :=\frac{{F}_{\mathrm{cost}}(\alpha +\mathsf{\Delta}\alpha )-{F}_{\mathrm{cost}}(\alpha )}{{F}_{\mathrm{cost}}(\alpha )+\mathit{J}(\alpha )\mathsf{\Delta}\alpha -{F}_{\mathrm{cost}}(\alpha )}=\frac{{F}_{\mathrm{cost}}(\alpha +\mathsf{\Delta}\alpha )-{F}_{\mathrm{cost}}(\alpha )}{\mathit{J}(\alpha )\mathsf{\Delta}\alpha}$ |

14 | ifρ > 0 |

15 | α:= α + Δα |

16 | A:= J(α)^{T}J(α); g:= J(α)^{T}F_{cost}(α) |

17 | found: = (||g||_{∞} ≤ ε_{1}) |

18 | μ:= μ*max{1/3, 1-(2ρ−1)^{3}}; v: = 2 |

19 | else |

20 | μ:= μ*v; v: = 2*v |

21 | end if |

22 | end if |

23 | end while |

24 | returnα |

25 | end program |

## References

- Morales, J.; Martinez, J.L.; Mandow, A.; Reina, A.J.; Pequenoboter, A.; Garciacerezo, A. Boresight Calibration of Construction Misalignments for 3D Scanners Built with a 2D Laser Rangefinder Rotating on Its Optical Center. Sensors
**2014**, 14, 20025–20040. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Morales, J.; Martinez, J.L.; Mandow, A.; Pequenoboter, A.; Garciacerezo, A. Design and development of a fast and precise low-cost 3D laser rangefinder. In Proceedings of the International Conference on Mechatronics, Istanbul, Turkey, 13–15 April 2011; pp. 621–626. [Google Scholar]
- Wulf, O.; Wagner, B. Fast 3d scanning methods for laser measurement systems. In International Conference on Control Systems and Computer Science; Editura Politehnica Press, 2003; Available online: https://www.researchgate.net/publication/228586709_Fast_3D_scanning_methods_for_laser_measurement_systems (accessed on 6 January 2021).
- Kang, J.; Doh, N.L. Full-DOF Calibration of a Rotating 2-D LIDAR with a Simple Plane Measurement. IEEE Trans. Robot.
**2016**, 32, 1245–1263. [Google Scholar] [CrossRef] - Gao, Z.; Huang, J.; Yang, X.; An, P. Calibration of rotating 2D LIDAR based on simple plane measurement. Sens. Rev.
**2019**, 39, 190–198. [Google Scholar] [CrossRef] - Alismail, H.; Browning, B. Automatic Calibration of Spinning Actuated Lidar Internal Parameters. J. Field Robot.
**2015**, 32, 723–747. [Google Scholar] [CrossRef] - Zeng, Y.; Yu, H.; Dai, H.; Song, S.; Lin, M.; Sun, B.; Jiang, W.; Meng, M.Q.H. An Improved Calibration Method for a Rotating 2D LIDAR System. Sensors
**2018**, 18, 497. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Martinez, J.L.; Morales, J.; Reina, A.J.; Mandow, A.; Pequeno-Boter, A.; Garcia-Cerezo, A. Construction and Calibration of a Low-Cost 3D Laser Scanner with 360 degrees Field of View for Mobile Robots. In Proceedings of the 2015 IEEE International Conference on Industrial Technology (ICIT), Seville, Spain, 17–19 March 2015; pp. 149–154. [Google Scholar]
- Murcia, H.F.; Monroy, M.F.; Mora, L.F. 3D Scene Reconstruction Based on a 2D Moving LiDAR. In International Conference on Applied Informatics; Springer: Cham, Switzerland, 2018. [Google Scholar]
- Olivka, P.; Krumnikl, M.; Moravec, P.; Seidl, D. Calibration of Short Range 2D Laser Range Finder for 3D SLAM Usage. J. Sens.
**2016**, 2016. [Google Scholar] [CrossRef] [Green Version] - Oberlaender, J.; Pfotzer, L.; Roennau, A.; Dillmann, R. Fast Calibration of Rotating and Swivelling 3-D Laser Scanners Exploiting Measurement Redundancies. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems, Hamburg, Germany, 28 September–2 October 2015; pp. 3038–3044. [Google Scholar]
- Kurnianggoro, L.; Hoang, V.-D.; Jo, K.-H. Calibration of Rotating 2D Laser Range Finder Using Circular Path on Plane Constraints. In New Trends in Computational Collective Intelligence; Camacho, D., Kim, S.W., Trawinski, B., Eds.; Springer: Cham, Switzerland, 2015; Volume 572, pp. 155–163. [Google Scholar]
- Kurnianggoro, L.; Hoang, V.-D.; Jo, K.-H. Calibration of a 2D Laser Scanner System and Rotating Platform using a Point-Plane Constraint. Comput. Sci. Inf. Syst.
**2015**, 12, 307–322. [Google Scholar] [CrossRef] - Pfotzer, L.; Oberlaender, J.; Roennau, A.; Dillmann, R. Development and calibration of KaRoLa, a compact, high-resolution 3D laser scanner. In Proceedings of the IEEE International Symposium on Safety, Hokkaido, Japan, 27–30 October 2014. [Google Scholar]
- Lin, C.C.; Liao, Y.D.; Luo, W.J. Calibration method for extending single-layer LIDAR to multi-layer LIDAR. In Proceedings of the 2013 IEEE/SICE International Symposium on System Integration (SII), Kobe, Japan, 15–17 December 2013. [Google Scholar]
- Ueda, T.K.H.; Tomizawa, T. Mobile SOKUIKI Sensor System-Accurate Range Data Mapping System with Sensor Motion. In Proceedings of the 2006 International Conference on Autonomous Robots and Agents, Palmerston North, New Zealand, 12–14 December 2006. [Google Scholar]
- Nagatani, K.; Tokunaga, N.; Okada, Y.; Yoshida, K. Continuous Acquisition of Three-Dimensional Environment Information for Tracked Vehicles on Uneven Terrain. In Proceedings of the IEEE International Workshop on Safety, Sendai, Japan, 21–24 October 2008. [Google Scholar]
- Matsumoto, M.; Yuta, S. 3D laser range sensor module with roundly swinging mechanism for fast and wide view range image. In Proceedings of the Multisensor Fusion & Integration for Intelligent Systems, Salt Lake City, UT, USA, 5–7 September 2010. [Google Scholar]
- Walther, M.; Steinhaus, P.; Dillmann, R. A foveal 3D laser scanner integrating texture into range data. In Proceedings of the International Conference on Intelligent Autonomous Systems 9-ias, Tokyo, Japan, 7–9 March 2006. [Google Scholar]
- Raymond, S.; Nawid, J.; Waleed, K.; Claude, S. A Low-Cost, Compact, Lightweight 3D Range Sensor. In Proceedings of the Australasian Conference on Robotics and Automation; Available online: https://www.researchgate.net/publication/228338590_A_Low-Cost_Compact_Lightweight_3D_Range_Sensor (accessed on 6 January 2021).
- Dias, P.; Matos, M.; Santos, V. 3D Reconstruction of Real World Scenes Using a Low-Cost 3D Range Scanner. Comput.-Aided Civ. Infrastruct. Eng.
**2006**, 21, 486–497. [Google Scholar] [CrossRef] - Nasrollahi, M.; Bolourian, N.; Zhu, Z.; Hammad, A. Designing LiDAR-equipped UAV Platform for Structural Inspection. In Proceedings of 34th International Symposium on Automation and Robotics in Construction; IAARC Publications, 2018; Available online: https://www.researchgate.net/publication/328370814_Designing_LiDAR-equipped_UAV_Platform_for_Structural_Inspection (accessed on 6 January 2021).
- Bertussi, S. Spin_Hokuyo—ROS Wiki. Available online: http://wiki.ros.org/spin_hokuyo (accessed on 13 October 2020).
- Bosse, M.C.; Zlot, R.M. Continuous 3D scan-matching with a spinning 2D laser. In Proceedings of the IEEE International Conference on Robotics & Automation, Kobe, Japan, 12–17 May 2009. [Google Scholar]
- Zheng, F.; Shibo, Z.; Shiguang, W.; Yu, Z. A Real-Time 3D Perception and Reconstruction System Based on a 2D Laser Scanner. J. Sens.
**2018**, 2018, 1–14. [Google Scholar] [CrossRef] - Almqvist, H.; Magnusson, M.; Lilienthal, A.J. Improving Point Cloud Accuracy Obtained from a Moving Platform for Consistent Pile Attack Pose Estimation. J. Intell. Robot. Syst.
**2014**, 75, 101–128. [Google Scholar] [CrossRef] - Zhang, J.; Singh, S. LOAM: Lidar Odometry and Mapping in Real-time. In Robotics: Science and Systems; 2014; Available online: https://www.researchgate.net/publication/311570125_LOAM_Lidar_Odometry_and_Mapping_in_real-time (accessed on 6 January 2021).
- Zhang, T.; Nakamura, Y. Moving Humans Removal for Dynamic Environment Reconstruction from Slow-Scanning LIDAR Data. In Proceedings of the 2018 15th International Conference on Ubiquitous Robots, Honolulu, HI, USA, 26–30 June 2018; pp. 449–454. [Google Scholar]
- David, Y.; Kent, W. “Sweep Diy 3d Scanner Kit” Project. Available online: https://www.servomagazine.com/magazine/article/the-multi-rotor-hobbyist-scanse-sweep-3d-scanner-review? (accessed on 16 October 2020).
- Hokuyo UST-10/20LX. Available online: https://www.hokuyo-aut.co.jp/search/single.php?serial=16 (accessed on 16 October 2020).
- Point Cloud Library. Available online: https://pointclouds.org/ (accessed on 16 October 2020).
- More, J.J. The Levenberg-Marquardt algorithm: Implementation and theory. In Lecture Notes in Mathematicsl; Springer: Berlin/Heidelberg, Germany, 1978; Volume 630. [Google Scholar]
- Madsen, K.; Nielsen, H.B.; Tingleff, O. Methods for Non-Linear Least Squares Problems, 2nd ed.; Informatics and Mathematical Modelling (IMM), Technical University of Denmark (DTU): Lyngby, Denmark, 2004. [Google Scholar]
- Ricaud, B.; Joly, C.; de La Fortelle, A. Nonurban Driver Assistance with 2D Tilting Laser Reconstruction. J. Surv. Eng.
**2017**, 143. [Google Scholar] [CrossRef] - Yan, F.; Zhang, S.; Zhuang, Y.; Tan, G. Automated indoor scene reconstruction with mobile robots based on 3D laser data and monocular visual odometry. In Proceedings of the 2015 IEEE International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), Shenyang, China, 8–12 June 2015. [Google Scholar]
- Colas, F.; Mahesh, S.; Pomerleau, F.; Liu, M.; Siegwart, R. 3D path planning and execution for search and rescue ground robots. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots & Systems, Tokyo, Japan, 3–7 November 2013. [Google Scholar]

**Figure 1.**The shape error and attitude error of the 3D point cloud: (

**a**) a photo of the environment for collecting the 3D point cloud; (

**b**) top view of a 3D point cloud collected by a rotating 2D lidar; (

**c**) top view of another 3D point cloud collected by a rotating 2D lidar.

**Figure 3.**A photo of our prototype. The 2D lidar Hokuyo UST-10LX is installed on a rotating unit, and a triangular plate is used to calibrate the attitude error of the 3D point cloud.

**Figure 4.**Three Cartesian coordinate frames. The coordinate frame of the 2D lidar (L) is shown in green, the coordinate frame of rotating unit before attitude correction (O’) is shown in blue, and the coordinate frame of rotating unit after attitude correction (O) is shown in red.

**Figure 5.**The scanning sector of the 2D lidar does not coincide with the rotation axis of the motor shaft. They are parallel to each other, and the distance between them is 13.9 mm. This design can collect the 3D point cloud with a mark, which can roughly show the position of the motor shaft at the moment a 3D scan is started, that is, the position of the Y’-axis of coordinate frame O’-X’Y’Z’.

**Figure 6.**In order to extract the point cloud

**C**

_{tri}from the point cloud

**C**

_{O’}, we define a special area, which is shown in blue.

**Figure 7.**Extract the planes corresponding to the 4 walls of the conference room, and calculate their unit normal vector. (

**a**) The top view of the room, where the red bold line is the walls used to extract the planes. Our selection criterion is as follows. The area of the walls should be large so as to facilitate the extraction of the planes. Besides, walls with windows should be avoided, because irregularly shaped curtains may interfere with the extraction of the planes; (

**b**) the 3D point cloud of the conference room, from which we extracted the planes corresponding to 4 walls. We marked them with different colors to distinguish them. There are 6 vectors in (

**b**), where the vectors

**n**

_{1},

**n**

_{2},

**n**

_{3},

**n**

_{4}are the unit normal vectors of the 4 planes, and the vectors

**i**and

**j**are the unit normal vectors of X-axis and Y-axis of coordinate frame O.

**Figure 10.**The maximum, average, and minimum values of angle φerror of 5 sampling points in each mode.

Mode | 1 | 2 | 3 | 4 | 5 | 6 | 7 |
---|---|---|---|---|---|---|---|

T/s | 1 | 2 | 4 | 5 | 8 | 10 | 16 |

γ/° | 4.5 | 2.25 | 1.125 | 0.9 | 0.5625 | 0.45 | 0.28125 |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Yuan, C.; Bi, S.; Cheng, J.; Yang, D.; Wang, W.
Low-Cost Calibration of Matching Error between Lidar and Motor for a Rotating 2D Lidar. *Appl. Sci.* **2021**, *11*, 913.
https://doi.org/10.3390/app11030913

**AMA Style**

Yuan C, Bi S, Cheng J, Yang D, Wang W.
Low-Cost Calibration of Matching Error between Lidar and Motor for a Rotating 2D Lidar. *Applied Sciences*. 2021; 11(3):913.
https://doi.org/10.3390/app11030913

**Chicago/Turabian Style**

Yuan, Chang, Shusheng Bi, Jun Cheng, Dongsheng Yang, and Wei Wang.
2021. "Low-Cost Calibration of Matching Error between Lidar and Motor for a Rotating 2D Lidar" *Applied Sciences* 11, no. 3: 913.
https://doi.org/10.3390/app11030913