A Cross-Line Structured Light Scanning System Based on a Measuring Arm

: The measurement system proposed in this paper, using a measuring arm and line structured light, has a wide range of applications. To improve the scanning efﬁciency, the system outlined in this paper uses two single-line structured lights to form crosshair structured light, which we combine with a measuring arm to form a comprehensive scanning measurement system. The calibration method of Zhengyou Zhang and a calibration board are used to complete parameter calibration of the sensors and cameras, as well as hand–eye calibration of the measuring arm. For complex curved-surface objects, this system extracts the cross-line structured light optical center location, which suffers from ambiguity. Therefore, we introduce the use of periodic control of the two line structured light sources in order to resolve the light extraction polysemy. Our experimental results indicate that the proposed system can effectively satisfy the function of crosshair structured light scanning of large, complex surfaces.


Introduction
With the rapid development of automation technology, increasingly higher machine accuracy of complex parts is required, which has put pressure on modern measurement technology in terms of developing the necessary production components and equipment.Structured light scanning measurement technology has high speed, is highly flexible and highly accurate, and is barely influenced by the surface material of the measured object.It has been widely used for surface contour extraction, reverse engineering, and surface defect detection of product, as well as in other fields [1,2].As the structured light and camera cannot scan all of an object alone, a coordinate measuring instrument, manipulator, and turntable must be used to obtain 3D information about the object [3].Measuring arms have been widely used for the measurement of workpiece size, due to their flexibility, light weight, and wide measuring range [4].Combining the advantages of a measuring arm and structured light measurement, a measuring arm structured-light scanning system was constructed by combining structured light and a camera fixed at the upper end of a measuring arm.
Structured-light scanning systems can be divided into single-line structured light systems and crosshair structured light systems according to the type of light.A singleline structured light emitter only shoots out a laser line, and images are aligned in the camera.The extraction of light strips is not polysemous and is easy to achieve, and this type of approach is widely used [5][6][7] along the orthogonal direction of the structured light.The digital modulus accuracy of the complex surface is not high, and so, it needs to be measured several times along different directions.Crosshair structured light can solve the problem of the measured edge being parallel to the light bar, as its two light planes are perpendicular to each other.However, due to the ambiguity of the plane where the light strips extracted from the camera are located, 3D reconstruction is difficult.Many scholars have conducted research into how to solve the polysemy problem.Liguo Zhang studied the application of cross-line structured light in searching for and tracking electric welds using a template-matching algorithm [8].Yu Zhejiang used the light plane constraints of a two-sided camera to match the extracted light bar center points [9].Michael Bleier considered the use of a different color to distinguish between the two light planes, that is, by distinguishing different colors of laser light [10].
In this paper, the method used to solve the ambiguity of light strip extraction was to periodically control two line laser emitters.Specifically, first, only one laser was turned on, which was then turned off, at which point the other laser was turned on.Then, the two lasers were turned on at the same time.This pattern was continued periodically.The advantages of this method are as follows: It can scan complex surfaces, it has low cost, and it can improve the scanning efficiency.

Methodology
To achieve rapid and accurate acquisition of the indicated contour of the measured object, we designed a set of crosshair structured-light scanning systems based on the structured light 3D vision measurement principle.Figure 1 shows a schematic diagram of the scanning system.For the single-line structured light A and the single-line structured light B, the light planes are perpendicular to each other, thus forming the cross-line structured light.The cross-line structured light and 2D camera are fixed together onto mounting plate A, and the whole system is installed at the end of the measuring arm.The measuring arm and camera laser are connected to a computer through a USB interface, enabling unified control.scholars have conducted research into how to solve the polysemy problem.Liguo Zhang studied the application of cross-line structured light in searching for and tracking electric welds using a template-matching algorithm [8].Yu Zhejiang used the light plane constraints of a two-sided camera to match the extracted light bar center points [9].Michael Bleier considered the use of a different color to distinguish between the two light planes, that is, by distinguishing different colors of laser light [10].
In this paper, the method used to solve the ambiguity of light strip extraction was to periodically control two line laser emitters.Specifically, first, only one laser was turned on, which was then turned off, at which point the other laser was turned on.Then, the two lasers were turned on at the same time.This pattern was continued periodically.The advantages of this method are as follows: It can scan complex surfaces, it has low cost, and it can improve the scanning efficiency.

Methodology
To achieve rapid and accurate acquisition of the indicated contour of the measured object, we designed a set of crosshair structured-light scanning systems based on the structured light 3D vision measurement principle.Figure 1 shows a schematic diagram of the scanning system.For the single-line structured light A and the single-line structured light B, the light planes are perpendicular to each other, thus forming the cross-line structured light.The cross-line structured light and 2D camera are fixed together onto mounting plate A, and the whole system is installed at the end of the measuring arm.The measuring arm and camera laser are connected to a computer through a USB interface, enabling unified control.

Model
Each part of the measuring arm cross-line structured light system must be calibrated to work normally.

3D Reconstruction
As shown in Figure 2, the projection of point P in space on the image plane is  = x y .Due to the lens distortion, the actual imaging point is  = x y , and the world coordinate of point P is: This is converted to the camera frame O X Y Z by: where R and t represent the rotation and translation, respectively, of the world coordinate system and the camera coordinate system.The relationship between  and P is as follows:

Model
Each part of the measuring arm cross-line structured light system must be calibrated to work normally.

3D Reconstruction
As shown in Figure 2, the projection of point P in space on the image plane is p u = x u y u T .
Due to the lens distortion, the actual imaging point is p d = x d y d T , and the world coordinate of point P is: This is converted to the camera frame O c X c Y c Z c by: where R and t represent the rotation and translation, respectively, of the world coordinate system and the camera coordinate system.The relationship between p u and p c is as follows: In this paper, the method in [11] was used to calibrate the structured light plane of a line.Considering only the radial and tangential distortion of the lens, the relationship between the actual imaging point p d and the theoretical point p u is: where (k 1 , k 2 , k 5 ) are the radial and (k 3 , k 4 ) are the tangential distortion parameters and r 2 = x u 2 + y u 2 .The pixel coordinates on the image p = u v T are: where f x and f y are the respective focal lengths and c x and c y are the principal points.In the case of known pixels, through Equation ( 5), Equation (4) can be used to calculate p u .However, we cannot calculate Pc, as the constraints are not repeated; in this case, additional constraints are required.Suppose the equation of the light plane of the laser in the camera coordinate system is: Putting Equation (3) into Equation ( 6), we obtain: P is solved for the camera coordinate system as follows: In this paper, the method in [11] was used to calibrate the structured light plane of a line.

Laser Line Extraction
The control of the two single-line lasers was carried out as explained next.Figure 3 depicts the manner in which the two lasers were turned off and on periodically."Time" denotes the time required for the camera to acquire a frame, which is aligned and synchronized in the hardware.Figure 4 shows the pictures captured by the camera when laser A, B and AB shoot on the plane.
The control of the two single-line lasers was carried out as explained next.Figure 3 depicts the manner in which the two lasers were turned off and on periodically."Time" denotes the time required for the camera to acquire a frame, which is aligned and synchronized in the hardware.Figure 4 shows the pictures captured by the camera when laser A, B and AB shoot on the plane.The direction of the line in the 2D image was estimated locally by computing the eigenvalues and eigenvectors of the Hessian matrix.The response of the ridge detector given by the value of the maximum absolute eigenvalue is a good indicator of the saliency of the extracted line points.It should be noted that the algorithm for the method is not able to run on an FPGA due to the complexity of the algorithm.
For the extraction of laser lines, we used the gray center method.When laser A is turned on, the current frame is denoted by N and the center of the light bar was extracted

Laser Line Extraction
The control of the two single-line lasers was carried out as explained next.Figure 3 depicts the manner in which the two lasers were turned off and on periodically."Time" denotes the time required for the camera to acquire a frame, which is aligned and synchronized in the hardware.Figure 4 shows the pictures captured by the camera when laser A, B and AB shoot on the plane.The direction of the line in the 2D image was estimated locally by computing the eigenvalues and eigenvectors of the Hessian matrix.The response of the ridge detector given by the value of the maximum absolute eigenvalue is a good indicator of the saliency of the extracted line points.It should be noted that the algorithm for the method is not able to run on an FPGA due to the complexity of the algorithm.
For the extraction of laser lines, we used the gray center method.When laser A is turned on, the current frame is denoted by N and the center of the light bar was extracted The direction of the line in the 2D image was estimated locally by computing the eigenvalues and eigenvectors of the Hessian matrix.The response of the ridge detector given by the value of the maximum absolute eigenvalue is a good indicator of the saliency of the extracted line points.It should be noted that the algorithm for the method is not able to run on an FPGA due to the complexity of the algorithm.
For the extraction of laser lines, we used the gray center method.When laser A is turned on, the current frame is denoted by N and the center of the light bar was extracted from left to right and top to bottom.In the j-th column of the image, the light bar region is S.Then, the abscissa is calculated as: ∑ y∈s gray(j, y) * y ∑ y∈s gray(j, y) .( 9) All the collected light bars are collectively denoted as: Similarly, when laser B is turned on, the current frame is N + 1 and the light bar in the j-th column is located in region S. Similarly, the abscissa is: The coordinates of all the collected light bars are denoted as follows: When lasers A and B are opened at the same time, light bars in all column directions are found.At this time, the frame is N + 2 and the collected light bars are denoted as: The position of the light bar recorded by U C N+2 in Equation ( 13) is polysemous and needed to be distinguished, for which the following equation can be used: and In the comparison, T is the allowable threshold for extracting the optical bar position.If it matches Equation (14), it is determined that the light strip belongs to laser A, and if it matches Equation (15), then the strip belongs to laser B. If three pictures are captured by the camera, we can only obtain four sets of points with the cross-line laser compared to three sets of points obtained with the single-line laser.Through this method to improve the frame rate of the 3D points, the scanning frame rate increased to 1.33 times that with a single-line laser ideally.

Hand-Eye Calibration
Hand-eye calibration is required to establish a relationship between the measuring arm and the camera.In the system shown in Figure 1, assuming that there is a fixed point P board on the calibration board, the coordinate P i base in the base coordinate under the condition that the measuring arm takes pose i satisfies the following relationship: where base tool T i is the transformation matrix between the tool coordinate system at the end of the measuring arm and the attitude of base i, which is directly fed back by the measuring arm; tool cam T i is the transformation between the camera coordinate system and the tool coordinate system at the end of the measuring arm, which is the target relationship required to be solved in the hand-eye calibration; and cam word T i is the relationship between the points on the calibration plate and the camera coordinate system, which can be solved using the calibration method of Zhengyou Zhang [12].Additionally, in the j-th pose: As the calibration plate is in a stationary state with respect to the measuring arm base: Combining Equations ( 16)-(18), we have: base tool T i tool cam T i cam word T i = base tool T j tool cam T j cam word T j .( Obviously, A and B are the same in positions i and j. From Transformation (19), we have: Equation ( 20) can be written as: For Equation ( 23), the method in [13] was used to solve the relationship X between the camera coordinate system and the tool coordinate system at the end of the measuring arm.

Laser Plane Extraction
The relationship R and t in Equation ( 2) between the circle grid board and the camera can be determined from the known camera parameters in Equations ( 4) and (5).Thus, we obtained the laser line positions (height A) in the camera coordinate system with R and t.Fixing the camera and moving the circle grid board to another position, we obtained the laser line positions (height B) in the same manner as height A. The laser line plane was then calculated using heights A and B.

The Hardware System
Figure 5 shows the hardware system used in this experiment.The system mainly consisted of a flexible 7-axis measuring arm, laser A, laser B, and a 2D camera.The hardware parameters were as follows: (1) Precision of the measuring arm: 0.02 mm (2) For the 2D camera: Laser wavelength: 405 nm Figure 6 shows the calibration camera and the circular array calibration plate used for hand-eye calibration.In Figure 7, the circle spacing was 3.75 mm, the circle diameter was 1.875 mm, and the accuracy was 0.001 mm.The material was ceramic.The metal gauge block used to verify the calibration result is also shown.Figure 6 shows the calibration camera and the circular array calibration plate used for hand-eye calibration.In figure 7, the circle spacing was 3.75 mm, the circle diameter was 1.875 mm, and the accuracy was 0.001 mm.The material was ceramic.The metal gauge block used to verify the calibration result is also shown.

Calibration of the Camera and Laser Light Plane
First, the camera was calibrated, following which the calibrated camera parameters were used to calibrate the optical plane of AB and lasers A and B. As shown in figure 8, 15 pictures were collected from various angles, and the internal parameters of the camera were obtained using the calibration method of Zhengyou Zhang, where fx = 5145.362,fy = 5153.664,cx = 1517.375,and cy = 1023.363.In the obtained distortion parameters, for the radial distortion,  = −0.09903, = 0.195427, and  = 0.41673; meanwhile, for the tangential distortion,  = −0.000065974and  = −0.0000978477.Figure 6 shows the calibration camera and the circular array calibration plate used for hand-eye calibration.In figure 7, the circle spacing was 3.75 mm, the circle diameter was 1.875 mm, and the accuracy was 0.001 mm.The material was ceramic.The metal gauge block used to verify the calibration result is also shown.

Calibration of the Camera and Laser Light Plane
First, the camera was calibrated, following which the calibrated camera parameters were used to calibrate the optical plane of AB and lasers A and B. As shown in figure 8, 15 pictures were collected from various angles, and the internal parameters of the camera were obtained using the calibration method of Zhengyou Zhang, where fx = 5145.362,fy = 5153.664,cx = 1517.375,and cy = 1023.363.In the obtained distortion parameters, for the radial distortion,  = −0.09903, = 0.195427, and  = 0.41673; meanwhile, for the tangential distortion,  = −0.000065974and  = −0.0000978477.

Calibration of the Camera and Laser Light Plane
First, the camera was calibrated, following which the calibrated camera parameters were used to calibrate the optical plane of AB and lasers A and B. As shown in Figure 8, 15 pictures were collected from various angles, and the internal parameters of the camera were obtained using the calibration method of Zhengyou Zhang, where fx = 5145.362,fy = 5153.664,cx = 1517.375,and cy = 1023.363.In the obtained distortion parameters, for the radial distortion, k 1 = −0.09903,k 2 = 0.195427, and k 5 = 0.41673; meanwhile, for the tangential distortion, k 3 = −0.000065974and k 4 = −0.0000978477.As shown in Figure 9, the upper and lower graphs on the left were used to calibrate the optical plane of laser B, while the upper and lower graphs on the right were used to calibrate the optical plane of laser A. The calibrated equation of the light plane of laser A was as follows:     The accuracy of the calibration was verified by measuring the height of 20 mm and the statistical height distribution, as explained next.
As shown in Figure 10, a measurement block with a height of 20 mm was used for verification.The abscissa represents the measured height (in mm), and the ordinate represents the height value statistics (in units).The mean height was 19.989 mm, and the standard deviation(std) of height was 0.006 mm.
As shown in Figure 10, a measurement block with a height of 20 mm was used for verification.The abscissa represents the measured height (in mm), and the ordinate represents the height value statistics (in units).The mean height was 19.989 mm, and the standard deviation(std) of height was 0.006 mm.

Hand-Eye Calibration
As shown in Figure 11, the joint arm with the camera photographed the calibration plate in different postures and recorded the coordinates of the joint arm feedback.A total of five sets were captured.

Scanning Test
The calibration result X and laser light plane parameters were used for scanning and testing.

Hand-Eye Calibration
As shown in Figure 11, the joint arm with the camera photographed the calibration plate in different postures and recorded the coordinates of the joint arm feedback.A total of five sets were captured.
As shown in Figure 10, a measurement block with a height of 20 mm was used for verification.The abscissa represents the measured height (in mm), and the ordinate represents the height value statistics (in units).The mean height was 19.989 mm, and the standard deviation(std) of height was 0.006 mm.

Hand-Eye Calibration
As shown in Figure 11, the joint arm with the camera photographed the calibration plate in different postures and recorded the coordinates of the joint arm feedback.A total of five sets were captured.

Scanning Test
The calibration result X and laser light plane parameters were used for scanning and testing.

Scanning Test
The calibration result X and laser light plane parameters were used for scanning and testing.
Figure 12 shows a scan of the calibration plate (note that only a single scan could be completed), Figure 13 shows a scan of a complex surface that is correct, and Figure 14 shows the scanning of a metal pellet.
Figure 12 shows a scan of the calibration plate (note that only a single scan could be completed), Figure 13 shows a scan of a complex surface that is correct, and Figure 14 shows the scanning of a metal pellet.Figure 14 shows a white plate with a scanning flatness of 0.015 mm. Figure 15 (right) shows a point cloud obtained from a white flat board.All points were fit to a plane, and the distance from all points to the plane was calculated.The statistical histogram shown in Figure 16 was obtained.The abscissa represents the distance (in mm), while the ordinate is the number of points in the distribution, with 96% distributed within +/−0.05 mm.The main deviation was due to the hand-eye calibration.Figure 12 shows a scan of the calibration plate (note that only a single scan could be completed), Figure 13 shows a scan of a complex surface that is correct, and Figure 14 shows the scanning of a metal pellet.Figure 14 shows a white plate with a scanning flatness of 0.015 mm. Figure 15 (right) shows a point cloud obtained from a white flat board.All points were fit to a plane, and the distance from all points to the plane was calculated.The statistical histogram shown in Figure 16 was obtained.The abscissa represents the distance (in mm), while the ordinate is the number of points in the distribution, with 96% distributed within +/−0.05 mm.The main deviation was due to the hand-eye calibration.Figure 12 shows a scan of the calibration plate (note that only a single scan could be completed), Figure 13 shows a scan of a complex surface that is correct, and Figure 14 shows the scanning of a metal pellet.Figure 14 shows a white plate with a scanning flatness of 0.015 mm. Figure 15 (right) shows a point cloud obtained from a white flat board.All points were fit to a plane, and the distance from all points to the plane was calculated.The statistical histogram shown in Figure 16 was obtained.The abscissa represents the distance (in mm), while the ordinate is the number of points in the distribution, with 96% distributed within +/−0.05 mm.The main deviation was due to the hand-eye calibration.Figure 14 shows a white plate with a scanning flatness of 0.015 mm. Figure 15 (right) shows a point cloud obtained from a white flat board.All points were fit to a plane, and the distance from all points to the plane was calculated.The statistical histogram shown in Figure 16 was obtained.The abscissa represents the distance (in mm), while the ordinate is the number of points in the distribution, with 96% distributed within +/−0.05 mm.The main deviation was due to the hand-eye calibration.
In the scan, the determination threshold T of Equation ( 14) was set as 10 pixels.
If Equations ( 14) and (15) both met the conditions, the one with the smallest difference was chosen.In the scan, the determination threshold T of Equation ( 14) was set as 10 pixels.If Equations ( 14) and (15) both met the conditions, the one with the smallest difference was chosen.

Conclusions
In this paper, we presented a cross−line laser scanning system with an articulated arm.We showed how a single-line laser scanning technique can be extended to a cross−line laser scanning system in order to improve the scanning efficiency.Moreover, we provided our implementation details, through which an object point cloud can finally be obtained, and showed how to differentiate the cross-line in one frame by comparing the pixel position extracted throughout the frame sequence.We experimentally demonstrated that high-quality, accurate scans can be achieved, while reducing the time required to scan a complex surface; in particular, the frame rate of 3D points was 1.33 times faster than that of single-line structured light with the same 2D camera.However, it should be noted that further work is necessary to improve the ratio of AB opening simultaneously, which may increase the efficiency of scanning by controlling the periodicity of the two lasers.In the scan, the determination threshold T of Equation ( 14) was set as 10 pixels.If Equations ( 14) and (15) both met the conditions, the one with the smallest difference was chosen.

Conclusions
In this paper, we presented a cross−line laser scanning system with an articulated arm.We showed how a single-line laser scanning technique can be extended to a cross−line laser scanning system in order to improve the scanning efficiency.Moreover, we provided our implementation details, through which an object point cloud can finally be obtained, and showed how to differentiate the cross-line in one frame by comparing the pixel position extracted throughout the frame sequence.We experimentally demonstrated that high-quality, accurate scans can be achieved, while reducing the time required to scan a complex surface; in particular, the frame rate of 3D points was 1.33 times faster than that of single-line structured light with the same 2D camera.However, it should be noted that further work is necessary to improve the ratio of AB opening simultaneously, which may increase the efficiency of scanning by controlling the periodicity of the two lasers.

Conclusions
In this paper, we presented a cross−line laser scanning system with an articulated arm.We showed how a single-line laser scanning technique can be extended to a cross−line laser scanning system in order to improve the scanning efficiency.Moreover, we provided our implementation details, through which an object point cloud can finally be obtained, and showed how to differentiate the cross-line in one frame by comparing the pixel position extracted throughout the frame sequence.We experimentally demonstrated that highquality, accurate scans can be achieved, while reducing the time required to scan a complex surface; in particular, the frame rate of 3D points was 1.33 times faster than that of singleline structured light with the same 2D camera.However, it should be noted that further work is necessary to improve the ratio of AB opening simultaneously, which may increase the efficiency of scanning by controlling the periodicity of the two lasers.

Figure 2 .
Figure 2. Cross-line projector with a single fixed camera.Figure 2. Cross-line projector with a single fixed camera.

Figure 2 .
Figure 2. Cross-line projector with a single fixed camera.Figure 2. Cross-line projector with a single fixed camera.

Figure 5 .
Figure 5. Schematic of the circle grid board used to extract the laser line plane.

Figure 5 .
Figure 5. Schematic of the circle grid board used to extract the laser line plane.

Figure 5 .
Figure 5. Schematic of the circle grid board used to extract the laser line plane.

Figure 7 .
Figure 7. Calibration board and gauge blocks.

Figure 7 .
Figure 7. Calibration board and gauge blocks.As shown in, the upper and lower graphs on the left were used to calibrate the optical plane of laser B, while the upper and lower graphs on the right were used to calibrate the optical plane of laser A. The calibrated equation of the light plane of laser A was as follows:     = −0.659905− 0.551732 0.51 − 68.7215 , while that for laser B was:     = 0.76417 − 0.543156 0.34787 − 49.616 .

Figure 8 .
Figure 8. Calibration of the camera.Laser B is fixed by hot glue.

Figure 9 .
Figure 9. Calibration of the laser line plane.The accuracy of the calibration was verified by measuring the height of 20 mm and the statistical height distribution, as explained next.

Figure 8 .
Figure 8. Calibration of the camera.Laser B is fixed by hot glue.

Figure 7 .
Figure 7. Calibration board and gauge blocks.As shown in, the upper and lower graphs on the left were used to calibrate the optical plane of laser B, while the upper and lower graphs on the right were used to calibrate the optical plane of laser A. The calibrated equation of the light plane of laser A was as follows:     = −0.659905− 0.551732 0.51 − 68.7215 , while that for laser B was:     = 0.76417 − 0.543156 0.34787 − 49.616 .

Figure 8 .
Figure 8. Calibration of the camera.Laser B is fixed by hot glue.

Figure 9 .
Figure 9. Calibration of the laser line plane.The accuracy of the calibration was verified by measuring the height of 20 mm and the statistical height distribution, as explained next.

Figure 9 .
Figure 9. Calibration of the laser line plane.

Figure 15 .
Figure 15.Flat board and the scanning result obtained.

Figure 16 .
Figure 16.Histogram of distance to plane.

Figure 15 . 12 Figure 15 .
Figure 15.Flat board and the scanning result obtained.

Figure 16 .
Figure 16.Histogram of distance to plane.

Figure 16 .
Figure 16.Histogram of distance to plane.