Next Article in Journal
Acknowledgment to the Reviewers of Instruments in 2022
Next Article in Special Issue
Moving from Raman Spectroscopy Lab towards Analytical Applications: A Review of Interlaboratory Studies
Previous Article in Journal
Angle-Resolved Time-of-Flight Electron Spectrometer Designed for Femtosecond Laser-Assisted Electron Scattering and Diffraction
Previous Article in Special Issue
Design, Construction and Characterization of Sealed Tube Medium Power CO2 Laser System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Cross-Line Structured Light Scanning System Based on a Measuring Arm

Department of Electronics and Information, Fukuoka Institute of Technology, Fukuoka 811-0295, Japan
*
Author to whom correspondence should be addressed.
Instruments 2023, 7(1), 5; https://doi.org/10.3390/instruments7010005
Submission received: 16 November 2022 / Revised: 7 December 2022 / Accepted: 23 December 2022 / Published: 3 January 2023
(This article belongs to the Special Issue Photonic Devices Instrumentation and Applications II)

Abstract

:
The measurement system proposed in this paper, using a measuring arm and line structured light, has a wide range of applications. To improve the scanning efficiency, the system outlined in this paper uses two single-line structured lights to form crosshair structured light, which we combine with a measuring arm to form a comprehensive scanning measurement system. The calibration method of Zhengyou Zhang and a calibration board are used to complete parameter calibration of the sensors and cameras, as well as hand–eye calibration of the measuring arm. For complex curved-surface objects, this system extracts the cross-line structured light optical center location, which suffers from ambiguity. Therefore, we introduce the use of periodic control of the two line structured light sources in order to resolve the light extraction polysemy. Our experimental results indicate that the proposed system can effectively satisfy the function of crosshair structured light scanning of large, complex surfaces.

1. Introduction

With the rapid development of automation technology, increasingly higher machine accuracy of complex parts is required, which has put pressure on modern measurement technology in terms of developing the necessary production components and equipment. Structured light scanning measurement technology has high speed, is highly flexible and highly accurate, and is barely influenced by the surface material of the measured object. It has been widely used for surface contour extraction, reverse engineering, and surface defect detection of product, as well as in other fields [1,2]. As the structured light and camera cannot scan all of an object alone, a coordinate measuring instrument, manipulator, and turntable must be used to obtain 3D information about the object [3]. Measuring arms have been widely used for the measurement of workpiece size, due to their flexibility, light weight, and wide measuring range [4]. Combining the advantages of a measuring arm and structured light measurement, a measuring arm structured-light scanning system was constructed by combining structured light and a camera fixed at the upper end of a measuring arm.
Structured-light scanning systems can be divided into single-line structured light systems and crosshair structured light systems according to the type of light. A single-line structured light emitter only shoots out a laser line, and images are aligned in the camera. The extraction of light strips is not polysemous and is easy to achieve, and this type of approach is widely used [5,6,7] along the orthogonal direction of the structured light. The digital modulus accuracy of the complex surface is not high, and so, it needs to be measured several times along different directions. Crosshair structured light can solve the problem of the measured edge being parallel to the light bar, as its two light planes are perpendicular to each other. However, due to the ambiguity of the plane where the light strips extracted from the camera are located, 3D reconstruction is difficult. Many scholars have conducted research into how to solve the polysemy problem. Liguo Zhang studied the application of cross-line structured light in searching for and tracking electric welds using a template-matching algorithm [8]. Yu Zhejiang used the light plane constraints of a two-sided camera to match the extracted light bar center points [9]. Michael Bleier considered the use of a different color to distinguish between the two light planes, that is, by distinguishing different colors of laser light [10].
In this paper, the method used to solve the ambiguity of light strip extraction was to periodically control two line laser emitters. Specifically, first, only one laser was turned on, which was then turned off, at which point the other laser was turned on. Then, the two lasers were turned on at the same time. This pattern was continued periodically. The advantages of this method are as follows: It can scan complex surfaces, it has low cost, and it can improve the scanning efficiency.

2. Methodology

To achieve rapid and accurate acquisition of the indicated contour of the measured object, we designed a set of crosshair structured-light scanning systems based on the structured light 3D vision measurement principle. Figure 1 shows a schematic diagram of the scanning system. For the single-line structured light A and the single-line structured light B, the light planes are perpendicular to each other, thus forming the cross-line structured light. The cross-line structured light and 2D camera are fixed together onto mounting plate A, and the whole system is installed at the end of the measuring arm. The measuring arm and camera laser are connected to a computer through a USB interface, enabling unified control.

2.1. Model

Each part of the measuring arm cross-line structured light system must be calibrated to work normally.

2.2. 3D Reconstruction

As shown in Figure 2, the projection of point P in space on the image plane is p u = x u y u T . Due to the lens distortion, the actual imaging point is p d = x d y d T , and the world coordinate of point P is:
P w = x w y w z w T .
This is converted to the camera frame O c X c Y c Z c by:
P c = x c y c z c T = RP w + t ,
where R and t represent the rotation and translation, respectively, of the world coordinate system and the camera coordinate system. The relationship between p u   and   p c is as follows:
x u y u = X c Z c Y c Z c .
Considering only the radial and tangential distortion of the lens, the relationship between the actual imaging point p d   and the theoretical point   p u is:
x d y d = 1 + k 1 r 2 + k 2 r 4 + k 5 r 6 x u y u + 2 k 3 x u y u + k 4 r 2 + 2 x u 2 2 k 4 x u y u + k 3 r 2 + 2 y u 2 ,
where ( k 1 , k 2 , k 5 ) are the radial and ( k 3 , k 4 ) are the tangential distortion parameters and r 2 = x u 2 + y u 2 . The pixel coordinates on the image p = u v T   are :
u v 1 = f x 0 c x 0 f y c y 0 0 1 x d y d 1 ,
where fx and fy are the respective focal lengths and cx and cy are the principal points. In the case of known pixels, through Equation (5), Equation (4) can be used to calculate pu. However, we cannot calculate Pc, as the constraints are not repeated; in this case, additional constraints are required.
Suppose the equation of the light plane of the laser in the camera coordinate system is:
a X + b Y + c Z + D = 0 .
Putting Equation (3) into Equation (6), we obtain:
a x u Z c + b y u Z c + c Z c + D = 0
P is solved for the camera coordinate system as follows:
Z c = D / ( a x u + b y u + c ) , X c = x u Z c , Y c = y u Z c .
In this paper, the method in [11] was used to calibrate the structured light plane of a line.

2.3. Laser Line Extraction

The control of the two single-line lasers was carried out as explained next.
Figure 3 depicts the manner in which the two lasers were turned off and on periodically. “Time” denotes the time required for the camera to acquire a frame, which is aligned and synchronized in the hardware. Figure 4 shows the pictures captured by the camera when laser A, B and AB shoot on the plane.
The direction of the line in the 2D image was estimated locally by computing the eigenvalues and eigenvectors of the Hessian matrix. The response of the ridge detector given by the value of the maximum absolute eigenvalue is a good indicator of the saliency of the extracted line points. It should be noted that the algorithm for the method is not able to run on an FPGA due to the complexity of the algorithm.
For the extraction of laser lines, we used the gray center method. When laser A is turned on, the current frame is denoted by N and the center of the light bar was extracted from left to right and top to bottom. In the j-th column of the image, the light bar region is S. Then, the abscissa is calculated as:
u j N c = y s g r a y j , y y y s g r a y j , y .
All the collected light bars are collectively denoted as:
U N C = u j N c .
Similarly, when laser B is turned on, the current frame is N + 1 and the light bar in the j-th column is located in region S. Similarly, the abscissa is:
u j N + 1 c = x s g r a y j , y y x s g r a y j , y .
The coordinates of all the collected light bars are denoted as follows:
U N + 1 C = u j N + 1 c .
When lasers A and B are opened at the same time, light bars in all column directions are found. At this time, the frame is N + 2 and the collected light bars are denoted as:
U N + 2 C = u j N + 2 c .
The position of the light bar recorded by U N + 2 C in Equation (13) is polysemous and needed to be distinguished, for which the following equation can be used:
u j N + 2 c u j N c T ,
and
u j N + 2 c u j N + 1 c T .
In the comparison, T is the allowable threshold for extracting the optical bar position. If it matches Equation (14), it is determined that the light strip belongs to laser A, and if it matches Equation (15), then the strip belongs to laser B. If three pictures are captured by the camera, we can only obtain four sets of points with the cross-line laser compared to three sets of points obtained with the single-line laser. Through this method to improve the frame rate of the 3D points, the scanning frame rate increased to 1.33 times that with a single-line laser ideally.

2.4. Hand–Eye Calibration

Hand–eye calibration is required to establish a relationship between the measuring arm and the camera. In the system shown in Figure 1, assuming that there is a fixed point Pboard on the calibration board, the coordinate P b a s e i in the base coordinate under the condition that the measuring arm takes pose i satisfies the following relationship:
P b a s e i = T t o o l b a s e i   T c a m t o o l i   T w o r d c a m   i P b o a r d ,
where   T t o o l b a s e i   is the transformation matrix between the tool coordinate system at the end of the measuring arm and the attitude of base i, which is directly fed back by the measuring arm; T c a m t o o l i   is the transformation between the camera coordinate system and the tool coordinate system at the end of the measuring arm, which is the target relationship required to be solved in the hand–eye calibration; and T w o r d c a m   i is the relationship between the points on the calibration plate and the camera coordinate system, which can be solved using the calibration method of Zhengyou Zhang [12]. Additionally, in the j-th pose:
P b a s e j = T t o o l b a s e j   T c a m t o o l j   T w o r d c a m   j P b o a r d .
As the calibration plate is in a stationary state with respect to the measuring arm base:
P b a s e j = P b a s e i .
Combining Equations (16)–(18), we have:
T t o o l b a s e i   T c a m t o o l i   T w o r d c a m   i = T t o o l b a s e j   T c a m t o o l j   T w o r d c a m   j .
Obviously, A and B are the same in positions i and j.
From Transformation (19), we have:
T c a m t o o l j   1 T t o o l b a s e i   X = X T w o r d c a m   j T w o r d c a m   i 1 .
As
A = T c a m t o o l j   1 T t o o l b a s e i   ,
B = T w o r d c a m   j T w o r d c a m   i 1 ,
Equation (20) can be written as:
A X = X B .
For Equation (23), the method in [13] was used to solve the relationship X between the camera coordinate system and the tool coordinate system at the end of the measuring arm.

2.5. Laser Plane Extraction

The relationship R and t in Equation (2) between the circle grid board and the camera can be determined from the known camera parameters in Equations (4) and (5). Thus, we obtained the laser line positions (height A) in the camera coordinate system with R and t. Fixing the camera and moving the circle grid board to another position, we obtained the laser line positions (height B) in the same manner as height A. The laser line plane was then calculated using heights A and B.

3. Experiment

3.1. The Hardware System

Figure 5 shows the hardware system used in this experiment. The system mainly consisted of a flexible 7-axis measuring arm, laser A, laser B, and a 2D camera. The hardware parameters were as follows:
(1)
Precision of the measuring arm: 0.02 mm
(2)
For the 2D camera:
(a)
Field of view: 62–72 mm
(b)
Acquisition rate: 80 frames per second (FPS)
(c)
Resolution: 3072 × 2048 pixels
(d)
Working resolution: 2658 × 800 pixels
(3)
Linear structured light 3D camera based on camera and laser
(a)
Z direction resolution: 0.01–0.012 mm
(b)
Laser line resolution (near–far): 0.025–0.028 mm
(c)
Laser wavelength: 405 nm
Figure 6 shows the calibration camera and the circular array calibration plate used for hand–eye calibration. In Figure 7, the circle spacing was 3.75 mm, the circle diameter was 1.875 mm, and the accuracy was 0.001 mm. The material was ceramic. The metal gauge block used to verify the calibration result is also shown.

3.2. Calibration of the Camera and Laser Light Plane

First, the camera was calibrated, following which the calibrated camera parameters were used to calibrate the optical plane of AB and lasers A and B. As shown in Figure 8, 15 pictures were collected from various angles, and the internal parameters of the camera were obtained using the calibration method of Zhengyou Zhang, where fx = 5145.362, fy = 5153.664, cx = 1517.375, and cy = 1023.363. In the obtained distortion parameters, for the radial distortion, k 1 = 0.09903 , k 2 = 0.195427 , and k 5 = 0.41673 ; meanwhile, for the tangential distortion, k 3 = 0.000065974 and k 4 = 0.0000978477 .
As shown in Figure 9, the upper and lower graphs on the left were used to calibrate the optical plane of laser B, while the upper and lower graphs on the right were used to calibrate the optical plane of laser A. The calibrated equation of the light plane of laser A was as follows:
A a   B a   C a   D a = 0.659905 0.551732   0.51 68.7215 ,
while that for laser B was:
A b   B b   C b   D b = 0.76417 0.543156   0.34787 49.616 .
The accuracy of the calibration was verified by measuring the height of 20 mm and the statistical height distribution, as explained next.
As shown in Figure 10, a measurement block with a height of 20 mm was used for verification. The abscissa represents the measured height (in mm), and the ordinate represents the height value statistics (in units). The mean height was 19.989 mm, and the standard deviation(std) of height was 0.006 mm.

3.3. Hand–Eye Calibration

As shown in Figure 11, the joint arm with the camera photographed the calibration plate in different postures and recorded the coordinates of the joint arm feedback. A total of five sets were captured.
Using Equation (22), we obtained:
X = 0.42684 0.66270 0.61532 4.1373 0.904314 0.3163 0.286604 21.393 0.004728 0.678785 0.73432 112.962 0 0 0 1

3.4. Scanning Test

The calibration result X and laser light plane parameters were used for scanning and testing.
Figure 12 shows a scan of the calibration plate (note that only a single scan could be completed), Figure 13 shows a scan of a complex surface that is correct, and Figure 14 shows the scanning of a metal pellet.
Figure 14 shows a white plate with a scanning flatness of 0.015 mm.
Figure 15 (right) shows a point cloud obtained from a white flat board. All points were fit to a plane, and the distance from all points to the plane was calculated. The statistical histogram shown in Figure 16 was obtained. The abscissa represents the distance (in mm), while the ordinate is the number of points in the distribution, with 96% distributed within +/−0.05 mm. The main deviation was due to the hand–eye calibration.
In the scan, the determination threshold T of Equation (14) was set as 10 pixels.
If Equations (14) and (15) both met the conditions, the one with the smallest difference was chosen.

4. Conclusions

In this paper, we presented a cross−line laser scanning system with an articulated arm. We showed how a single-line laser scanning technique can be extended to a cross−line laser scanning system in order to improve the scanning efficiency. Moreover, we provided our implementation details, through which an object point cloud can finally be obtained, and showed how to differentiate the cross-line in one frame by comparing the pixel position extracted throughout the frame sequence. We experimentally demonstrated that high-quality, accurate scans can be achieved, while reducing the time required to scan a complex surface; in particular, the frame rate of 3D points was 1.33 times faster than that of single-line structured light with the same 2D camera. However, it should be noted that further work is necessary to improve the ratio of AB opening simultaneously, which may increase the efficiency of scanning by controlling the periodicity of the two lasers.

Author Contributions

Conceptualization, D.T.; methodology, Z.W., Y.Y. and D.T.; software, Z.W.; validation, D.T., Z.W. and C.L.; formal analysis, D.T. and C.L.; investigation, D.T., Z.W. and Y.Y.; resources, D.T.; data curation, D.T.; writing—original draft preparation, D.T. and Z.W.; writing—review and editing, D.T and C.L.; visualization, Y.Y. and C.L.; supervision, D.T.; project administration, D.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Lin, A.C.; Hui-Chin, C. Automatic 3D measuring system for optical scanning of axial fan blades. Int. J. Adv. Manuf. Technol. 2011, 57, 701–717. [Google Scholar] [CrossRef]
  2. Mei, Q.; Gao, J.; Lin, H.; Chen, Y.; Yunbo, H.; Wang, W.; Zhang, G.; Chen, X. Structure light telecentric stereoscopic vision 3D measurement system based on Scheimpflug condition. Opt. Lasers Eng. 2016, 86, 83–91. [Google Scholar] [CrossRef]
  3. Choi, K.H.; Song, B.R.; Yoo, B.S.; Choi, B.H.; Park, S.R.; Min, B.H. Laser scan-based system to measure three dimensional conformation and volume of tissue-engineered constructs. Tissue Eng. Regen. Med. 2013, 10, 371–379. [Google Scholar] [CrossRef]
  4. Huang, K. Research on the Key Technique of Flexible Measuring Arm Coordinate Measuring System; Huazhong University of Science and Technology: Wuhan, China, 2010. [Google Scholar]
  5. Chen, J.; Wu, X.; Wang, M.Y.; Li, X. 3D shape modeling using a self-developed hand-held 3D laser scanner and an efficient HT-ICP point cloud registration algorithm. Opt. Laser Technol. 2013, 45, 414–423. [Google Scholar] [CrossRef]
  6. Galantucci, L.M.; Piperi, E.; Lavecchia, F.; Zhavo, A. Semi-automatic low cost 3D laser scanning system for reverse engineering. Procedia CIRP 2015, 28, 94–99. [Google Scholar] [CrossRef] [Green Version]
  7. Zhang, X.P.; Wang, J.Q.; Zhang, Y.X.; Wang, S.; Xie, F. Large-scale there dimensional stereo vision geometric measurement system. Acta Opt. Sin. 2012, 32, 140–147. [Google Scholar]
  8. Zhang, L.; Ye, Q.; Yang, W.; Jiao, J. Weld line detection and tracking via spatial-temporal cascaded hidden Markov models and cross structured light. IEEE Trans. Instrum. Meas. 2014, 63, 742–753. [Google Scholar] [CrossRef]
  9. Yu, Z.J.; Wang, W.; Wang, S.; Chang, Z.J. An online matchin method for binocular vision measurement by using cross lin structured light. Semicond. Optoelectron. 2017, 38, 445–458. [Google Scholar]
  10. Bleier, M.; Nüchter, A. Low cost 3D laser scanning in air or water using self-calibrating structured light. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 105. [Google Scholar] [CrossRef] [Green Version]
  11. Liu, S.; Tan, Q.; Zhang, Y. Shaft diameter measurement of using the line structured light vision. Sensors 2015, 15, 19750–19767. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  12. Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef] [Green Version]
  13. Chou, J.C.; Kamel, M. Finding the Position and Orientation of a Sensor on a Robot Manipulator Using Quaternions. Int. J. Robot. Res. 1991, 10, 240–254. [Google Scholar] [CrossRef]
Figure 1. Schematic measuring arm-mounted cross-line structured-laser scanning system.
Figure 1. Schematic measuring arm-mounted cross-line structured-laser scanning system.
Instruments 07 00005 g001
Figure 2. Cross-line projector with a single fixed camera.
Figure 2. Cross-line projector with a single fixed camera.
Instruments 07 00005 g002
Figure 3. Cross−line laser work period.
Figure 3. Cross−line laser work period.
Instruments 07 00005 g003
Figure 4. Laser A opened (top), laser B opened (middle), and laser AB opened (bottom).
Figure 4. Laser A opened (top), laser B opened (middle), and laser AB opened (bottom).
Instruments 07 00005 g004
Figure 5. Schematic of the circle grid board used to extract the laser line plane.
Figure 5. Schematic of the circle grid board used to extract the laser line plane.
Instruments 07 00005 g005
Figure 6. Hardware system.
Figure 6. Hardware system.
Instruments 07 00005 g006
Figure 7. Calibration board and gauge blocks.
Figure 7. Calibration board and gauge blocks.
Instruments 07 00005 g007
Figure 8. Calibration of the camera. Laser B is fixed by hot glue.
Figure 8. Calibration of the camera. Laser B is fixed by hot glue.
Instruments 07 00005 g008
Figure 9. Calibration of the laser line plane.
Figure 9. Calibration of the laser line plane.
Instruments 07 00005 g009
Figure 10. Histogram of height.
Figure 10. Histogram of height.
Instruments 07 00005 g010
Figure 11. One of the six postures captured.
Figure 11. One of the six postures captured.
Instruments 07 00005 g011
Figure 12. Calibration board scanning result.
Figure 12. Calibration board scanning result.
Instruments 07 00005 g012
Figure 13. Air pod scanning result.
Figure 13. Air pod scanning result.
Instruments 07 00005 g013
Figure 14. Standard ball scanning result.
Figure 14. Standard ball scanning result.
Instruments 07 00005 g014
Figure 15. Flat board and the scanning result obtained.
Figure 15. Flat board and the scanning result obtained.
Instruments 07 00005 g015
Figure 16. Histogram of distance to plane.
Figure 16. Histogram of distance to plane.
Instruments 07 00005 g016
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Tai, D.; Wu, Z.; Yang, Y.; Lu, C. A Cross-Line Structured Light Scanning System Based on a Measuring Arm. Instruments 2023, 7, 5. https://doi.org/10.3390/instruments7010005

AMA Style

Tai D, Wu Z, Yang Y, Lu C. A Cross-Line Structured Light Scanning System Based on a Measuring Arm. Instruments. 2023; 7(1):5. https://doi.org/10.3390/instruments7010005

Chicago/Turabian Style

Tai, Dayong, Zhixiong Wu, Ying Yang, and Cunwei Lu. 2023. "A Cross-Line Structured Light Scanning System Based on a Measuring Arm" Instruments 7, no. 1: 5. https://doi.org/10.3390/instruments7010005

Article Metrics

Back to TopTop