Next Article in Journal
Pedestrian Crossing Sensing Based on Hough Space Analysis to Support Visually Impaired Pedestrians
Previous Article in Journal
Correction: Li et al. Fringe Projection Profilometry Based on Saturated Fringe Restoration in High Dynamic Range Scenes. Sensors 2023, 23, 3133
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Novel Calibration Method of Line Structured Light Plane Using Spatial Geometry

1
College of Automation, Nanjing University of Aeronautics and Astronautics, Jiangjun Road, Nanjing 211106, China
2
School of Computer Science and Communication Engineering, Jiangsu University, Xuefu Road, Zhenjiang 212013, China
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(13), 5929; https://doi.org/10.3390/s23135929
Submission received: 22 May 2023 / Revised: 19 June 2023 / Accepted: 22 June 2023 / Published: 26 June 2023
(This article belongs to the Section Optical Sensors)

Abstract

:
The line structured light plane calibration method using a plane target cannot produce satisfactory calibration results due to inaccurate positioning of the calibrated points. Field of view noise and sensor noise affect the target light stripe extraction and camera parameter calculation during the calibration process. These factors will cause the calculation of the coordinates of the calibrated point to deviate, and thus affect the light plane calibration. To solve this problem, we propose a new method to calculate the calibrated point based on spatial geometry. Firstly, for the projection line corresponding to the feature point on the light stripe and the corresponding line on the target, a common perpendicular of these two lines above is established, and since the sum of the squares of the distances from the midpoint to the two straight lines is the smallest, the midpoint of the common perpendicular is taken as the calibrated point. Secondly, the target is moved to different positions, and the non-collinear calibrated points are calculated. Finally, the parameters of the light plane are obtained by fitting these calibrated points. This method requires only a checkerboard target, and has a simple calibration process. The experimental results show that the average error of the calibration method proposed in this paper is 0.011 mm, which is less than the 0.031 mm of the calibration method based on the plane target with cross-ratio invariant.

1. Introduction

Optical 3D measurement techniques have been some of the most important 3D measuring techniques due to their advantages such as being non-contacting, high precision, fast speed, and so on. Optical 3D measurement techniques can be classified as passive or active based on whether an external light source is used in the measurement system [1]. The active vision technique provides a more accurate result than the passive technique, as it can acquire more information about the shape of the object with the help of an external light source. It has been widely used in many fields, such as reverse engineering [2], industrial inspection [3,4,5,6], 3D reconstruction [7], and robotics [8,9]. According to the light source, the active vision technique can be categorized into: point structured light [10], line structured light, or plane structured light [11]. A line structured light measurement system usually contains a camera and one laser projector. The system projects one laser stripe onto the surface of object, then captures the distorted light stripe modulated by the object surface. Three-dimensional information about the profile of the object can be obtained based on the light stripe center and the system-calibrated results. The architecture of this system is shown in Figure 1. Thus, it can be seen that system calibration is a basic and significant step in the whole measurement process.
Line structured light measurement system calibration includes camera calibration and light plane calibration. Camera calibration is necessary to solve the intrinsic parameters of the camera, and lots of works [12,13,14,15,16] have been conducted on camera calibration. While the present paper concentrates on the calibration of the light plane. According to the shape of the calibration target, the line structured light plane calibration methods can be divided into three-dimensional (3D), two-dimensional (2D), and one-dimensional (1D) methods.
Among 3D light plane calibration methods, Huynh [17] proposed a calibration method with the principle of cross-ratio invariability using a 3D target. The intersection point of the light stripe and the line where the collinear points on the target located were used as calibrated points. Then, the light plane could be obtained by fitting these points. However, the 3D target was required to consist of two or three planes orthogonal to each other, and it was difficult to obtain high-quality images due to the light shielding between planes. Liu et al. [18] proposed a calibration method based on a single ball target. In this method, the coefficients of the light plane equation were obtained by calculating the intersection plane of the sphere target and the cone determined by the light stripe on the ball and the center of the camera. The extraction of spherical contours is easily affected by the environment, and inaccurate contours can further affect the calibration accuracy of the light plane. A movable parallel cylinder target was adopted to calibrate the light plane in [19]. Two ellipses can be obtained from the intersection of the light stripe and the target, and the corresponding equations of the two ellipses and their projected images were established based on the perspective projection transformation. Then, the light plane equation was calculated with the constraint that the minor axis of the ellipse was equal to the diameter of the cylinder. However, the diameter error and the parallelism between the two cylinders would affect the light plane calibration accuracy. Pan et al. [5] proposed a light plane calibration method based on a multi-tooth free-moving target, which can be implemented with a camera equipped with an optical filter. This method took the intersection points of light stripe and multi-tooth target edge as feature points, and compensated the positioning deviation of image feature points based on uncertain models. To calibrate the light plane, this method calculated the coordinates of the feature points in the target coordinate system according to the cross ratio invariance, then combined the camera internal parameters and the Vanishing point of the line where the feature points were located, calculated the coordinates of the feature points in the camera coordinate system according to the camera perspective projection model, and finally fitted the non collinear feature points to obtain the light plane equation parameters. Zhu et al. [20] used a single cylindrical target to calibrate the light plane. The laser projected onto the cylindrical target to form an ellipse. According to the principle of camera perspective projection, the relationship equations were constructed using the geometric characteristics of the ellipse, and the parameters of the light plane can be calculated from the relationship equations. Wu et al. [21] designed a calibration target with a trapezoidal cross-section, and designed a number of characteristic straight lines in the horizontal and vertical directions on its inclined surface. The straight lines can be detected by using the Canny operator and the Hough line detection method, and then the feature points on the target can be obtained. With the angle information between the projected points of the collinear feature points and the optical center of the camera, the coordinates of the feature points in the camera coordinate system can be calculated using the cosine theorem. After that, the coordinates of the feature points on the light stripe can be calculated using cross-ratio invariance. Finally, the parameters of the light plane equation can be obtained by fitting the non-collinear light stripe points.
Wei [22] calibrated the light plane with a 1D target. The intersection point between the light plane and the target was obtained according to the distances between the feature points on the target. The target needed to be moved repeatedly to obtain enough calibrated points for fitting the light plane.
Both 3D and 1D targets need to be designed and manufactured precisely, which is usually expensive. In contrast, a 2D target is often used for camera calibration; its production is mature and accurate. Zhou [23] presented an on-site light plane calibration method using a planar target. The calibrated points are calculated using the principle of cross-ratio invariability as well, and non-collinear points are obtained by moving the planar target repeatedly. Through the conversion relationship between the image coordinate system and the camera coordinate system, Yu et al. [24] solved the equation of the light stripe line in the image in the camera coordinate system, and further calculated the plane equation of the light projection plane, and then calculated the target plane in the camera coordinate system. The intersection of the two planes mentioned above was the equation of the light stripe line on the calibration board in the camera coordinate system. Multiple points on the line were extracted as calibration points, moved the target and repeated the above process to obtain non-common calibration points. These methods based on plane targets are free of expensive equipment, and they are suitable for on-site calibration. However, line structured light plane calibration methods using a plane target fail to obtain satisfactory calibration accuracy due to inaccurate positioning of calibrated points. To obtain more accurate calibrated points, a novel light plane calibration method is proposed in this paper. According to the model of a line structured light vision sensor and the principle of perspective projection, the projection line corresponding to the feature point on the light stripe intersects with the corresponding line on the target in an ideal case. However, field of view noise and sensor noise such as lens distortion, out-of-focus blur, poor laser quality, etc., induce camera calibration error and light stripe extraction error, which lead to the above two calculated lines intersecting in different planes. Based on the spatial geometry observed and the least squares principle, a common perpendicular of the two lines above is established, and since the sum of the squares of the distances from the midpoint to the two straight lines is the smallest, the midpoint of the common perpendicular is taken as the calibrated point. Then, the plane target is moved to different positions to obtain several calibrated points that are not collinear. Finally, the parameters of the light plane are obtained by fitting these points.
The rest of this paper is organized as follows: Section 2 describes the model of the line structured light vision sensor and the proposed line structured light plane calibration method in detail. Section 3 carries out the experiments, and the performance of the presented method is evaluated. Section 4 reaches the conclusion.

2. Methods for Light Plane Calibration

2.1. Model of the Line Structured Light Vision Sensor

The measurement model of the line structured light vision sensor [25] is displayed in Figure 2. o c x c y c z c is the camera coordinate frame (CCF). o u v is the image coordinate frame in pixels and O X Y is the image coordinate frame in millimeters.
Based on the perspective projection model, the equation between the point P = x c , y c , z c T in the camera coordinate frame and its image coordinate p = ( u , v ) T is obtained as:
s u v 1 = K x c y c z c
where K = a x γ u 0 0 a y v 0 0 0 1 is the camera intrinsic parameter matrix obtained by camera calibration. a x and a y denote the effective focal lengths in the X and Y axes of the image, respectively, u 0 , v 0 is the principle point, γ is the skew of the two image axes, and s is a nonzero scale factor. The light plane equation in the camera coordinate frame can be written as: a c x c + b c y c + c c z c + d c = 0 . The point P locates on this plane, so the mathematical model of the line structured light vision sensor can be expressed as:
s u v 1 = K x c y c z c a c x c + b c y c + c c z c + d c = 0
If the camera intrinsic parameter matrix and the parameters of the light plane are known, as well as the image coordinate of the measured point, then the 3D coordinates in the camera coordinate frame of the measured point can be calculated by using Equation (2). In this paper, we assume the camera calibration has been completed [12]; our task is to solve the parameters of the light plane.

2.2. The Proposed Line Structured Light Plane Calibration Method

A novel light plane calibration method of free-moving planar targets is proposed in this paper. Firstly, the three-dimensional coordinates of the calibrated point are calculated by using the spatial geometry of the line structured light measurement system as well as least squares principle, and then the parameters of the light plane are obtained by fitting the calibrated points. The steps of light plane calibration are described as follows.
As in method [23], the intersection of the light plane and the grid line on the target is used as the calibrated point. According to the model of the line structured light sensor, as shown in Figure 3, the calibrated point P is the intersection of the corresponding light projection line o c p and the corresponding target line A B without considering errors. However, field of view noise and sensor noise such as lens distortion, out-of-focus blur, poor laser quality, etc., induce camera calibration error and light stripe extraction error, which will result in the lines o c p and A B being located on different planes and not having any intersection point. Therefore, combining spatial geometry and the least squares principle, we select the midpoint of the common perpendicular of the lines o c p and A B as the calibrated point, and the sum of the squares of the distances from this point to the two straight lines is the smallest.
As shown in Figure 3, the spatial point P is the intersection point of the lines o c p and A B , and its corresponding projected point is p, with image coordinates [ u p , v p ] T in pixels. The coordinates of the point o c are [ 0 , 0 , 0 ] T in the CCF. As is known, the coordinates of p in z c are f, which is the focal length of the camera. According to the pin-hole model of the camera, the coordinates of p in the CCF can be obtained as:
x p y p z p = d X 0 0 0 d Y 0 0 0 1 u p u 0 v p v 0 f
where d X and d Y are the sizes of one pixel in the X and Y axes, respectively.
The coordinates of the points A and B on the target in the CCF can be given by:
x i y i z i = R T L X i L Y i 0
where i = ( A , B ) , [ L X i , L Y i , 0 ] T and [ x i , y i , z i ] T are the coordinates of point A or B in the local world coordinate frame on the target and in the camera coordinate frame, respectively. R and T are the rotation and translation matrix from the local world coordinate frame to the camera coordinate frame, which can be calculated with Zhang’s method [12].
Since the coordinates of o c , p, A, and B in the CCF are calculated, the equations of the spatial straight lines o c p and A B are obtained as:
f x , y , z = x 0 a 1 = y 0 b 1 = z 0 c 1
g x , y , z = x x a a 2 = y y a b 2 = z z a c 2
where [ a 1 , b 1 , c 1 ] and [ a 2 , b 2 , c 2 ] are the directional vectors of o c p and A B , respectively, and [ x a , y a , z a ] T is the coordinate of point A in the CCF.
With f x , y , z and g x , y , z , we can obtain the common perpendicular of o c p and A B , and the common perpendicular intersects with o c p and A B at s 1 and s 2 ; we take the midpoint s of the line connecting points s 1 and s 2 as the calibrated point. A diagram of the calibrated point calculation is shown in Figure 4. Other calibrated points can be obtained similarly: move the target to another different orientation, repeat the above procedures, then, non-collinear calibrated points are obtained.
Finally, the least squares method is used to solve the parameters of the light plane by fitting these calibrated points. The objective function is to minimize the square sum of the distances from the calibrated points to the fitting plane:
f a c , b c , c c , d c = i = 1 k a c x c i + b c y c i + c c z c i + d c a c 2 + b c 2 + c c 2 2
where [ x c i , y c i , z c i ] T are the coordinates of the i calibrated point in the CCF.
In summary, the procedures for the proposed light plane calibration method are as follows:
(1)
Correcting the distortion of the calibration images.
(2)
Extracting light stripe centers for all calibration images.
(3)
Fitting the line with the extracted light stripe centers.
(4)
Fitting the line with the image coordinates of the horizontal collinear points on the checkerboard.
(5)
Computing the intersection points of the above two lines on the images.
(6)
Computing the line linking the intersection point on the image and the camera optical center in the CCF.
(7)
Computing the line where the horizontal collinear points locate on the chessboard in the CCF.
(8)
Computing the common perpendicular of the two lines in (6) and (7).
(9)
Computing the midpoint of the common perpendicular as calibrated points.
(10)
Estimating the equation of the light plane in the CCF by fitting non-collinear calibrated points.

3. Experiment Results and Discussion

To verify the feasibility and the effectiveness of the proposed method, we conducted simulated and physical experiments. The simulation experiment is to determine the influence of image noise on the calibration accuracy. The physical experiment is to evaluate the accuracy of our method with Zhou’s method [23] and Yu’s method [24].

3.1. Simulation Experiment

The configuration parameters for the simulation experiment are as follows: the camera resolution is 1600 × 1200 pixels, the focal length is 8 mm, and the intrinsic matrix of the camera is
K = 1000 0 800 0 1000 600 0 0 1
the virtual checkerboard is 10 × 7, and the size of each grid is 20 × 20. The equation of the light plane is 1.103 x 0.241 y 0.856 z + 390.793 = 0
The rotation matrixes R 1 and R 2 and the translation vectors t 1 and t 2 from the virtual target coordinate frame to the camera coordinate frame are:
R 1 = 0.9997 0.0079 0.0232 0.0105 0.9936 0.1123 0.0222 0.1126 0.9934 , t 1 = 10.00 15.00 460.00
R 2 = 0.9987 0.0017 0.0503 0.0095 0.9878 0.1557 0.0494 0.1560 0.9865 , t 2 = 10.00 20.00 450.00
In the experiment, the target was placed in two different positions. With the preset rotation matrix R and the translation vector t, the coordinates of the feature points on the target plane in the camera coordinate system can be calculated. By calculating the intersection of the line where the horizontal feature point is located and the preset light plane, the accurate coordinates of the calibrated point in the camera coordinate system can be obtained. Combined with the camera’s internal parameters, the error free projection point coordinates of the calibrated point on the image can be further obtained. In order to verify the influence of the light stripe center extraction error on the light plane calibration accuracy, Gaussian noise with a mean of zero and a standard deviation of 0.1 to 1 pixel with an interval of 0.1 pixels was added to the centers of the light stripe. For each noise level, 100 experiments were carried out to calculate the relative error in the light plane parameters. It should be noted that for the real and calculated light plane equations, we normalized them to A x + B y z + D = 0 , and then estimated the relative errors of the plane parameters A, B, and D. The relative errors of the calibration results for our method and Zhou’s method at different noise levels are shown in Figure 5.
As shown in Figure 5, the relative errors increase with the increase in noise for both our method and Zhou’s method [23], and the robustness of the proposed method is comparable to Zhou’s method. Combined with the fact that the extraction accuracy of the light stripe center usually reaches 0.1–0.2 pixels, thus, the relative error in the light plane parameters obtained with our method can reach 0.5 % from the simulation results. In this noise range, the method in this paper is slightly better than Zhou’s method.

3.2. Physical Experiments

The line structured light vision sensor is composed of a camera with resolution 1600 pixels × 1200 pixels and a single line laser projector with wavelength 650 nm. The calibration equipment is displayed in Figure 6.
The camera intrinsic parameters calibrated with Zhang’s method [12] are as follows:
K = 1804.75 0 791.43 0 1805.10 601.31 0 0 1
The radial distortion coefficients of the camera are k 1 = 0.1059 and k 2 = 0.1710 .
To verify the effectiveness of our method, Zhou’s method [23] and Yu’s method [24] were conducted to compare with the proposed method. A checkerboard was adopted as the target in the light plane calibration experiments, and the spacing of the grid points was 30 mm. We placed the target in front of the line structured light vision sensor five times, and three captured images were used to calibrate the light plane, the other two images were used as test images to verify the accuracy of the calibration results. The images used in the calibration experiments are shown in Figure 7. Firstly, we corrected the distortion of the calibration images, then extracted the coordinates of the light stripe centers and corner points on the undistorted images, and the calibrated points used for fitting the light plane were calculated according to the proposed method and Zhou’s method and Yu’s method. We used 18 calibrated points to calibrate the light plane. The three-dimensional coordinates in the CCF of the calibrated points used in the proposed method are shown in Figure 8, the blue + represents the calibrated points. The calibration results for Zhou’s method, Yu’s method and our method are: 1.724 x 0.109 y z + 375.073 = 0 , 1.713 x 0.111 y z + 375.467 = 0 , and 1.727 x 0.111 y z + 374.997 = 0 , respectively.
For test images, the intersection points of the light plane and the checkerboard in the horizontal direction are called test points. The distance between any two test points calculated with the calibrated line structured light vision sensor was taken as the measured distance d p . The test points in the local world coordinate frame were calculated with the principle of cross-ratio invariance, and they were taken as the approximate ground values. The ideal distance between two test points was taken as d r . The distance deviation was recorded as Δ d . The distance between any two test points on the same light stripe is as follows:
d r = x w , i x w , j 2 + y w , i y w , j 2 + z w , i z w , j 2
d p = x c , i x c , j 2 + y c , i y c , j 2 + z c , i z c , j 2
where [ x w , i , y w , i , z w , i ] T and [ x c , i , y c , i , z c , i ] T are the local world coordinates of the ith point and the corresponding 3D camera coordinates.
The distance deviations are shown in Table 1. From Table 1, it can be seen that the RMS error of Zhou’s method is about 0.031 mm, the RMS error of Yu’s method is about 0.075 mm, and that of the proposed method is 0.011 mm. The calibration accuracy of our method is higher than that of Zhou’s method and Yu’s method.
The coordinates of test points in the CCF calculated with Zhou’s method, Yu’s method and the proposed method are shown in Table 2.

4. Conclusions

In this paper, a novel light plane calibration method is proposed based on the spatial geometry and the principle of least squares. Our method is validated through simulation experiments and practical experiments. In the simulation experiments, our method is compared with the calibration method [23] based on cross-ratio invariance under different image noise levels, and the results show that our method is robust to noise, and is slightly better than the latter method. In practical experiments, our method is compared with Zhou’s method [23] and Yu’s method [24], and the results show that our method has the highest calibration accuracy. The RMS error of our method is 0.011 mm, which is less than the 0.031 mm of Zhou’s method and 0.075 mm of Yu’s method.

Author Contributions

Methodology, experiment, and writing, H.G.; review and funding, G.X.; review and funding, Z.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China (grant nos. 62073161 and 62006098).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
CCFCamera coordinate frame

References

  1. Frauel, Y.; Tajahuerce, E.; Matoba, O.; Castro, A.; Javidi, B. Comparison of passive ranging integral imaging and active imaging digital holography for three-dimensional object recognition. Appl. Opt. 2004, 43, 452–462. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Lu, K.; Wang, W. A multi-sensor approach for rapid and precise digitization of free-form surface in reverse engineering. Int. J. Adv. Manuf. Technol. 2015, 79, 1983–1994. [Google Scholar] [CrossRef]
  3. Kapłonek, W.; Nadolny, K. Laser methods based on an analysis of scattered light for automated, in-process inspection of machined surfaces: A review. Optik 2015, 126, 2764–2770. [Google Scholar] [CrossRef]
  4. Mavrinac, A.; Chen, X.; Alarcon-Herrera, J.L. Semiautomatic model-based view planning for active triangulation 3-D inspection systems. IEEE/ASME Trans. Mechatron. 2014, 20, 799–811. [Google Scholar] [CrossRef]
  5. Pan, X.; Liu, Z.; Zhang, G. Line structured-light vision sensor calibration based on multi-tooth free-moving target and its application in railway fields. IEEE Trans. Intell. Transp. Syst. 2020, 22, 5762–5771. [Google Scholar] [CrossRef]
  6. Miao, J.; Tan, Q.; Wang, S.; Liu, S.; Chai, B.; Li, X. A Vision Measurement Method for the Gear Shaft Radial Runout with Line Structured Light. IEEE Access 2020, 9, 5097–5104. [Google Scholar] [CrossRef]
  7. Song, Z.; Jiang, H.; Lin, H.; Tang, S. A high dynamic range structured light means for the 3D measurement of specular surface. Opt. Lasers Eng. 2017, 95, 8–16. [Google Scholar] [CrossRef]
  8. Muhammad, J.; Altun, H.; Abo-Serie, E. Welding seam profiling techniques based on active vision sensing for intelligent robotic welding. Int. J. Adv. Manuf. Technol. 2017, 88, 127–145. [Google Scholar] [CrossRef]
  9. Wang, Z.; Fan, J.; Jing, F.; Deng, S.; Zheng, M.; Tan, M. An efficient calibration method of line structured light vision sensor in robotic eye-in-hand system. IEEE Sens. J. 2020, 20, 6200–6208. [Google Scholar] [CrossRef]
  10. Nguyen, T.T.; Slaughter, D.C.; Max, N.; Maloof, J.N.; Sinha, N. Structured light-based 3D reconstruction system for plants. Sensors 2015, 15, 18587–18612. [Google Scholar] [CrossRef] [Green Version]
  11. Salvi, J.; Fernandez, S.; Pribanic, T.; Llado, X. A state of the art in structured light patterns for surface profilometry. Pattern Recognit. 2010, 43, 2666–2680. [Google Scholar] [CrossRef]
  12. Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef] [Green Version]
  13. Zhang, Z. Camera calibration with one-dimensional objects. IEEE Trans. Pattern Anal. Mach. Intell. 2004, 26, 892–899. [Google Scholar] [CrossRef] [PubMed]
  14. Liu, Z.; Wu, Q.; Wu, S.; Pan, X. Flexible and accurate camera calibration using grid spherical images. Opt. Express 2017, 25, 15269–15285. [Google Scholar] [CrossRef] [PubMed]
  15. Sun, J.; Cheng, X.; Fan, Q. Camera calibration based on two-cylinder target. Opt. Express 2019, 27, 29319–29331. [Google Scholar] [CrossRef]
  16. Chuang, J.H.; Ho, C.H.; Umam, A.; Chen, H.Y.; Hwang, J.N.; Chen, T.A. Geometry-Based Camera Calibration Using Closed-Form Solution of Principal Line. IEEE Trans. Image Process. 2021, 30, 2599–2610. [Google Scholar] [CrossRef]
  17. Huynh, D.Q.; Owens, R.A.; Hartmann, P. Calibrating a structured light stripe system: A novel approach. Int. J. Comput. Vis. 1999, 33, 73–86. [Google Scholar] [CrossRef]
  18. Liu, Z.; Li, X.; Li, F.; Zhang, G. Calibration method for line-structured light vision sensor based on a single ball target. Opt. Lasers Eng. 2015, 69, 20–28. [Google Scholar] [CrossRef]
  19. Liu, Z.; Li, X.; Yin, Y. On-site calibration of line-structured light vision sensor in complex light environments. Opt. Express 2015, 23, 29896–29911. [Google Scholar] [CrossRef]
  20. Zhu, Z.; Wang, X.; Zhou, F.; Cen, Y. Calibration method for a line-structured light vision sensor based on a single cylindrical target. Appl. Opt. 2020, 59, 1376–1382. [Google Scholar] [CrossRef]
  21. Wu, X.; Tang, N.; Liu, B.; Long, Z. A novel high precise laser 3D profile scanning method with flexible calibration. Opt. Lasers Eng. 2020, 132, 105938. [Google Scholar] [CrossRef]
  22. Wei, Z.; Cao, L.; Zhang, G. A novel 1D target-based calibration method with unknown orientation for structured light vision sensor. Opt. Laser Technol. 2010, 42, 570–574. [Google Scholar] [CrossRef]
  23. Zhou, F.; Zhang, G. Complete calibration of a structured light stripe vision sensor through planar target of unknown orientations. Image Vis. Comput. 2005, 23, 59–67. [Google Scholar] [CrossRef]
  24. Longlong, Y.; Yanwen, L.; Yingbao, L. Line structured light calibrating based on two-dimensional planar target. Chin. J. Sci. Instrum. 2020, 41, 124–131. [Google Scholar]
  25. Zou, W.; Wei, Z.; Liu, F. High-accuracy calibration of line-structured light vision sensors using a plane mirror. Opt. Express 2019, 27, 34681–34704. [Google Scholar] [CrossRef]
Figure 1. The architecture of a line structured light sensor.
Figure 1. The architecture of a line structured light sensor.
Sensors 23 05929 g001
Figure 2. The measurement model of the line structured light vision sensor.
Figure 2. The measurement model of the line structured light vision sensor.
Sensors 23 05929 g002
Figure 3. The spatial geometry of the calibrated point for light plane calibration.
Figure 3. The spatial geometry of the calibrated point for light plane calibration.
Sensors 23 05929 g003
Figure 4. Diagram of calibrated point calculation.
Figure 4. Diagram of calibrated point calculation.
Sensors 23 05929 g004
Figure 5. Relative errors of the calibration results at different noise levels. (ac) are for plane parameters A, B, and D, respectively.
Figure 5. Relative errors of the calibration results at different noise levels. (ac) are for plane parameters A, B, and D, respectively.
Sensors 23 05929 g005
Figure 6. Line structured light vision sensor.
Figure 6. Line structured light vision sensor.
Sensors 23 05929 g006
Figure 7. Images used for the line structured light vision sensor calibration.
Figure 7. Images used for the line structured light vision sensor calibration.
Sensors 23 05929 g007
Figure 8. The 3D camera coordinates of the calibrated points on the light plane.
Figure 8. The 3D camera coordinates of the calibrated points on the light plane.
Sensors 23 05929 g008
Table 1. Accuracy evaluation with distance between two calibrated points.
Table 1. Accuracy evaluation with distance between two calibrated points.
Position
Number
(Point 1 Point 2) d r (mm)Zhou’s MethodYu’s MethodProposed Method
d p (mm) Δ d (mm) d p (mm) Δ d (mm) d p (mm) Δ d (mm)
No. 1(0 1)30.00129.980−0.02129.978−0.22330.0110.010
(0 2)60.00059.966−0.03459.953−0.04760.0190.019
(0 3)90.00189.955−0.04689.923−0.07890.0210.020
(0 4)120.000119.949−0.051119.891−0.109120.0190.019
(0 5)150.000149.959−0.041149.866−0.134150.0240.024
(1 2)30.00029.985−0.01529.975−0.02530.0070.007
(1 3)60.00159.974−0.02759.945−0.05660.0090.008
(1 4)90.00189.969−0.03289.913−0.08890.0080.007
(1 5)120.001119.979−0.022119.888−0.113120.0120.011
(2 3)30.00129.989−0.01229.970−0.03130.0020.001
(2 4)60.00259.984−0.01859.938−0.06460.001−0.001
(2 5)90.00189.993−0.00889.913−0.08890.0050.004
(3 4)30.00929.995−0.01429.968−0.04129.999−0.010
(3 5)60.00460.0040.00059.942−0.06260.003−0.001
(4 5)30.00030.0100.01029.974−0.02630.0040.004
No. 2(0 1)30.00729.992−0.01529.991−0.01630.0230.016
(0 2)60.02759.990−0.03759.980−0.04760.0430.016
(0 3)90.03789.985−0.05289.958−0.07990.0510.014
(0 4)120.056119.995−0.061119.942−0.114120.0640.008
(0 5)150.068150.022−0.046149.936−0.132150.0860.018
(1 2)30.02229.998−0.02429.989−0.03330.020−0.002
(1 3)60.03159.993−0.03859.967−0.06460.028−0.003
(1 4)90.05090.003−0.04789.951−0.09990.041−0.009
(1 5)120.062120.030−0.032119.944−0.118120.0630.001
(2 3)30.01029.995−0.01529.978−0.03230.008−0.002
(2 4)60.02860.004−0.02459.962−0.06660.021−0.007
(2 5)90.04190.031−0.01089.955−0.08690.0430.002
(3 4)30.02030.010−0.01029.984−0.03630.013−0.007
(3 5)60.03260.0370.00559.978−0.05460.0350.003
(4 5)30.01230.0270.01529.994−0.01830.022−0.010
RMS error 0.031 0.075 0.011
Table 2. 3D coordinates of the test points obtained using different calibration results.
Table 2. 3D coordinates of the test points obtained using different calibration results.
Position
Number
Point
Index
Zhou’s MethodProposed MethodYu’s Method
xyzxyzxyz
No. 1165.287−58.002493.94765.321−58.032494.20565.252−57.971493.680
264.597−28.357489.52864.621−28.368489.71064.553−28.338489.197
363.9061.293485.10963.9211.293485.21663.8541.292484.714
463.21630.946480.68963.22030.948480.72363.15630.917480.233
562.52560.605476.26962.52060.600476.23162.45760.539475.751
661.83490.279471.84661.82090.258471.73761.75890.169471.269
No. 2163.865−57.918491.48563.897−57.947491.73763.833−57.889491.239
263.118−28.276486.97063.141−28.287487.14763.078−28.259486.662
362.3721.371482.45562.3851.371482.55762.3241.370482.084
461.62631.014477.94161.62931.016477.96961.57030.986477.508
560.87960.673473.42460.87460.667473.38060.81660.610472.932
660.13290.348468.90460.11790.326468.79060.06190.242468.353
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gao, H.; Xu, G.; Ma, Z. A Novel Calibration Method of Line Structured Light Plane Using Spatial Geometry. Sensors 2023, 23, 5929. https://doi.org/10.3390/s23135929

AMA Style

Gao H, Xu G, Ma Z. A Novel Calibration Method of Line Structured Light Plane Using Spatial Geometry. Sensors. 2023; 23(13):5929. https://doi.org/10.3390/s23135929

Chicago/Turabian Style

Gao, Huiping, Guili Xu, and Zhongchen Ma. 2023. "A Novel Calibration Method of Line Structured Light Plane Using Spatial Geometry" Sensors 23, no. 13: 5929. https://doi.org/10.3390/s23135929

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop