Next Article in Journal
Sperm-Cultured Gate Ion-Sensitive Field-Effect Transistor for Non-Optical and Live Monitoring of Sperm Capacitation
Next Article in Special Issue
Laser Sensors for Displacement, Distance and Position
Previous Article in Journal
Inertial Measurement Unit Based Upper Extremity Motion Characterization for Action Research Arm Test and Activities of Daily Living
Previous Article in Special Issue
Development of a 3-PRR Precision Tracking System with Full Closed-Loop Measurement and Control
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Calibration Method for a Laser Triangulation Scanner Mounted on a Robot Arm for Surface Mapping

by
Gerardo Antonio Idrobo-Pizo
1,
José Maurício S. T. Motta
2,* and
Renato Coral Sampaio
3
1
Faculty of Gama-FGA, Department Electronics Engineering, University of Brasilia, Brasilia-DF 72.444-240, Brazil
2
Faculty of Technology-FT, Department of Mechanical and Mechatronics Engineering, University of Brasilia, Brasilia-DF 70910-900, Brazil
3
Faculty of Gama-FGA, Department Software Engineering, University of Brasilia, Brasilia-DF 72.444-240, Brazil
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(8), 1783; https://doi.org/10.3390/s19081783
Submission received: 13 February 2019 / Revised: 6 March 2019 / Accepted: 12 March 2019 / Published: 14 April 2019
(This article belongs to the Special Issue Laser Sensors for Displacement, Distance and Position)

Abstract

:
This paper presents and discusses a method to calibrate a specially built laser triangulation sensor to scan and map the surface of hydraulic turbine blades and to assign 3D coordinates to a dedicated robot to repair, by welding in layers, the damage on blades eroded by cavitation pitting and/or cracks produced by cyclic loading. Due to the large nonlinearities present in a camera and laser diodes, large range distances become difficult to measure with high precision. Aiming to improve the precision and accuracy of the range measurement sensor based on laser triangulation, a calibration model is proposed that involves the parameters of the camera, lens, laser positions, and sensor position on the robot arm related to the robot base to find the best accuracy in the distance range of the application. The developed sensor is composed of a CMOS camera and two laser diodes that project light lines onto the blade surface and needs image processing to find the 3D coordinates. The distances vary from 250 to 650 mm and the accuracy obtained within the distance range is below 1 mm. The calibration process needs a previous camera calibration and special calibration boards to calculate the correct distance between the laser diodes and the camera. The sensor position fixed on the robot arm is found by moving the robot to selected positions. The experimental procedures show the success of the calibration scheme.

1. Introduction

In the last decade, the spread of 3D scanning devices has been increasing progressively in industry, mainly for the inspection and quality control of processes that use robotic and machine vision systems, which need motion control within an unknown workspace [1,2]. The main noncontact measurement methods include visual detection [3,4] and laser scanning methods [5].
Up to now, few works have been published about sensor model calibration describing the combination of motion control with the high positioning accuracy of industrial robots (1 mm maximum error tolerance) and 3D noncontact measuring systems [1,2,6,7,8,9,10,11,12,13,14]. A robotic system can perform measurements from different angles and directions avoiding occlusion, shading problems, and insufficient data from the surface to be measured [15,16].
To achieve an accurate measurement of an object’s pose expressed in a world coordinate system using a vision system mounted on the robot, various components need to be calibrated beforehand [11,17,18]. This includes the robot tool position, expressed in the robot base coordinate system; the camera pose, expressed in the robot tool coordinate system; and the object pose, expressed in the camera coordinate system. In recent years, there have been major research efforts to individually resolve each of the tasks above [19]. For instance, the calibration of camera and laser projectors to find the intrinsic and extrinsic parameters of the digitizer or 2D image coordinates from 3D object coordinates that are expressed in world coordinate systems [20]. In addition, there is also research focused on robot calibration in order to increase the accuracy of the robot end-effector positioning by using measures expressed in a 3D digitizer coordinate system [21]. Once all system components are individually calibrated, the object position expressed in the robot base reference system can be directly obtained from vision sensor data.
The calibration of the complete robot-vison system can be achieved from the calibration of its components or subsystems separately, taking into account that each procedure for the component calibration is relatively simple. If any one of the components of the system has its relative position modified, the calibration procedure must be repeated only for that component of the system.
Noncontact measurement systems have been analyzed and compared regarding their measurement methodology and accuracy in a comparative and analytical form in [22], considering their high sensitivity to various external factors inherent in the measuring process or the optical characteristics of the object. However, in the case of noncontact optical scanning systems and due to the complexity of the assessment to the process errors, there is no standardized method to evaluate the measurement uncertainty, as described in ISO/TS/14253-2:1999 and IEC Guide 98-3:2008, which makes it difficult to establish criteria to evaluate the performance of the measurement equipment. In ISO 10360-7:2011, for example, there is currently no specification of performance requirements for the calibration of laser scanners, fringe projection, or structured light systems.
An experimental procedure has been conceived to calibrate the relative position between the vision sensor coordinate system and the robot base coordinate system consisting of moving the robot manipulator to different poses for the digitization of a standard sphere of known radius [10]. Through a graphical visualization algorithm, a trajectory could be chosen by the user for the robot tool to follow. The calibration procedure proposed in that work agreed with the standards specifications of ISO 10360-2 for coordinate measuring machines (CMMs). A similar work is presented in [23].
In this article, a calibration routine is presented to acquire surface 3D maps from a scanner specially built with a vision camera and two laser projectors to transform these coordinates into object coordinates expressed in the robot controller for surface welding. The calibration of the geometric parameters of the vision sensor can be performed by using a flat standard block to acquire several images of the laser light at different angular positions of the mobile laser projector. The image of the fixed sensor is stored to compute the intersection between it and the images of the light projections of the mobile laser projector. The transformation of 3D maps from the sensor coordinates to the robot base coordinates was performed using a method to calibrate the sensor position fixed on the robot arm together with the geometric parameters of the robot. Results have shown that in the application the scanning sensor based on triangulation can generate 3D maps expressed in the robot base coordinates with acceptable accuracy for welding, with the values of positioning errors smaller than 1 mm in the working depth range.

2. The Optical System

The surface scanning system developed in this research does not depend on positioning sensors to measure the angular displacement of the laser light source. However, the determination of this angular displacement is required for the triangulation process to produce the depth map. The proposed sensor replaces the angular displacement sensor by another laser source, such that the system is composed of two laser projectors and a camera, as shown in the sketch in Figure 1.
In addition to the use of a second laser projector, it is also necessary to include two geometrical restrictions to the mounting system. The first restriction is that the relative position between the two planes of light projected by the lasers must be perpendicular so that the triangulation equations proposed are valid. The second restriction is that one of the laser projectors is fixed. These restrictions will be discussed in detail later.
Each of the laser diodes project a light plane on a surface generating a light curve on it, as shown in Figure 2.
It is considered that the light plane in Figure 2 is parallel to the X-axis of the camera coordinate system. In the Z–Y plane in Figure 3, the image of a single point, P, on the laser line is projected on the camera sensor such that the image formation of P is a projection by the central perspective model.
There are two triangles in green and blue in Figure 3 from which a relationship between the 3D coordinates of point P and the 2D image coordinates can be formulated.
So,
{ y c = y u b y f c o t ( θ y ) + y u z c = f b y f c o t ( θ y ) + y u ,
where
  • (xc, yc, zc) → 3D coordinates of point P in camera coordinates (mm);
  • (xu, yu) → image coordinates of point P (mm);
  • by → distance along the Y-axis between the camera origin and the laser plane parallel to the X-axis;
  • θy → angle between the Y-axis and the laser plane parallel to the X-axis;
  • f → camera focal length.
From the perspective equations, xc/xu = yc/yu, it is possible to determine the value of xc from
x c = x u b y f c o t ( θ y ) + y u ,
such that the 3D coordinates of point P are completely defined by the 2D image coordinates by
[ x c y c z c ] = b y f c o t ( θ y ) + y u [ x u y u f ]
Due to the restrictions of the mounting system, both laser planes are perpendicular to each other so that the second laser is parallel to the Y-axis of the camera. The equations of the first laser line can be derived from a projection of the X–Z plane shown in Figure 3 such that the image formation of point P on the line can be formulated with the perspective model with Equation (4):
{ x c = x u b x f c o t ( θ x ) + x u z c = f b x f c o t ( θ x ) + x u ,
where
  • bx → distance along the X-axis between the camera origin and the laser plane parallel to the Y-axis;
  • θx → angle between the X-axis and the laser plane parallel to the Y-axis;
From the perspective equations:
y c = y u b x f c o t ( θ x ) + x u ,
such that the 3D coordinates of point P are completely defined by its 2D image coordinates using Equation (6):
[ x c y c z c ] = b x f c o t ( θ x ) + x u [ x u y u f ]
Equations (3) and (6) define a relationship between the 3D coordinates of a point P and its 2D image coordinates, but these equations are not valid for all points of the light projection. Equation (3) is valid only for one of the laser’s lines, and Equation (6) is valid only for the other, as shown in Figure 4.
However, at the point of intersection Pint between the two lasers’ lines projected on the surface, both equations are valid.
From the image coordinates of the intersection point (xint and yint), the 3D coordinates of Pint can be calculated from both Equations (3) and (6), so a relationship between the angular displacement of both laser diodes, θx and θy, can be obtained as
{ c o t ( θ y ) = 1 f [ b y b x ( f c o t ( θ x ) + x i n t ) y i n t ] c o t ( θ x ) = 1 f [ b x b y ( f c o t ( θ y ) + y i n t ) x i n t ]
Since one of the laser diodes has no degree of freedom, then either cot(θx) or cot(θy) is constant and previously known, as well as the values of bx, by, and f, which are also calibrated previously. Therefore, the other term cot(θx) or cot(θy) of the mobile laser can be obtained from Equation (7) and Equation (3), or alternatively Equation (6) can convert the 2D image coordinates into 3D coordinates of each of the points on the line projected onto the surface by the mobile laser diode.
However, when rotating the mobile laser projector, the model described by Equation (3), Equation (6), and Equation (7) cannot describe the system geometry. As can be seen in Figure 5, if the mobile laser diode is not aligned with the camera’s coordinate system, the distance, b, does not remain constant while scanning the surface.
To consider this effect in the digitization equations, it is necessary to include a misalignment parameter, and then it is possible to perform a correction on the base distance of the laser for each angular position according to Equation (8) and Figure 6:
{ b y = b y + d y c o t ( θ y ) b x = b x + d x c o t ( θ x )
It is important to note that although this misalignment can occur in both diodes, it generates variation only on the base distance of the mobile laser beam. For the fixed laser, regardless of the misalignment, the base distance, b’, remains constant. In other words, after determining this distance, no compensation is necessary due to the variation in the position of the mobile beam.
Rewriting the scanning equations, including the effect of the mobile laser misalignment, yields
[ x c y c z c ] = b y f c o t ( θ y ) + y u [ x u y u f ]
[ x c y c z c ] = b x f c o t ( θ x ) + x u [ x u y u f ]
where bx’ and by’ are given by Equation (8) and
{ c o t ( θ y ) = 1 f [ b y b x ( f c o t ( θ x ) + x i n t ) y i n t ] [ 1 d y f b x ( f c o t ( θ x ) + x i n t ) ] 1 c o t ( θ x ) = 1 f [ b x b y ( f c o t ( θ y ) + y i n t ) x i n t ] [ 1 d x f b y ( f c o t ( θ y ) + y i n t ) ] 1
A flowchart shows each of the steps for the complete scan of a surface in Figure 7. It is important to note that the camera model and the parameters bx, by, dx, dy, cot(θx), and cot(θy) are previously calibrated. Depending on which diode laser is used as the mobile laser, either Equation (9) or Equation (10) is used.

3. Optical System Calibration

Since the camera is calibrated, all camera intrinsic and extrinsic parameters are completely determined, and the optical system can be calibrated with these parameters.
The calibration of the optical system is the process of identifying the real values of the geometric parameters of the optical system described previously. These parameters can be seen in Figure 8.
A point P on the object reference system with coordinates (xw, yw, zw) has its coordinates expressed in the camera coordinate system (xc, yc, zc) by the following equation:
[ x c y c z c ] = R · [ x w y w z w ] + T ,
where R is an orthonormal rotation matrix 3 × 3 and T is a translation vector representing the spatial coordinates of the origin of the world reference system expressed in the camera coordinate system.
Considering that R and T, defined in Equation (12), perform the transformation of the world reference system to the camera reference system, it is possible to determine the equation of the reference plane in relation to the camera from the transformation below:
0 x w + 0 y w + 1 z w + 0 = 0 R , T A x c + B y c + C z c + D = 0 ,
where zw = 0.
To transform the normal plane vector to the camera’s coordinate system, one can use
[ A B C ] = [ r 1 r 2 r 3 r 4 r 5 r 6 r 7 r 8 r 9 ] [ 0 0 1 ] = [ r 3 r 6 r 9 ]
To determine D in Equation (13) and considering that a point [Tx Ty Tz]T belongs to the calibration plane, then
A T x + B T y + C T z + D = 0 D = r 3 T x r 6 T y r 9 T z
So, the calibration plane in relation to the camera frame is completely defined as
A x c + B y c + C z c + D = 0 ,
where A = r 3 , B = r 6 , C = r 9 , and D = r 3 T x r 6 T y r 9 T z .
The next step is the determination of the planes generated by each of the laser beams, as shown in Figure 9, where V N X and V N Y represent the normal vectors of the generated planes, and P L X and P L Y are the positions of the laser diodes related to the camera.
The equations of these planes are given by
{ V N Y = [ 0 1 c o t ( θ y ) ] P L Y = [ 0 b y d y ] y c + c o t ( θ y ) z c b y d y c o t ( θ y ) = 0
{ V N X = [ 1 0 c o t ( θ x ) ] P L X = [ b x 0 d x ] x c + c o t ( θ x ) z c b x d x c o t ( θ x ) = 0
The intersection of these planes with the calibration board can be determined through Equations (16)–(18). These intersections are the projections of the laser light on the board surface and are mathematically described as lines in space.
The plane defined by Equation (17) is
{ A x c + B y c + C z c + D = 0 y c + c o t ( θ y ) z c b y d y c o t ( θ y ) = 0
By choosing xc as a free parameter, the solution of the system is given from the parametric equation of the line of intersection between these planes:
{ x c = t y c = A c o t ( θ y ) C B c o t ( θ y ) x c + C b y + C d y c o t ( θ y ) + D c o t ( θ y ) C B c t g ( θ y ) z c = A C B c o t ( θ y ) x c + B b y B d y c o t ( θ y ) D C B c o t ( θ y )
Similarly, for the plane of light described by Equation (18):
{ A x c + B y c + C z c + D = 0 x c + c o t ( θ x ) z c b x d x c o t ( θ x ) = 0
{ x c = B c o t ( θ x ) C A c o t ( θ x ) y c + C b + C d x c o t ( θ x ) + D c o t ( θ x ) C A c o t ( θ x ) y c = t z c = B C A c o t ( θ x ) y c + A b A d x c o t ( θ x ) D C A c o t ( θ x )
The existence of the free parameter, t, in the equations of the intersection between the planes is to avoid divisions by zero since it is possible that the values of xc and yc are constant in light planes parallel to the X and Y axes, respectively.
Thus, the image coordinates (xim and yim) from a point on the laser line, since this point is on the plane of the calibration board, are obtained, then the coordinates of this point (xc, yc, and zc) relative to the camera reference system can be obtained using the camera model equations proposed by Tsai [24], Lenz, and Tsai [25], referred to as Radial Alignment Constraint (RAC model), with some modifications proposed by Zhuang and Roth [26], comprising the equations below together with the equation of the calibration board:
{ x i m C x 1 k r 2 = f x x c z c y i m C y 1 k r 2 = f y y c z c A x c + B y c + C z c + D = 0 r 2 = μ 2 ( x i m C x ) 2 + ( y i m C y ) 2 ,
where (Cx, Cy) are the coordinates of the image center in pixels, μ = fy/fx, k = coefficient of image radial distortion, kr2 << 1, and fx and fy are the focal length in pixels corrected for the shape of the pixel dimensions in the X and Y axes, respectively (scale factors sx and sy in Table 1, where fx = f/sx and fy = f/sy).
Solving the system above, the coordinates (xc, yc, and zc) of a point of the laser line can be obtained directly as
{ x c = A x D A A x + B B y + C y c = B y D A A x + B B y + C z c = D A A x + B B y + C ,
where
{ A x = 1 f x x C x 1 k r 2 B y = 1 f y y C y 1 k r 2 r 2 = μ 2 ( x C x ) 2 + ( y C y ) 2
Therefore, using these obtained coordinates (xc, yc, and zc) and the equation of the projection line of the laser plane in space (Equations (20) and (22)) it is possible to obtain a linear system of equations for b , c o t ( θ ) ,   and   d c o t ( θ ) :
[ A x c + B y c + D C C B z c B B ] [ c o t ( θ y ) b y d y c o t ( θ y ) ] = [ C y c A x c + C z c + D ] ,
[ A x c + B y c + D C C A z c A A ] [ c o t ( θ x ) b x d x c o t ( θ x ) ] = [ C x c B y c + C z c + D ]
It is easily seen that Columns 2 and 3 are identical in Equation (27), i.e., regardless of the number of points used the system will always have a rank of 2. Therefore, the misalignment parameters, dx and dy, cannot be obtained directly from these systems.
For the calibration of dx and dy, two or more positions of the mobile laser are used and the values of d.cot(θ) and b are determined at once. The systems of Equation (26) and Equation (27) can be modified to
[ A x c + B y c + D C B z c B ] [ c o t ( θ y ) b y ] = [ C y c A x c + C z c + D ]
[ A x c + B y c + D C A z c A ] [ c o t ( θ x ) b x ] = [ C x c B y c + C z c + D ] ,
where
{ b x = b x + d x c o t ( θ x ) b y = b y + d y c o t ( θ y )
For the solution of these systems, a single point of the laser line is sufficient; however, the use of several points on the laser line and the optimization based on least squares or the singular value decomposition (SVD) can produce more accurate results.
Therefore, with N different positions of the mobile laser, it is possible to determine the actual base distance of the laser diode and its misalignment value through an overdetermined system, calibrating the laser parameters completely:
{ b x + d x 1 c o t ( θ x ) = 1 b x b x + d x 2 c o t ( θ x ) = 2 b x b x + d x N c o t ( θ x ) = N b x
{ b y + d y 1 c o t ( θ y ) = 1 b y b y + d y 2 c o t ( θ y ) = 2 b y b y + d y N c o t ( θ y ) = N b y
For the calibration of the fixed laser, the same procedure is performed; however, since the angle of inclination of the fixed laser is constant, the determination of the apparent base distance, b’, is sufficient.
The entire calibration of the optical system can be summarized through the algorithms illustrated in Figure 10 and Figure 11.

4. Calibration of the Sensor Position

The 3D laser scanning sensor based on triangulation developed in this research is intended to produce 3D maps of the surfaces of hydraulic turbine blades. The sensor must be mounted and fixed on the robot arm to be moved over the surfaces by the robot. Therefore, for the 3D coordinates of the map to be assigned with respect to the controller coordinate system at the robot base, the sensor needs to have its position and orientation expressed in the robot base coordinate system previously determined.
The process to determine the sensor position is accomplished by moving the robotic arm with the sensor attached on it over a gage block of known dimensions, such that the range images represented in the respective 3D camera coordinates are obtained and recorded. Subsequently, the robot has its end-effector (weld torch) positioned at various point positions on the gage block surface and the robot coordinates are recorded and related to the coordinates of the same position point expressed in the camera coordinate system of the map.
From several point positions, the transformation between the camera coordinate system and the robot base coordinate system can be obtained and used in the parameter identification routine described in the next sections.

4.1. Robot Forward Kinematic Model

Considering the robot model shown in Figure 12, homogeneous transformation matrices that relate coordinate frames from the robot base (b) to the robot torch/tool (t) can be formulated as follows:
T t b = T 0 b T 1 0 T 2 1 T 3 2 T 4 3 T 5 4 T t 5 = [ n x o x a x p x n y n z 0 o y o z 0 a y a z 0 p y p z 1 ] ,
where T i + 1 i is the homogeneous transformation between two successive joint coordinate frames.
The transformations shown in Equation (33) can be formulated with only 4 elementary motions as proposed by the Denavit–Hartenberg (D–H) convention [27] as below:
T i i 1 = R z ( θ ) T z ( d ) T x ( l ) R x ( α ) = T i i 1 = [ c o s ( θ ) ( c o s ( α ) s i n ( θ ) ) s i n ( α ) s i n ( θ ) s i n ( θ ) c o s ( α ) c o s ( θ ) ( c o s ( α ) s i n ( θ ) ) 0 0 s i n ( α ) 0 c o s ( α ) 0 l c o s ( θ ) l s i n ( θ ) d 1 ] ,
where θ and α are rotation parameters in Z and X axes, respectively, and d and l are translation parameters along the Z and X axes, respectively. The application of Equation (34) to each of the consecutive robot joint frames by using the geometric parameters shown in Figure 12 produces the general homogeneous transformation of the manipulator.
The entries of the general manipulator transformation, T 5 0 , according to Equation (33), excluding the rotation of the torch tip coordinate frame by the angle β (Figure 12), are formulated below, as the robot forward kinematic equations:
n x = s i n ( θ 1 ) · s i n ( θ 5 ) + c o s ( θ 5 ) · c o s ( θ 1 ) · c o s ( θ 2 + θ 4 )
o x = s i n ( θ 1 ) · c o s ( θ 5 ) s i n ( θ 5 ) · c o s ( θ 1 ) · c o s ( θ 2 + θ 4 )
a x = c o s ( θ 1 ) · s i n ( θ 2 + θ 4 )
p x = p z 3 · c o s ( θ 1 ) · s i n ( θ 2 ) + p x 2 · s i n ( θ 1 ) · s i n ( θ 2 ) p z 2 · s i n ( θ 1 ) + p z 5 · c o s ( θ 1 ) · s i n ( θ 2 + θ 4 ) + p x 5 · [ c o s ( θ 1 ) · c o s ( θ 2 + θ 4 ) ] s i n ( θ 1 ) · s i n ( θ 5 )
n y = c o s ( θ 1 ) · s i n ( θ 5 ) + c o s ( θ 5 ) · s i n ( θ 1 ) · c o s ( θ 2 + θ 4 )
o y = c o s ( θ 1 ) · c o s ( θ 5 ) s i n ( θ 5 ) · s i n ( θ 1 ) · c o s ( θ 2 + θ 4 )
a y = s i n ( θ 1 ) · s i n ( θ 2 + θ 4 )
p y = p z 3 · s i n ( θ 1 ) · s i n ( θ 2 ) p x 2 · s i n ( θ 1 ) · c o s ( θ 1 ) + p z 2 · c o s ( θ 1 ) + p z 5 · s i n ( θ 1 ) · s i n ( θ 2 + θ 4 ) + p x 5 · [ c o s ( θ 1 ) · s i n ( θ 5 ) + c o s ( θ 5 ) · s i n ( θ 1 ) · c o s ( θ 2 + θ 4 ) ]
n z = s i n ( θ 2 + θ 4 ) · c o s ( θ 5 )
o z = s i n ( θ 2 + θ 4 ) · s i n ( θ 5 )
a z = c o s ( θ 2 + θ 4 )
p z = p z 1 + p z 5 · c o s ( θ 2 + θ 4 ) + p z 3 · c o s ( θ 2 ) + p x 2 · s i n ( θ 2 ) p x 5 · s i n ( θ 2 + θ 4 ) · c o s ( θ 5 )

4.2. Parameter Identification Modeling

Robot calibration is a process of fitting a nonlinear complex model consisting of a parametrized kinematic model with error parameters to experimental data. The error parameters are identified by minimizing an error function [17].
A robot kinematic model consists of a set of nonlinear functions relating joint variables and link geometric parameters to the robot end-effector pose, such as in
P = T1. T2…Tn,
where Ti are any link transformations defined in Equation (34), P is the manipulator transformation and n is the number of links. If the kinematic model uses a convention of 4 elementary transformations per link, like the D–H convention, the manipulator pose error can be expressed as (from Equation (34))
Δ P = P θ Δ θ + P α Δ α + P d Δ d + P l Δ l ,
where θ, α, d, and l are geometric parameters that relate a robot joint frame to the next joint frame, where d and l are translation parameters, and θ and α are rotation parameters in two of the three coordinate axes, respectively.
The derivatives shown in Equation (48) characterize the partial contribution of each of the geometric error parameters of each joint, consisting of the total pose error of the robot’s end-effector, which can be measured with proper measuring devices. Considering the measured robot poses (M) and the transformation from the measurement system frame to the robot base (B), ΔP is the vector shown in Figure 13.
The transformation, B, can also be considered as a virtual link belonging to the robot model that must be identified. So, the pose error, ΔP, can be calculated with Equation (49) as [28]
Δ P = M P B = M C
The manipulator transformation, P, is updated each time a new set of geometric error parameters is fitted through an iterative process, and, when the calibration process finishes, P is the minimum deviation of the measured poses.
Equation (48) can be rewritten in a matrix form for m measured poses in the form of a Jacobian matrix comprising the partial derivatives of P, such that Δx is the vector of the model parameter errors as in Equation (49):
[ Δ P 1 Δ P 2 Δ P m ] = [ P 1 θ P 1 α P 1 d P 1 l P 2 θ P 2 α P 2 d P 2 l P m θ P m α P m d P m l ] · [ Δ θ Δ α Δ d Δ l ] = [ J 1 J 2 J m ] · Δ x J · Δ x = Δ P
The Jacobian matrix size depends on the number of measured poses in the robot workspace (m) and on the number of error parameters in the model (n). The matrix order is ηm x n, such that η is the number of space degrees of freedom (3 position and 3 orientation parameters). Then, the calibration problem can be set as the solution of the nonlinear system J.x = b.
A widely used method to solve this type of system is the Squared Sum Minimization (SSM). Several other methods are discussed extensively with their related algorithms in [22]. A successful method for the solution of nonlinear least squares problems in practice is the Levemberg–Marquardt algorithm. Many versions of this algorithm have proved to be globally convergent. The algorithm is an iterative solution method with few modifications of the Gauss–Newton method to reduce numerical divergence problems.

4.3. Algorithm for the Transformation of Coordinates from the Sensor to the Robot Base

Input: The matrix with all the coordinates of the map points scanned by the sensor in a scan, expressed in the sensor coordinate system. Each point coordinate is transformed to coordinates represented in the robot base coordinate system with the homogeneous transformation equations below:
A0P = A01 * A12 * A2S * ASP,
where,
A0P = matrix representing the position of the scanned object point (P) in the robot base coordinate system (0);
A01 = matrix representing the position of Joint 1 (1) in the robot base coordinate system (0);
A12 = matrix representing the position of Joint 2 (2) in the Joint 1 coordinate system (1);
A2S = matrix representing the position of the sensor (S) in the Joint 2 coordinate system (2) (pxs, pys, and pzs) (see Figure 12);
ASP = matrix representing the position of the scanned point (P) in the sensor coordinate system (S) (xc, yc, and zc) (see Figure 12).
The homogeneous transformations are shown below:
A 01 = [ c o s ( θ 1 ) c o s ( α 1 ) s i n ( θ 1 ) s i n ( α 1 ) s i n ( θ 1 ) p x 1 c o s ( θ 1 ) s i n ( θ 1 ) c o s ( α 1 ) c o s ( θ 1 ) s i n ( α 1 ) c o s ( θ 1 ) p x 1 s i n ( θ 1 ) 0 s i n ( α 1 ) c o s ( α 1 ) p z 1 0 0 0 1 ]
A 12 = [ c o s ( θ 2 ) c o s ( α 2 ) s i n ( θ 2 ) s i n ( α 2 ) s i n ( θ 1 ) p x 2 c o s ( θ 2 ) s i n ( θ 2 ) c o s ( α 2 ) c o s ( θ 2 ) s i n ( α 2 ) c o s ( θ 1 ) p x 2 s i n ( θ 2 ) 0 s i n ( α 2 ) c o s ( α 2 ) p z 2 0 0 0 1 ]
A 2 S = [ c o s ( θ 3 ) c o s ( α 3 ) s i n ( θ 3 ) s i n ( α 3 ) s i n ( θ 3 ) p x s c o s ( θ 3 ) p y s s i n ( θ 3 ) s i n ( θ 3 ) c o s ( α 3 ) c o s ( θ 3 ) s i n ( α 3 ) c o s ( θ 3 ) p y s c o s ( θ 3 ) + p x s s i n ( θ 3 ) 0 s i n ( α 3 ) c o s ( α 3 ) p z 3 0 0 0 1 ]
A S P = [ 1 0 0 1 0 x c 0 y c 0 0 0 0 1 z c 0 1 ] ,
where symbols are described in Section 4.1 and:
θ1 = Joint 1 position when scanning, recorded from the robot controller;
θ2 = Joint 2 position when scanning, recorded from the robot controller;
pz3 = Joint 3 position when scanning, recorded from the robot controller;
(xc, yc, zc) = object point coordinates, P, represented in the sensor coordinate system.
The constant parameters were previously determined from a robot calibration process, and details about the calibration process of this robot can be seen in [28]. The pertinent results are listed below:
α1 = −89.827°;
α2 = 90°;
pz1 = 275 mm;
pz2 = 104.718 mm;
pz3 = joint variable position in the controller +103.677 mm;
px1 = −0.059 mm;
px2 = 33.389 mm;
θ1 = joint variable position in the controller + 0.1097°;
θ2 = joint variable position in the controller + 89.602°.
The parameters to be identified are pxs, pys, and pzs, and the results of the identification routine are presented in the homogeneous transformation A2S:
A 2 S = [ 0.00087266 0.00204203 0.99999753 29.92711162 0.99999962 0.00000178 0.00087266 94.9961525 0 0.99999792 0.00204203 p z 3 0 0 0 1 ]
Output: The matrix with all the object point coordinates of a scan expressed in the robot base coordinate system. These coordinate values must be input into the robot controller so that, through the forward kinematics, the robot torch reaches the programmed trajectory points.

5. Results and Discussion

5.1. Sensor Calibration

A calibration board (see Figure 14) was used to first calibrate the camera intrinsic parameters, such as the focal length (f), image center (Cx and Cy), scale parameters (sx and sy), and an image radial distortion factor (k). After the camera was calibrated, the geometric parameters of the sensor (Figure 8) could be calibrated from several images acquired from the laser plane projection on a plane board. The sensor with the camera, lens, and laser light projectors can be seen in Figure 15.
The algorithm used to calibrate the camera is based on the RAC model proposed by Tsai and Lenz [25]. Data from the camera calibration process reveal a distance from the target to the camera of approximately 383 mm. Table 1 shows the camera intrinsic parameter results obtained with the calibration routine.

5.2. Calibration of the Sensor Position

The vision sensor was mounted on the robot arm according to Figure 12. The calibration of the sensor geometric parameters was performed using a flat plate and a gage block for depth verification (100 × 50 × 10 mm), through several images of laser light lines in various positions of the mobile laser projector. The light line image of the fixed sensor reflected on the metal plate must be vertical when projected on the screen and can be stored during each test to be subsequently used to calculate the projection of the light lines emitted by the mobile sensor on the plate. Images from the calibration process can be seen in Figure 16.
The geometric parameters that must be obtained with the parameter identification routine discussed in Section 4.2 and Section 4.3 to calibrate the sensor are shown in Figure 8. The experimental results from the sensor calibration procedures can be seen in Table 2.

5.3. Accuracy Evaluation of the Robot Positioning Using the Surface 3D Sensor Map

To evaluate the accuracy of the surface 3D maps constructed by the vision sensor and expressed on the robot base coordinate system, some tests were carried out on scanning the surfaces and positioning the robot’s end-effector (in this case an inductive proximity sensor) on the surface trajectories to be followed by the robot. In Figure 17, a metallic 3D block with known dimensions is shown. The block was scanned, and the map is shown in Figure 18. Figure 19 shows a positioning measurement with an inductive proximity sensor along a straight trajectory and the measurements are shown. Figure 20 shows the same trajectory adjusted to the welding torch.
It can be seen from the results that the proximity sensor showed instability when the distance from the surface varied and stability when the distance remained constant. This is because the sensor head has a diameter of 18 mm with a working distance that has to be within the range 0.4–4 mm, which could not be achieved when moving over the curved borders of the depressed surface of the block. However, along the flat surface, it could be observed that the sensor head could measure a distance from the surface from 2.5 to 2.6 mm, which obviously suffices welding requirements.
It was observed that, within the operating distance range of 350 to 500 mm, there was a systematic translation between the origins of the robot and the map coordinates in the X, Y, and Z axes that could easily be fixed with a simple transformation matrix, resulting in a very good accuracy in tracking the programmed trajectory.

6. Conclusions

This work proposed a calibration method of a laser triangulation scanner mounted on a robot arm to produce 3D surface maps expressed in the robot coordinates to be used in welding tasks on the surfaces of turbine blades. The method assumes that the robot and the camera are previously calibrated. The vision sensor embeds two laser line projectors to scan the surface in such a way that a triangulation process can construct a 3D surface map after the geometric parameters of the sensor are identified. The position of the fixed sensor on the robot arm is then calibrated and the 3D map can have its coordinates expressed in the robot base coordinate system. With the map available it is possible to perform the offline programming of robot welding tasks.
Experimental tests were performed to evaluate the accuracy of the 3D map expressed in the robot controller coordinates by moving the robot’s end-effector along a trajectory programmed over a metal block with a surface depression similar to those found in the field such that the stand-off should be kept constant. The distance from the robot end-effector and the plate surface along the trajectory was measured with a magnetic proximity sensor mounted on the robot welding torch. Results showed an average accuracy of 0.3 mm on a displacement of approximately 180 mm.
This calibration system proposal opens up an alternative to use triangulation-based laser scanners with enough accuracy in applications where the distance from the target is large but within a depth range where calibration has been performed, exactly as the application for which this system was developed.

Author Contributions

Conceptualization, J.M.S.T.M.; Funding acquisition, J.M.S.T.M.; Investigation, G.A.I.-P., J.M.S.T.M., and R.C.S.; Methodology, J.M.S.T.M.; Project administration, J.M.S.T.M.; Software, G.A.I.-P. and R.C.S.; Supervision, J.M.S.T.M.; Validation, G.A.I.-P., J.M.S.T.M., and R.C.S.; Visualization, G.A.I.-P., J.M.S.T.M., and R.C.S.; Writing—original draft, G.A.I.-P.; Writing—review and editing, J.M.S.T.M. and R.C.S.

Funding

This research has been partially supported by the Electrical Power Plants of the North of Brazil (ELETRONORTE), grant number 1203243, the Foundation for Scientific and Technological Enterprises (FINATEC), and the CAPES Foundation, Ministry of Education of Brazil.

Acknowledgments

The authors would like to thank the many students, researchers, and engineers that have partially contributed to this research. The work described in this article is only one part of a larger project to construct a robot dedicated to repair hydroelectric turbine blades automatically.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Pérez, L.; Rodríguez, Í.; Rodríguez, N.; Usamentiaga, R.; García, D. Robot Guidance Using Machine Vision Techniques in Industrial Environments: A Comparative Review. Sensors (Basel) 2016, 16, 335. [Google Scholar] [CrossRef] [PubMed]
  2. Morozov, M.; Pierce, S.G.; MacLeod, C.N.; Mineo, C.; Summan, R. Off-line scan path planning for robotic NDT. Measurement 2018, 122, 284–290. [Google Scholar] [CrossRef]
  3. He, B.X.; Li, C.-L.; Li, J.-P.; Zhang, Y.; Liu, R.-L. Integration of intelligent measurement and detection for sealing rings used in aerospace systems. Opt. Precis. Eng. 2015, 12, 3395–3404. [Google Scholar]
  4. Shi, Y.; Sun, C.; Wang, P.; Wang, Z.; Duan, H. High-speed measurement algorithm for the position of holes in a large plane. Opt. Lasers Eng. 2012, 50, 1828–1835. [Google Scholar] [CrossRef]
  5. Kondo, Y.; Hasegawa, K.; Kawamata, H.; Morishita, T.; Naito, F. On-machine non-contact dimension-measurement system with laser displacement sensor for vane-tip machining of RFQs. Nuclear Instrum. Methods Phys. Res. A Accel. Spectrom. Detect. Assoc. Equip. 2012, 667, 5–10. [Google Scholar] [CrossRef]
  6. Heeshin, K. Study on Synchronization for Laser Scanner and Industrial Robot. Int. J. Sci. Eng. Appl. Sci. 2016, 2, 2395–3470. [Google Scholar]
  7. Hatwig, J.; Reinhart, G.; Zaeh, M.F. Automated task planning for industrial robots and laser scanners for remote laser beam welding and cutting. Prod. Eng. 2010, 4, 327. [Google Scholar] [CrossRef]
  8. Craig, J.J. Introduction to Robotics, Mechanics and Control, 3rd ed.; Pearson Prentice Hall: Upper Saddle River, NJ, USA, 2005; p. 408. ISBN 13 978-0201543612. [Google Scholar]
  9. Niola, V.; Rossi, C.; Sergio, S.; Salvatore, S. A method for the calibration of a 3-D laser scanner. Robot. Comput.-Integr. Manuf. 2010, 27, 479–484. [Google Scholar] [CrossRef]
  10. Ren, Y.; Yin, S.; Zhu, J. Calibration technology in application of robot-laser scanning system. Opt. Eng. 2012, 51. [Google Scholar] [CrossRef]
  11. Li, J.; Chen, M.; Jin, X.; Chen, Y.; Dai, Z.; Ou, Z.; Tang, Q. Calibration of a multiple axes 3-D laser scanning system consisting of robot, portable laser scanner and turntable. Opt. Int. J. Light Electron. Opt. 2011, 122, 324–329. [Google Scholar] [CrossRef]
  12. Tzafestas, S.G.; Raptis, S.; Pantazopoulos, J. A Vision-Based Path Planning Algorithm for a Robot-Mounted Welding Gun. Image Process. Commun. 1996, 2, 61–72. [Google Scholar]
  13. Hatwig, J.; Minnerup, P.; Zaeh, M.F.; Reinhard, G. An Automated Path Planning System for a Robot with a Laser Scanner for Remote Laser Cutting and Welding. In Proceedings of the IEEE International Conference on Mechatronics and Automation, Chengdu, China, 5–8 August 2012. [Google Scholar] [CrossRef]
  14. Shirinzadeh, B.; Teoh, P.L.; Tian, Y.; Dalvand, M.M.; Zhong, Y.; Liaw, H.C. Laser interferometry-based guidance methodology for high precisión positioning of mechanisms and robots. Robot. Comput.-Integr. Manuf. 2010, 26, 74–82. [Google Scholar] [CrossRef]
  15. Larsson, S.; Kjellander, J.A.P. An industrial robot and a laser scanner as a flexible solution towards an automatic system for reverse engineering of unknown objects. In Proceedings of the 7th Biennial Conference on Engineering Systems Design and Analysis 2004; ASME: New York, NY, USA, 2004; Volume 2, pp. 341–350. [Google Scholar]
  16. Umeda, K.; Ikushima, K.; Arai, T. 3D shape recognition by distributed sensing of range images and intensity images. In Proceedings of the 1997 IEEE International Conference on Robotics and Automation, Albuquerque, NM, USA, 25 April 1997; pp. 149–154. [Google Scholar] [CrossRef]
  17. Zhuang, H.; Roth, Z.S.; Sudhakar, R. Simultaneous robot/world and tool/flange calibration by solving homogeneous transformation equations of the form AX = YB. IEEE Trans. Robot. Autom. 1994, 10, 549–554. [Google Scholar] [CrossRef]
  18. Zhuang, H.; Wang, K.; Roth, Z.S. Simultaneous calibration of a robot and a hand-mounted camera. IEEE Trans. Robot. Autom. 1998, 11, 649–660. [Google Scholar] [CrossRef]
  19. Yu, C.; Xi, J. Simultaneous and on-line calibration of a robot-based inspecting system. Robot. Comput.-Integr. Manuf. 2018, 49, 349–360. [Google Scholar] [CrossRef]
  20. David, A.F.; Jean, P. Computer Vision: A Modern Approach, 2nd ed.Prentice Hall: Upper Saddle River, NJ, USA, 2011; p. 792. ISBN 10 013608592X. [Google Scholar]
  21. Larsson, S.; Kjellander, J.A.P. Motion control and data capturing for laser scanning with an industrial robot. Robot. Auton. Syst. 2006, 54, 453–460. [Google Scholar] [CrossRef]
  22. Barbero, B.R.; Ureta, E.S. Comparative study of different digitization techniques and their accuracy. Comput.-Aided Des. 2011, 43, 188–206. [Google Scholar] [CrossRef]
  23. Shen, C.; Zhu, S. A Robotic System for Surface Measurement Via 3D Laser Scanner. In Proceedings of the 2012 International Conference on Computer Application and System Modeling-ICCSM2012, Cochin, India, 20–21 October 2012; pp. 1237–1239. [Google Scholar]
  24. Tsai, R.Y. A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-the Shelf TV Cameras and Lenses. IEEE Int. J. Robot. Autom. 1987, 4, 323–344. [Google Scholar] [CrossRef]
  25. Lenz, R.K.; Tsai, R.Y. Techniques for Calibration of the Scale Factor and Image Center for High Accuracy 3D Machine Vision Metrology. In Proceedings of the IEEE International Conference on Robotics and Automation, Raleigh, NC, USA, 31 March–3 April 1987; pp. 68–75. [Google Scholar]
  26. Zhuang, H.; Roth, Z.S. Camera-Aided Robot Calibration, 1st ed.; CRC Press: Boca Raton, FL, USA, 1996; pp. 11–58. ISBN 0-8493-9407-4. [Google Scholar]
  27. Denavit, J.; Hartenberg, R.S. A kinematic notation for lower-pair mechanisms based on matrices. J. Appl. Mech. 1955, 22, 215–221. [Google Scholar]
  28. Motta, J.M.S.T.; Llanos-Quintero, C.H.; Sampaio, R.C. Optimization of a Five-D.O.F. Robot for Repairing the Surface Profiles of Hydraulic Turbine Blades. Int. J. Adv. Robot. Syst. 2016, 13, 1–15. [Google Scholar] [CrossRef]
Figure 1. Sketch of the laser projectors and camera (VISSCAN-3D).
Figure 1. Sketch of the laser projectors and camera (VISSCAN-3D).
Sensors 19 01783 g001
Figure 2. Projection of a laser light plane on a surface.
Figure 2. Projection of a laser light plane on a surface.
Sensors 19 01783 g002
Figure 3. A light plane and the image formation of a point on the first and second laser line, showing the triangulation from the laser projection on the object surface.
Figure 3. A light plane and the image formation of a point on the first and second laser line, showing the triangulation from the laser projection on the object surface.
Sensors 19 01783 g003
Figure 4. Mathematical model of the two light projections.
Figure 4. Mathematical model of the two light projections.
Sensors 19 01783 g004
Figure 5. Effect of the misalignment of the mobile laser projector.
Figure 5. Effect of the misalignment of the mobile laser projector.
Sensors 19 01783 g005
Figure 6. Misalignment parameters.
Figure 6. Misalignment parameters.
Sensors 19 01783 g006
Figure 7. Surface scanning algorithm.
Figure 7. Surface scanning algorithm.
Sensors 19 01783 g007
Figure 8. Geometric parameters of the digitization system.
Figure 8. Geometric parameters of the digitization system.
Sensors 19 01783 g008
Figure 9. Generated light planes and the geometric parameters.
Figure 9. Generated light planes and the geometric parameters.
Sensors 19 01783 g009
Figure 10. Mobile laser calibration algorithm.
Figure 10. Mobile laser calibration algorithm.
Sensors 19 01783 g010
Figure 11. Fixed laser calibration algorithm.
Figure 11. Fixed laser calibration algorithm.
Sensors 19 01783 g011
Figure 12. Robot at zero position with joint coordinate systems, link variables, point P, and sensor position vectors.
Figure 12. Robot at zero position with joint coordinate systems, link variables, point P, and sensor position vectors.
Sensors 19 01783 g012
Figure 13. Calibration transformations [28].
Figure 13. Calibration transformations [28].
Sensors 19 01783 g013
Figure 14. Calibration board made of photographic paper printed with light through etched glass used to calibrate the camera intrinsic parameters, with 9 × 9 dots of 26.5 ± 0.1 mm from each other.
Figure 14. Calibration board made of photographic paper printed with light through etched glass used to calibrate the camera intrinsic parameters, with 9 × 9 dots of 26.5 ± 0.1 mm from each other.
Sensors 19 01783 g014
Figure 15. Scanner with camera and two laser light projectors.
Figure 15. Scanner with camera and two laser light projectors.
Sensors 19 01783 g015
Figure 16. Image of the standard block and processed image data in a laser projector position.
Figure 16. Image of the standard block and processed image data in a laser projector position.
Sensors 19 01783 g016
Figure 17. Scanning process of a metallic block.
Figure 17. Scanning process of a metallic block.
Sensors 19 01783 g017
Figure 18. 3D map of a scanned block and a programmed robot trajectory for welding.
Figure 18. 3D map of a scanned block and a programmed robot trajectory for welding.
Sensors 19 01783 g018
Figure 19. Positioning measurements along a trajectory crossing the metallic block.
Figure 19. Positioning measurements along a trajectory crossing the metallic block.
Sensors 19 01783 g019
Figure 20. Three robot welding torch positions along a trajectory.
Figure 20. Three robot welding torch positions along a trajectory.
Sensors 19 01783 g020
Table 1. Camera intrinsic parameters 1.
Table 1. Camera intrinsic parameters 1.
Focal Length f (mm)Image Center (Cx,Cy) (pixel)Scale Factors (sx,sy) (pixel/mm)Radial Distortion Factor k
9.43773(738, 585)(227.27, 227.27)9.4604 × 10−9
1 Camera CMOS Lumenera LW230—1616 × 1216, 4.4 microns squared pixels.
Table 2. Geometric parameters of the laser projectors in the vision sensor after calibration.
Table 2. Geometric parameters of the laser projectors in the vision sensor after calibration.
Rotatory Laser ProjectorFixed Laser Projector
by (mm)dy (mm)Bx’ (mm)cot(θx)
101.973−16.0156−6.6788−0.0922

Share and Cite

MDPI and ACS Style

Idrobo-Pizo, G.A.; Motta, J.M.S.T.; Sampaio, R.C. A Calibration Method for a Laser Triangulation Scanner Mounted on a Robot Arm for Surface Mapping. Sensors 2019, 19, 1783. https://doi.org/10.3390/s19081783

AMA Style

Idrobo-Pizo GA, Motta JMST, Sampaio RC. A Calibration Method for a Laser Triangulation Scanner Mounted on a Robot Arm for Surface Mapping. Sensors. 2019; 19(8):1783. https://doi.org/10.3390/s19081783

Chicago/Turabian Style

Idrobo-Pizo, Gerardo Antonio, José Maurício S. T. Motta, and Renato Coral Sampaio. 2019. "A Calibration Method for a Laser Triangulation Scanner Mounted on a Robot Arm for Surface Mapping" Sensors 19, no. 8: 1783. https://doi.org/10.3390/s19081783

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop