Next Article in Journal
GNSS Based Low-Cost Magnetometer Calibration
Next Article in Special Issue
Ultrasound Probe and Hand-Eye Calibrations for Robot-Assisted Needle Biopsy
Previous Article in Journal
Analytics and Applications of Audio and Image Sensing Techniques
Previous Article in Special Issue
Simultaneous Calibration of the Hand-Eye, Flange-Tool and Robot-Robot Relationship in Dual-Robot Collaboration Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Novel Point Set Registration-Based Hand–Eye Calibration Method for Robot-Assisted Surgery

Institute of Medical Robotics, School of Medical Engineering, Shanghai Jiao Tong University, Shanghai 200240, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Sensors 2022, 22(21), 8446; https://doi.org/10.3390/s22218446
Submission received: 16 October 2022 / Revised: 29 October 2022 / Accepted: 31 October 2022 / Published: 3 November 2022

Abstract

:
Pedicle screw insertion with robot assistance dramatically improves surgical accuracy and safety when compared with manual implantation. In developing such a system, hand–eye calibration is an essential component that aims to determine the transformation between a position tracking and robot-arm systems. In this paper, we propose an effective hand–eye calibration method, namely registration-based hand–eye calibration (RHC), which estimates the calibration transformation via point set registration without the need to solve the AX = XB equation. Our hand–eye calibration method consists of tool-tip pivot calibrations in two-coordinate systems, in addition to paired-point matching, where the point pairs are generated via the steady movement of the robot arm in space. After calibration, our system allows for robot-assisted, image-guided pedicle screw insertion. Comprehensive experiments are conducted to verify the efficacy of the proposed hand–eye calibration method. A mean distance deviation of 0.70 mm and a mean angular deviation of 0.68° are achieved by our system when the proposed hand–eye calibration method is used. Further experiments on drilling trajectories are conducted on plastic vertebrae as well as pig vertebrae. A mean distance deviation of 1.01 mm and a mean angular deviation of 1.11° are observed when the drilled trajectories are compared with the planned trajectories on the pig vertebrae.

1. Introduction

Pedicle screw insertion is an effective treatment of spinal diseases, such as scoliosis, in addition to spinal fracture and vertebral injury. Manual implantation is challenging, especially in patients with severe spinal deformity, osteoporosis, or tumor [1,2,3]. To address the challenge, one of the proposed technologies is to integrate a robot arm with a computer navigation system [4,5,6,7]. In developing such a system, hand–eye calibration is an essential component, which aims to determine the homogeneous transformation between the robot hand/end-effector and the optical frame affixed to the end-effector [8,9].
Due to its importance, a number of approaches have been developed to solve the problem. Hand–eye calibration can be formulated in the form of AX = XB , where A and B are the robotic end-effector and the optical frame poses between successive time frames, respectively, and X is the unknown transformation matrix between the robot end-effector and the optical frame. Many solutions have been proposed to recover X -given data streams { A i } and { B i } . Solutions to the problem can be roughly classified into four categories, i.e., separable solutions [10,11,12,13,14], simultaneous solutions [15,16,17], iterative solutions [8,18,19,20,21], and probabilistic methods [22,23]. Specifically, given equations A and B , it is possible to decompose the equation into rotational and translational parts. Separable solutions utilize this property to solve hand–eye calibration, where the rotation part is first solved, followed by solving the translational part. In contrast, simultaneous solutions solve the rotational and translational parts at the same time. Methods in the third category solve a nonlinear optimization problem by minimizing equations such as | | AX XB | | . As the algorithm iterates, it will converge on a solution to X . Different from methods of the first three categories, which assume an exact correspondence between the data streams { A i } and { B i } , methods in the fourth category eliminate such a requirement.
Despite these efforts, accurate hand–eye calibration is challenging for the following reasons. First, although separable methods are useful, any error in the estimation for the rotational part is compounded when being applied to solving the translational part. Second, while simultaneous solutions can significantly reduce the propagation of error [24], they are sensitive to the nonlinearities present in measurements in the form of noise and errors [25]. Third, although it was observed that the nonlinear iterative approaches yielded better results to linear and closed-form solutions in terms of accuracy [25], they can be computationally expensive to carry out and may not always converge on the optimal solution.
In this paper, to tackle these challenges, we propose an effective hand–eye calibration method, namely registration-based hand–eye calibration (RHC), which estimates the calibration transformation via paired-point matching without the need to solve the AX = XB equation. Specifically, in our solution, we reformulate hand–eye calibration as tool-tip pivot calibrations in two-coordinate systems and a paired-point matching, taking advantage of the steady movement of the robot arm and thus reducing measurement errors and noise. The hand–eye calibration problem is then solved via closed-form solutions to three overdetermined equation systems. Our point set registration-based hand–eye calibration method has the following advantages:
  • Our method is a simultaneous closed-form solution, which guarantees an optimal solution;
  • Unlike other simultaneous solutions, our solution is obtained by solving three nonlinear least-square fitting problems, leading to three overdetermined equation systems. Thus, it is not sensitive to the nonlinearities present in measurements in the form of noise and errors;
  • In comparison with the nonlinear iterative approaches, our method requires only simple matrix operations. Thus, it is computationally efficient;
  • Our method achieves better results than the state-of-the-art (SOTA) methods.
The paper is organized as follows. Section 2 reviews related works. Section 3 presents the proposed method. Section 4 describes the experiments and results. Finally, we present discussions in Section 5, followed by our conclusion in Section 6.

2. Related Works

Huge amounts of time have been devoted to solve the problem of hand–eye calibration. Due to the wide applications of robot-assisted procedures, different types of methods have been developed for increased accuracy and robustness. Existing solutions can be roughly classified into four categories, as shown in Table 1, i.e., separable closed-form solutions [10,11,12,13,14], simultaneous closed-form solutions [15,16,17], iterative solutions [18,19,20], and probabilistic methods [22,23].
The earliest approaches separately estimated the rotational and translational parts. For example, Shiu et al. proposed a method for solving homogeneous transform equations [10]. Tsai presented an efficient 3D robotics hand–eye calibration algorithm that computed 3D position and orientation separably [11]. Quaternion-based [13], extrinsic hand–eye calibration [12], and dual-quaternions-based calibration methods [14] have been introduced for the individual estimations of rotational and translational parts. One known problem with separable methods is that any error in the estimation of the rotation matrices may be propagated to the estimation of the translation vector.
To avoid the error propagation problem with separable solutions, methods in the second category simultaneously compute the orientation and position. For example, Lu et al. proposed an approach that transformed the kinematic equation into linear systems using normalized quaternions [16]. Andreff et al. proposed an on-line hand–eye calibration method that derived a linear formulation of the problem [15]. Zhao et al. [17] proposed a hand–eye calibration method based on screw motion theory to establish linear equations and simultaneously solve rotation and translation. As confirmed by experimental results, simultaneous methods have less error than separable solutions [25].
Iterative solutions are another type of method used to solve the problem of error propagation. For example, Zhuang et al. [18] presented an iterative algorithm to solve the unknown matrix X in one stage, thus eliminating error propagation and improving noise sensitivity. Mao et al. [20] proposed using a direct linear closed-form solution followed by Jacobian optimization to solve AX = XB for hand–eye calibration. Hirsh et al. [26] proposed a robust iterative method to simultaneously estimate both the hand–eye and robot–world spatial transformation. Based on a metric defined on the group of the rigid transformation S E ( 3 ) , Strobl and Hirzinger [27] presented an error model for nonlinear optimization. They then proposed a calibration method for estimating both the hand–eye and robot–world transformations. While iterative solutions are generally accurate, they can be computationally expensive and may not always converge to the optimal solution [28].
The methods mentioned above assume an exact correspondence between the streams of sensor data, while methods in the fourth category eliminate such a requirement. For example, Ma et al. [23] proposed two probabilistic approaches by giving new definitions of the mean on S E ( 3 ) , which alleviated the restrictions on the dataset and led to improved accuracy. Although it is worth investigating the situation when the exact correspondence between sensor data is unknown, probabilistic methods usually lead to longer computation times. Additionally, assuming an exact correspondence is not a problem in our study.
Hand–eye calibration is also an active research topic in medical applications. For example, Morgan et al. [29] presented a Procrustean perspective-n-point (PnP) solution for hand–eye calibration for surgical cameras, achieving an average projection error of 12.99 pixels when evaluated on a surgical laparoscope. Özgüner et al. [30] proposed a solution for hand–eye calibration for the da Vinci robotic surgical system by breaking down the calibration procedure into systematic steps to reduce error accumulation. They reported a root mean square (RMS) error of 2.1 mm and a mean rotational error of 3.2 when their calibration method was used to produce visually-guided end-effector motions. Using the da Vinci Research Kit (dVRK) and an RGB-D camera, Roberti et al. [31] proposed to separate the calibration of the robotic arms and an endoscope camera manipulator from the hand–eye calibration of the camera for an improved accuracy in a 3D metric space. The proposed method reached a sub-millimeter accuracy in a dual-arm manipulation scenario, while the use of the RGB-D camera limited its actual application in surgery. Sun et al. [32] proposed a hand–eye calibration method for robot-assisted minimally invasive surgery, which relied purely on surgical instruments already in the operating scenario. Their model was formed by the geometry information of the surgical instrument and the remote center-of-motion (RCM) constraint, outperforming traditional hand–eye calibration methods in both simulation and robot experiments.
Deep learning-based methods, especially those based on convolutional neural networks (CNN), have also been developed for low-level image-processing tasks in hand–eye calibration [33,34,35,36]. For example, Valassakis et al. [34] proposed a sparse correspondence model that used a U-Net to detect 2D key points for eye-in-hand camera calibration. Kim et al. [36] introduced deep learning-based methods to restore out-of-focus blurred images for an improved accuracy in hand–eye calibration.

3. Materials and Methods

3.1. System Overview

Our robot-assisted, image-guided pedicle screw insertion system consists of a master computer, an optical tracking camera (Polaris Vega XT, NDI, Waterloo, ON, Canada) and a robot arm (UR 5e, Universal Robots, Odense, Denmark) with a guiding tube. The master computer communicates with the tracking camera to obtain poses of different optical tracking frames with the remote controller of the UR robot in order to realize a steady movement and to receive feedback information.
During pedicle screw insertion, the target point and the aiming trajectory are planned in a pre-operative CT, which are transformed to the tracking camera space via a homogeneous transformation obtained by a surface registration [37]. Then, the pose of the guide will be adjusted to align with the planned trajectory. Thus, it is essential to determine the spatial transformation from the tracking camera space to the robot space, as shown in Figure 1. The transformation can be obtained via two different calibration procedures, including the hand–eye calibration and guiding tube calibration.
Our robot-assisted, image-guided pedicle screw insertion procedure involves the following coordinate systems (COS), as shown in Figure 1. The 3D COS of the optical tracking camera is represented by O C ; the 3D COS of the optical reference frame on the end-effector is represented by O M ; the 3D COS of the robotic flange is represented by O F ; the 3D COS of the guiding tube is represented by O T ; the 3D COS of the robot base is represented by O B ; the 3D COS of the pre-operative CT data is represented by O C T ; and the 3D COS of the optical reference frame attached to the patient/phantom is represented by O R . At any time, poses of different optical tracking frames with respect to the tracking camera, such as M C T and R C T , are known. At the same time, the pose of the robotic flange with respect to the robot base F B T is known. This transformation information can be retrieved from the API (application programming interface) of the associated devices.

3.2. Registration-Based Hand–Eye Calibration

The aim of the hand–eye calibration is to establish the spatial transformation between the robot system and the optical tracking system. Mathematically, we solve the 4 × 4 spatial transformation matrix from the COS O M to the COS O F , referred as M F T . In this subsection, the proposed registration-based hand–eye calibration (RHC) is introduced, which mainly consists of two steps: (1) solving tool-tip pivot calibrations in both the optical tracking camera COS O C and the robot base COS O B ; (2) solving hand–eye calibration via a paired-point matching.

3.2.1. Tool-Tip Calibration

In the first step, we rigidly fixed a calibration tool with a sharp tip to the flange, as shown in Figure 2a. We then need to determine the coordinates of the tool tip relative to the respective two-coordinate systems, i.e., O M and O F . We obtained both by pivot calibration [38]. Once calibrated, the coordinates of the tool tip with respect to O M and O F are known, which will then be used in the next step to compute a paired-point matching.
We will start to describe the pivot calibration of the coordinate of the tool tip with respect to the coordinate system O M . We pivoted the tool tip around a stationary point, as shown in Figure 2b, to estimate the coordinates of the tool tip in both the optical tracking camera COS O C and the 3D COS O M of the optical reference frame on the end-effector. During pivoting, we placed the tool tip in a divot, which has the same size and shape with the tool tip to avoid any possible sliding. Then, we moved the tool around this pivot point while always touching the divot with its tip. We denoted, respectively, the two offsets as p C and p M . During pivoting, we kept p C and p M static while collecting a set of n homogeneous transformations { ( M C T ) i = ( ( M C R ) i , ( M C t ) i ) ; 1 i n } via the tracking camera API. Then, we estimated p C and p M by solving the following overdetermined equations:
( M C R ) 1 I ( M C R ) n I p M p C = ( M C t ) 1 ( M C t ) n
where I is the 3 × 3 identity matrix.
Defining R ˜ = ( M C R ) 1 I ( M C R ) n I , and t ˜ = ( M C t ) 1 ( M C t ) n , we have:
R ˜ p M p C = t ˜
Then, we can solve p M and p C using pseudo-inverse [39]:
p M p C = ( ( R ˜ ) T ( R ˜ ) ) 1 ( R ˜ ) T t ˜
As we are only interested in knowing the offset of the tool tip with respect to O M , we keep p M while disregarding p C .
Similarly, we can use the same pivot calibration technique to estimate the coordinates of the tool tip in both the robotic flange COS O F and the robot base COS O B . This time, we pivoted the tool tip around a stationary point, as shown in Figure 2c. Similarly, we placed the tool tip in a divot to avoid sliding. We denoted, respectively, the two coordinates as p B and p F . During pivoting, we kept p B and p F static while collecting a set of l homogeneous transformations { ( F B T ) i = ( ( F B R ) i , ( F B t ) i ) ; 1 i l } via the robot arm API. Then, we estimated p B and p F by solving following the following overdetermined equations [39]:
( F B R ) 1 I ( F B R ) l I p F p B = ( F B t ) 1 ( F B t ) l
where I is the 3 × 3 identity matrix.
Again, we are only interested in knowing the offset of the tool tip with respect to O F ; therefore, we kept p F while disregarding p B .

3.2.2. Solving Hand–Eye Calibration via Paired-Point Matching

After obtaining the offsets of the tool tip with respect to two-coordinate systems O M and O F , we can compute the coordinates of the tool tip in both the tracking camera COS O C and the robot base COS O B at any time via the corresponding device’s API. In this section, we present an elegant method to solve the hand–eye calibration via paired-point matching using the setup shown in Figure 3.
Specifically, during the hand–eye calibration, we maintained a stationary spatial relationship between the robot base and the tracking camera while moving the robot flange. By controlling the flange to move in m different positions, we collected two set of points P C = ( p C ) 1 ( p C ) m , which are the coordinates of the tool tip measured in the tracking camera COS O C via ( p C ) i = ( M C T ) i p M and P B = ( p B ) 1 ( p B ) m , which are the coordinates of the tool tip measured in the robot base COS O B via ( p B ) i = ( F B T ) i p F . Therefore, we can solve the spatial transformation B C T using a paired-point matching algorithm.
For the first step to match two paired-point sets, we computed a 3 × 3 matrix H as follows:
H = j = 1 m ( ( p B ) j ( 1 m i = 1 m ( p B ) i ) ) · ( ( p C ) j ( 1 m i = 1 m ( p C ) i ) ) T
We then used the singular value decomposition (SVD) [39] to decompose matrix H into U , S , and V matrices:
H = U S V T
Based on the decomposed matrices, we computed the rotation matrix B C R as:
B C R = V 1 0 0 0 1 0 0 0 λ U T
where λ = det ( U V ) .
Based on B C R , we solved B C t using:
B C t = 1 m i = 1 m ( p C ) i B C R 1 m i = 1 m ( p B ) i
Therefore, we obtained the spatial transformation B C T as:
B C T = ( B C R , B C t )
For each position in the movement trajectory, we computed the spatial transformation ( M F T ) i as:
( M F T ) i = ( F B T ) i 1 · ( B C T ) 1 · ( M C T ) i
where ( F B T ) i and ( M C T ) i are retrieved from the associated device’s API when generating P C and P B .
Each position will give a different ( M F T ) i . To improve the robustness and to increase the accuracy, we averaged all the obtained transformations. Specifically, we used ( ψ i , θ i , ϕ i ) to represent the Euler angles of ( M F R ) i , so the average rotation matrix M F R can be written as:
M F R ¯ = R ( 1 m i = 1 m ψ i , 1 m i = 1 m θ i , 1 m i = 1 m ϕ i )
where R ( ) represents the transformation from the Euler angles to the rotation matrix. Meanwhile, the average translation vector M F t can be written as:
M F t ¯ = 1 m i = 1 m ( M F t ) i
where ( M F t ) i is the translation vector of ( M F T ) i .
Therefore, the hand–eye transformation M F T is composed of the average rotation matrix M F R ¯ and average translation vector M F t ¯ , written as:
M F T = ( M F R ¯ , M F t ¯ )

3.3. Guiding Tube Calibration

To achieve the robot-assisted pedicle screw insertion, the guiding tube that guides the drilling of a screw insertion trajectory needs to be calibrated. The guiding tube calibration is a procedure to estimate the transformation M T T of the COS O T defined on the guiding tube relative to the COS O M of the optical reference frame attached to the robot end-effector. In this calibration procedure, we utilized two COSs, i.e., the local COS O T of the guiding tube and the COS O M , as shown in Figure 4.
The local COS O T can be determined using three points: the two end points of the guiding tube that lie on the center axis of the tube (referred as p ( 1 ) and p ( 2 ) ), and one further point that is on the guiding tube (referred as p ( 3 ) ). To digitize p ( 1 ) and p ( 2 ) , we used a plug to insert into the guiding tube. We then digitized the coordinates of these three points in the COS O M , referred as p M ( 1 ) , p M ( 2 ) , and p M ( 3 ) , respectively.
To establish the COS O T , we defined the origin by p ( 2 ) , the z-axis by p ( 1 ) and p ( 2 ) , and determined the three points by the x-z plane. We obtained the transformation M T T by its origin and axes, as:
T M T = r M ( x ) r M ( x ) r M ( y ) r M ( y ) r M ( z ) r M ( z ) p M ( 2 ) 0 0 0 1
where,
r M ( x ) = ( ( p M ( 3 ) p M ( 2 ) ) × ( p M ( 1 ) p M ( 2 ) ) ) × ( p M ( 1 ) p M ( 2 ) ) r M ( y ) = ( p M ( 3 ) p M ( 2 ) ) × ( p M ( 1 ) p M ( 2 ) ) r M ( z ) = p M ( 1 ) p M ( 2 )

3.4. Robot-Assisted Pedicle Screw Insertion

Figure 5 illustrates the schematic view of the robot-assisted pedicle screw insertion procedure. The workflow of the robot-assisted, image-guided pedicle screw insertion consists of following three steps: (1) pre-operative trajectory planning; (2) intra-operative registration; (3) transforming the planned trajectory to the robot base COS O B and aligning the guiding tube with the transformed trajectory.

3.4.1. Pre-Operative Planning

In the first step, we obtained a pre-operative CT scan before the operation. We segmented the target vertebra in the CT image and defined a trajectory using an entry point p C T ( e ) and a target point p C T ( t ) in the image space.

3.4.2. Intra-Operative Registration

In the second step, we performed an intra-operative registration to establish the spatial transformation C T R T from the CT image COS O C T to the COS O R . By digitizing points on the surface, we adopted a surface registration algorithm [37] to solve C T R T . Based on C T R T , p C T ( e ) and p C T ( t ) can be transformed to the COS O R .

3.4.3. Transforming the Planned Trajectory to the Robot Base COS and Aligning the Guiding Tube with the Transformed Trajectory

In the third step, we transformed the planned trajectory to the robot base COS O B so that the robot can align the guiding tube with the transformed trajectory, which is calculated as:
p B ( e ) p B ( t ) = F B T · M F T · ( M C T ) 1 · R C T · C T R T p C T ( e ) p C T ( t )
In Equation (16), we retrieved R C T and M C T from the optical tracking camera’s API. M F T is the hand–eye transformation. We retrieved F B T from the robot arm’s API.

4. Experiments and Results

In this section, we will introduce the experiments and results of our study. We designed and conducted three experiments to investigate the efficacy of the proposed method: (1) an investigation of the influence of the range of robot movement to hand–eye calibration; (2) a comparison with state-of-the-art hand–eye calibration methods; and (3) an overall system accuracy study.

4.1. Metrics

In the experiments, the performance is quantified by the deviations between the actual path and the planned trajectory. The deviations consist of the incline angle (unit: ) and distance deviation (unit: mm). We used the entry point p ( e ) and the target point p ( t ) on the planned trajectory to measure the distance, as shown in Figure 6. The distance deviation and incline angle between the guidance path and the planned trajectory are denoted as d and ϕ , respectively, while the distance deviation and the incline angle between the drilled path and the planned trajectory are denoted as d and ϕ , respectively.

4.2. Investigation of the Influence of the Range of Robot Movement to the Hand–Eye Calibration

In this experiment, we investigated the influence of the spatial range of robot movement to the proposed RHC. In the experiment, a plastic phantom was designed and used, as shown in Figure 7a. The phantom, which was fabricated by 3D printing, had a dimension of 140 × 90 × 85 mm 3 , and 25 trajectories were planned on the phantom.
During the hand–eye calibration, the robot is controlled to move in an L × L × L mm 3 cubic space. To investigate the influence of the range of robot movement, we calibrated different hand–eye transformation matrices with an L of 30, 60, 90, 120, 150, or 200 mm. Each time, after obtaining hand–eye calibration, we used the obtained transformation to control the robot to align the guiding tube with a planned trajectory. After that, we digitized the guidance path to evaluate the alignment accuracy.
Experimental results are shown in Figure 7 and Table 2. Both d and ϕ decreased when L increased. When L was 200 mm, the mean distance deviation was 0.70 mm and the mean incline angle was 0.68. The results demonstrate that the larger the robot movement range, the higher the hand–eye calibration accuracy. However, further increasing the movement range will lead to a failure in tracking by the camera. We found that the maximally allowed robot movement range is 200 × 200 × 200 mm 3 .

4.3. Comparison with State-of-the-Art Hand–Eye Calibration Methods

The plastic phantom introduced in Section 4.2 was also used in this study to compare our method with state-of-the-art (SOTA) hand–eye calibration methods, including Tsai’s method [11], Andreff’s method [15], Chou’s method [13], Shah’s method [40], and Wu’s method [8]. Each time, after obtaining hand–eye calibration using one of the mentioned methods, we used the obtained transformation to control the robot to align the guiding tube with a planned trajectory. After that, we then digitized the guidance path to evaluate the alignment accuracy, which reflects the hand–eye calibration accuracy.
The distance deviation d and the angular deviation ϕ are shown in Figure 8 and Table 3. We also report the computational time cost for each method in Table 3. In comparison with the SOTA methods, our method achieved the best results in terms of distance deviation and incline angle. Meanwhile, the time cost of our method is much lower than the iterative calibration method [8], as shown in Table 3.

4.4. Overall System Accuracy Study

To evaluate the overall system accuracy, we conducted trajectory drilling experiments on three types of objects: (a) the plastic phantom used in Section 4.2, (b) four 3D-printed vertebrae, and (c) eight pig vertebrae. Each time, we controlled the robot to align the guiding tube with the planned trajectory and drilled a trajectory into the test subject. In total, we planned and drilled 20 trajectories on the plastic phantom, another 8 trajectories on the 3D-printed vertebrae and further another 8 trajectories on the pig vertebrae. For each trajectory, after drilling, both the guidance paths and the drilled paths were digitized to measure the accuracy.
Results are shown in Figure 9 and Table 4. Specifically, on the plastic phantom, the average distance deviation and the average incline angle between the guiding paths and the planned trajectories are 0.70 mm and 0.72°, respectively, while the average distance deviation and the average incline angle between the drilled trajectories and the planned trajectories are 0.93 mm and 1.04°, respectively. Additionally, on the 3D-printed vertebrae, our system achieved a slightly better result, i.e., the average distance deviation and the average incline angle between the guiding paths and the planned trajectories are 0.66 mm and 0.79, respectively, and the average distance deviation and the average incline angle between the drilled trajectories and the planned trajectories are 0.90 mm and 0.96°, respectively. Finally, we evaluated our system accuracy on the pig vertebrae. The average distance deviation and the average incline angle between the guiding paths and the planned trajectories are 0.71 mm and 0.82°, respectively, while the average distance deviation and the average incline angle between the drilled trajectories and the planned trajectories are 1.01 mm and 1.11°, respectively. Figure 9b shows a post-operative CT scan of the drilled path on a pig vertebra, demonstrating the high accuracy of our system. Both quantitative and qualitative results demonstrate that our system accuracy is good enough for robot-assisted pedicle screw insertion.

5. Discussions

Hand–eye calibration is one of the essential components when developing a robot-assisted, image-guided pedicle screw insertion system. The accuracy of hand–eye calibration will have a direct influence on the system accuracy. However, it is challenging to develop an accurate and robust method for hand–eye calibration. In this paper, we proposed an effective hand–eye calibration method based on tool-tip pivot calibration and paired-point matching without the need to solve the AX = XB equation. Comprehensive experiments were conducted to validate the accuracy of our proposed hand–eye calibration method as well as the robot-assisted, image-guided pedicle screw insertion system. Both qualitative and quantitative results demonstrate the efficacy of our hand–eye calibration method and the high accuracy of our system.
In comparison with a SOTA hand–eye calibration method, our method has the following advantages: First, our method is a simultaneous closed-form solution, which is derived by solving three overdetermined equations, guaranteeing an optimal solution. Second, unlike other simultaneous solutions, we reformulate the hand–eye calibration problem as solutions to tool-tip pivot calibrations in two-coordinate systems and paired-point matching, taking advantage of the steady movement of the robot arm, thus reducing measurement errors and noise. Third, in comparison with methods depending on iterative solutions [18,19,20,21] or probabilistic models [22,23], our method is much faster because it is not an iterative solution and only requires simple matrix operations.
Based on the novel hand–eye calibration method, we further developed a robot-assisted, image-guided pedicle screw insertion system. We conducted trajectory drilling experiments on a plastic phantom, 3D-printed vertebrae, and pig vertebrae to validate the accuracy of our system. When drilling trajectories on the plastic phantom, our system achieved a mean distance deviation of 0.93 mm and a mean angular deviation of 1.04°. When it was used to drill trajectories on the 3D-printed vertebrae, our system achieved a mean distance deviation of 0.90 mm and a mean angular deviation of 0.96°. To check whether the differences between results obtained from the plastic phantom and the 3D-printed vertebrae are statistically significant, we conducted an unpaired t-test and chose a significant level of α = 0.05 . We found a p-value of 0.52 for the distance deviation and a p-value of 0.40 for the angular deviation. When drilling trajectories on the pig vertebrae, our system achieved a mean distance deviation of 1.01 mm and a mean angular deviation of 1.11°, which are regarded accurate enough for pedicle screw insertion.

6. Conclusions

In this paper, we proposed a novel hand–eye calibration method, namely registration-based hand–eye calibration (RHC), to estimate the calibration transformation via paired-point matching without the need to solve the AX = XB equation. Based on the proposed hand–eye calibration method, we developed a robot-assisted, image-guided pedicle screw insertion system. Comprehensive experiments were conducted to investigate the influence of the range of robot movement on the hand–eye calibration to compare our method with state-of-the-art methods and to evaluate overall system accuracy. Our experimental results demonstrate the efficacy of our hand–eye calibration method and the high accuracy of our system. Our novel hand–eye calibration method can be applied to other types of robot-assisted surgery.

Author Contributions

Conceptualization, G.Z.; Data curation, W.S., J.L. and Y.Z.; Formal analysis, W.S.; Funding acquisition, G.Z.; Investigation, W.S., J.L. and Y.Z.; Methodology, W.S. and G.Z.; Project administration, G.Z.; Software, W.S.; Supervision, G.Z.; Validation, J.L. and Y.Z.; Visualization, W.S.; Writing—original draft, W.S.; Writing—review & editing, G.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was partially supported by the Shanghai Municipal Science and Technology Commission (20511105205) and by the National Natural Science Foundation of China (U20A20199).

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board of School of Biomedical Engineering, Shanghai Jiao Tong University, China (Approval No. 2020031, approved on 8 May 2020).

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
CTComputed Tomography
APIApplication Programming Interface
COSCoordinate System
SVDSingular Value Decomposition
RHCRegistration-based Hand–eye Calibration
3DThree-dimension

References

  1. Tian, N.F.; Xu, H.Z. Image-guided pedicle screw insertion accuracy: A meta-analysis. Int. Orthop. 2009, 33, 895–903. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Fan, Y.; Du, J.; Zhang, J.; Liu, S.; Xue, X.; Huang, Y.; Zhang, J.; Hao, D. Comparison of accuracy of pedicle screw insertion among 4 guided technologies in spine surgery. Med. Sci. Monit. Int. Med. J. Exp. Clin. Res. 2017, 23, 5960. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Nguyen, N.Q.; Priola, S.M.; Ramjist, J.M.; Guha, D.; Dobashi, Y.; Lee, K.; Lu, M.; Androutsos, D.; Yang, V. Machine vision augmented reality for pedicle screw insertion during spine surgery. J. Clin. Neurosci. 2020, 72, 350–356. [Google Scholar] [CrossRef] [PubMed]
  4. Solomiichuk, V.; Fleischhammer, J.; Molliqaj, G.; Warda, J.; Alaid, A.; von Eckardstein, K.; Schaller, K.; Tessitore, E.; Rohde, V.; Schatlo, B. Robotic versus fluoroscopy-guided pedicle screw insertion for metastatic spinal disease: A matched-cohort comparison. Neurosurg. Focus 2017, 42, E13. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. Molliqaj, G.; Schatlo, B.; Alaid, A.; Solomiichuk, V.; Rohde, V.; Schaller, K.; Tessitore, E. Accuracy of robot-guided versus freehand fluoroscopy-assisted pedicle screw insertion in thoracolumbar spinal surgery. Neurosurg. Focus 2017, 42, E14. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  6. Kim, H.J.; Jung, W.I.; Chang, B.S.; Lee, C.K.; Kang, K.T.; Yeom, J.S. A prospective, randomized, controlled trial of robot-assisted vs freehand pedicle screw fixation in spine surgery. Int. J. Med. Robot. Comput. Assist. Surg. 2017, 13, e1779. [Google Scholar] [CrossRef]
  7. Shaw, K.A.; Murphy, J.S.; Devito, D.P. Accuracy of robot-assisted pedicle screw insertion in adolescent idiopathic scoliosis: Is triggered electromyographic pedicle screw stimulation necessary? J. Spine Surg. 2018, 4, 187. [Google Scholar] [CrossRef]
  8. Wu, L.; Ren, H. Finding the kinematic base frame of a robot by hand-eye calibration using 3D position data. IEEE Trans. Autom. Sci. Eng. 2016, 14, 314–324. [Google Scholar] [CrossRef]
  9. Liu, G.; Yu, X.; Li, C.; Li, G.; Zhang, X.; Li, L. Space calibration of the cranial and maxillofacial robotic system in surgery. Comput. Assist. Surg. 2016, 21, 54–60. [Google Scholar] [CrossRef] [Green Version]
  10. Shiu, Y.C.; Ahmad, S. Calibration of wrist-mounted robotic sensors by solving homogeneous transform equations of the form AX= XB. IEEE Trans. Robot. Autom. 1989, 5, 16–29. [Google Scholar] [CrossRef]
  11. Tsai, R.Y.; Lenz, R.K. A new technique for fully autonomous and efficient 3 d robotics hand/eye calibration. IEEE Trans. Robot. Autom. 1989, 5, 345–358. [Google Scholar] [CrossRef] [Green Version]
  12. Wang, C.C. Extrinsic calibration of a vision sensor mounted on a robot. Ieee Trans. Robot. Autom. 1992, 8, 161–175. [Google Scholar] [CrossRef]
  13. Chou, J.C.; Kamel, M. Finding the position and orientation of a sensor on a robot manipulator using quaternions. Int. J. Robot. Res. 1991, 10, 240–254. [Google Scholar] [CrossRef]
  14. Daniilidis, K. Hand-eye calibration using dual quaternions. Int. J. Robot. Res. 1999, 18, 286–298. [Google Scholar] [CrossRef]
  15. Andreff, N.; Horaud, R.; Espiau, B. On-line hand-eye calibration. In Proceedings of the Second International Conference on 3-D Digital Imaging and Modeling (Cat. No. PR00062), Ottawa, ON, Canada, 8 October 1999; pp. 430–436. [Google Scholar]
  16. Lu, Y.C.; Chou, J.C. Eight-space quaternion approach for robotic hand-eye calibration. In Proceedings of the 1995 IEEE International Conference on Systems, Man and Cybernetics. Intelligent Systems for the 21st Century, Vancouver, BC, Canada, 22–25 October 1995; Volume 4, pp. 3316–3321. [Google Scholar]
  17. Zhao, Z.; Liu, Y. Hand-eye calibration based on screw motions. In Proceedings of the 18th International Conference on Pattern Recognition (ICPR’06), Hong Kong, China, 20–24 August 2006; Volume 3, pp. 1022–1026. [Google Scholar]
  18. Zhuang, H.; Shiu, Y.C. A noise-tolerant algorithm for robotic hand-eye calibration with or without sensor orientation measurement. IEEE Trans. Syst. Man, Cybern. 1993, 23, 1168–1175. [Google Scholar] [CrossRef]
  19. Wei, G.Q.; Arbter, K.; Hirzinger, G. Active self-calibration of robotic eyes and hand-eye relationships with model identification. IEEE Trans. Robot. Autom. 1998, 14, 158–166. [Google Scholar]
  20. Mao, J.; Huang, X.; Jiang, L. A flexible solution to AX= XB for robot hand-eye calibration. In Proceedings of the 10th WSEAS International Conference on Robotics, Control and Manufacturing Technology, Hangzhou, China, 11–13 April 2010; pp. 118–122. [Google Scholar]
  21. Zhang, Z.; Zhang, L.; Yang, G.Z. A computationally efficient method for hand–eye calibration. Int. J. Comput. Assist. Radiol. Surg. 2017, 12, 1775–1787. [Google Scholar] [CrossRef] [Green Version]
  22. Li, H.; Ma, Q.; Wang, T.; Chirikjian, G.S. Simultaneous hand-eye and robot-world calibration by solving the AX = YB problem without correspondence. IEEE Robot. Autom. Lett. 2015, 1, 145–152. [Google Scholar] [CrossRef]
  23. Ma, Q.; Li, H.; Chirikjian, G.S. New probabilistic approaches to the AX= XB hand-eye calibration without correspondence. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 4365–4371. [Google Scholar]
  24. Aiguo, L.; Lin, W.; Defeng, W. Simultaneous robot-world and hand-eye calibration using dual-quaternions and Kronecker product. Int. J. Phys. Sci. 2010, 5, 1530–1536. [Google Scholar]
  25. Ali, I.; Suominen, O.; Gotchev, A.; Morales, E.R. Methods for simultaneous robot-world-hand–eye calibration: A comparative study. Sensors 2019, 19, 2837. [Google Scholar] [CrossRef] [Green Version]
  26. Hirsh, R.L.; DeSouza, G.N.; Kak, A.C. An iterative approach to the hand-eye and base-world calibration problem. In Proceedings of the 2001 ICRA, IEEE International Conference on Robotics and Automation (Cat. No. 01CH37164), Seoul, Korea, 21–26 May 2001; Volume 3, pp. 2171–2176. [Google Scholar]
  27. Strobl, K.H.; Hirzinger, G. Optimal hand-eye calibration. In Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; pp. 4647–4653. [Google Scholar]
  28. Shah, M.; Eastman, R.D.; Hong, T. An overview of robot-sensor calibration methods for evaluation of perception systems. In Proceedings of the Workshop on Performance Metrics for Intelligent Systems, Gaithersburg, MD, USA, 19–21 August 2012; pp. 15–20. [Google Scholar]
  29. Morgan, I.; Jayarathne, U.; Rankin, A.; Peters, T.M.; Chen, E. Hand-eye calibration for surgical cameras: A procrustean perspective-n-point solution. Int. J. Comput. Assist. Radiol. Surg. 2017, 12, 1141–1149. [Google Scholar] [CrossRef] [PubMed]
  30. Özgüner, O.; Shkurti, T.; Huang, S.; Hao, R.; Jackson, R.C.; Newman, W.S.; Çavuşoğlu, M.C. Camera-robot calibration for the da vinci robotic surgery system. IEEE Trans. Autom. Sci. Eng. 2020, 17, 2154–2161. [Google Scholar] [CrossRef] [PubMed]
  31. Roberti, A.; Piccinelli, N.; Meli, D.; Muradore, R.; Fiorini, P. Improving rigid 3-d calibration for robotic surgery. IEEE Trans. Med. Robot. Bionics 2020, 2, 569–573. [Google Scholar] [CrossRef]
  32. Sun, Y.; Pan, B.; Guo, Y.; Fu, Y.; Niu, G. Vision-based hand–eye calibration for robot-assisted minimally invasive surgery. Int. J. Comput. Assist. Radiol. Surg. 2020, 15, 2061–2069. [Google Scholar] [CrossRef]
  33. Tang, Y.; Zhou, H.; Wang, H.; Zhang, Y. Fruit detection and positioning technology for a Camellia oleifera C. Abel orchard based on improved YOLOv4-tiny model and binocular stereo vision. Expert Syst. Appl. 2023, 211, 118573. [Google Scholar] [CrossRef]
  34. Valassakis, E.; Dreczkowski, K.; Johns, E. Learning Eye-in-Hand Camera Calibration from a Single Image. In Proceedings of the Conference on Robot Learning, PMLR, London, UK, 8–11 November 2021; pp. 1336–1346. [Google Scholar]
  35. Huo, J.; Meng, Z.; Zhang, H.; Chen, S.; Yang, F. Feature points extraction of defocused images using deep learning for camera calibration. Measurement 2022, 188, 110563. [Google Scholar] [CrossRef]
  36. Kim, H.S.; Kuc, T.Y.; Lee, K.H. Hand-eye calibration using images restored by deep learning. In Proceedings of the 2020 IEEE International Conference on Consumer Electronics-Asia (ICCE-Asia), Seoul, Korea, 1–3 November 2020; pp. 1–4. [Google Scholar]
  37. Low, K.L. Linear least-squares optimization for point-to-plane icp surface registration. Chapel Hill Univ. North Carol. 2004, 4, 1–3. [Google Scholar]
  38. Khamene, A.; Sauer, F. A novel phantom-less spatial and temporal ultrasound calibration method. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention, Palm Springs, CA, USA, 26–29 October 2005; pp. 65–72. [Google Scholar]
  39. Petersen, P. Linear Algebra; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2012. [Google Scholar]
  40. Shah, M. Solving the robot-world/hand-eye calibration problem using the Kronecker product. J. Mech. Robot. 2013, 5, 031007. [Google Scholar] [CrossRef]
Figure 1. The involved coordinate systems for robot-assisted, image-guided pedicle screw insertion. During a pedicle screw insertion procedure, the pose of the guide is adjusted to align with a trajectory, which is planned in a pre-operative CT first, and then is transformed to the patient space via a surface registration.
Figure 1. The involved coordinate systems for robot-assisted, image-guided pedicle screw insertion. During a pedicle screw insertion procedure, the pose of the guide is adjusted to align with a trajectory, which is planned in a pre-operative CT first, and then is transformed to the patient space via a surface registration.
Sensors 22 08446 g001
Figure 2. Tool-tip pivot calibration. (a) The calibration tool with a sharp tip is rigidly fixed to the flange during the hand–eye calibration; (b) pivot calibration of the offset of the tool tip with respect to the 3D COS O M of the optical reference frame on the end-effector; (c) pivot calibration of the offset of the tool tip with respect to the robotic flange COS O F .
Figure 2. Tool-tip pivot calibration. (a) The calibration tool with a sharp tip is rigidly fixed to the flange during the hand–eye calibration; (b) pivot calibration of the offset of the tool tip with respect to the 3D COS O M of the optical reference frame on the end-effector; (c) pivot calibration of the offset of the tool tip with respect to the robotic flange COS O F .
Sensors 22 08446 g002
Figure 3. Solving B C T via paired-point matching. By controlling the flange to move in m different positions, we can obtain the coordinates of the tool tip in both the optical tracking camera COS O C and the robot base COS O B , generating two point sets. B C T is solved by matching the two point sets using a paired-point matching algorithm.
Figure 3. Solving B C T via paired-point matching. By controlling the flange to move in m different positions, we can obtain the coordinates of the tool tip in both the optical tracking camera COS O C and the robot base COS O B , generating two point sets. B C T is solved by matching the two point sets using a paired-point matching algorithm.
Sensors 22 08446 g003
Figure 4. A schematic view the guiding tube calibration. (a) The plug, which can be inserted into the guiding tube from both ends for digitization. (b) The three points on the tube that are digitized and the COS O T of the guiding tube established using the three points.
Figure 4. A schematic view the guiding tube calibration. (a) The plug, which can be inserted into the guiding tube from both ends for digitization. (b) The three points on the tube that are digitized and the COS O T of the guiding tube established using the three points.
Sensors 22 08446 g004
Figure 5. A schematic view of robot-assisted pedicle screw insertion: The target trajectory is planned in the COS O C T and transformed to the COS O R by intra-operative registration. The target trajectory is further transformed to the robot base COS O B . The guiding tube is aligned with the target trajectory for insertion guidance.
Figure 5. A schematic view of robot-assisted pedicle screw insertion: The target trajectory is planned in the COS O C T and transformed to the COS O R by intra-operative registration. The target trajectory is further transformed to the robot base COS O B . The guiding tube is aligned with the target trajectory for insertion guidance.
Sensors 22 08446 g005
Figure 6. Metrics used to evaluate the accuracy in this study, including distance deviation d and d , as well as incline angle ϕ and ϕ .
Figure 6. Metrics used to evaluate the accuracy in this study, including distance deviation d and d , as well as incline angle ϕ and ϕ .
Sensors 22 08446 g006
Figure 7. Investigation of the influence of the range of robot movement to the hand–eye calibration: (a) the plastic phantom used in the experiment; (b) the box plots of distance deviation and incline angle.
Figure 7. Investigation of the influence of the range of robot movement to the hand–eye calibration: (a) the plastic phantom used in the experiment; (b) the box plots of distance deviation and incline angle.
Sensors 22 08446 g007
Figure 8. Comparison of our method with the SOTA hand–eye calibration methods.
Figure 8. Comparison of our method with the SOTA hand–eye calibration methods.
Sensors 22 08446 g008
Figure 9. Overall system accuracy study: (a) the 3D-printed vertebra and pig vertebrae; (b) the CT image of the animal vertebrae after drilling; (c) the box plots of distance deviation and incline angle.
Figure 9. Overall system accuracy study: (a) the 3D-printed vertebra and pig vertebrae; (b) the CT image of the animal vertebrae after drilling; (c) the box plots of distance deviation and incline angle.
Sensors 22 08446 g009
Table 1. Comparison of existing solutions to the hand–eye calibration problem.
Table 1. Comparison of existing solutions to the hand–eye calibration problem.
CategoriesSolutionsDrawbacks
Separable solutions [10,11,12,13,14]Solve the rotation part first; then, solve the translational part.Error propagation problem.
Simultaneous solutions [15,16,17]Solve the rotational and translational parts at the same time.Sensitive to the nonlinearities present in measurements in the form of noise and errors.
Iterative solutions [8,18,19,20,21]Solve a nonlinear optimization problem by minimizing the error by iteration.Computationally expensive; may not always converge on the optimal solution.
Probabilistic methods [22,23]Solve the calibration problem without the assumption of exact correspondence between the data streams.Computationally expensive.
Table 2. Investigation of the influence of the range of robot movement to the hand–eye calibration.
Table 2. Investigation of the influence of the range of robot movement to the hand–eye calibration.
d [mm] ϕ [ ° ]
L [mm]MeanMax.MeanMax.
301.171.400.871.25
600.861.090.830.93
900.820.950.720.91
1200.861.060.750.90
1500.711.110.700.85
2000.700.880.680.96
Table 3. Comparison of our method with the SOTA hand–eye calibration methods.
Table 3. Comparison of our method with the SOTA hand–eye calibration methods.
d [mm] ϕ [ ° ] Computation
Time [ms]
L [mm]MeanMax.MeanMax.
Tsai [11]0.740.920.750.881.18
Andreff [15]0.730.870.700.922.23
Chou [13]0.730.840.690.890.82
Shah [40]0.740.920.720.970.63
Wu [8]0.720.880.680.9026.84
Ours0.700.880.680.962.21
Table 4. Overall system accuracy study.
Table 4. Overall system accuracy study.
Plastic Phantom3D-Printed VertebraePig Vertebrae
d [mm]Mean0.700.660.71
Max.0.850.790.82
d [mm]Mean0.930.901.01
Max.1.151.131.52
ϕ [ ° ] Mean0.720.790.82
Max.0.940.910.96
ϕ [ ° ] Mean1.040.961.11
Max.1.451.241.38
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Sun, W.; Liu, J.; Zhao, Y.; Zheng, G. A Novel Point Set Registration-Based Hand–Eye Calibration Method for Robot-Assisted Surgery. Sensors 2022, 22, 8446. https://doi.org/10.3390/s22218446

AMA Style

Sun W, Liu J, Zhao Y, Zheng G. A Novel Point Set Registration-Based Hand–Eye Calibration Method for Robot-Assisted Surgery. Sensors. 2022; 22(21):8446. https://doi.org/10.3390/s22218446

Chicago/Turabian Style

Sun, Wenyuan, Jihao Liu, Yuyun Zhao, and Guoyan Zheng. 2022. "A Novel Point Set Registration-Based Hand–Eye Calibration Method for Robot-Assisted Surgery" Sensors 22, no. 21: 8446. https://doi.org/10.3390/s22218446

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop