Next Article in Journal
Blockchain-Powered LSTM-Attention Hybrid Model for Device Situation Awareness and On-Chain Anomaly Detection
Previous Article in Journal
Study on the Fast Transient Process of Primary Equipment Operation in UHV Fixed Series Capacitors Based on PEEC Method
Previous Article in Special Issue
Reinforcement Learning Approach to Optimizing Profilometric Sensor Trajectories for Surface Inspection
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Enhanced Calibration Method for Robotic Flexible 3D Scanning System †

by
Zhilong Zhou
*,
Jinyong Shangguan
,
Xuemei Sun
*,
Yunlong Liu
,
Xu Zhang
,
Dengbo Zhang
and
Haoran Liu
College of Mechanical and Vehicle Engineering, Linyi University, Linyi 276012, China
*
Authors to whom correspondence should be addressed.
This article is a revised and expanded version of a paper entitled “Enhanced calibration method for robotic flexible 3D scanning system”, which was presented at Proceedings of the 40th Annual Youth Academic Conference of Chinese Association of Automation (YAC), Zhengzhou, China, 17–19 May 2025.
Sensors 2025, 25(15), 4661; https://doi.org/10.3390/s25154661
Submission received: 17 June 2025 / Revised: 20 July 2025 / Accepted: 25 July 2025 / Published: 27 July 2025
(This article belongs to the Special Issue Applications of Manufacturing and Measurement Sensors: 2nd Edition)

Abstract

Large-sized components with numerous small key local features are essential in advanced manufacturing. Achieving high-precision quality control necessitates accurate and highly efficient three-dimensional (3D) measurement techniques. A flexible measurement system integrating a fringe-projection-based 3D scanner with an industrial robot is developed to enable the rapid measurement of large object surfaces. To enhance overall measurement accuracy, we propose an enhanced calibration method utilizing a multidimensional ball-based calibrator to simultaneously calibrate for hand-eye transformation and robot kinematic parameters. Firstly, a preliminary hand-eye calibration method is introduced to compensate for measurement errors at observation points, leveraging geometric-constraint-based optimization and a virtual single point derived via the barycentric calculation method. Subsequently, a distance-constrained calibration method is proposed to jointly estimate the hand-eye transformation and robot kinematic parameters, wherein a distance error model is constructed to link parameter errors with the measured deviations of a virtual single point. Finally, calibration and validation experiments were carried out, and the results indicate that the maximum and average measurement errors were reduced from 1.053 mm and 0.814 mm to 0.421 mm and 0.373 mm, respectively, thereby confirming the effectiveness of the proposed method.

1. Introduction

Complicated components with numerous small key local features (KLFs) are commonly used in modern advanced manufacturing industries such as aerospace, marine engineering, and automotive sectors. These features, including edges, holes, grooves, and curved surfaces, often exhibit complex geometries and tight tolerances, which pose significant challenges for dimensional inspection and quality assurance. Automated and accurate 3D shape measurement of these features is critical for ensuring quality control, enhancing product reliability, and minimizing manufacturing costs [1,2,3]. As product designs become increasingly complex and lightweight, the demand for high-precision and intelligent inspection technologies continues to grow, driving the need for more effective measurement systems capable of handling diverse geometries under varying operational conditions.
Three-dimensional (3D) shape measurement techniques are generally classified into contact and non-contact approaches. Among contact-based methods, coordinate measuring machines (CMMs) equipped with tactile probes [4] offer exceptional precision. Nevertheless, they are constrained by low efficiency in acquiring high-density 3D point cloud data. With the continued advancement of optical non-contact measurement technologies, line-structured light sensors based on active vision have been widely applied in industrial visual inspection and 3D shape measurement due to their non-contact nature, high precision, and efficiency [5]. However, these sensors have limited fields of view and lack intrinsic motion capabilities, making them unsuitable for measuring the full geometry of large-scale objects. To address this, they are typically mounted on high-precision motion platforms to enable large-area, multi-angle, and full-coverage measurements [6,7]. In recent years, structured light sensors have been widely used in industrial applications and integrated with CMMs for 3D shape measurements. However, the combined method, which integrates CMMs with a structured light sensor [8], is not suitable for online measurements due to its restrictive mechanical structure and limited measurement efficiency. In contrast to CMMs, industrial robots are well-suited for executing complex spatial positioning and orientation tasks with efficiency [9,10]. Equipped with a high-performance controller, these robots can also transport a vision sensor mounted on the end-effector to specified target locations. During the online measurement process, the spatial pose of the vision sensor varies as the robot’s end-effector moves to different positions. Nevertheless, the relative transformation between the sensor’s coordinate system and the end-effector remains fixed-an essential concept known as hand-eye calibration. Consequently, the accuracy of this calibration plays a pivotal role in determining the overall measurement accuracy of the system.
For hand-eye calibration, techniques are generally classified into three categories based on the nature of the calibration object: 2D target-based methods, single-point or standard-sphere-based methods, and 3D-object-based methods. One of the most widely recognized hand-eye calibration methods based on a two-dimensional calibration target was introduced by Shiu [11]. The spatial relationship between the sensor and the robot’s wrist frame was identified by analyzing the sensor’s motion in response to controlled movements of the robotic arm, typically modeled using the hand-eye calibration equation AX = XB, where A and B represent the robot and sensor transformations, respectively, and X denotes the unknown sensor pose. Under general conditions, this equation has a closed-form solution, provided that at least two distinct robot motions are used to resolve the inherent ambiguities in rotation and translation. The uniqueness of the solution depends on the rotational characteristics of the robot’s movements. Since then, a wide range of solution methods have been developed for these calibration equations, which are generally categorized into linear and nonlinear algorithms [12,13]. Representative approaches include the distributed closed-loop method, the global closed-loop method, and various iterative techniques. However, these methods tend to be highly sensitive to measurement noise. In contrast to traditional hand-eye calibration methods, which often involve complex setups and labor-intensive manual procedures, Sung et al. [14] proposed a vision-based self-calibration method tailored for industrial robotic applications. The approach utilizes a static 2D calibration plane and a camera mounted on the end-effector of a moving robot. By designing controlled translational and rotational movements of the robot arm, the spatial relationship between the robot and the camera can be accurately estimated without external intervention. This self-calibration strategy significantly reduces human involvement and enhances the practicality of deploying robotic vision systems in automated environments. To streamline the calibration process for line-structured light sensors in robotic eye-in-hand systems, Wang et al. [15] introduced an integrated method that simultaneously determines the camera intrinsics, hand-eye transformation, and structured light plane parameters. Unlike traditional approaches that treat each parameter group separately, often resulting in complex and time-consuming workflows, this method performs all calibrations using a unified image set acquired from multiple robot and target poses. By reusing the extrinsic results from camera calibration, the structured light plane and hand-eye parameters can be efficiently derived without additional data collection. The procedure requires only a low-cost planar checkerboard as the calibration target, making it both practical and accessible. Experimental results demonstrated that the method achieves sub-millimeter accuracy with a total calibration time under 10 min, validating its effectiveness for industrial 3D measurement applications. Pavlovčič et al. [16] proposed an efficient calibration method that simultaneously estimates both the intrinsic parameters of a laser profilometer and the hand-eye transformation with respect to an industrial robot. Unlike conventional approaches that often rely on multiple calibration targets or sequential procedures, their method requires only a single reference geometry scanned from multiple viewpoints. By capturing the reference object from 15 different poses and numerically optimizing the transformation parameters to minimize geometric deviations, the approach achieves high accuracy with minimal setup. Experimental validation demonstrated sub-millimeter calibration precision, achieving deviations below 0.105 mm when using a robotic arm and 0.046 mm with a CNC system. The fully automated workflow, which completes in under 10 min, makes this approach highly suitable for regular on-site recalibration in industrial environments. In contrast, Xu et al. [17] proposed a single-point-based calibration method that utilizes a single point, such as a standard sphere, to compute the transformation, offering a relatively simple and practical implementation. Furthermore, hand-eye calibration methods utilizing a standard sphere of known radius have been extensively adopted to estimate the hand-eye transformation parameters [18]. These methods provide an intuitive and user-friendly solution for determining both rotation and translation matrices. Among the 3D-object-based calibration methods, Liu et al. [19] designed a calibration target consisting of a small square with a circular hole. Feature points were extracted from line fringe images and used as reference points for the calibration process. However, the limited number of calibration points and the low precision of the extracted image features resulted in suboptimal calibration accuracy. In addition to hand-eye calibration methods based on calibration objects, several approaches that use environmental images, rather than calibration objects, have been successfully applied to determine hand-eye transformation parameters, as reported by Sang [20] and Nicolas and Qi [21]. To enhance the flexibility and autonomy of 3D measurement systems, Song et al. [22] proposed a hand-eye calibration method that avoids reliance on high-precision or specially designed calibration targets. Instead, their approach leverages irregular and unstructured objects, capturing 3D point cloud data from multiple viewpoints to estimate the spatial relationship between the robot and the sensor. By integrating a fast point feature histogram (FPFH)-based sampling consistency check with an improved iterative closest point registration algorithm that incorporates probabilistic modeling and covariance refinement, the method establishes the hand-eye transformation through robust point cloud alignment. Experimental results confirmed that the technique can perform accurate calibration using arbitrary objects in real-world robotic applications. However, the methods mentioned above do not account for the robot positioning errors introduced by kinematic parameter errors during the hand-eye calibration process.
To overcome this limitation, many researchers have proposed various hand-eye calibration methods that account for the correction of robot kinematic parameter errors. Yin et al. [23] introduced an enhanced hand-eye calibration algorithm. Initially, hand-eye calibration is conducted using a standard sphere, without accounting for robot positioning errors. Then, both the robot’s kinematic parameters and the hand-eye relationship parameters are iteratively refined based on differential motion theory. Finally, a unified identification of hand-eye and kinematic parameter errors is accomplished using the singular value decomposition (SVD) method. However, the spherical constraint-based error model lacks absolute dimensional information, and cumulative sensor measurement errors further limit the accuracy of parameter estimation. Li et al. [24] introduced a method that incorporates fixed-point information as optimization constraints to concurrently estimate both hand-eye transformation and robot kinematic parameters. However, the presence of measurement errors from visual sensors substantially compromises the robustness and generalizability of the resulting parameter estimations. To further optimize measurement system parameters, Mu et al. [25] introduced a unified calibration method that simultaneously estimates hand-eye and robot kinematic parameters based on distance constraints. However, this method does not adequately address sensor measurement error correction during the solution of the hand-eye matrix, resulting in accumulated sensor inaccuracies that significantly degrade the precision of the derived relationship matrix.
To address the aforementioned limitations, we propose an accurate calibration method for a robotic flexible 3D scanning system based on a multidimensional ball-based calibrator (MBC) [26]. This method constructs a distance-based calibration model that concurrently considers measurement errors, hand-eye parameter errors, and robotic kinematic errors. Specifically, by incorporating geometric-constraint-based optimization for compensating coordinate errors of the measurement points, a preliminary hand-eye calibration method is introduced based on a single virtual point determined via the barycenter technique. Subsequently, a distance-constraint-based calibration method is developed to further optimize both hand-eye and kinematic parameters, effectively associating system parameter errors with deviations in the measured coordinates of the single virtual point.
The remainder of this paper is organized as follows. Section 2 introduces the model of the measurement system. Section 3 outlines the proposed calibration methodology. Section 4 presents the experimental setup and accuracy validation. Finally, Section 5 concludes this study with a brief summary of the findings.

2. Measurement Method and Principle

2.1. Measurement Method

The proposed robotic 3D scanning system integrates an industrial robot with a fringe-projection-based 3D scanner, which is mounted on the robot’s end-effector using a custom fixture. For accurate system calibration and validation, a specially designed multidimensional ball-based calibrator (MBC) is utilized. The MBC features eight non-collinear spheres, whose spatial relationships are pre-calibrated using a CMM. As illustrated in Figure 1, the system comprises three coordinate frames: the robot base coordinate system O b X b Y b Z b (BCS), the robot end-effector coordinate system O e X e Y e Z e (ECS), and the 3D scanner coordinate system O s X s Y s Z s (SCS).
During the measurement process, the robot adjusts its pose to guide the scanner in acquiring the target features. The acquired data are subsequently transformed from SCS to BCS. Based on the coordinate transformation principle, the measured coordinates of the visual points P c S in the SCS can be expressed P c B in the BCS is as follows:
P c B 1 = R E B T E B 0 1 R S E T S E 0 1 P c S 1
where R E B and T E B represent the rotation matrix and translation vector, respectively, from ECS to BCS, while R S E and T S E represent the rotation matrix and translation vector, respectively, in the hand-eye relation.
To ensure accurate measurements throughout the system, three calibration procedures must be completed prior to data acquisition: (1) intrinsic calibration of the 3D sensor, (2) hand-eye calibration, and (3) calibration of the robot’s kinematic parameters.
In this study, the intrinsic parameters of the 3D scanner were pre-calibrated and verified by the manufacturer. Therefore, they are assumed to remain constant during operation. Consequently, only the hand-eye calibration and the robot kinematic calibration are required, as further described in Section 3.

2.2. Robot Kinematic Error Model

Due to structural and manufacturing imperfections, deviations arise between a robot’s actual and nominal kinematic parameters, leading to discrepancies in its actual and theoretical poses—commonly referred to as positioning errors. A kinematic error model quantitatively characterizes the relationship between these parameter deviations and the resulting positioning errors. In the study, the error model is formulated based on the Modified Denavit-Hartenberg (MD-H) convention [27] and rigid-body differential motion theory [28] and is expressed as follows:
H E B + d H E B = i = 1 N H i i 1 + d H i i 1
where d H E B is the deviation in the homogenous transformation matrix between ECS and BCS, and H i i 1 is the homogenous transformation matrix between the adjacent joints.
By expanding Equation (2) and neglecting the second-order term and combining it with the differential kinematics model d H E B = H E B δ H E B , we obtain
H E B + d H E B = H E B + H E B δ H E B   = H E B + i = 1 n H E B θ i Δ θ i + H E B d i Δ d i + H E B a i Δ a i + H E B α i Δ α i + H E B β i Δ β i
where Δ θ i , Δ d i , Δ a i , Δ α i , and Δ β i are small link parameter errors.
Consequently, the relationship between the robot positioning error and kinematic parameter errors can be expressed as
Δ E = Δ D Δ Θ = M 1 M 2 Δ θ + M 2 0 Δ d + M 3 0 Δ a + M 4 M 3 Δ α + M 5 M 6 Δ β = J Δ X
where Δ E = Δ e x Δ e y Δ e z δ e x δ e y δ e z T and Δ X = Δ θ   Δ d Δ a Δ α   Δ β T and Δ D and Δ Θ represent, respectively, the differential translation vector and differential rotation vector; Δ θ , Δ d , Δ a , Δ α , and Δ β represent the kinematic parameter errors vector; M 1 M 6 are the 3 × n error coefficient matrices; and J is the Jacobian matrix.

3. Measurement System Calibration

To improve the accuracy of hand-eye calibration, this paper proposed an online accurate calibration method using a multidimensional ball-based calibrator (MBC) to simultaneously identify both hand-eye transformation and robot kinematic parameters. A schematic of the calibration procedure is presented in Figure 2. First, Section 3.1 introduces an initial hand-eye calibration method that compensates for measurement point errors using an angular-constraint-based optimization method. This method is based on a virtual point calculated via the barycenter algorithm. Subsequently, Section 3.2 presents a distance-constraint-based calibration strategy to further refine the estimation of both hand-eye transformation and kinematic parameters.

3.1. Initial Calibration of the Hand-Eye Parameters

3.1.1. Preliminary Hand-Eye Calibration Method Based on a Virtual Single Point

During the initial stage of hand-eye calibration, the MBC is stably positioned within a suitable workspace. A single virtual point, serving as the calibration target, is calculated based on the coordinates of four measurement points optimized using an angular-constraint-based optimization method. Specifically, the initial hand-eye transformation is then determined by capturing this virtual point from multiple robots’ poses using the 3D scanner. Furthermore, the position of the single virtual point relative to the BCS remains constant. According to the principle of barycentric coordinates, an initial (non-optimized) virtual point P g i m S = x g i m S , y g i m S , z g i m S T , as illustrated in Figure 2, is computed based on the initial coordinates of four measurement points P s i m S = x s i m S , y s i m S , z s i m S T and is formulated as follows:
P g i m S = 1 4 i = 1 4 x s i m S , 1 4 i = 1 4 y s i m S , 1 4 i = 1 4 z s i m S T
When the 3D scanner moves to the i-th and j-th poses, the following equations, considering the measurement deviation, can be obtained according to Equation (1):
P g i B = R E 0 i B R S E P g i m S + Δ P g i m S + R E 0 i B T S E T E 0 i B P g j B = R E 0 j B R S E P g j m S + Δ P g j m S + R E 0 j B T S E T E 0 j B
where Δ P g i m S , computed by the optimized method introduced in Section 3.1.2, is the deviation between the nominal and actual coordinates of the single virtual point in the SCS.
By translating the 3D scanner multiple times to measure the points, a matrix equation in the form of R S E A = b is given by
R S E P g 1 m S + Δ P g 1 m S P g 2 m S + Δ P g 2 m S P g 1 m S + Δ P g 1 m S P g n m S + Δ P g n m S T = R E 0 i B T T E 02 B T E 02 B T E 0 n B T E 02 B T
The unknown matrix R S E is determined using the singular value decomposition (SVD) algorithm. Subsequently, the unknown translation vector T S E is calculated via the least squares method. Since measurement errors at the target points substantially affect the overall calibration accuracy, it is essential to optimize the coordinates of the measurement points acquired by the 3D scanner.

3.1.2. Measurement Error Identification Method

To identify and minimize measurement errors at the target points, an adjustment optimization method is introduced in which prior geometric knowledge is used to construct distance and angular constraints among the target points. The constrained distances constructed by different target points and the spatial angle formed by two nonzero vectors are illustrated in Figure 3a and Figure 3b, respectively.
First, a coordinate measurement optimization model based on distance constraints is constructed to perform an initial correction of measurement errors at the target points. Let P m = x 1 m , y 1 m , , y n m , z n m and P r = x 1 r , y 1 r , , y n r , z n r represent the three-dimensional coordinate measurement values and theoretical values of the target points on the MBC obtained by the 3D scanner, respectively. Then, a virtual target point is determined by computing the centroid of three target points, and together with the coordinates of all points, distance constraints are constructed. The theoretical distance value L i j d between any two target points and their theoretical coordinate values has the following relationship:
x 1 r x 2 r 2 + y 1 r y 2 r 2 + z 1 r z 2 r 2 L 12 D 2 = 0 x i r x j r 2 + y i r y j r 2 + z i r z j r 2 L i j D 2 = 0 x n 1 r x n r 2 + y n 1 r y n r 2 + z n 1 r z n r 2 L n n 1 D 2 = 0
Assuming that
f d = x i r x j r 2 + y i r y j r 2 + z i r z j r 2 i , j = 1 , 2 , , n ; i j
Performing a Taylor expansion on the above equation and ignoring the terms of higher order than the second, we obtain the linearized equation of the distance constraint:
f d = L i j m + f d x i r δ x i m + f d y i r δ y i m + f d z i r δ z i m + f d x j r δ x j m + f d y j r δ y j m + f d z j r δ z j m
where L i j m = x i m x j m 2 + y i m y j m 2 + z i m z j m 2 , which is the measurement value of the distance between two target points; δ X m = δ x i m , δ y i m , δ z i m , δ x j m , δ y j m , δ z j m T is the correction vector for measurement errors of the two target points on site; f d x i r , f d y i r , f d z i r , f d x j r , f d y j r , f d z j r is the partial derivative:
f d x i r 0 = x j m x i m L i j m = Δ x i j 0 L i j m f d y i r 0 = y j m y i m L i j m = Δ y i j 0 L i j m f d z i r 0 = z j m z i m L i j m = Δ z i j 0 L i j m ,   f d x j r 0 = x i m x j m L i j m = Δ x i j 0 L i j m f d y j r 0 = y i m y j m L i j m = Δ y i j 0 L i j m f d z j r 0 = z i m z j m L i j m = Δ z i j 0 L i j m
Further, we obtain
Δ x i j 0 L i j m δ x i m + Δ y i j 0 L i j m δ y i m + Δ z i j 0 L i j m δ z i m Δ x i j 0 L i j m δ x j m Δ y i j 0 L i j m δ y j m Δ z i j 0 L i j m δ z j m + l i j D = 0
where l i j d = L i j m L i j d is the closure error.
Thus, the matrix form of Equation (12) can be organized as follows:
A d δ X m δ L d = 0
where A d is the coefficient matrix; δ L d = l 12 d , l 13 d , , l n 1 n d T is the closure error vector.
In Equation (13), there are 3n unknowns and n n 1 2 equations. According to the Lagrange multiplier method of conditional extremum, the objective function is obtained as
Φ d = δ X m T P d δ X m 2 K d T A d δ X m δ L d
where K d = k 1 d , k 2 d , , k n d T is the contact number vector.
By taking the first derivative of δ X m and making it zero, therefore, in conclusion, we can obtain the optimized results of the observed values:
X d = P m + δ X m
Subsequently, a coordinate measurement optimization model based on angular constraints is established to further enhance the correction of measurement errors at the target points. Based on the property that the angle between two vectors in Euclidean space is independent of the coordinate system, the angle values formed by the target points on the MBC are considered as the reference true values. These points are pre-calibrated using a high-precision CMM. The angular values derived from the measured coordinates on the MBC are then compared with the reference true to establish an angular error equation.
According to the angle information composed of the initial measurement points in the MBC, the arccosine function is given by
θ i = arccos a b a b
where θ i is the angle between the vectors a and b.
Next, the prior angle values are selected as the reference true values. The angular values derived from the field measurements of each target point are then subtracted from the reference true values to form the angular error equation. Taking angle θ a i as an example, we apply a Taylor expansion to Equation (16), neglecting second-order and higher-order terms, resulting in the linearized equation for the angular constraint:
θ ^ a i = θ a i 0 + θ a i x ˜ s 1 m D Δ x ˜ s 1 m D + θ a i y ˜ s 1 m D Δ y ˜ s 1 m D + θ a i z s 1 m D Δ z ˜ s 1 m D + + θ a i z ˜ s n m D Δ z ˜ s n m D
where Δ x ˜ s i m D , Δ y ˜ s i m D , Δ z ˜ s i m D represent the coordinate corrections of the initial measurement points, the nominal angle θ i is calibrated by CMM, and θ i 0 is the measured angle.
The angular error is characterized by the following mathematical expression, which quantifies the relationship between measurement variables and angular deviation:
v a i = θ ^ a i θ a i 0
To further facilitate analysis, the above equation is reformulated into a matrix form as follows:
W a = B a Δ X ˜ a d a
where Δ X ˜ a = Δ x ˜ s 1 m D , Δ y ˜ s 1 m D , , Δ z ˜ s n m D T denotes the vector of coordinate corrections; d a represents the angular error vector; B a is the coefficient matrix.
According to the principle of least squares adjustment, the normal equations are derived as follows:
B a T U a B a Δ X ˜ a = B a T U a d a
where U a is the weight matrix.
In the paper, the ridge estimation method [29] was employed to calculate the optimal parameters. As a result, after compensating for measurement errors, an optimized single virtual point P ˜ g i m S is derived from the adjusted measurement points.

3.2. Accurate Calibration of Robot Kinematics and Hand-Eye Parameters

Discrepancies between the theoretical and actual kinematic parameters of the robot lead to deviations in the end-effector’s pose. Additionally, the robot motion involved in the initial hand-eye calibration process introduces inevitable positioning errors, further compromising calibration accuracy. Measurement errors from the 3D scanner also impose constraints on improving calibration precision. To address these issues, this section presents a joint calibration method based on a stereo target, aiming to simultaneously compensate for both hand-eye and kinematic parameter errors. Specifically, a distance error model is derived for associating the parameter errors with the deviations in the measured coordinates of the single virtual point. By reducing the impact of 3D scanner measurement noise and kinematic deviations, the proposed approach significantly enhances the calibration accuracy of the integrated robot-scanner system.
Assume that P c m S = x c m S , y c m S , z c m S , 1 T represents the homogeneous coordinates of a measurement point in the SCS, and the corresponding measurement result in the BCS can be denoted as
P g i m B = H E i B H S E P g i m S
where H E i B and H S E represent the theoretical values of the robot end-effector pose matrix and the hand-eye transformation matrix, respectively.
Considering the influences of hand-eye calibration errors, robotic kinematic parameter deviations, and 3D scanner measurement errors, the actual position of the measurement point in the BCS can be expressed as
P c i r B = H E i B + Δ H E i B H S E + Δ H S E P c m S + Δ P c m S
where Δ P c m S = Δ x c m S , Δ y c m S , Δ z c m S , 0 T represents the 3D scanner measurement error, Δ H E i B denotes the robotic end-effector pose error, and Δ H S E corresponds to the hand-eye calibration error.
By subtracting Equation (22) from Equation (21) and neglecting higher-order terms beyond the second order, the deviation between the actual and measured values of the point in the BCS is obtained:
d P c B = H E i B H S E Δ P c m S + H E i B Δ H S E P c m S + Δ H E i B H S E P c m S
where P c m S = P c m S + Δ P c m S is the corrected coordinates of the scanned measurement point. The first term on the right-hand side can be expressed as
H E i B H S E Δ P c m S = H E i B H S E Δ x c m S , Δ y c m S , Δ z c m S , 0 T = L i S Δ T c m S
According to differential kinematics, the hand-eye relationship error can be expressed as
Δ H S E = H S E δ H S E
where δ H S E is the differential operator.
Thus, the second term on the right-hand side of Equation (23) can be simplified as
H E i B Δ H S E P c m S = H E i B H S E δ H S E P c m S = M i Δ S
Similarly, the error model for the robot end-effector pose can be expressed as
Δ H E i B H S E P c m S = N i Δ E = N i J i Δ X
where P c m B = H E i B H S E P c m S represents the coordinates of the corrected scan measurement point transformed into the robot base coordinate system, and Δ E i = J i Δ X represents the robot end-effector pose error model from Equation (4).
By neglecting second-order high-order terms, the hand-eye relationship model incorporating 3D scanner measurement errors and robot kinematic parameter errors is obtained as follows:
d P c B = L i S Δ T c m S + M i Δ S + N i J i Δ X = L i S Δ T c m S + G i Δ K
where G i = M i N i J i is the coefficient matrix corresponding to the i-th calibration pose of the robot, while Δ K = Δ S Δ X T is a vector comprising system parameter errors, including hand-eye parameter errors and robot kinematic parameter errors.
Building upon the preceding research, a system parameter identification model is further developed based on distance error analysis. In Euclidean space, the theoretical distance between two measured points should remain invariant across different coordinate systems. However, discrepancies arise between the theoretical and measured distances of two points in the BCS due to 3D scanner measurement errors, inaccuracies in hand-eye parameters, and deviations in robot kinematic parameters—collectively referred to as distance errors. In this paper, the distance error is defined as the difference between the calibrated distance and the measured distance between the center points of two spheres on the stereo target, as illustrated in Figure 4.
Let P c a r B = x c a r B , y c a r B , z c a r B and P c b r B = x c b r B , y c b r B , z c b r B represent the actual coordinates of points a and b on the stereo target in the BCS. Let P c a m B = x c a m B , y c a m B , z c a m B and P c b m B = x c b m B , y c b m B , z c b m B represent the measured coordinates of the corresponding points. Similarly, let l a b r B be the actual distance vector between the two points and l a b m B be the measured distance vector. The error vectors between the measured and actual values of the two points are denoted as d P c a B and d P c b B , respectively. Then, we have
l a b r B = P a b r B P a b r B l a b m B = P a b m B P a b m B d P c a B = P c a r B P c a m B d P c b B = P c b r B P c b m B
Thus, the distance error Δ l a b between the two points can be expressed as
Δ l a b = Δ l a b = l a b r B l a b m B
where Δ l a b is the distance error vector.
Then, we obtain
Δ l a b = P c a m B P c b m B T l a b m B × d P c b B d P c a B
Substituting Equation (28) into the above expression yields:
Δ l a b P c a m B P c b m B T l a b m B L b S Δ T c b m S L a S Δ T c a m S Δ l = P c a m B P c b m B T l a b m B G b G a Q Δ K
To refine the kinematic model and hand-eye transformation matrix for improved end-effector scanning accuracy, the position of the stereo target was varied, and the above process was repeated multiple times to establish a system of linear equations. The Levenberg-Marquardt (L-M) algorithm was then employed to solve for system parameter errors, resulting in a more accurate representation of the system.

4. Experimental and Discussion

To verify the effectiveness of the proposed calibration method, a robot-scanner experimental system was established, and corresponding validation experiments were conducted. As illustrated in Figure 5, the system mainly comprises a KUKA-KR industrial robot (KUKA Robotics, Augsburg, Germany) with a repeatability of 0.04 mm and a structured light 3D scanner. The scanner, a binocular fringe projection system from the LMI Gocator 3210 (LMI Technologies Inc., Burnaby, BC, Canada), has its key performance specifications listed in Table 1. The calibration experiments are presented were carried out in Section 4.1, while Section 4.2 assesses the calibration accuracy by measuring a metric artifact in accordance with VDI/VDE 2634 Part 3 [30]. All experiments were conducted in a stable laboratory environment, with the indoor temperature varying between 22 °C and 23 °C and the relative humidity ranging from 55% to 60%.
For precise calibration and performance evaluation, a customized multidimensional ball-based calibrator (MBC) with a length of 1000 mm was designed specifically for 3D scanner measurements. As shown in Figure 5, the calibrator is made of carbon fiber reinforced polymer (CFRP) and integrates eight matte stainless steel balls (MSBs). Each MSB has a diameter of 12.70 mm and a roundness deviation of less than 0.002 mm. The spatial relationships among all balls were pre-calibrated using a Zeiss PRISMO coordinate measuring machine (Carl Zeiss Industrielle Messtechnik GmbH, Oberkochen, Germany), a high-precision CMM widely used in industrial metrology. In this study, the maximum inter-sphere distance is 1006.863 mm. According to the manufacturer, the machine provides a length measurement error (MPE) of
E 0 = 0.9 + L / 350   μ m
where L is the measured length in millimeters. For our typical inter-sphere distances, this corresponds to a measurement error below 1.5 μm. A Type A uncertainty analysis based on repeated measurements, combined with Type B estimates (instrument specs and environmental control), leads to a conservative combined uncertainty of ±2 μm (k = 2) for each sphere center distance. These uncertainty estimates provide a reliable reference for evaluating the calibration accuracy of the proposed method.

4.1. Calibration Experiments

The accuracy of hand-eye calibration is known to be highly sensitive to the spatial distribution and diversity of the robot poses used during the calibration process. To ensure robust and accurate hand-eye calibration, we adopt the following pose selection strategy: First, we generate at least 12–15 poses with varied orientations and positions covering a large portion of the robot’s reachable workspace. Then, the selected poses avoid alignment along a single axis (collinearity) or within a single plane, thereby increasing the observability of rotational and translational parameters. Furthermore, each pose ensures that the 3D scanner maintains a consistent view of the calibration target, avoiding occlusion and maintaining sufficient overlap for accurate point cloud registration. Specifically, the range of end-effector orientations is selected to include rotations about different axes, and translational displacements cover different spatial quadrants.
To validate the effectiveness of this strategy, we conducted a contrast experiment in which three different pose configurations were tested, as shown in Figure 6. Specifically, in this study, we compare the radius error of 0.5-inch MSBs obtained using (i) collinear poses, (ii) planar poses, and (iii) spatially distributed poses, under the same conditions. The experimental steps are as follows: First, the hand-eye transformation matrix was estimated under three different robot poses using a calibration method based on a single sphere [18]. The resulting point cloud data of the measured MSB were then transformed into the robot coordinate frame. The radius of the MSB was reconstructed and compared with its pre-calibrated reference value to compute the radius error of the MSB for each pose, as summarized in Table 2. These results confirm that greater pose diversity significantly improves calibration accuracy and reduces numerical instability.
Based on the comprehensive analysis of robot pose diversity and its influence on calibration accuracy, we proceeded to apply the proposed calibration method using the optimized pose. The following section details the setup, procedure, and results of the calibration experiments conducted to validate the effectiveness and robustness of our method: First, an initial hand-eye calibration experiment was conducted following the method introduced in Section 3.1. A multidimensional ball-based calibrator was positioned within the robot’s workspace, and the robot was programmed to execute six translational movements and six orientation changes. To prevent singularities, the end-effector was translated along the X, Y, and Z axes of the robot’s base coordinate frame during the six translational movements. At each pose, the 3D scanner performed multiple measurements of the target spheres on the stereo target. Using the first six sets of measurements, the rotation matrix R S E was computed according to Equation (7). Subsequently, the translation vector T S E was determined from the remaining six sets of data, yielding the initial estimate of the hand-eye transformation matrix.
Subsequently, a high-precision hand-eye calibration experiment was performed. The MBC, mounted on a tripod, was successively placed at eight distinct vertical positions within the workspace. At each position, the robot’s end-effector executed eight unique orientation changes. A 3D scanner was employed to capture the target points and extract the coordinates of the sphere centers. Based on these measurements, a system parameter error identification model was constructed to estimate the kinematic parameter errors, as presented in Table 3. The refined hand-eye transformation matrix was then accurately determined through this high-precision calibration process, as shown below:
H S E = 0.0171 0.9998 0.0053 1.2874 0.9999 0.0173 0.0022 0.5842 0.0003 0.0056 1.0001 264.7846 0 0 0 1
The entire calibration procedure is highly automated and completes within 8 min, enabling efficient on-site recalibration of the system, when necessary, without interrupting standard workflow.

4.2. Accuracy Evaluation of Accurate Calibration

To evaluate the calibration accuracy, a standard scale was utilized, as shown in Figure 7. The scale primarily consists of several homogeneous and precision-grade ceramic spheres with an approximate diameter. Each standard ceramic sphere has a diameter of 10.010 mm and a roundness deviation of 0.001 mm. The center-to-center distance between two selected spheres was pre-calibrated using a coordinate measuring machine (CMM) [30], yielding a result of D15 = 299.546 mm.
Therefore, the errors between the measured and actual distances of sphere centers on the standard scale were computed at ten different spatial positions, using system parameters obtained both before and after calibration. The corresponding results are presented in Table 4. Following calibration, the maximum (MPE) and mean errors (ME) were reduced from 1.053/0.814 mm to 0.421/0.373 mm, respectively. These results satisfy the accuracy requirements for scanning critical and hard-to-reach features, thereby confirming the reliability and effectiveness of the proposed method.
To better understand the accuracy and robustness of the proposed calibration method, we performed a qualitative analysis of the major sources of error involved in the hand-eye and kinematic parameter calibration process. These error sources can be categorized into the following three aspects: (i) robot pose repeatability and mechanical uncertainty. Although the KUKA industrial robot employed in this study offers a high repeatability of ±0.04 mm, it is not immune to small deviations caused by joint backlash, compliance, or control jitter. These minor inconsistencies directly affect the transformation between the robot base and end-effector and propagate into the final calibration results; (ii) 3D scanner measurement error. The LMI Gocator 3 series structured-light scanner used in this study has a nominal accuracy better than 0.035 mm. However, in practical use, factors, such as surface reflectivity, ambient lighting, and sensor-to-target distance, may introduce fluctuations in the measured point cloud, resulting in uncertainty in the extracted geometric features; and (iii) pose selection and geometric observability.
In summary, while sensor noise and sphere fitting errors primarily affect the input data quality, the pose distribution and optimization formulation govern the observability and convergence of the calibration. These factors jointly determine the final calibration accuracy.
To further validate the effectiveness of the proposed calibration method, mean sphere spacing errors at multiple positions were evaluated using the methods proposed by Ren [18], Mu [25], and this study, respectively, as shown in Figure 8. When using Ren’s method, the mean sphere spacing errors at positions 1 to 3 were 0.627 mm, 0.598 mm, and 0.601 mm, respectively. In comparison, Mu’s method yielded average errors of 0.536 mm, 0.501 mm, and 0.457 mm for the same positions. By contrast, the proposed method achieved lower mean errors of 0.385 mm, 0.322 mm, and 0.307 mm at positions 1 to 3. These experimental results demonstrate the superior accuracy of the calibration technique developed in this study.

5. Conclusions

In this study, a flexible measurement system integrating a fringe-projection 3D scanner with an industrial robot was developed to enable efficient measurement of large-scale surfaces. To improve overall measurement accuracy, we proposed an enhanced calibration method employing a multidimensional ball-based calibrator to simultaneously compensate for hand-eye transformation and robot kinematic errors. Calibration and validation experiments were carried out, and the results indicate that the maximum and average measurement errors were reduced from 1.053 mm and 0.814 mm to 0.421 mm and 0.373 mm, respectively, thereby confirming the effectiveness of the proposed method. The entire calibration workflow is highly automated and can be executed in less than 8 min. This facilitates practical on-site recalibration during system deployment.
Future research will further investigate the pose selection strategy, including experimental evaluations to assess how pose diversity influences calibration accuracy. In addition, a detailed analysis of potential error sources will be conducted. Efforts will also be made to integrate the current system with an automated guided vehicle (AGV) platform, enabling initial measurement trials on large-scale industrial parts. Particular emphasis will be placed on quantitatively validating the measurement accuracy for these large components under real-world conditions.

Author Contributions

Conceptualization, Z.Z.; methodology, Z.Z. and X.S.; software, J.S.; validation, Z.Z., Y.L. and X.Z.; formal analysis, Z.Z. and D.Z.; investigation, Z.Z. and H.L.; resources, Z.Z.; writing—original draft preparation, Z.Z.; writing—review and editing, X.S.; visualization, X.Z.; supervision, X.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Natural Science Foundation of Shandong Province (Grant No. ZR2024QE152 and ZR2024QE386) and the Science and Technology Smes Innovation Ability Improvement Project of Shandong Province (Grant No. 2024TSGC0829 and 2023TSGC0459) and the Scientific Research of Linyi University (Grant No. Z6124007) and the Youth Entrepreneurship Technology Support Program for Higher Education Institutions of Shandong Province (Grant No. 2023KJ215).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available upon request from the corresponding authors.

Acknowledgments

During the preparation of this study, the authors used ChatGPT4.0 solely for language editing and formatting in some sentences, without involvement in core ideas, data analysis, conclusions, or scientific writing. The authors have reviewed and edited the output and take full responsibility for the content of this publication. Furthermore, we would like to express our gratitude to Liu Wei from the School of Mechanical Engineering at Dalian University of Technology for his guidance.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Du, P.; Duan, Z.; Zhang, J.; Zhao, W.; Lai, E. The design and implementation of a dynamic measurement system for a large gear rotation angle based on an extended visual field. Sensors 2025, 25, 3576. [Google Scholar] [CrossRef]
  2. Sun, B.; Zhu, J.; Yang, L.; Yang, S.; Guo, Y. Sensor for in-motion continuous 3D shape measurement based on dual line-scan cameras. Sensors 2016, 16, 1949. [Google Scholar] [CrossRef]
  3. Du, H.; Chen, X.; Xi, J.; Yu, C.; Zhao, B. Development and verification of a novel robot-integrated fringe projection 3D scanning system for large-scale metrology. Sensors 2017, 17, 2886. [Google Scholar] [CrossRef]
  4. Yau, H.T.; Menq, C.H. An automated dimensional inspection environment for manufactured parts using coordinate measuring machines. Int. J. Prod. Res. 2010, 30, 1517–1536. [Google Scholar] [CrossRef]
  5. Sam, V.; Dirckx, J. Real-time structured light profilometry: A review. Opt. Lasers Eng. 2016, 87, 18–31. [Google Scholar] [CrossRef]
  6. Wu, D.; Chen, T.; Li, A. A high precision approach to calibrate a structured light vision sensor in a robot-based three-dimensional measurement system. Sensors 2016, 16, 1388. [Google Scholar] [CrossRef]
  7. Atif, M.; Lee, S. FPGA based adaptive rate and manifold pattern projection for structured light 3D camera system. Sensors 2018, 18, 1139. [Google Scholar] [CrossRef]
  8. Bi, C.; Fang, J.; Li, K.; Guo, Z. Extrinsic calibration of a laser displacement sensor in a non-contact coordinate measuring machine. Chin. J. Aeronaut. 2017, 30, 1528–1537. [Google Scholar] [CrossRef]
  9. Wagner, M.; Heß, P.; Reitelshöfer, S.; Franke, J. Self-calibration method for a robotic based 3D scanning system. In Proceedings of the 2015 IEEE 20th Conference on Emerging Technologies Factory Automation (ETFA), Luxembourg, 8–11 September 2015; pp. 1–6. [Google Scholar]
  10. Idrobo-Pizo, G.A.; Motta, J.M.S.T.; Sampaio, R.C. A Calibration Method for a Laser Triangulation Scanner Mounted on a Robot Arm for Surface Mapping. Sensors 2019, 19, 1783. [Google Scholar] [CrossRef]
  11. Shiu, Y.C.; Ahmad, S. Calibration of wrist-mounted robotic sensors by solving homogeneous transform equations of the form AX=XB. Robot. Autom. IEEE Trans. 1989, 5, 16–29. [Google Scholar] [CrossRef]
  12. Zhuang, H.; Roth, Z.S.; Sudhakar, R. Simultaneous robot/world and tool/flange calibration by solving homogeneous transformation equations of the form AX=YB. IEEE Trans. Robot. Autom. 1994, 10, 549–554. [Google Scholar] [CrossRef]
  13. Dornaika, F.; Horaud, R. Simultaneous robot-world and hand-eye calibration. IEEE Trans Robot. Autom. 1998, 14, 617–622. [Google Scholar] [CrossRef]
  14. Sung, H.; Lee, S.; Kim, D. A robot-camera hand/eye self-calibration system using a planar target. In Proceedings of the IEEE International Symposium on Robotics, Seoul, Republic of Korea, 24–26 October 2013. [Google Scholar]
  15. Wang, Z.; Fan, J.; Jing, F.; Deng, S.; Zheng, M.; Tan, M. An efficient calibration method of line structured light vision sensor in robotic eye-in-hand system. IEEE Sens. J. 2020, 20, 6200–6208. [Google Scholar] [CrossRef]
  16. Pavlovi, U.; Arko, P.; Jezerek, M. Simultaneous hand-eye and intrinsic calibration of a laser profilometer mounted on a robot arm. Sensors 2021, 21, 1037. [Google Scholar] [CrossRef]
  17. Xu, H.; Wang, Y.; Wei, C. A self-calibration approach to hand-eye relation using a single point. In Proceedings of the International Conference on Information and Automation, Changsha, China, 20–23 June 2008. [Google Scholar]
  18. Ren, Y.; Yin, S.B.; Zhu, J. Calibration technology in application of robot-laser scanning system. Opt. Eng. 2012, 51, 114204. [Google Scholar] [CrossRef]
  19. Liu, S.; Wang, G. Simultaneous calibration of camera and hand eye in laser vision robot welding. J. South China Univ. Technol. 2008, 36, 74–77. [Google Scholar]
  20. De Ma, S. A self-calibration technique for active vision systems. IEEE Trans. Robot. Autom. 1996, 12, 114–120. [Google Scholar] [CrossRef]
  21. Qi, Y.; Jing, F.; Tan, M. Line-feature-based calibration method of structured light plane parameters for robot hand-eye system. Opt. Eng. 2013, 52, 7202. [Google Scholar] [CrossRef]
  22. Song, Z.; Sun, C.L.; Sun, Y.Q.; Qi, L. Robotic hand-eye calibration method using arbitrary targets based on refined two-step registration. Sensors 2025, 25, 2976. [Google Scholar] [CrossRef]
  23. Yin, S.; Ren, Y.; Guo, Y.; Zhu, J.; Yang, S.; Ye, S. Development and calibration of an integrated 3D scanning system for high-accuracy large-scale metrology. Measurement 2014, 54, 65–76. [Google Scholar] [CrossRef]
  24. Li, A.; Ma, Z. Calibration for robot-based measuring system. Control. Theory Appl. 2010, 27, 663–667. [Google Scholar]
  25. Nan, M.; Wang, K.; Xie, Z. Calibration of a flexible measurement system based on industrial articulated robot and structured light sensor. Opt. Eng. 2017, 56, 054103. [Google Scholar] [CrossRef]
  26. Duan, S.Y.; Zhou, Z.L.; Sun, X.M.; Liu, S.J.; Zhang, D.B.; Shangguan, J.Y. Enhanced calibration method for robotic flexible 3D scanning system. In Proceedings of the 40th Annual Youth Academic Conference of Chinese Association of Automation (YAC), Zhengzhou, China, 17–19 May 2025. [Google Scholar]
  27. Hayati, S.; Mirmirani, M. Improving the absolute positioning accuracy of robot manipulators. J. Field Robot. 1985, 2, 397–413. [Google Scholar] [CrossRef]
  28. Paul, R.P. Robot Manipulators: Mathematics, Programming and Contro, 1st ed.; The MIT Press: Cambridge, MA, USA; London, UK, 1981; pp. 85–118. [Google Scholar]
  29. Hansen, P.C. Analysis of discrete ill-posed problems by means of the L-curve. SIAM Rev. 1992, 34, 561–580. [Google Scholar] [CrossRef]
  30. VDI/VDE Innovation + Technik GmbH. Optical 3D-Measuring Systems-Part 3: Multiple View Systems Based on Area Scanning; VDI/VDE: Berlin, Germany, 2008. [Google Scholar]
Figure 1. Schematic of robotic flexible 3D scanning system.
Figure 1. Schematic of robotic flexible 3D scanning system.
Sensors 25 04661 g001
Figure 2. Diagram of the single virtual point.
Figure 2. Diagram of the single virtual point.
Sensors 25 04661 g002
Figure 3. Diagram of prior geometric knowledge: (a) diagram of constrained distances constructed by target points and (b) diagram of a spatial angle formed by two nonzero vectors.
Figure 3. Diagram of prior geometric knowledge: (a) diagram of constrained distances constructed by target points and (b) diagram of a spatial angle formed by two nonzero vectors.
Sensors 25 04661 g003
Figure 4. Schematic of distance error.
Figure 4. Schematic of distance error.
Sensors 25 04661 g004
Figure 5. Proposed flexible 3D scanning system.
Figure 5. Proposed flexible 3D scanning system.
Sensors 25 04661 g005
Figure 6. Experiment on the impact of robot posture on calibration accuracy.
Figure 6. Experiment on the impact of robot posture on calibration accuracy.
Sensors 25 04661 g006
Figure 7. Standard scale used for calibration accuracy verification.
Figure 7. Standard scale used for calibration accuracy verification.
Sensors 25 04661 g007
Figure 8. Mean sphere spacing errors at different locations.
Figure 8. Mean sphere spacing errors at different locations.
Sensors 25 04661 g008
Table 1. Main parameters of LMI 3D scanner.
Table 1. Main parameters of LMI 3D scanner.
ParameterValueParameterValue
Field of view (mm)71 × 98~100 × 154Measuring distance (mm)165
Scanning speed (HZ)4Resolution in XY direction (µm)60~90
VDE accuracy (mm)0.035Optical sourceBlue LED light
Table 2. Measurement result of MSB (unit: mm).
Table 2. Measurement result of MSB (unit: mm).
Pose ConfigurationAverage RadiusAverage Radius Error
Collinear poses7.1120.756
Planar poses6.9930.637
Spatially distributed poses6.3970.412
Table 3. Identification results of robot parameter errors.
Table 3. Identification results of robot parameter errors.
Link No. Δ θ i Δ d i /mm Δ a i /mm Δ α i Δ β i
10.02060.45390.14280.0074
2−0.0169−0.26320.00220.0079
30.0309−0.21680.0508−0.0127
4−0.04070.06870.02680.0014
50.0274−0.0832−0.0034−0.0137
6−0.02370.10410.03780.0293
Table 4. Sphere spacing errors before and after calibration (unit: mm).
Table 4. Sphere spacing errors before and after calibration (unit: mm).
No.12345MPEME
Before0.7620.9010.8260.7750.7741.0530.814
678910
1.0530.7750.8130.7280.735
No.12345MPEME
After0.3370.4070.4170.3840.3420.4210.373
678910
0.4210.3630.3740.3510.341
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhou, Z.; Shangguan, J.; Sun, X.; Liu, Y.; Zhang, X.; Zhang, D.; Liu, H. Enhanced Calibration Method for Robotic Flexible 3D Scanning System. Sensors 2025, 25, 4661. https://doi.org/10.3390/s25154661

AMA Style

Zhou Z, Shangguan J, Sun X, Liu Y, Zhang X, Zhang D, Liu H. Enhanced Calibration Method for Robotic Flexible 3D Scanning System. Sensors. 2025; 25(15):4661. https://doi.org/10.3390/s25154661

Chicago/Turabian Style

Zhou, Zhilong, Jinyong Shangguan, Xuemei Sun, Yunlong Liu, Xu Zhang, Dengbo Zhang, and Haoran Liu. 2025. "Enhanced Calibration Method for Robotic Flexible 3D Scanning System" Sensors 25, no. 15: 4661. https://doi.org/10.3390/s25154661

APA Style

Zhou, Z., Shangguan, J., Sun, X., Liu, Y., Zhang, X., Zhang, D., & Liu, H. (2025). Enhanced Calibration Method for Robotic Flexible 3D Scanning System. Sensors, 25(15), 4661. https://doi.org/10.3390/s25154661

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop