Next Article in Journal
Research Advances in Marine Aquaculture Net-Cleaning Robots
Previous Article in Journal
Design, Fabrication and Characterization of Disk Resonator Gyroscope with Vibration and Shock Resistance
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Position-Constrained Calibration Compensation for Hand–Eye Calibration in Industrial Robots

Department of Mechanical Engineering, Tsinghua University, Beijing 100084, China
*
Authors to whom correspondence should be addressed.
Sensors 2024, 24(23), 7554; https://doi.org/10.3390/s24237554
Submission received: 13 September 2024 / Revised: 13 November 2024 / Accepted: 19 November 2024 / Published: 26 November 2024
(This article belongs to the Section Optical Sensors)

Abstract

:
The hand–eye calibration of laser profilers and industrial robots is a critical component of the laser vision system in welding applications. To improve calibration accuracy and efficiency, this study proposes a position-constrained calibration compensation algorithm aimed at optimizing the hand–eye transformation matrix. Initially, the laser profiler is mounted on the robot and used to scan a standard sphere from various poses to obtain the theoretical center coordinates of the sphere, which are then utilized to compute the hand–eye transformation matrix. Subsequently, the positional data of the standard sphere’s surface are collected at different poses using the welding gun tip mounted on the robot, allowing for the fitting of the sphere’s center coordinates as calibration values. Finally, by minimizing the error between the theoretical and calibrated sphere center coordinates, the optimal hand–eye transformation matrix is derived. Experimental results demonstrate that, following error compensation, the average distance error in hand–eye calibration decreased from 4.5731 mm to 0.7069 mm, indicating that the proposed calibration method is both reliable and effective.

1. Introduction

Robotic welding technology is essential in modern manufacturing, particularly in the automotive, shipbuilding, and pressure vessel industries, where automated welding is in high demand [1,2,3]. The line-structured light sensor, valued for its non-contact measurement, high speed, accuracy, and stability, has emerged as the preferred tool for automated robotic welding [4,5]. A prevalent measurement technique involves affixing a line-structured laser sensor to the end effector of a robot. This method capitalizes on the robot’s inherent flexibility, allowing for a wider range of measurements than a sensor that is stationary [6].
Upon the initial installation of the sensor on the robot flange, it is essential to ascertain the relative pose relationship between the sensor and the robot’s end effector, a process referred to as hand–eye calibration [7]. The accuracy of hand–eye calibration directly impacts the precision of the sensor’s measurements in the robot’s coordinate system, which is crucial for high-precision tasks such as weld seam detection and tracking. The standard approach of hand–eye calibration is to formulate a matrix equation derived from the robot’s kinematic model, employing calibration objects as constraints to determine the transformation matrix that relates the sensor coordinate system to the end effector [8,9,10,11,12,13]. However, during hand–eye calibration, error accumulation occurs due to sensor calibration errors, geometric errors and motion errors of the robotic arm, and deformation errors caused by load gravity. Therefore, the calibration scheme requires urgent optimization.
The accuracy of hand–eye calibration directly impacts the precision of the entire robotic vision system. This challenge has led many experts and scholars to explore algorithms and models to effectively improve the accuracy of robot hand–eye calibration. Shiu and Ahmad [14] initially converted hand–eye calibration into the problem of solving the rotation matrix, as well as solving the nonlinear homogeneous equation AX = BX. However, since the line-structured light sensor outputs two-dimensional point cloud data, Huang et al. [15] and Yang et al. [16] used the center of a stationary standard sphere as a constraint to establish and solve the matrix equation of the form AX = B instead of AX = XB. An et al. [17] proposed a combined optimization method to address the impact of random errors during hand–eye calibration and applied it to the hand–eye calibration process, However, this method did not consider the motion and geometric errors of the robotic arm, nor did it include the installation of an end effector. Cao et al. [18] proposed a simultaneous calibration method for the robot hand–eye relationship and kinematics using a line-structured light sensor. They utilized two standard spheres with known center distances for calibration, reducing the average distance error from 3.967 mm to 1.001 mm. Still, this method did not account for non-kinematic errors caused by load, and the calibration model and process were complex and inefficient. Furthermore, existing hand–eye calibration algorithms do not account for cumulative errors, rendering them inadequate for weld seam detection requirements.
To address these challenges, this paper proposes a compensation algorithm based on position constraints to optimize the calibration model parameters. An experimental platform is constructed to verify the correctness and effectiveness of the proposed methods. This paper is organized as follows: Section 2 introduces the principles of hand–eye calibration and error compensation techniques. Section 3 describes the industrial robot measurement system and presents the experimental results, including a comprehensive calibration process, error compensation procedures, and verification of the optimized hand–eye transformation matrix. Finally, Section 4 provides a summary of the paper and presents the concluding remarks.

2. Hand–Eye Calibration of the Line Structure Light Sensor

Using the 3D measurement data from the line-structured light sensor, the trajectory of the robotic arm equipped with a welding torch can be planned to achieve automated welding. The key to this technology is accurately calibrating the rigid body transformation from the coordinate system O C X C Y C Z C to the robotic arm tool coordinate system O T O X T O Y T O Z T O , i.e., calibrating the hand–eye matrices for the sensor used in industrial robots.

2.1. Principle of the Hand–Eye Calibration Algorithm Based on Spherical Constraints

For an automated welding system, hand–eye calibration is the process of calibrating the relative coordinate transformation between the welding torch tip and the camera. A standard sphere fixed in the robot’s workspace is used as the calibration target. When the laser line emitted by the sensor projects onto the surface of the standard sphere, the intersection forms a circle, and the laser line creates a circular arc, as shown in Figure 1.
Since the circle’s center C c lies in the light plane, its coordinates in the O C X C Y C Z C system can be represented as x c c , 0 , z c c . By extracting the arc points and fitting the circle, the values of x c c , z c c , and the radius r can be obtained. Using the Pythagorean theorem and considering the geometric properties of the sphere, the coordinates of the sphere’s center C s in the O C X C Y C Z C system can be expressed as x c s , y c s , z c s as follows [17]:
x c s = x c c y c s = ± ρ 2 r 2 z c s = z c c
Here, ρ is the radius of the sphere, and the sign of y c s is determined by the relative position of the line-structured light sensor and the standard sphere.
By keeping the sphere’s center fixed and changing the sensor’s position and orientation multiple times, the coordinates of the sphere center in the O C X C Y C Z C   system are obtained as P c s k x c s k , y c s k , z c s k , where k denotes the k-th measurement state of the robotic arm at different positions and orientations. According to the rigid body transformation, the relationship between the sphere center’s position in the robotic arm base coordinate system O B X B Y B Z B and P c s k is given as follows:
P f b 1 = R b k T b k 0 1 R t o T t o 0 1 P c s k 1
Here, R b k (3 × 3) and T b k (3 × 1) are determined by the robotic arm’s calibration algorithm and are considered known. R t o 3 × 3 and T t o (3 × 1) are the hand–eye matrices to be solved.
First, solve for the rotation matrix R t o , ince the robotic arm only undergoes translational motion and does not change its orientation between two positions, the rotation matrix R b k remains constant [17], i.e., R b 1 = R b 2 . Based on this, the following equation can be derived:
R t o P c s 1 P c s 2 = R b 1 1 T b 2 T b 1 = R b 1 T T b 2 T b 1
Here, R b 1 1 and R b 1 T are the inverse and transpose of R b 1 , respectively. Since R b 1 is an orthogonal matrix, R b 1 1 = R b 1 T .
After moving the robotic arm through n t n t 4 translations and measuring P c s k , n t − 1 equations can be established from Equation (3):
R t o A = B ,
A = P c s 1 P c s 2 P c s 1 P c s 3 P c s 1 P c s n t ,
B = R b 1 T T b 2 T b 1 R b 1 T T b 3 T b 1 R b 1 T T b n t T b 1 ·
To obtain the optimal solution for the rotation matrix R t o , following the idea of the Iterative Closest Point (ICP) registration algorithm [19], let H I C P = A B T . Performing SVD [20] on H I C P yields H I C P = U I C P Σ I C P V I C P T . The optimal rotation matrix based on the measurement results is then the following:
R t o = V I C P U I C P T
The translation vector T t o can be obtained by changing the position and orientation of the robotic arm through arbitrary rotations and translations. From Equation (2), the following can be derived:
R b 1 R b 2 T t o = R b 2 R t o P c s 2 R b 1 R t o P c s 1 + T b 2 T b 1
By changing the robotic arm’s position and orientation n r n r 4 times and measuring P c s k , the following can be obtained:
C T t o = D ,
C   =   R b 1 R b 2 R b 1 R b 3 R b 1 R b n r ,
D   =   R b 2 R t o P c s 2 R b 1 R t o P c s 1 + T b 2 T b 1 R b 3 R t o P c s 3 R b 1 R t o P c s 1 + T b 3 T b 1 R b n r R t o P c s n r R b 1 R t o P c s 1 + T b n r T b 1 ·
Since the coefficient matrix C is known, the optimal translation vector can be obtained using the least squares method:
T t o = C T C 1 C T D

2.2. Position-Constrained Calibration Compensation Algorithm

The aforementioned hand–eye calibration algorithm inevitably encounters errors due to several factors: (a) When solving for the hand–eye parameters, the calibration equations are established based on motion constraints, assuming that the sensor detection data are unbiased. However, errors exist in the sensor’s intrinsic calibration. (b) The robotic arm itself has geometric and motion errors, and relying solely on theoretical matrix decomposition has limitations. (c) The robotic arm, equipped with a welding torch and the sensor, undergoes deformation due to gravity, resulting in deformation errors. These errors contribute to inaccuracies in the theoretical matrix decomposition process, making it challenging to ensure the precision of experimental calibration. To address this issue, a hand–eye calibration error compensation algorithm is proposed:
Step 1: Use the hand–eye parameters R t o and T t o obtained from the hand–eye calibration algorithm described in Section 2.1 as the initial solution.
Step 2: Control the robotic arm, equipped with the sensor and welding torch, to collect positional information on the standard sphere’s surface points from various orientations at multiple poses. Fit the coordinates of C s in the O B X B Y B Z B system to obtain P f b , which represents the true position of C s in the O B X B Y B Z B system.
Step 3: Substitute the sphere center position P c s k measured at the k-th pose and the pose matrices R b k and T b k relative to the O B X B Y B Z B system into Equation (2) to obtain the theoretical position P ^ f b of C s in the O B X B Y B Z B system.
Step 4: Based on the principle of minimizing the error between the theoretical sphere center coordinates and the true sphere center coordinates, establish the objective function of Equation (9). Use the LM algorithm to iteratively solve Equation (9), optimizing the hand–eye parameters R t o and T t o of the robot vision-guided system until the results converge.
m i n k = 1 n c o m P f b P ^ f b 2
where n c o m is the total number of times the sensor measures the sphere center when obtaining the initial hand–eye parameters, which is the sum of the number of translations n t and the number of arbitrary pose changes n r of the robotic arm.

3. Experimental Verification and Results Analysis

3.1. Experimental Setup

The line-structured light sensor and the robotic arm used in this system is shown in Figure 2. The camera is a board-level CMOS camera, model MV-CB016-10GM-C from Hikvision (Hangzhou, China), with a resolution of 1440 × 1080 and a pixel size of 3.45 μm × 3.45 μm. It is equipped with a fixed-focus lens with a focal length of 12 mm. Considering the ambient light conditions in the application environment, a laser emitter with a wavelength of 658 ± 5 nm was selected, and a narrow-band filter matching this wavelength range was installed in front of the lens. The robotic arm used is a KUKA KR5 R1400 (Augsburg, Bavaria, Germany), a six-axis industrial robot.

3.2. Hand–Eye Calibration Experiment

In the hand–eye calibration experiment, a calibration tool—hereinafter referred to as the “simulated welding torch”—was employed to replicate the operation of an actual welding torch. A standard stainless-steel ball, with a diameter of 40.00 mm ± 0.01 mm, was securely positioned within the workspace of the robotic arm’s vision guidance system. According to the calibration procedure detailed in Section 2.1, the robotic arm was first controlled to measure the standard ball 10 times using the line-structured light sensor, restricting movement to translation only. Subsequently, the robotic arm measured the standard ball another 10 times using the line-structured light sensor from arbitrary poses [17]. During these measurements, to prevent singularities in the calibration algorithm, variations occur along the X B , Y B , and Z B axes of the base coordinate system were ensured when translating the robotic arm. When the robotic arm adopted arbitrary poses, all six joints exhibited variation. Finally, the hand–eye matrices R t o and T t o were determined:
R t o = 0.0383 0.0495 0.9980 0.0641 0.9968 0.0470 0.9972 0.0622 0.0414 , T t o = 162.085 173.945 60.7321 ·
After completing the above steps, the robotic arm’s pose was adjusted to measure the positions of three points using the line-structured light sensor from ten different viewpoints. The deviation between the sensor’s measured positions and the actual positions was calculated. The average deviations were 4.3132 mm for point 1, 4.5731 mm for point 2, and 4.0331 mm for point 3, as shown in Table 1. These values are significantly higher than the requirement for industrial robot vision guidance systems for weld tracking, which is within 1.5 mm, indicating the need for further optimization.
To address the insufficient accuracy, 60 sets of standard sphere surface points were randomly collected within the robot’s workspace. A microcontroller connected the robotic arm and the standard sphere to ensure accuracy. The robotic arm touching the standard sphere served as a trigger signal to collect the real-time position of the mock welding torch tip at the end of the robotic arm, as shown in Figure 3. The collected sphere points were used to fit the sphere expression, thereby identifying the accurate position of the sphere center.
Following the hand–eye calibration compensation algorithm based on position constraints described in Section 2.2, the optimized hand–eye parameters R t and T t were obtained. The positions of the three points were subsequently remeasured using a line-structured light sensor from ten consistent viewpoints during the calibration process. Table 1 presents the average errors of the feature points before and after error compensation. The results revealed that the average deviation was 0.8183 mm for point 1, 0.7069 mm for point 2, and 0.6371 mm for point 3. These deviations are well below 1.5 mm, meeting the requirements for weld seam tracking.
To verify the accuracy of the proposed algorithm for detecting weld seams and guide welding in the thick plate welding, an S-shaped trajectory V-groove welding piece was designed and manufactured, as shown in Figure 4. This V-groove welding piece had a thickness of 60 mm, a length of 300 mm, and a groove angle of 45°.
By comparing all V-groove weld seam feature points detected by the sensor during tracking with the actual weld seam trajectory, as illustrated in Figure 5, the average absolute trajectory error was determined. Figure 6 presents the deviations between the measured and actual values of the trajectories for the three feature points across ten measurement sessions. The average errors for the weld seam trajectory were 0.4567 mm for feature point 1, 0.6374 mm for feature point 2, and 0.4856 mm for feature point 3, as detailed in Table 2. These errors meet the application requirements for effective weld seam tracking.

3.3. Three-Dimensional Reconstruction Accuracy Verification

To further validate the generalizability of the algorithm, a regular boss with a dimensional accuracy of 0.01 mm was designed as the evaluation object. The calibrated robot and the laser vision sensor were then employed to perform the 3D reconstruction. Different numerical labels were assigned to each plane of the boss, as depicted in Figure 7. The results of scanning and reconstructing the boss using the line-structured light sensor are shown in Figure 8a. The point cloud quality is high and meets the evaluation requirements. Additionally, the point cloud of the boss was segmented into planes, each assigned different colors to distinguish the planes corresponding to the numerical labels, as shown in Figure 8b.
To quantitatively evaluate the reconstruction accuracy, both length and angle measurements were considered. The four slanted edges of the boss were used as the length evaluation standard. According to the design dimensions in Figure 7, the theoretical lengths of the slanted edges L 34 , L 45 , L 56 , and L 63 are 50 mm. The dihedral angles between plane 2 and the four slanted surfaces were used as the angle evaluation standard, with theoretical values of θ 23 , θ 24 , θ 25 , and θ 26 being 53.13°. After segmenting the reconstructed point cloud of the boss into planes and performing geometric calculations, the deviations between the point cloud data and the theoretical data were obtained, as shown in Table 3. The average length deviation was 0.5071 mm, and the average angle deviation is 0.2145°, both of which meet the application requirements for weld seam tracking.

4. Conclusions

A comprehensive and effective calibration scheme was proposed for a line-structured light sensor based on a constant-focus optical path. By introducing a two-dimensional tilt angle, a more suitable inclined camera imaging model for the line-structured light sensor was established. The detailed process of obtaining initial values and performing nonlinear optimization of the model parameters was thoroughly described. Using a dual-step target as a marker, numerous non-collinear points on the light plane were quickly and conveniently obtained, and the light plane equation in the ideal camera coordinate system was accurately fitted. An integrated automatic calibration device was designed to efficiently complete the calibration experiments of the vision sensor while adhering to imaging constraints.
The experimental results demonstrate that the average distance error of the hand–eye calibration after error compensation decreased from 4.5731 mm to 0.7069 mm. This indicates that the proposed calibration method ensures high detection accuracy and repeatability for the line-structured light sensor. Additionally, the effectiveness of the calibration method in reducing the impact of hand–eye calibration errors for industrial robots was validated.

Author Contributions

J.L.: Conceptualization, Methodology, Formal analysis, Software, Writing—Review and Editing, Supervision, Project administration. W.R.: Methodology, Investigation, Writing—original draft. Y.F.: Conceptualization, Investigation, Validation. J.F.: Resources. J.Z.: Resources, Supervision, Funding acquisition, Project administration. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Tsinghua University (Department of Mechanical Engineering), Shenzhen Chuangxin Laser Co., Ltd. Laser Advanced Manufacturing Joint Research Center Fund (20232910019).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Pietraszkiewicz, W.; Konopińska, V. Junctions in shell structures: A review. Thin-Walled Struct. 2015, 95, 310–334. [Google Scholar] [CrossRef]
  2. Wu, Q.; Li, Z.; Gao, C.; Biao, W.; Shen, G. Research on welding guidance system of intelligent perception for steel weldment. IEEE Sens. J. 2023, 23, 5220–5231. [Google Scholar] [CrossRef]
  3. Wang, X.; Zhou, X.; Xia, Z.; Gu, X. A survey of welding robot intelligent path optimization. J. Manuf. Process. 2021, 63, 14–23. [Google Scholar] [CrossRef]
  4. Guan, T. Research on the application of robot welding technology in modern architecture. Int. J. Syst. Assur. Eng. Manag. 2023, 14, 681–690. [Google Scholar] [CrossRef]
  5. Tran, C.C.; Lin, C.Y. An intelligent path planning of welding robot based on multisensor interaction. IEEE Sens. J. 2023, 23, 8591–8604. [Google Scholar] [CrossRef]
  6. Li, M.; Du, Z.; Ma, X.; Dong, W.; Gao, Y. A robot hand-eye calibration method of line laser sensor based on 3D reconstruction. Robot. Comput.-Integr. Manuf. 2021, 71, 102136. [Google Scholar] [CrossRef]
  7. Zhang, L.; Zhang, J.; Jiang, X.; Liang, B. Error correctable hand–eye Calibration for stripe-laser vision-guided robotics. IEEE Trans. Instrum. Meas. 2020, 69, 10. [Google Scholar] [CrossRef]
  8. Wu, Q.; Qiu, J.; Li, Z.; Liu, J.; Wang, B. Hand-Eye Calibration Method of Line Structured Light Vision Sensor Robot Based on Planar Target. Laser Optoelectron. Prog. 2023, 60, 1015002. [Google Scholar]
  9. Pavlovčič, U.; Arko, P.; Jezeršek, M. Simultaneous Hand–Eye and Intrinsic Calibration of a Laser Profilometer Mounted on a Robot Arm. Sensors 2021, 21, 1037. [Google Scholar] [CrossRef] [PubMed]
  10. Xiao, R.; Xu, Y.; Hou, Z.; Chen, C.; Chen, S. An automatic calibration algorithm for laser vision sensor in robotic autonomous welding system. J. Intell. Manuf. 2021, 33, 1419–1432. [Google Scholar] [CrossRef]
  11. Xu, J.; Li, Q.; White, B. A novel hand-eye calibration method for industrial robot and line laser vision sensor. Sens. Rev. 2023, 43, 259–265. [Google Scholar] [CrossRef]
  12. Zhong, K.; Lin, J.; Gong, T.; Zhang, X.; Wang, N. Hand-eye calibration method for a line structured light robot vision system based on a single planar constraint. Robot. Comput.-Integr. Manuf. 2025, 91, 102825. [Google Scholar] [CrossRef]
  13. Xu, J.; Hoo, J.L.; Dritsas, S.; Fernandez, J.G. Hand-eye calibration for 2D laser profile scanners using straight edges of common objects. Robot. Comput.-Integr. Manuf. 2022, 73, 102221. [Google Scholar] [CrossRef]
  14. Shiu, Y.C.; Ahmad, S. Calibration of wrist-mounted robotic sensors by solving homogeneous transform equations of the form AX=XB. IEEE Trans. Robot. Autom. 1987, 5, 16–29. [Google Scholar] [CrossRef]
  15. Jia, H.; Zhu, J.; Yi, W. Calibration for 3d profile measurement robot with laser line-scan sensor. Chin. J. Sens. Actuators 2012, 25, 62–66. [Google Scholar] [CrossRef]
  16. Yang, S.; Yin, S.; Ren, Y.; Zhu, J.; Ye, S. Improvement of calibration method for robotic flexible visual measurement systems. Opt. Precis. Eng. 2014, 22, 3239–3246. [Google Scholar] [CrossRef]
  17. An, Y.; Wang, X.; Zhu, X.; Jiang, S.; Ma, X.; Cui, J.; Qu, Z. Application of combinatorial optimization algorithm in industrial robot hand eye calibration. Measurement 2012, 202, 111815. [Google Scholar] [CrossRef]
  18. Cao, D.; Liu, W.; Liu, S.; Chen, J.; Liu, W.; Ge, J.; Deng, Z. Simultaneous calibration of hand-eye and kinematics for industrial robot using line-structured light sensor. Measurement 2023, 221, 113508. [Google Scholar] [CrossRef]
  19. Yang, J.; Li, H.; Campbell, D.; Jia, Y. Go-ICP: A globally optimal solution to 3D ICP point-set registration. IEEE Trans. Pattern Anal. Mach. Intell. 2015, 38, 2241–2254. [Google Scholar] [CrossRef] [PubMed]
  20. Liu, Z.; Li, X.; Li, F.; Zhang, G. Calibration method for line-structured light vision sensor based on a single ball target. Opt. Laser Eng. 2015, 69, 20–28. [Google Scholar] [CrossRef]
Figure 1. Calibration sphere model.
Figure 1. Calibration sphere model.
Sensors 24 07554 g001
Figure 2. Line-Structured light sensor and robot system.
Figure 2. Line-Structured light sensor and robot system.
Sensors 24 07554 g002
Figure 3. Calibration error compensation system.
Figure 3. Calibration error compensation system.
Sensors 24 07554 g003
Figure 4. V-groove weld of S-shaped trajectory.
Figure 4. V-groove weld of S-shaped trajectory.
Sensors 24 07554 g004
Figure 5. Comparison of weld detection trajectory and actual trajectory.
Figure 5. Comparison of weld detection trajectory and actual trajectory.
Sensors 24 07554 g005
Figure 6. Average deviation results of feature points.
Figure 6. Average deviation results of feature points.
Sensors 24 07554 g006
Figure 7. Boss for evaluating 3D reconstruction accuracy. The numbers 1 to 6 correspond to the six planes of the convex platform, respectively.
Figure 7. Boss for evaluating 3D reconstruction accuracy. The numbers 1 to 6 correspond to the six planes of the convex platform, respectively.
Sensors 24 07554 g007
Figure 8. 3D reconstruction results of boss: (a) boss point cloud; (b) point cloud segmentation. The numbers 1 to 6 correspond to the six planes of the convex platform, respectively.
Figure 8. 3D reconstruction results of boss: (a) boss point cloud; (b) point cloud segmentation. The numbers 1 to 6 correspond to the six planes of the convex platform, respectively.
Sensors 24 07554 g008
Table 1. Average deviation of feature points.
Table 1. Average deviation of feature points.
Feature PointsFeature Points 1Feature Points 2Feature Points 3
Average Deviation (mm) After Calibration4.31324.57314.0331
Average Deviation (mm) After Compensation0.81830.70690.6371
Table 2. Average deviation of weld seam feature points.
Table 2. Average deviation of weld seam feature points.
Feature PointsFeature Point 1Feature Point 2Feature Point 3
Average
Deviation (mm)
0.45670.63740.4856
Table 3. Deviations between point cloud data and theoretical data.
Table 3. Deviations between point cloud data and theoretical data.
LengthPoint Cloud (mm)Deviation (mm)Angle
Parameters
Point Cloud Data (°)
L 34 50.39470.3947 θ 23 53.4567
L 45 50.48910.5109 θ 24 53.0064
L 56 50.67130.6713 θ 25 53.3356
L 63 50.45130.4513 θ 26 52.9278
Average50.25160.5071Average53.1816
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Lin, J.; Feng, Y.; Ren, W.; Feng, J.; Zheng, J. Position-Constrained Calibration Compensation for Hand–Eye Calibration in Industrial Robots. Sensors 2024, 24, 7554. https://doi.org/10.3390/s24237554

AMA Style

Lin J, Feng Y, Ren W, Feng J, Zheng J. Position-Constrained Calibration Compensation for Hand–Eye Calibration in Industrial Robots. Sensors. 2024; 24(23):7554. https://doi.org/10.3390/s24237554

Chicago/Turabian Style

Lin, Jinsong, Yuxing Feng, Wenze Ren, Jiahui Feng, and Jun Zheng. 2024. "Position-Constrained Calibration Compensation for Hand–Eye Calibration in Industrial Robots" Sensors 24, no. 23: 7554. https://doi.org/10.3390/s24237554

APA Style

Lin, J., Feng, Y., Ren, W., Feng, J., & Zheng, J. (2024). Position-Constrained Calibration Compensation for Hand–Eye Calibration in Industrial Robots. Sensors, 24(23), 7554. https://doi.org/10.3390/s24237554

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop