Next Article in Journal
End-To-End Teleoperated Driving Video Transmission Under 6G with AI and Blockchain
Previous Article in Journal
Studies of Geosphere Interactions by Means of Laser Interference Complex
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Design and Error Calibration of a Machine Vision-Based Laser 2D Tracking System

School of Automation and Electrical Engineering, University of Science and Technology Beijing, Beijing 100083, China
*
Author to whom correspondence should be addressed.
Sensors 2026, 26(2), 570; https://doi.org/10.3390/s26020570 (registering DOI)
Submission received: 7 November 2025 / Revised: 30 December 2025 / Accepted: 8 January 2026 / Published: 14 January 2026
(This article belongs to the Section Physical Sensors)

Abstract

A laser tracker is an essential tool in the field of precise geometric measurement. Its fundamental operating idea is a dual-axis rotating device that propels the laser beam to continuously align and measure the attitude of a collaborating target. Such systems provide numerous benefits, including a broad measuring range, high precision, outstanding real-time performance, and ease of use. To solve the issue of low beam recovery efficiency in typical laser trackers, this research offers a two-dimensional laser tracking system that incorporates a machine vision module. The system uses a unique off-axis optical design in which the distance measuring and laser tracking paths are independent, decreasing the system’s dependency on optical coaxiality and mechanical processing precision. A tracking head error calibration method based on singular value decomposition (SVD) is introduced, using optical axis point cloud data obtained from experiments on various components for geometric fitting. A complete prototype system was constructed and subjected to accuracy testing. Experimental results show that the proposed system achieves a relative positioning accuracy of less than 0.2 mm (spatial root mean square error (RMSE) = 0.189 mm) at the maximum working distance of 1.5 m, providing an effective solution for the design of high-precision laser tracking systems.

1. Introduction

Precision measurement instruments serve as essential platforms for realizing high-accuracy measurement technologies and are indispensable tools in modern scientific research and advanced engineering manufacturing processes [1,2]. Due to their superior measurement accuracy, quick response time, and strong spatial adaptability, laser trackers have become one of the most popular key devices in high-end precision measurement, particularly in contemporary industrial measurement scenarios that increasingly demand large-scale measurement capabilities, high spatial resolution, and real-time responsiveness [3,4].
Laser trackers typically work on the basis of polar coordinate measurement. Their fundamental components are a laser tracking head (laser, rangefinder, Position Sensitive Detector (PSD), and other optical components), a two-axis rotating device, a controller (communication mode, external interface), a host computer (running software), a cooperative target, and additional measuring accessories. These systems allow for high-precision target point placement in three-dimensional space through coordinated control and data fusion [5,6]. In order to achieve continuous tracking and stable capture of dynamic targets, the system continuously modifies the laser’s orientation during actual measurement processes in response to changes in the cooperative target sphere’s spatial position, also referred to as the Spherically Mounted Retroreflector (SMR). Commercial laser tracking systems are currently widely used in high-end manufacturing industries, including mold inspection, automotive production, and aerospace because they have attained micrometer-level measurement accuracy.
Laser tracking technology has been a part of engineering application research since the 1980s. Laser tracker systems and products have become more intelligent, multifunctional, and smaller due to ongoing developments in critical technologies including digital signal processing, precision mechanical control, and photoelectric measurement. Even though the polar coordinate measurement framework based on laser interferometric distance measurement and two-dimensional angle encoding has been widely adopted in the industry, a number of technical bottlenecks still exist in terms of practical deployment and usage, despite the fact that current laser tracker technologies have reached a relatively mature stage.
First, conventional laser tracker systems usually use high-precision biaxial rotary stages. The accuracy of azimuth (horizontal angle) and elevation angle measurements is directly impacted by mechanical backlash, encoder resolution limitations, and rotational axis installation errors, which limit the system’s overall stability and measurement accuracy [7]. Second, the laser beam path will be interrupted during dynamic measurements if the target sphere is obscured or lost. The continuity of measurement tasks and overall system efficiency are then significantly impacted by the time-consuming and labor-intensive procedure of completely re-initializing the guiding beam path required by conventional systems. Meanwhile, as authoritative standards for the performance evaluation and verification of laser trackers, ASME B89.4.19-2021 and ISO 10360-10:2021 clearly specify core requirements such as periodic calibration and geometric error detection of equipment [8,9]. Traditional systems struggle to balance compliance with these standards and beam break recovery efficiency, which further highlights the necessity of designing the off-axis optical structure and visual reconnection mechanism in this study.
Recent studies have looked into integrating computer vision technology to create innovative laser tracking systems with vision-assisted capability in order to address these problems. For instance, these systems can locate and identify the target sphere’s pose by integrating industrial cameras and image processing modules, allowing for quick recovery and path correction after beam interruptions [10]. The development of unified standards or widely used mature solutions is hindered by the significant challenges that such systems still face in real-world engineering applications, such as intricate calibration processes, inconsistent image recognition accuracy, and high demands on mechanical assembly precision.

2. Overall System Design

This study presents the design of an intelligent laser tracking system featuring rapid beam break recovery based on machine vision. The core architecture comprises two main components: a two-axis rotary stage and a tracking head.
As shown in Figure 1, this work proposes a novel machine vision-based design approach for a laser 2D tracking system to overcome these technical constraints. The proposed system introduces a novel off-axis optical arrangement where the vision recognition path and distance measurement path are configured separately. While preserving measurement accuracy, this configuration successfully reduces the system’s dependency on optical coaxiality and precise mechanical alignment. The technology assists with autonomous laser path guidance and dynamic attitude control by combining sophisticated image processing techniques with target identification algorithms to enable fast localization and reliable tracking of the SMR in image space.
Compared with traditional coaxial optical systems, the proposed design offers a more compact system architecture (with overall dimensions of approximately 300 mm × 300 mm × 310 mm). Notably, the system proposed in this study exhibits remarkable cost competitiveness compared with commercial laser trackers with equivalent dynamic tracking performance. The total development cost of the prototype system is approximately $25,000, covering all core hardware components (including the biaxial rotary stage, optical components of the tracking head, industrial camera, PSD sensor, Absolute Distance Meter (ADM) rangefinder, servo motor, and related mechanical structural parts), mechanical processing fees, and software development costs. In contrast, mainstream commercial devices on the market, represented by the API T3 and Leica AT960 laser trackers, can achieve a spatial root mean square error (RMSE) within 0.2 mm, and are equipped with dynamic target tracking and beam break recovery functions, with their market prices generally ranging from $80,000 to $150,000. This significant cost gap highlights the economic advantages of the proposed system, making it a more cost-effective option for budget-constrained scenarios such as laboratory research, small- and medium-sized industrial measurement, and customized precision detection.
To clarify the application positioning and characteristics of the proposed system, Table 1 presents a characteristic comparison with two mainstream commercial laser trackers (API T3 and Leica AT960, Wetzlar, Germany). The parameters of commercial products are comprehensively referenced from manufacturers’ technical manuals, while the indicators of the proposed system are derived from prototype experimental test results, focusing on the adaptability analysis in short-distance measurement scenarios.
With the goal of offering a practical and effective solution for high-robustness, high-precision, and long-range dynamic target tracking measurements, as well as providing technical support and theoretical underpinnings for the development of intelligent measurement systems, this paper thoroughly examines important issues like overall system architecture design, core functional module implementation, error modeling, and calibration methods.
A schematic illustration of the tracking head structure is shown in Figure 2. The tracking head adopts a composite optical-mechanical design, consisting of a laser source, a beam splitter, a PSD, an ADM, an industrial camera, and other essential cage optical components. The components corresponding to the labels in Figure 2 (along with their models and quantities) are detailed in Table 2.
The tracking head uses the center of the cubic beam splitter as its reference point, from which the instrument’s primary optical axis extends both forward and backward. The laser beam emitted from the laser source aligns precisely with this primary optical axis. The PSD sensor and ADM are located on either side of the cubic beam splitter. The distance measurement beam created by the ADM is aligned with the laser beam using an optical path regulator. The beam reflected back from the SMR is sent through the optical path regulator and cubic beam splitter, with the cubic beam splitter’s reflective surface directing a portion of this return beam towards the PSD sensor.
The system calculates the target sphere’s current spherical coordinates using the ADM and PSD sensor outputs. This information is then used to control the two-axis rotary stage, allowing for continuous target tracking. In the event of target loss, the system uses the industrial camera to search for the SMR while controlling the two-axis rotary stage to reorient toward the target, enabling quick beam path reacquisition.
Figure 3 shows the system’s optical path design. The laser beam created by the laser source travels immediately outside the instrument and is reflected back inside by the SMR target sphere. A portion of the returned beam is then routed to the PSD sensor via the cubic beam splitter located in the center of the optical path.
The system adopts an off-axis distance measurement scheme, dividing the optical path into two mutually independent sections: one for distance measurement and the other for target recognition. This design eliminates the need for overly complex optical coupling modules between the distance measurement optical path and the main optical path of the system. The distance measurement beam emitted by the ADM is projected onto the SMR target ball via a reflector, and the distance measurement is completed after the beam is reflected by the prism built into the target ball. The final measured distance value is regarded as the actual distance only after error compensation.
The cubic beam splitter used in this system features a 1:1 beam splitting ratio. Additionally, narrowband optical filters and lens shade tubes are installed on the external surfaces along the three optical transmission directions to ensure the PSD sensor receives sufficient light intensity for normal operation while effectively minimizing interference from ambient light. This design separates the tracking head from the biaxial rotary mechanism, allowing the laser beam to directly project outside the instrument and simplifying the overall system architecture.
Current commercial laser trackers (e.g., Leica AT960, API T3) generally adopt an architecture of “single optical path coaxial integration + camera-main optical axis binding”: On one hand, the laser, distance measurement module, and position detection components need to share the same main optical axis, relying on specialized optical coupling components to achieve optical path integration. This imposes extremely high requirements on the precision of mechanical assembly, which not only increases hardware configuration costs but also extends the system’s assembly and commissioning cycle; On the other hand, their visual auxiliary cameras need to adopt an coaxial layout, and through precise mechanical adjustment, the camera’s optical axis maintains a fixed spatial correspondence with the main optical axis, with the field of view center fully coinciding with the main optical axis direction, this calibration process is complex to operate, and if mechanical structure deviation occurs due to environmental disturbances in the later stage, it will directly affect the effectiveness of beam break recovery. In contrast, the proposed system adopts an “off-axis splitting + camera optical path decoupling” design, as illustrated in Figure 4. Specifically, two key design features are implemented as follows:
  • The laser and the ADM distance measurement module each maintain their own independent coaxial optical paths. These two paths achieve off-axis coordination through a cube beam splitter, eliminating the requirement for full-link optical path coaxiality and thereby significantly reducing the system’s dependence on high-precision mechanical assembly.
  • The optical path of the visual camera is completely independent from the main optical axis, with no need for rigid mechanical calibration or binding between them. Instead, laser beam guidance is accomplished solely by algorithmically matching the position of the SMR in the camera image with the pointing direction of the main optical axis.
This design not only simplifies the system’s assembly process and controls hardware costs but also avoids the environmental sensitivity issues caused by mechanical binding, improving the reliability of the beam break recovery function.

3. Error Calibration Method

3.1. Monocular Camera Calibration

In the novel laser tracker, the machine vision module is responsible for searching the target sphere in the field of view when the system is in a beam-interrupted state, and then deriving the world coordinate system of the target. Therefore, it is necessary to first understand the imaging principle of industrial cameras. As shown in Figure 4, four coordinate systems are involved in the figure: the world coordinate system ( O w X w Y w Z w , describing the camera position, unit: m), the camera coordinate system ( O c X c Y c Z c , with the optical center as the origin, unit: m), the image coordinate system ( o x y , with the optical center as the midpoint of the image, unit: mm), and the pixel coordinate system ( o u v , with the origin at the upper left corner of the image, unit: pixel). Among them, point P is a real point in the physical world, point p is the corresponding point in the image, and f is the camera focal length.
The world coordinate system, also known as the object coordinate system, is used to clearly represent and describe objectively existing objects. Assume that the coordinate of a point P in space in the world coordinate system is  ( x w , y w , z w ) , and its coordinate in the camera coordinate system is  ( x c , y c , z c ) . The two coordinates can be converted through rigid body transformation, which includes translation and rotation. The transformation relationship is as follows:
x c y c z c 1 = R t 0 T 1 x w y w z w 1
where  t = [ t x , t y , t z ] T  is the translation vector during the conversion of the two coordinate systems, and  R  is the rotation matrix ( 3 × 3  square matrix), whose expression is related to the sine and cosine values of the rotation angles in three directions:
R = cos α cos β cos α sin β sin γ sin α cos γ cos α sin β cos γ + sin α sin γ sin α cos β sin α sin β sin γ + cos α cos γ sin α sin β cos γ cos α sin γ sin β cos β sin γ cos β cos γ
The camera coordinate system takes the optical center of the lens as the origin, the  Z c  axis extends outward along the optical axis direction, and the plane composed of the  X c  and  Y c  axes is translated from the physical imaging plane. The transformation from the camera coordinate system to the image coordinate system is a perspective projection transformation from 3D to 2D. According to the similar triangle theorem, the coordinate  ( x c , y c , z c )  of point P in the camera coordinate system can be converted to the image coordinate system coordinate  ( x , y ) :
z c x y 1 = f 0 0 0 0 f 0 0 0 0 1 0 x c y c z c 1
The image coordinate system is established based on the camera photosensitive sensor and can be regarded as a part of the physical imaging plane. An image is essentially an array of pixel gray values. The pixel coordinate system defines the upper left corner of the image as the origin, and u and v represent the row and column positions of a point in the image (unit: pixel), respectively. The image coordinate system and the pixel coordinate system can be converted to each other through translation and scaling transformation: the origin is shifted from the center of the imaging surface to the upper left corner of the image, and the unit is converted from mm to pixel. The conversion relationship is:
u v 1 = 1 d x 0 u 0 0 1 d y v 0 0 0 1 x y 1
where  d x  and  d y  represent the millimeter size of a single pixel on the camera photosensitive surface in the horizontal and vertical directions, and  ( u 0 , v 0 )  is the pixel coordinate of the image center when the origin is at the upper left corner. By combining the above coordinate transformation formulas, the mathematical model of camera imaging can be obtained:
z c u v 1 = f x 0 u 0 0 0 f y v 0 0 0 0 1 0 R t 0 T 1 x w y w z w 1
where  f x = f d x f y = f d y f x 0 u 0 0 0 f y v 0 0 0 0 1 0  is the camera intrinsic parameter matrix, and  R t 0 T 1  is the camera extrinsic parameter matrix.
However, due to manufacturing process variations, inconsistencies in magnification between the lens center and edges result in image distortion [11]. Distortion is an inherent characteristic of optical systems. Although it does not affect image sharpness, it does cause geometric deformation of the captured images and requires algorithmic correction. The main types of distortion include the following: radial distortion, which intensifies with increasing distance from the optical axis, with minimal distortion at the center and either outward expansion (barrel distortion) or inward contraction (pincushion distortion) at the image edges; tangential distortion, caused by non-parallel assembly between the lens group and the image plane, resulting in inclined imaging of light rays; and thin prism distortion, which arises from material imperfections or assembly errors in the lens, causing asymmetric lateral displacement of light rays. Although optimized manufacturing processes can reduce these distortions, they cannot be completely eliminated, necessitating a subsequent calibration-based correction.
This study adopts Zhang’s calibration method, utilizing multiple images of a checkerboard calibration target captured from different poses to solve the intrinsic parameters of the camera through the Camera Calibration Toolbox in MATLAB R2022b. The selection of this method is mainly based on three core considerations: first, it only requires a planar checkerboard to complete the calibration without the need for high-precision 3D calibration objects, which significantly reduces hardware costs and operational complexity, and is suitable for rapid calibration scenarios in laboratories and industrial fields; second, it can simultaneously solve the camera’s intrinsic parameters (focal length, principal point coordinates, distortion coefficients) and extrinsic parameters (rotation matrix, translation vector), with a significant correction effect on lens radial and tangential distortions, which is consistent with the high-precision measurement requirements of this system; third, the algorithm has strong robustness, excellent anti-noise ability and data redundancy fault tolerance, and has been widely integrated into mainstream toolkits such as OpenCV and MATLAB, facilitating engineering implementation and secondary development [12]. Assuming a spatial coordinate system exists on the checkerboard, the coordinates of a corner point A on the board can be expressed as  ( x a , y a , z a ) . After capturing the calibration board image with the camera, corner point A corresponds to a pixel point a in the image, whose coordinates are  ( u a , v a ) . According to (5), the transformation relationship between these coordinate systems can be expressed as follows:
s u a v a 1 = f x 0 u 0 0 0 f y v 0 0 0 0 1 0 R t 0 T 1 x a y a z a 1 = M 1 r 1 r 2 r 3 t x a y a z a 1
where s is a scale factor and  M 1  and [ r 1 r 2 r 3 r 4 t ] are matrices related to the intrinsic parameters of the camera. Since all checkerboard corner points lie on the same plane, when the object coordinate system is established on this plane, the value of  z a  becomes zero. Consequently, (6) simplifies to:
u a ν a 1 = M 1 S r 1 r 2 r 3 t x a y a 0 1 = M 1 S r 1 r 2 t x a y a 1
Leaving out the Z-axis value, the transformation from ( x a y a , 0) to ( u a v a ) in (7) can be regarded as a two-dimensional projective transformation, characterized by homography. The term [s M 1 r 1 r 2 t ] defines the homography matrix  H r , which is a 3 × 3 matrix that characterizes the intrinsic properties of the camera and is independent of external conditions. When both sides of (7) are expressed in homogeneous coordinates, proportional scaling does not affect the equality, resulting in  H r  containing eight degrees of freedom. The homography matrix  H r  and the constraint can be rewritten as:
H r = h 11 h 12 h 13 h 21 h 22 h 23 h 31 h 32 h 33 i = 1 3 j = 1 3 h i j 2 = 1
By substituting the reformulated homography matrix into (7) and simplifying, the following equation set is obtained:
h 11 x a + h 12 y a + h 13 h 31 x a u a h 32 y a u a h 33 u a = 0 h 21 x a + h 22 y a + h 23 h 31 x a v a h 32 y a v a h 33 v a = 0
As shown in (8) and (9), solving the nine unknown parameters in  H r  requires at least nine independent equations. Therefore, at least four corner points from the checkerboard must be substituted into (9) to generate eight linear equations. By combining these with the constraint equation, a solvable equation system can be formed. The homography matrix is then solved using optimization algorithms such as Singular Value Decomposition (SVD) and the Levenberg–Marquardt algorithm.
In this study, camera calibration was conducted using the built-in Camera Calibration Toolbox in MATLAB. A custom-designed aluminum oxide optical calibration board was used as the target. Twenty images of the checkerboard from different camera poses were captured. After corner detection, several images with high similarity or large corner detection errors were discarded. The final calibration results are presented in Table 3.

3.2. Optical Axis Calibration of the Tracking Head

Due to manufacturing imperfections, the laser beam emitted from the laser source is not perfectly perpendicular to the exit surface but is instead projected outward at an oblique angle. Therefore, a calibration and compensation procedure for the laser optical axis was designed in this system. First, the beam direction was constrained by using a cage-mounted laser alignment target combined with an adjustable retaining ring lens mount, limiting the beam deviation solely in the pitch (Y) direction. Next, the reference optical path was calibrated by placing an optical height gauge 75 mm in front of the tracking head and adjusting the support rod to ensure that the laser beam passed through an 80 mm clear aperture, aligned with the vertical offset of the rotary stage center.
Subsequently, data acquisition was performed by incrementally moving the height gauge along the laser propagation direction at 25 mm intervals while recording the beam spot height. A point cloud dataset was established with distance as the X-axis and height as the Z-axis. The experimental data obtained are shown in Figure 5. Following data collection, 30 valid data groups within the 75–800 mm range were retained. For the distal range (800–1500 mm), data points were selectively sampled. Finally, a spatial line fitting was conducted using the Singular Value Decomposition (SVD) algorithm [13] to derive the optical axis equation, ensuring coverage of 1.5 times the system’s maximum working distance requirement. The fitted laser optical axis equation is as follows:
y = 0 z = 0.00049 x + 0.11172
The calibration of the distance meter optical axis requires prior measurement of its installation error. Due to the non-coincidence between the zero position of the distance meter (front surface of the housing) and the YOZ reference plane of the tracking head, a linear offset of 18.3 mm exists, defined as the primary compensation amount  d 0 . The calibration experiment utilized a two-dimensional detection target. First, the origin of the detection target was fixed, and the mapping coordinates of the tracking head origin on the target surface were calculated for subsequent data normalization. Then, at various mapping distances (the distance from the tracking head origin to the detection target), the two-dimensional coordinates of the distance meter’s beam spot were recorded and transformed into Y-axis and Z-axis data of the optical axis point cloud, with the mapping distance used as the X-axis. This process yielded a three-dimensional point cloud for the optical axis.
Incorporating the compensation term  d 0 , spatial fitting algorithms were employed to complete the optical axis calibration and correct the impact of installation errors on measurement accuracy. The point cloud data of the distance meter’s optical axis were imported into MATLAB for spatial line fitting. The resulting fitted line equation is:
x 600 0.99993 = y + 38.18571 0.01088 = z + 0.86 0.00352
To further improve the distance measurement accuracy, the system achieves strict coaxiality between the distance measurement beam emitted by the ADM and the main optical axis of the laser through adjustment of two reflectors. After being reflected by the reflectors, the propagation direction of the distance measurement beam remains consistent with the main optical axis of the laser over the entire working distance, with a coaxiality deviation of  0.01   mm . Since the distance measurement beam needs to be reflected by two reflectors, there is a fixed physical distance  d m = 31.22   mm  between the reflectors. This distance will be included in the original measurement value of the rangefinder, so an additional spacing compensation needs to be added on the basis of the installation error compensation. The actual distance calculation formula is revised as:
d t = d c d 0 d m
where  d t  is the actual distance,  d c  is the original measurement value of the rangefinder,  d 0  is the basic compensation distance, and  d m  is the measured compensation value of the reflector spacing.
Fitted by the SVD algorithm, the optical axis equation of the system after coaxiality is:
y = 0.00001 x + 0.002 z = 0.00049 x + 0.112
Within the working distance of 75–1500 mm, a two-dimensional detection target is arranged every 50 mm to simultaneously record the spot coordinates of the laser and the rangefinder. The verification results show that the coaxiality deviation is  0.01   mm , which meets the requirements of high-precision measurement.
In this study, camera optical axis calibration and pixel-to-distance conversion factor calibration were conducted simultaneously using the camera software. The optical axis calibration was carried out in conjunction with the distance meter calibration experiment. By adjusting the lens focal length to achieve clear imaging and utilizing the center marker feature of the MVS V4.5.1 software, the mapping coordinates of the image center point on the two-dimensional detection target were recorded. Multiple data sets were obtained by moving the detection target, and spatial fitting was performed to derive the camera optical axis equation, as shown in (14):
x 683.33333 0.99998 = y + 2.76333 0.00331 = z 30.21667 0.00216
To meet the requirements for beam break recovery, it was also necessary to calibrate the pixel-to-real-distance conversion factor of the camera. Images of the detection target were captured at different radial distances, and the central 100 mm2 region of each image was used to calculate the pixel scale. A two-dimensional conversion curve was fitted using multiple data sets. Experimental results verified that the geometry of the camera target plane was perpendicular to the reference plane of the tracking head (rotational errors were negligible). Therefore, only the positional offset between the camera and the tracking head needed to be calibrated, eliminating the need for complex pose matrix calibration.
According to prior knowledge, the camera’s length conversion function typically follows a power function form. Using the Curve Fitting Toolbox in MATLAB, a two-dimensional curve fitting for the camera’s length conversion function was performed. The fitting results, presented in Figure 6, demonstrate high reliability based on metrics such as the Sum of Squared Errors (SSE), R-square coefficient, and Root Mean Square Error (RMSE). Therefore, the length conversion function  R ( d )  of the camera at different radial distances d is:
R ( d ) = 488700 d 1.651 + 6.412
The overall optical axis model of the tracking head and its components is shown in Figure 7.

3.3. Principle of Light Interruption and Reconnection Based on Visual Indexing

Rapid light interruption recovery is one of the prominent advantages of this study compared to traditional laser trackers. Based on the tracking head error model described in the previous section, this study designed a machine vision-based light interruption recovery control model. In this study, the state information of the target ball provided by the image processing algorithm was combined with the two parts of the tracking head optical axis model to achieve rapid light interruption recovery of the system. The corresponding control principle is as follows:
Assuming that the current measurement target is lost, the system enters a light-off state. The controller captures the video stream data from the camera and applies corresponding image processing algorithms to each frame [14,15]. After successful processing, the pixel coordinates  ( u t , v t )  of the target ball’s center in the image and the pixel value  r t  of the target ball’s shadow ring radius are obtained. Based on the ratio to the target ball’s actual radius  r c , the pixel distance ratio of the camera image at the current radial distance is calculated:
R n = r t r c
Perform the inverse transformation of Equation (15) and substitute  R n  =  R ( d )  into it to obtain the value of the current radial distance  d n .
d n = 488700 R n 6.412 1.651
Based on the camera parameters, the pixel coordinates of the image center  C 0  are (1536, 1024). Calculate the pixel difference between the center of the target ball and the  C 0  point in the horizontal and pitch directions, and then combine it with Rn to convert it into a real distance value:
D H t - c = u t 1536 R n
D V t - c = v t 1024 R n
In this equation,  u t  and  v t  are the horizontal pixel coordinate and pitch pixel coordinate of the target ball’s center.  D H t - c  represents the true distance of the position of the target ball relative to the optical axis of the camera in the horizontal direction, and  D V t - c  represents the true distance of the target ball position relative to the camera optical axis in the pitch direction.
Substitute the radial distance  d n  as the value of the X axis into the camera optical axis model Equation (14), to obtain the true distance differences  D H c - 0  and  D V c - 0  between the camera optical axis and the main optical axis of the tracking head in the horizontal and pitch directions, respectively, at the current radial distance  d n :
D H c - o = 0.0033 d n 0.5014
D V c - o = 0.0022 d n + 31.6927
Calculate the true distances  D H t 0  and  D V t 0  of the target ball center position relative to the main optical axis of the tracking head in the horizontal and pitch directions:
D H t o = D H t c + D H c o
D V t o = D V t c + D V c o
Finally, use the inverse tangent function to calculate the deflection angles  R a d H  and  R a d V  of the instrument reference point (the origin of the tracking head coordinate system) from the center of the target ball in the horizontal and pitch directions:
Rad H = tan 1 D H t o d n
Rad V = tan 1 D V t o d n
After all the above calculations are completed, the system will drive the two-axis turntable mechanism to rotate toward the target ball based on the values of  R a d H  and  R a d V , thereby enabling the main optical axis of the tracking head to successfully align with the center of the target ball and complete the instrument’s light interruption recovery. To systematically encapsulate the closed-loop control logic of the visual indexing-based beam reconnection algorithm detailed earlier, Figure 8 presents a highly concise flowchart. This diagram integrates the core operational chain from beam-break detection to successful beam reconnection, encompassing target recognition, spatial parameter calculation, offset analysis, and turntable adjustment. Each node reflects the key functional steps of the algorithm, directly mapping to the systematic analysis and computational processes outlined in the preceding sections. By visualizing the sequential workflow, the flowchart offers an intuitive overview of how the system achieves rapid, initialization-free beam recovery.

4. System Implementation and Experiments

4.1. Hardware Selection and Assembly

The system employs a biaxial rotary stage as the driving mechanism for the tracking head. The core performance parameters are listed in Table 4. The rotary stage utilizes a worm gear drive and a 100 W servo motor, supplemented with a LAMOTION circular encoder to achieve an angular resolution of  0.019 . The system adopts triple closed-loop control, incorporating current, velocity, and position loops. Communication with the host computer is established via an RS232 serial interface, which supports secondary development through the Software Development Kit (SDK).
The tracking head (127 × 113 × 128 mm, 1.2 kg) is connected to the rotary stage platform via an optical support rod. The support rod is adjusted to align the reference point with the 80 mm customized vertical offset from the stage platform, and the optical path is calibrated using an optical scale. The integrated vision-based laser tracker satisfies both laser interferometric distance measurement and dynamic tracking requirements.

4.2. Software Design

The upper computer control software was developed using C++17 and the Qt5.14.2 framework, as shown in Figure 9. It integrates OpenGL for 3D rendering and the Point Cloud Library (PCL1.14.0) for point cloud processing, enabling precise control and data management for the laser tracker. The core software functions include instrument control, dynamic tracking, data visualization, and algorithm integration. Pan-tilt motion control (at a speed of 10°/s), emergency stop, and zero-reset operations are implemented via interface buttons, with command transmission managed through the Qt signal-slot mechanism and the rotary stage control library.
The real-time monitoring module refreshes rotary stage angles (with a grating closed-loop resolution of  0.019 ), speed, PSD spot coordinates (tolerance of 0.1 mm), and distance meter correction values at 50 Hz. Dynamic system status is indicated using color-coded lights: blue for running, red for beam loss, orange for beam alignment, and green for target acquisition. The 3D visualization module, based on OpenGL, constructs the measurement coordinate system, dynamically rendering a magnetic field line model (with a 1 m radius representing actual distance) and measurement points (represented as blue-black spheres). It supports mouse-based interaction for viewpoint adjustment and zooming, offering intuitive visualization of spatial data distribution.
Dynamic tracking functionality is executed in a separate thread, integrating PSD feedback and vision-based algorithms (supporting YOLOv5/v8 and OpenCV) for beam break recovery. When spot displacement exceeds the threshold, the rotary stage triggers closed-loop adjustment. The data recording module supports single or continuous measurements (at 1 s intervals), converting polar coordinates into 3D coordinates for storage while providing invalid data filtering.
Point cloud processing leverages the PCL library for spatial line and plane fitting as well as custom model fitting (e.g., arcs, spheres) using least-squares methods. Fitting results are overlaid onto the OpenGL visualization. The camera module interfaces with the Hikvision SDK for image stream callbacks, performs distortion correction, and utilizes OpenCV or YOLO (converted to the cross-platform ONNX format) for real-time target sphere detection with overlaid visualization of detection results. The parameter configuration interface supports read/write operations for 15 parameters, including rotary stage acceleration and camera exposure settings, automatically detecting serial ports and initializing devices.
Communication between the system and the rotary stage control unit is established via RS232 serial communication. Triple closed-loop control (current, velocity, position) ensures high-precision dynamic response, and the grating closed-loop enhances angular resolution to  0.019 . The main control thread coordinates PSD, distance meter, and vision subsystems, ensuring real-time performance and system stability to meet the high-precision measurement and rapid recovery demands of the laser tracker in complex scenarios.

4.3. The Workflow of the Laser Tracking System

Similar to traditional laser trackers, this system is also divided into multiple operating states. The specific work tasks and workflow are shown in Figure 10. After receiving the power-on command, the system first performs serial port information checks on all sensors. Once confirmed correct, it connects all sensors to complete the startup process, and the system enters the basic operating state from this point. During startup, system initialization is also carried out, which mainly includes setting initial parameters of each sensor, checking optical axis offset, etc. After all parameter settings are completed, the tracking and measurement function can be activated: first, place the target SMR at the point to be measured—at this moment, the system is in the beam loss state, and it executes the target recognition algorithm for the SMR on the camera image. Then, relying on the beam break recovery control model, it deduces the world coordinates of the SMR from the pixel coordinates of the SMR’s center. After obtaining these coordinates, the system controls the rotary stage to turn toward the target. When the rotary stage moves to a position where the laser beam can enter the interior of the SMR, the PSD outputs the offset of the reflected light spot, and the system enters the beam alignment state at this time. In the beam alignment state, the current spherical coordinate position of the SMR is calculated based on the output value of the PSD sensor and the distance measured by the rangefinder, and the rotary stage will continue to rotate until the PSD output value is zero—at this point, the system enters the target-aligned state. Coordinate measurement can be performed in the target-aligned state; after measurement, move the target SMR to another measurement point and repeat the measurement until all points to be measured are completed. If the system enters the beam loss state during measurement (due to reasons such as the SMR moving too fast or the optical path being blocked), the system will re-execute the target search task until it enters the target-aligned state again. If all points to be measured have been completed, the system will perform point cloud data fitting according to the selected fitting target, and after fitting, output basic geometric parameters as reference results. At the end of the system’s operation, position re-measurement and data export can be performed optionally; if not needed, all processes are terminated and the system returns to the basic operating state.

4.4. Precision Testing Experiment

A relative error test was conducted on the independently developed off-axis laser tracker. Due to the off-axis measurement approach, where the laser and distance measurement axes are separated, traditional error models were not directly applicable. Therefore, the system’s performance was evaluated by comparing measurement data from the proposed system (experimental group) and a commercial FARO laser tracker (reference group).
A Thorlabs high-precision two-axis electric displacement stage (accuracy: 0.1  μ m) was used to drive the SMR target sphere along the Y and Z axes (horizontal range: 0–8 mm, vertical range: 0–15 mm, with 1 mm steps). A total of 144 sets of 3D coordinate data were simultaneously recorded from both the proposed system and the FARO tracker (preheated for 30 min) [16].
The datasets from the FARO tracker and the proposed system were designated as the reference and experimental groups, respectively. Since both datasets were measured within their respective coordinate systems, the reference group data were rescaled and reprojected onto the experimental system’s coordinate framework for coordinate-aligned comparative analysis. The two point clouds were visually displayed in 3D using MATLAB, as shown in Figure 11.
By assuming parallelism between the XOY planes of the two systems and retaining only rotation around the Z-axis and origin translation parameters, the reference group data were projected onto the coordinate system of the experimental group. Based on the experimental data, a statistical analysis was conducted to evaluate the mean error (ME), maximum error (R), standard deviation (SD), and root mean square error (RMSE) of the proposed system relative to the FARO laser tracker. Relative errors between corresponding measurement points from both datasets were calculated for the X, Y, and Z measurement directions, as well as for the spatial Euclidean distance. The results are presented in Table 5.
The results indicate that the mean error in the X direction was −0.025 mm, with an RMSE of 0.030 mm, primarily influenced by the stability of the distance meter. The RMSEs in the Y and Z directions reached 0.150 mm and 0.129 mm, respectively, reflecting combined limitations in distance meter accuracy and rotary stage stability. The spatial distance exhibited a mean error of 0.191 mm and a maximum error of 0.945 mm, revealing deficiencies in the mechanical repeatability of the system.
Additionally, the experiment observed a systematic X-axis displacement of approximately 0.009 mm per millimeter during Y/Z axis movement of the target sphere. This phenomenon is attributed to the assembly inclination of the displacement stage. However, as the reference group data inherently compensated for this effect, this error does not significantly impact the performance evaluation of the proposed system.
The primary error sources of the system were identified as angular error, distance measurement error, tracking error, and assembly error. Firstly, angular error mainly arises from the angle measurement error of the rotary encoder, as well as orthogonality deviation between the two axes of the rotary stage and mechanical installation errors, all of which further affect attitude calculation accuracy. Based on the rotary stage repeatability specification of 0.003°, and applying small-angle approximation, the angular error-induced spatial deviation at a 1 m measurement range was approximately 0.052 mm.
Secondly, distance measurement error mainly originates from the random noise and systematic bias of the distance meter itself. According to the specifications of the selected distance meter, the distance measurement error within a 1 m range was approximately 0.06 mm. Thirdly, tracking error refers to positional deviations caused by servo lag, dynamic dead zones, and control response delays during dynamic target tracking. Finally, assembly error primarily includes orthogonality deviation between the two axes, installation errors of optical components, and central axis misalignment. These assembly errors lead to discrepancies between the actual measurement direction and the ideal geometric model, thereby affecting measurement accuracy.
According to the measurement results in Table 5, the spatial RMSE of the system was 0.189 mm. Considering that the combined contribution of angular and distance measurement errors was 0.112 mm, the remaining error of approximately 0.077 mm can be primarily attributed to the combined effects of tracking error and assembly error.
Although the system achieved single-axis measurement accuracy at the 0.05 mm level within a 1.0 m measurement range, spatial error was evidently constrained by the stability of the rotary stage control. The observed high maximum error and RMSE indicate that further optimization of the mechanical structure and control algorithms is necessary. The experiments validated the baseline measurement capability of the prototype system, providing essential data support for future accuracy enhancement and functional improvements. To clarify the quantitative improvement effect of reducing or compensating key error sources on the total measurement error, a quantitative analysis is conducted based on the error decomposition results (Table 5) and the independent contribution ratio of each error source, las follows:
  • Quantitative improvement potential of angular error. The current angular error (caused by encoder resolution, turntable axis orthogonality deviation, and mechanical installation errors) results in a spatial deviation of approximately 0.052 mm at a measurement distance of 1 m, accounting for 27.5% of the total error (0.052/0.189). By adopting a higher-resolution encoder (improving the repeatability from the existing 0.003° to 0.001°) and correcting the axis deviation through an orthogonality error compensation algorithm, the spatial deviation caused by angular error can be reduced to 0.017 mm. This single reduction contributes approximately 0.035 mm, corresponding to an 18.5% decrease in the total measurement error.
  • Quantitative improvement potential of distance measurement error. The inherent error of the distance measurement module (random noise + systematic deviation) is approximately 0.06 mm, accounting for 31.7% of the total error (0.06/0.189). By introducing a temperature compensation algorithm (correcting the impact of ambient temperature on distance measurement accuracy) and moving average filtering to handle random noise, the distance measurement error can be reduced to 0.03 mm. This single reduction contributes approximately 0.03 mm, corresponding to a 15.9% decrease in the total measurement error.
  • Quantitative improvement potential of assembly error. The current contribution of assembly error (including axis orthogonality deviation, optical component installation errors, and central axis misalignment) is approximately 0.077 mm, accounting for 40.8% of the total error (0.077/0.189). By improving mechanical processing precision (optimizing the axis orthogonality deviation from the current 0.01 mm to 0.005 mm) and performing precise calibration of the optical axis using a laser interferometer, the assembly error can be reduced to 0.01 mm. This single reduction contributes approximately 0.067 mm, corresponding to a 35.4% decrease in the total measurement error.
Combining the optimal reduction schemes for the three types of errors mentioned above, the total measurement error of the system can be reduced from the current spatial RMSE = 0.189 mm to approximately 0.085 mm, with a relative improvement of 55.0%. At this point, the relative positioning accuracy of the system at the maximum working distance of 1.5 m can be better than 0.1 mm, further meeting the requirements of high-precision industrial measurement scenarios. The above quantitative analysis is based on the assumption of independent error effects; in practical applications, the error coupling effect needs to be considered, and the final accuracy improvement is expected to be between 50% and 55%.

5. Conclusions

This study addresses the technical pain points of traditional laser trackers, such as low beam break recovery efficiency and strict dependence on optical coaxiality and mechanical processing precision. A machine vision-based laser 2D tracking system with rapid beam break recovery capability is proposed, and the system architecture design, multi-dimensional error calibration method research, prototype development, and experimental verification are completed. The main research conclusions are as follows:
1. An off-axis optical splitting design scheme is proposed, where the vision recognition optical path and the distance measurement optical path are independently configured. The coordinated cooperation of the two optical axes is realized through a cube beam splitter, which effectively reduces the system’s dependence on high-precision optical coupling and mechanical assembly. Meanwhile, the system miniaturization (300 mm × 300 mm × 310 mm) and low-cost development (prototype cost of approximately $25,000) are achieved, providing a cost-effective solution for short-distance high-precision measurement scenarios.
2. A visual indexing-based beam break recovery mechanism is constructed, forming a closed-loop process of “beam break detection → target ball visual positioning → pixel-distance conversion → spatial offset calculation → turntable attitude adjustment”. Rapid beam reconnection can be completed without re-initialization, which solves the problem of low recovery efficiency of traditional laser trackers after beam break and improves the continuity of dynamic measurement.
3. A multi-dimensional error calibration system is established to provide key support for the high-precision measurement of the system: (1) Zhang’s calibration method for monocular cameras is adopted, which can simultaneously solve intrinsic and extrinsic parameters using a planar checkerboard target. It effectively corrects lens radial distortion (0.0036) and tangential distortion (−0.0013) without the need for high-precision 3D calibration components, reducing calibration costs and operational complexity. (2) A tracking head optical axis calibration method based on the Singular Value Decomposition (SVD) algorithm is proposed. By fitting the optical axis point cloud data of the laser, rangefinder, and camera, the installation errors of optical components (e.g., 18.3 mm linear offset of the rangefinder) and beam tilt deviations are quantified and compensated. After calibration, the coaxiality deviation of the three optical axes is  0.01   mm , which significantly reduces the inherent system error. (3) A power function-based pixel-distance conversion model is established. Experimental verification shows excellent fitting reliability, and the radial distance can be inversely derived only through the ratio of the pixel radius of the target ball’s shadow ring to the actual radius, providing a fast and accurate distance estimation basis for beam break recovery.
4. Experimental verification and error analysis indicate that the system achieves a spatial Root Mean Square Error (RMSE) of 0.189 mm and a relative positioning accuracy better than 0.2 mm at the maximum working distance of 1.5 m, meeting the requirements of short-distance high-precision measurement. Meanwhile, it is clarified that the tracking head mechanical system, rotary encoder, and distance measurement module are the core factors determining the stable and accurate operation of the device. Among them, the assembly accuracy of the tracking head mechanical structure (e.g., axis orthogonality deviation, optical component installation coaxiality), the angle measurement accuracy of the rotary encoder, and the measurement accuracy of the distance measurement module together constitute the main sources of the system’s total error, and are also the key links restricting the improvement of the system’s measurement accuracy.
Future research will focus on three optimization directions: first, improving optical coupling efficiency and operational stability under complex environmental conditions (e.g., reducing the impact of ambient light interference); second, enhancing the rigidity and assembly precision of the tracking head’s mechanical structure to reduce axis deviation and optical component installation errors; third, adopting high-resolution encoders and developing targeted angular error compensation algorithms to further weaken the constraints of mechanical defects and encoder performance on measurement accuracy, providing technical support for the iterative upgrading of system performance.

Author Contributions

Conceptualization, D.L. and X.W.; methodology, D.L. and T.C.; software, X.W. and T.C.; validation, X.W. and T.C.; formal analysis, D.L.; investigation, X.W. and T.C.; resources, D.L.; data curation, X.W.; writing—original draft preparation, D.L.; writing—review and editing, D.L., X.W. and T.C.; visualization, X.W.; supervision, D.L.; project administration, D.L.; funding acquisition, D.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Science and Technology Major Project of China, grant number 2024ZD0702901.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

The authors would like to thank the University of Science and Technology Beijing for providing the research facilities.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

SMRSpherically Mounted Retroreflector
PSDPosition Sensitive Detector
ADMAbsolute Distance Meter
SVDSingular Value Decomposition
RMSERoot Mean Square Error
SDStandard Deviation
MEMean Error

References

  1. Lv, Z.; Cai, L.; Wu, Z.; Cheng, K.; Chen, B.; Tang, K. Target Locating of Robots Based on the Fusion of Binocular Vision and Laser Scanning. In Proceedings of the 2021 International Conference on Computer Engineering and Artificial Intelligence (ICCEAI), Shanghai, China, 27–29 August 2021; pp. 427–430. [Google Scholar] [CrossRef]
  2. Ren, F.; Yang, T.; Wang, X.; Ma, Z.; Li, X.; Jiang, H.; Zhao, H. On-line Measurement System of Aerospace Large Components Based on Vision Measurement System and Robotic Arm. In Proceedings of the 2024 9th Asia-Pacific Conference on Intelligent Robot Systems (ACIRS), Dalian, China, 18–20 July 2024; pp. 132–138. [Google Scholar] [CrossRef]
  3. Feng, T.; Cui, C.; Li, J.; Zhou, W.; Dong, D.; Zhang, Z.; Wang, G.; Qiu, Q.; Wang, S. High-velocity measurement method in dual-frequency laser interference tracker based on beam expander and acousto-optic modulator. Opt. Express 2023, 31, 24230–24242. [Google Scholar] [CrossRef] [PubMed]
  4. Pan, X.; Liu, F.; Liu, Q.; Wang, Z.; Li, J.; Cheng, P. Omnidirectional Vector Measurement Method for Accurate Laser Positioning System. In Proceedings of the 2024 7th International Conference on Intelligent Robotics and Control Engineering (IRCE), Xi’an, China, 7–9 August 2024; pp. 184–189. [Google Scholar] [CrossRef]
  5. Yang, L.; Pan, Y.; Lin, J.; Liu, Y.; Shang, Y.; Yang, S.; Cao, H. Automatic Guidance Method for Laser Tracker Based on Rotary-Laser Scanning Angle Measurement. Sensors 2020, 20, 4168. [Google Scholar] [CrossRef] [PubMed]
  6. Mutilba, U.; Gomez-Acedo, E.; Kortaberria, G.; Olarra, A.; Yagüe-Fabra, J.A. Traceability of On-Machine Tool Measurement: A Review. Sensors 2017, 17, 1605. [Google Scholar] [CrossRef] [PubMed]
  7. Ma, D.; Li, J.; Feng, Q.; Zhao, Y.; Cui, J.; Wu, L. Method and system for simultaneously measuring six degrees of freedom motion errors of a rotary axis based on a semiconductor laser. Opt. Express 2023, 31, 24127–24141. [Google Scholar] [CrossRef] [PubMed]
  8. ASME B89.4.19-2021; Performance Evaluation of Laser-Based Spherical Coordinate Measurement Systems. American Society of Mechanical Engineers: New York, NY, USA, 2021.
  9. ISO 10360-10:2021; Geometrical Product Specifications (GPS)—Acceptance and Reverification Tests for Coordinate Measuring Systems (CMS)—Part 10: Laser Trackers. International Organization for Standardization: Geneva, Switzerland, 2021.
  10. Dong, M.; Qin, X.; Fu, Y.; Xu, H.; Wang, J.; Wu, D.; Wang, Z. A Vision Algorithm for Robot Seam Tracking Based on Laser Ranging. In Proceedings of the 2023 3rd International Conference on Computer, Control and Robotics (ICCCR), Shanghai, China, 24–26 March 2023; pp. 175–179. [Google Scholar] [CrossRef]
  11. Ding, G.; Chen, T.; Sun, L.; Fan, P. High Precision Camera Calibration Method Based on Full Camera Model. In Proceedings of the 2024 36th Chinese Control and Decision Conference (CCDC), Xi’an, China, 25–27 May 2024; pp. 4218–4223. [Google Scholar] [CrossRef]
  12. Zhang, Z. A Flexible New Technique for Camera Calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef]
  13. López-López, L.E.; Luviano-Cruz, D.; Cota-Ruiz, J.; Díaz-Roman, J.; Sifuentes, E.; Silva-Aceves, J.M.; Enríquez-Aguilera, F.J. A Configurable Parallel Architecture for Singular Value Decomposition of Correlation Matrices. Electronics 2025, 14, 3321. [Google Scholar] [CrossRef]
  14. Nguyen, Q.C.; Hua, H.Q.B.; Pham, P.T. Development of a vision system integrated with industrial robots for online weld seam tracking. J. Manuf. Process. 2024, 119, 414–424. [Google Scholar] [CrossRef]
  15. Xu, W. Research and Design of Weld Seam Tracking Algorithm Based on Machine Vision and Machine Learning. In Proceedings of the 2023 IEEE 5th International Conference on Power, Intelligent Computing and Systems (ICPICS), Shenyang, China, 14–16 July 2023; pp. 359–363. [Google Scholar] [CrossRef]
  16. Deng, R.; Shi, S.; Yang, L.; Lin, J.; Zhu, J. Dynamic measurement method based on temporal-spatial geometry constraint and optimized motion estimation. Measurement 2024, 227, 114269. [Google Scholar] [CrossRef]
Figure 1. Schematic diagram of the proposed laser 2D tracking system.
Figure 1. Schematic diagram of the proposed laser 2D tracking system.
Sensors 26 00570 g001
Figure 2. Exploded view of the tracking head structure of the proposed system.
Figure 2. Exploded view of the tracking head structure of the proposed system.
Sensors 26 00570 g002
Figure 3. Schematic diagram of the optical path configuration of the proposed system.
Figure 3. Schematic diagram of the optical path configuration of the proposed system.
Sensors 26 00570 g003
Figure 4. Schematic diagram of camera model and imaging coordinate systems.
Figure 4. Schematic diagram of camera model and imaging coordinate systems.
Sensors 26 00570 g004
Figure 5. The fitting results for the laser optical axis.
Figure 5. The fitting results for the laser optical axis.
Sensors 26 00570 g005
Figure 6. The fitting results of the camera’s length conversion function at different radial distances.
Figure 6. The fitting results of the camera’s length conversion function at different radial distances.
Sensors 26 00570 g006
Figure 7. The overall optical axis model of the tracking head and its components.
Figure 7. The overall optical axis model of the tracking head and its components.
Sensors 26 00570 g007
Figure 8. Highly concise flowchart of the visual indexing-based beam reconnection algorithm.
Figure 8. Highly concise flowchart of the visual indexing-based beam reconnection algorithm.
Sensors 26 00570 g008
Figure 9. The display interface of the host computer software in this system.
Figure 9. The display interface of the host computer software in this system.
Sensors 26 00570 g009
Figure 10. Workflow diagram of the proposed system.
Figure 10. Workflow diagram of the proposed system.
Sensors 26 00570 g010
Figure 11. Three-dimensional visualization of the two experimental datasets.
Figure 11. Three-dimensional visualization of the two experimental datasets.
Sensors 26 00570 g011
Table 1. Characteristic comparison of the proposed system and commercial laser trackers.
Table 1. Characteristic comparison of the proposed system and commercial laser trackers.
Characteristic IndicatorsProposed SystemAPI T3Leica AT960
Total System Weight≤10 kg (simplified structure) 23   kg  (with controller) 21   kg  (integrated host)
Core Optical ArchitectureDual-mirror coaxial design (adapted for short-distance)Built-in multi-optical components (general long-distance)Built-in multi-optical components (general long-distance)
Anti-interference Adaptation ScenariosShort-distance, indoor complex light environment (no external shading required)Long-distance, stable lab environment (temperature control dependent)Long-distance, standard working conditions (temperature control dependent)
Measurement Distance Range 0 1.5   m  (short-distance dedicated) 0 60   m  (long-distance general) 0 60   m  (long-distance general)
Total Calibration Time (including preheating)≤5 min (rapid assembly) 15   min  (13 min preheating + 2 min calibration)≥12 min (professional operation required)
Main Application ScenariosShort-distance high-precision assembly, indoor small-space measurementLarge-size workpiece inspection, long-distance positioningLarge equipment calibration, ultra-long-distance measurement
Table 2. Component information list of the tracking head.
Table 2. Component information list of the tracking head.
Figure LabelComponent CategoryComponent ModelOrigin (City, Country)Quantity
1Cube Beam SplitterM2-BS1Guangzhou, China1
2Industrial CameraMV-CE060-10UCHangzhou, China1
3RangefinderL2Shenzhen, China1
4PSD SensorDRX-PSD500Shenzhen, China1
5LaserHW650DMGX100-1660BDXi’an, China1
6Industrial LensMVL-HF1624M-10MPHangzhou, China1
7PSD Amplifier Circuit BoardDRX-2DPSD-OA11Shenzhen, China1
8Adjustable Fixing FrameCRD-2XGuangzhou, China1
9Cage Plate AdapterLPCM-CGuangzhou, China1
10Cage Coaxial RodPCM-S19Guangzhou, China4
10Cage Coaxial RodPCM-S38Guangzhou, China4
10Cage Coaxial RodPCM-S76Guangzhou, China4
11Pressure Ring Lens HolderCSJ-25Guangzhou, China2
12Laser Mounting Hole FramePOL-12Guangzhou, China2
13Lens Light Shield TubeCSA-45Guangzhou, China2
13Lens Light Shield TubeCSA-08Guangzhou, China1
13Lens Light Shield TubeCSA-08TGuangzhou, China1
14Narrow Band FilterNP650Guangzhou, China3
15Telescopic Optical Support RodCAT57-SGuangzhou, China1
16Coaxial Light Shield TubeCustomizedGuangzhou, China1
Table 3. Camera Parameters.
Table 3. Camera Parameters.
ParametersValue
Image Size2048, 3072
Radial Distortion0.0036, 0.4924
Tangential Distortion−0.0013, 0.0040141
Table 4. Core performance specifications of the selected biaxial rotary stage.
Table 4. Core performance specifications of the selected biaxial rotary stage.
Structural Parameters
Angle RangeAzimuth  ± 180 , Elevation  ± 90
Platform Size165 × 80 mm
Transmission Ratio180:1
Drive MechanismWorm gear and worm shaft mechanism
Motor100 W servo motor
Central Load Capacity10 kg
Accuracy Parameters
Resolution 0.0005 °  =  1.8  (with 20 subdivisions)
Closed-Loop MechanismGrating closed-loop (resolution:  0.0001125 ° )
Maximum Speed 50 ° /s
Repeatability 0.003 °
End Jump Accuracy10  μ m
Table 5. Measurement errors in different dimensions.
Table 5. Measurement errors in different dimensions.
Measurement DimensionMean Error (ME)Maximum Error (R)Standard Deviation (SD)Root Mean Square Error (RMSE)
X Direction−0.0250.0830.0300.030
Y Direction0.0440.9370.1470.150
Z Direction0.0340.7610.1260.129
Spatial Distance0.1910.9450.0990.189
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Lao, D.; Wang, X.; Chen, T. Design and Error Calibration of a Machine Vision-Based Laser 2D Tracking System. Sensors 2026, 26, 570. https://doi.org/10.3390/s26020570

AMA Style

Lao D, Wang X, Chen T. Design and Error Calibration of a Machine Vision-Based Laser 2D Tracking System. Sensors. 2026; 26(2):570. https://doi.org/10.3390/s26020570

Chicago/Turabian Style

Lao, Dabao, Xiaojian Wang, and Tianqi Chen. 2026. "Design and Error Calibration of a Machine Vision-Based Laser 2D Tracking System" Sensors 26, no. 2: 570. https://doi.org/10.3390/s26020570

APA Style

Lao, D., Wang, X., & Chen, T. (2026). Design and Error Calibration of a Machine Vision-Based Laser 2D Tracking System. Sensors, 26(2), 570. https://doi.org/10.3390/s26020570

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop