Next Article in Journal
Bayesian Network in Structural Health Monitoring: Theoretical Background and Applications Review
Previous Article in Journal
SC-CoSF: Self-Correcting Collaborative and Co-Training for Image Fusion and Semantic Segmentation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Design and Implementation of a Dynamic Measurement System for a Large Gear Rotation Angle Based on an Extended Visual Field

1
School of Mechanical Engineering, Shenyang University of Technology, Shenyang 110870, China
2
Engineering Training Centre, Shenyang University of Technology, Shenyang 110870, China
3
School of Mechanical Engineering, Liaoning Mechanical & Electrical College of Technology, Dandong 118009, China
*
Author to whom correspondence should be addressed.
Sensors 2025, 25(12), 3576; https://doi.org/10.3390/s25123576
Submission received: 28 April 2025 / Revised: 28 May 2025 / Accepted: 3 June 2025 / Published: 6 June 2025
(This article belongs to the Section Physical Sensors)

Abstract

:
High-precision measurement of large gear rotation angles is a critical technology in gear meshing-based measurement systems. To address the challenge of high-precision rotation angle measurement for large gear, this paper proposes a binocular vision method. The methodology consists of the following steps: First, sub-pixel edges of calibration circles on a 2D dot-matrix calibration board are extracted using edge detection algorithms to obtain pixel coordinates of the circle centers. Second, a high-precision calibration of the measurement reference plate is achieved through a 2D four-parameter coordinate transformation algorithm. Third, binocular cameras capture images of the measurement reference plates attached to large gear before and after rotation. Coordinates of the camera’s field-of-view center in the measurement reference plate coordinate system are calculated via image processing and rotation angle algorithms, thereby determining the rotation angle of the large gear. Finally, a binocular vision rotation angle measurement system was developed, and experiments were conducted on a 600 mm-diameter gear to validate the feasibility of the proposed method. The results demonstrate a measurement accuracy of 7 arcseconds (7”) and a repeatability precision of 3 arcseconds (3”) within the 0–30° rotation range, indicating high accuracy and stability. The proposed method and system effectively meet the requirements for high-precision rotation angle measurement of large gear.

1. Introduction

In the field of precision metrology, angle measurement technology plays a pivotal role, particularly in industrial applications, such as aerospace equipment manufacturing, precision machining, and high-accuracy quality inspection, where the precise measurement of component rotation parameters holds significant value [1,2]. Modern industry imposes two core requirements on this technology: first, the need to achieve micrometer-level measurement accuracy and, second, the demand for intelligent automated measurement [3]. Depending on the measurement principles, current angle measurement technologies can be primarily categorized into four types: contact-based measurement relying on mechanical transmission principles [4,5,6], electromagnetic induction-based measurement [7,8,9,10], optical interference-based measurement [11,12,13], and machine vision-based measurement utilizing digital image processing [14,15,16,17,18]. Among these, the first two contact-based methods are susceptible to environmental interference, such as temperature, humidity, and vibration. Although optical measurement can achieve sub-micrometer accuracy, its measurement field of view is limited, and the optical system setup is complex. In recent years, thanks to advancements in computational processing power and the widespread adoption of high-resolution industrial cameras, non-contact measurement techniques based on machine vision have demonstrated significant advantages [19,20]. These methods combine high measurement accuracy, excellent stability, and strong dynamic measurement capabilities [21,22].
For machine vision-based measurement of component rotation angles, in [23], the authors introduced a monocular vision-based pose measurement method using cooperative targets, where a circular planar target and feature point extraction algorithm were designed, combined with RANSAC and topological fitting optimization, to achieve pitch and roll angle measurements. In [24], the authors presented a non-contact angle measurement approach based on rotating spot images and machine vision, utilizing elliptical trajectory fitting for angle detection. In [25], the authors proposed a high-precision monocular vision-based rotation angle measurement method that employs coordinate rotation equations to determine the rotation angle of calibration plates. In [26], the authors adopted a non-contact measurement method based on machine vision technology to detect winding tilt angles, conducting a comparative study using an improved interval rotation projection method, a quadratic iterative least squares method, and the Hough transform detection approach. In [27], the authors proposed a machine vision-based method for measuring and controlling brush angles, obtaining data on formed angle values and springback angle values using computer vision and image processing algorithms. In [28], the authors designed a dedicated marker and a novel measurement algorithm based on this marker in a monocular vision measurement system. By identifying auxiliary reference lines that can reflect azimuth angle information in real time, they derived a pitch angle calculation formula through mapping relationships. In [29], the authors developed a machine vision system for real-time measurement of fiber angles on the surface of radially braided preforms, enabling precise angular measurement for preforms with varying diameters. The proposed image analysis algorithm achieves automated angle measurement through edge detection and autocorrelation techniques.
Neuromorphic vision sensors offer a novel approach for dynamic measurement. In [30], the authors utilized retinal neural spike data to quantitatively assess decoded visual stimuli through a neural network decoder, establishing a comprehensive decoding evaluation framework incorporating six image quality assessment metrics. This study successfully achieved the decoding of dynamic visual scenes from retinal spike signals. In [31], the authors propose a line-based pose tracking method for uncooperative spacecraft utilizing a stereo event camera. Additionally, they constructed a stereo event-based uncooperative spacecraft motion dataset encompassing both simulated and real events. These studies have pointed out directions for future research. However, considering the reliability requirements of measurement systems in industrial settings, this study adopts an industrial camera solution with an extended field of vision.
The current monocular vision-based angle measurement methods are significantly limited in high-precision inspection of large-scale components, particularly in the high-accuracy measurement of large gear tooth profiles. First, there is the issue of geometric errors caused by large-field-of-view distortion. Monocular cameras exhibit significant radial and tangential distortion in wide fields of view, causing edge feature point coordinates to deviate from their true positions [32]. This is especially pronounced when using wide-angle lenses, where distortion at the image edges can be 3 to 5 times greater than in the central region, leading to nonlinear accumulation of angle calculation errors. Second, edge resolution degradation and feature extraction failure become problematic. As the size of the measured object increases, the physical dimension represented by each pixel also grows significantly [33]. For example, when measuring a 2 m-diameter gear, a monocular system covering the full field of view may experience edge resolution dropping to just one-tenth of that in the central region, resulting in sub-pixel blurring of critical tooth profile features. Optical measurement faces several limitations in large-scale component angle measurement. Line-of-sight occlusion is a major challenge, particularly in complex structures where self-occlusion occurs in angular regions, such as internal right angles. This often necessitates the use of multiple sensors or repeated repositioning, significantly increasing system complexity [34]. For large parts, dimensional effects amplify error propagation, typically requiring supplementary laser tracker measurements for global datum control to ensure accuracy [35,36]. Additionally, surface characteristics affect measurement results. When surface roughness (Ra) is below 0.2 μm, high reflectivity can introduce specular reflection noise. While anti-glare sprays are commonly used in engineering applications to mitigate this, they typically introduce an additional 15–20 μm of measurement error.
In the field of gear measurement, gears with a module ≥10 mm or a pitch circle diameter >500 mm are typically defined as large gears. The in-machine measurement accuracy of large gear tooth profiles based on the meshing method is primarily determined by gear rotation angle errors and meshing line errors [37]. Tooth profile measurement requires a high-precision angular reference. The tooth profile error specified in gear accuracy standards is calculated as a linear value along the meshing line direction. As the diameter of the measured gear increases, the linear error also increases accordingly, leading to a decline in angular measurement accuracy, which is highly unfavorable for tooth profile measurement of large-diameter gears [38,39].
To address the aforementioned issues, this study proposes an extended field-of-view measurement method based on binocular collaborative vision. This method innovatively employs a high-precision measurement reference plate as a unified spatial reference for the binocular vision system, aligning the fields of view of both cameras to the coordinate system of the measurement reference plate. In the specific implementation, the binocular camera system is first rigorously calibrated to precisely calculate the physical coordinates of the two camera centers within the measurement reference plate’s coordinate system. The binocular system synchronously captures image data of the measurement reference plate, and image processing algorithms are used to extract the coordinates of the two cameras’ field-of-view centers on the reference plate. The spatial positional relationship between these two points is then fitted. When the measured large gear rotates, the connecting line between the two camera centers rotates accordingly. By tracking the angular changes of this connecting line in real time, the rotation angle of the large gear can be accurately calculated. Leveraging the spatial complementarity of the dual cameras, the measurement range is effectively extended through field-of-view superposition, enabling high-precision measurement of large gear rotation angles.

2. Basic Theory

In the gear coordinate system (i.e., the measurement datum plate coordinate system O 4 ( X 4 , Y 4 ) ), the pre-rotation field-of-view centers of the binocular camera are P 1 ( X P 1 , Y P 1 ) , Q 1 ( X Q 1 , Y Q 1 ) , while the post-rotation field-of-view centers become P 2 ( X P 2 , Y P 2 ) , Q 1 ( X Q 2 , Y Q 2 ) . See Figure 1.
The displacement L P and rotation angle δ P between the first and second output images from Camera 1 are given by the following:
L P = ( X P 2 X P 1 ) 2 + ( Y P 2 Y P 1 ) 2
δ P = a r c t a n Y P 2 Y P 1 X P 2 X P 1
The displacement L Q and rotation angle δ Q between the first and second output images from Camera 2 are given by the following:
L Q = ( X Q 2 X Q 1 ) 2 + ( Y Q 2 Y Q 1 ) 2
δ Q = a r c t a n Y Q 2 Y Q 1 X Q 2 X Q 1
The chord length L can be calculated using the Law of Cosines as follows:
L = L Q 2 + L P 2 2 L Q L P c o s ( δ Q δ P )
Since the baseline distance L C of the stereo camera has been calibrated, the rotation angle θ can be calculated as follows:
θ = 2 a r c s i n L 2 L C
To improve the measurement accuracy of rotation angles, the length of line PQ must not be excessively short. When measuring the rotation angle of line PQ using a vision system, the camera’s field-of-view must encompass not only the two endpoints of PQ but also the entire trajectory from its initial to final rotational positions. Employing such a large field-of-view compromises the measurement precision of a single camera. Consequently, this paper proposes a binocular camera system to achieve high-precision rotation angle measurements for large gear.

3. System Solution Design

3.1. General System Design

The binocular vision rotation angle measurement system primarily consists of CMOS industrial cameras, dual telecentric lenses, a measurement reference plate, and two 2D dot array calibration plates, as illustrated in Figure 2. The measurement reference plate is fixed on the large gear and rotates with it. Using the 2D dot array calibration plate as a feature marker, the binocular cameras continuously capture images of the measurement reference plate on the rotating large gear. The rotation angle value is obtained by calculating the coordinates of the binocular cameras’ field-of-view center points within the measurement reference plate coordinate system.

3.2. Measurement Reference Plate Design

The measurement reference plate serves as the mounting base for two parallel 2D dot array calibration targets. Each target comprises a 36 × 76 orthogonal grid of calibration circles with 2 mm diameter and 5 mm pitch. The centers of the two calibration plates are separated by 400 mm on the reference plate. Sixteen high-precision alignment holes are distributed across the plate’s structural framework to establish coordinate transformations between the calibration plates and the reference plate. These holes provide metrological traceability by bridging the coordinate systems of individual calibration plates with the unified reference frame, as illustrated in Figure 3.

3.3. Measurement Reference Plate Calibration Method

3.3.1. Coordinate System Establishment

In the rotation angle measurement system, multi-level coordinate system conversion demonstrates clear engineering necessity. From the perspective of rotation angle measurement principles, the measurement essentially requires establishing precise correspondence between image pixel space and real-world coordinate systems, a process that inherently involves coordinate conversion. While traditional monocular vision solutions typically adopt a three-level conversion architecture, our binocular measurement system faces special engineering constraints: large-sized 2D dot array calibration boards present practical challenges, including difficult-to-guarantee machining accuracy, significantly increased accumulated errors, and prohibitively high manufacturing costs. To address these issues, this study innovatively employs a dual 2D dot array calibration board splicing scheme, consequently introducing an additional fourth-level coordinate conversion step. This involves conversions among four distinct coordinate systems: image pixel coordinate system O 1 ( X ( 1 ) , Y ( 1 ) ) , image physical coordinate system O 2 ( X ( 2 ) , Y ( 2 ) ) , 2D dot array calibration board coordinate system O 3 ( X ( 3 ) , Y ( 3 ) ) , and measurement reference board coordinate system O 4 ( X ( 4 ) , Y ( 4 ) ) . The relationships between these four hierarchical coordinate systems are illustrated in Figure 4. This solution effectively controls machining errors and manufacturing costs by decomposing a single large-sized 2D dot array calibration board into two precisely machinable smaller ones. At the technical implementation level, it establishes spatial mapping relationships between the two calibration boards through optimized algorithms, ultimately completing angle calculations in a unified measurement reference board coordinate system.
The image pixel coordinate system O 1 X 1 , Y 1 is defined with the top-left corner of the image as the origin O 1 . The X-axis extends horizontally (along the image width), and the Y-axis extends vertically (along the image height), with both axes measured in pixels. The image physical coordinate system O 2 X 2 , Y 2 uses the center of the image as the origin O 2 . The X-axis and Y-axis align with the horizontal and vertical directions of the image, respectively, but their coordinates are measured in millimeters (mm). The 2D dot array calibration plate coordinate system O 3 ( X 3 , Y 3 ) takes the center of the top-left calibration circle on the calibration board as its origin O 3 . The X-axis follows the long-edge direction of the board, while the Y axis aligns with its short-edge direction. The measurement reference plate coordinate system O 4 ( X 4 , Y 4 ) has its origin O 4 at point A , the fitted center derived from the four positioning holes A 11 , A 12 , A 21 , A 22 at the plate’s top-left corner. The X-axis is defined by the line connecting point A to point B (the fitted center of the four positioning holes B 11 , B 12 , B 21 , B 22 at the top-right corner). The Y-axis is defined by the line connecting point A to point C (the fitted center of the four positioning holes C 11 , C 12 , C 21 , C 22 at the bottom-left corner).

3.3.2. Calibration Method for Reference Plate

Let point A have coordinates O i ( X ( i ) , Y ( i ) ) in the pre-transformation coordinate system A ( X i , Y i ) and coordinates O i + 1 ( X ( i + 1 ) , Y ( i + 1 ) ) in the post-transformation coordinate system A ( X i + 1 , Y i + 1 ) . Their relationship is governed by Equation (7) [40]:
X i + 1 Y i + 1 = Δ X Δ Y + m cos α sin α sin α cos α X i Y i
In Equation (7), m is the scale factor, α is the rotation parameter of the coordinate system (i.e., the angle between the axis- X i and Y i , with counterclockwise defined as positive), and the remaining are the translation parameters. To solve for the four parameters ( Δ X , Δ Y , m , α ), the coordinates of two distinct feature points must be known in both the O i X i , Y i   and O i + 1 ( X ( i + 1 ) , Y ( i + 1 ) ) coordinate systems.
Based on Equation (7), the conditional equations are formulated as follows:
X i + 1 Y i + 1 X i Y i = Δ X Δ Y + m cos α 1 m sin α m sin α m cos α 1 X i Y i
Let a = m cos α 1 , and b = m sin α . Then,
X i + 1 Y i + 1 X i Y i = Δ X Δ Y + X i Y i Y i X i a b
The equivalent expression is as follows:
X i + 1 Y i + 1 X i Y i = 1 0 X i Y i 0 1 Y i X i Δ X Δ Y a b
where L = X i + 1 Y i + 1 X i Y i , B = 1 0 X i Y i 0 1 Y i X i , X = Δ X Δ Y a b , P = 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 .
According to the least squares principle, the parameter vector X is calculated as follows:
X = ( B T P B ) 1 B T P L
After obtaining the solution for X , the scale factor m and rotation angle α can be derived from parameters a and b using the following relationships:
α = arctan b a + 1
m = ( a + 1 ) 2 + b 2
Through the above calculation process, the calculations from the image pixel coordinate O 1 ( X ( 1 ) , Y ( 1 ) ) to the image physical coordinate system O 2 ( X ( 2 ) , Y ( 2 ) ) , from the image physical coordinate system O 2 ( X ( 2 ) , Y ( 2 ) ) to the 2D dot array calibration plate coordinate system O 3 ( X ( 3 ) , Y ( 3 ) ) , and from the 2D dot array calibration plate coordinate system O 3 ( X ( 3 ) , Y ( 3 ) ) to the measurement reference plate coordinate system O 4 ( X ( 4 ) , Y ( 4 ) ) are completed. Finally, we get the four parameter values from the coordinate system of the 2D lattice calibration plate to the coordinate system of the measurement reference plate so as to complete the calibration of the measurement reference plate.

4. Experiment and Verification

4.1. Experimental Setup

The experiment utilized a large gear with a module of 20 and 30 teeth as the test object, applying the proposed method for angle rotation measurement. The vision measurement unit comprised: a CMOS monochrome camera (Photron, MV-E EM HS, Tokyo, Japan) with a resolution of 2248 × 2048 pixels; a bi-telecentric long-focus lens (Opto Engineering, TC12 48, Mantova, Italy) with a focal length of 131.4 mm; an LED collimated white light source (Jenoptik, JL-BRPX-380 × 180W-C, Jena, GermanyJ); and a digital light intensity controller (Jenoptik, JL-APS2-14424-2, Jena, Germany) to construct the angle measurement system. To validate the accuracy of the setup, the rotary stage (Newport, GTN-250R, Irvine, CA, USA) integrated with the gear was selected as the reference standard. This stage employs standard pulse signals for angle control and utilizes an internal rotary optical encoder grating as the traceable angular measurement standard, enabling real-time acquisition of the rotated angle values. The large gear was rigidly coupled with the high-precision rotary stage, and the experimental configuration is illustrated in Figure 5.

4.2. Experimental Environment Control

In terms of temperature control, considering the sensitivity of the CCD camera and the 2D dot-matrix calibration plate to temperature variations, fluctuations may lead to issues such as sensor thermal noise, lens and material thermal expansion, etc. The experimental environment temperature is strictly maintained within the range of 20 ± 2 °C. For vibration conditions, the setup strictly adheres to the ISO 20816-1 standard [41], ensuring that environmental vibration levels remain below 1.0 mm/s RMS. All data acquisition is triggered only when the vibration monitoring system confirms that the readings are within the safe threshold. Regarding lighting conditions, since the system is equipped with a digital light intensity controller, it can automatically adjust brightness based on the experimental site’s illumination, ensuring that the camera’s field of view remains clear at all times. Therefore, the actual lighting conditions impose minimal constraints. These environmental parameter control measures effectively ensure the reliability and repeatability of the experimental results.

4.3. Pixel Equivalent Calibration Experiment

In visual measurement, to obtain the dimensional parameters of an object, it is necessary to establish the correspondence between pixel dimensions and actual physical dimensions by calibrating the actual size represented by each pixel, which is referred to as the pixel equivalent [42,43]. The pixel equivalent serves as one of the most important parameters in visual measurement systems and has significant influence on measurement accuracy, thus requiring precise calibration [44]. To improve the measurement accuracy while considering the camera’s field of view and resolution, this study selected a 5 × 5 array of 25 calibration circles in the central region of a two-dimensional dot-pattern calibration board as the pixel equivalent calibration area, as illustrated in Figure 6. For the pixel equivalent calibration in visual measurement systems, this work adopted the method based on the center-to-center distance of calibration circles rather than the edge transition regions, mainly due to the following considerations. The edge transition regions of calibration circles exhibited obvious grayscale gradient variations, making them susceptible to image noise interference and highly sensitive to edge detection algorithms. In comparison, the center-to-center distance calibration method demonstrated significant advantages. The center coordinates were obtained through least-squares fitting of the entire circumference, which effectively suppressed the influence of local grayscale fluctuations at the edges. Experimental results showed that even when the edges of calibration circles exhibit blurring of up to ±5 pixels, the variation in center-to-center distance can be controlled within 0.1 μm, thereby ensuring the calibration precision.
The subpixel edges of calibration circles were acquired through edge extraction algorithms, followed by least-squares centroid fitting to determine the pixel coordinates of circle centers. The average pixel value of inter-centroid distances is derived by calculating the sum of pixel distances between the central centroid center and its 24 surrounding centroids. Given the known physical inter-centroid distance of 5 mm, the pixel equivalent of the measurement system was determined through the ratio of physical-to-pixel distances. The calibrated pixel equivalents of the dual-camera system are summarized in Table 1.

4.4. Calibration Experiment of Measurement Reference Plate

4.4.1. Transformation from Image Pixel Coordinates O 1 ( X ( 1 ) , Y ( 1 ) ) to Image Physical Coordinates O 2 ( X ( 2 ) , Y ( 2 ) )

Processing image (a) in Figure 6 yielded the coordinates of the centers of four positioning holes ( A 11 , A 12 , A 21 , A 22 ) and the centers of calibration circles ( a 11 , a 22 ) approximately concentric with A 11 and A 22 within the image pixel coordinate system. Based on the pixel equivalent value derived from binocular camera calibration, the coordinates of A 11 , A 12 , A 21 , A 22 , and a 11 , a 22 in the image physical coordinate system were calculated. According to the mean formula, the coordinate value of A in the physical coordinates of the image can be calculated. Similarly, processing images (b), (c), and (d) in Figure 6 yielded the results summarized in Table 2.

4.4.2. Transformation from Image Physical Coordinates O 2 X 2 , Y 2 to 2D Dot Array Calibration Plate Coordinates O 3 ( X ( 3 ) , Y ( 3 ) )

To determine the four transformation parameters Δ X , Δ Y , m , and α for the coordinate system conversion between O 2 ( X ( 2 ) , Y ( 2 ) ) and O 3 ( X ( 3 ) , Y ( 3 ) ) , two points were selected within the spatial regions containing points A , B , C , and D . These points’ coordinates in both the image physical coordinate system and the 2D dot array calibration plate coordinate system were computed. To minimize systematic errors during the transformation process and enhance calibration accuracy, diagonal points were selected for coordinate conversion, as illustrated in Table 3. The calculated values of the four transformation parameters ( Δ X , Δ Y , m , α ) are summarized in Table 4.
Using the coordinates of point A in the image physical coordinate system A ( X A ( 2 ) , Y A ( 2 ) ) and the four transformation parameters ( Δ X , Δ Y , m , α ), the coordinates of A in the 2D dot array calibration plate coordinate system A ( X A ( 3 ) , Y A ( 3 ) ) ) can be derived. This iterative process was repeated for images (b), (c), and (d) in Figure 7 to calculate the respective four parameters and subsequently determine the coordinates of points B , C , and D in the 2D dot array calibration plate coordinate system. The parameters were specifically B ( X A ( 3 ) , Y A ( 3 ) ) , C ( X A ( 3 ) , Y A ( 3 ) ) , D ( X A ( 3 ) , Y A ( 3 ) ) . The calculated results are summarized in Table 5.

4.4.3. Transformation from 2D Dot Array Calibration Plate Coordinates O 3 X 3 , Y 3 to Measurement Reference Plate Coordinates O 4 ( X ( 4 ) , Y ( 4 ) )

The coordinates of the centers of 16 high-precision positioning holes ( A 11 A 22 , B 11 B 22 , C 11 C 22 , and D 11 D 22 ) on the measurement reference plate were measured using a Coordinate Measuring Machine (CMM) in the O 4 X 4 , Y 4 coordinate system. The mean-value formula was applied to calculate the coordinates of points A, B, C, and D in the measurement reference plate coordinate system. The measurement results are listed in Table 6.
Using the coordinates of points A, B and C, D in both the 2D dot array calibration plate coordinate system and the measurement reference plate coordinate system, the four transformation parameters ( Δ X , Δ Y , m , α ) for converting between the two coordinate systems were calculated. The results are summarized in Table 7.

4.5. Angle Rotation Measurement Experiment

4.5.1. Angle Rotation Measurement Procedure

(1)
The measurement reference plate was secured to the large gear using three magnetic mounts, and the plate was leveled via a three-point leveling device.
(2)
Prior to initiating the measurement program, the initial coordinates of the calibration circle center closest to the field-of-view center and its right adjacent circle center were recorded in the 2D dot array calibration plate coordinate system: a 0 ( X a 0 ( 3 ) , Y a 0 ( 3 ) ) , b 0 ( X b 0 ( 3 ) , Y b 0 ( 3 ) ) .
(3)
The rotary table was activated to rotate at a constant angular velocity via the measurement program while simultaneously triggering the binocular cameras for continuous image acquisition.
(4)
The stereo camera system synchronously captures sequential images of the calibration target at fixed temporal intervals, generating dual image sequences P i ( i = 1,2 , 3 . . . , n ) and Q i ( i = 1,2 , 3 . . . , n ) from respective optical channels.

4.5.2. Image Processing

The image processing methods for both cameras are identical. This section details the data processing procedure for Camera 1.
(1)
From image i , extract the coordinates of the calibration circle center closest to the field of view center and its right adjacent circle center in the image pixel coordinate system: a i X a i 1 , Y a i 1 and b i ( X b i ( 1 ) , Y b i ( 1 ) ) . Additionally, derive the camera center coordinate P i ( X P i ( 1 ) , Y P i ( 1 ) ) in the same coordinate system. Chose a i as the reference point because the area near the center of the field of view is less affected by camera distortion. Select b i primarily for programming convenience and ease of identification. The positional relationship of these three points is illustrated in Figure 7.
(2)
Based on the pixel equivalent of Camera 1, compute the coordinates of the aforementioned three points in the image physical coordinate system: a i ( X a i ( 2 ) , Y a i ( 2 ) ) , b i X b i 2 , Y b i 2 , and P i ( X P i ( 2 ) , Y P i ( 2 ) ) .
(3)
Given the initial coordinates a 0 ( X a 0 ( 3 ) , Y a 0 ( 3 ) ) , b 0 ( X b 0 ( 3 ) , Y b 0 ( 3 ) ) of the calibration circle centers closest to the field-of-view center and its immediate right neighbor in the 2D dot array calibration plate coordinate system, iteratively compute the coordinates a 0 ( X a 0 ( 3 ) , Y a 0 ( 3 ) ) , b 0 ( X b 0 ( 3 ) , Y b 0 ( 3 ) ) for each image frame using an 8-neighborhood coordinate extraction algorithm.
(4)
Compute the four-parameter values Δ X i ( 32 ) , Δ Y i ( 32 ) , α i ( 32 ) , and m i ( 32 ) using the coordinates of points a i and b i derived from Steps (2) and (3) in both the image physical coordinate system and the 2D dot array calibration plate coordinate system. During rotational angle measurement, as the large gear undergoes rotary motion, the image physical coordinate system experiences relative rotation with respect to the 2D dot array calibration plate coordinate system. Consequently, the four parameters describing the coordinate transformation for each image i vary with each rotation angle.
(5)
Using the four parameters ( Δ X i ( 32 ) , Δ Y i ( 32 ) , α i ( 32 ) , m i ( 32 ) ) and P i ( X P i ( 2 ) , Y P i ( 2 ) ) , compute the camera center coordinate P i ( X P i ( 3 ) , Y P i ( 3 ) ) in the 2D dot array calibration plate coordinate system.
(6)
Transform P i ( X P i ( 3 ) , Y P i ( 3 ) ) to the measurement reference plate coordinate system via the pre-calibrated four-parameter set ( Δ X , Δ Y , m , α ), resulting in P i ( X P i ( 4 ) , Y P i ( 4 ) ) . The coordinate distribution of Camera 1 centered in this system is shown in Figure 8a.
Similarly, by repeating the above process for Camera 2 through continuous image acquisition, the coordinates Q i ( X Q i ( 4 ) , Y Q i ( 4 ) ) were derived and visualized in Figure 8b.

4.5.3. Analysis of Rotation Angle Measurement Results

Based on the coordinates of points P i X P i 4 , Y P i 4 and Q i ( X Q i ( 4 ) , Y Q i ( 4 ) ) obtained through image processing and measurement procedures combined with the rotation angle measurement principle for large gears, the rotation angle can be calculated. Using the rotation angle values measured by the circular grating as the theoretical reference, the results obtained by this measurement method were compared and analyzed. The test object divided the measured large gear into six equal parts uniformly. Rotation angle measurements were conducted at each predetermined measurement position on the large gear, as shown in Figure 9. The system collected rotation angle data from each measurement point and plotted the rotation angle error distribution curve, as illustrated in Figure 10. The analysis of the experimental data revealed that the maximum error between the actual rotation angle and the theoretical rotation angle was 0.001895°, which was within 7 arcseconds.
Under identical experimental conditions, the measurement system conducted 15 repeated measurements for six angular positions: 5°, 10°, 15°, 20°, 25°, and 30°. The repeatability deviation of any individual measurement was calculated as the difference between that specific measurement result and the arithmetic mean of the 15 measurement outcomes. As statistically demonstrated in Figure 11, the maximum observed repeatability deviation of the system reached 0.000699°, which corresponds to a repeatability precision within 3 arcseconds (3”).

4.5.4. Angle Measurement Application

(1) Analysis of System Application Scenarios.
In the field of large gear manufacturing, high-precision measurement and online machining inspection are critical to ensuring gear performance. The dynamic angular measurement system for large gears, based on an extended visual field, achieves full coverage of gear end faces through dual-camera collaborative vision expansion technology. It can capture angular deviation of spur gears in real time, making it particularly suitable for low-speed, heavy-duty applications such as wind turbine gearboxes and marine propulsion systems. During online inspection, the system analyzes parameters like pitch error and tooth profile error using gear meshing measurement methods, providing direct feedback to machining tools for closed-loop compensation. This effectively addresses the latency issues associated with traditional offline inspection. The technology significantly improves consistency in batch gear production and enhances fault prediction capabilities, delivering core data support for intelligent manufacturing.
(2) Application Case of Rotation Angle Measurement.
In large gear profile measurement based on the meshing method, the measurement accuracy of the system is primarily determined by gear rotation angle error and meshing line error. Based on the experimental data from six angle measurements conducted within the 0° to 30° range and the displacement values measured by a laser rangefinder, the total profile deviation of the measured large gear was calculated to be 0.0401 mm, as shown in Figure 12.
According to GB/T 10095.1-2008 standard [45], for large gears with reference diameters ranging from 560 mm to 1000 mm and modules between 16 and 25, the allowable total profile deviation is 42 μm. Therefore, this system can meet the Grade 7 precision inspection requirements for large gears.

4.5.5. System Optimization Prospects

The current measurement system employs a relatively bulky and fixed structural configuration. System optimization will be implemented through the following two aspects:
(1) Modular design concept.
Adopting a modular design philosophy to construct a reconfigurable vision measurement platform, the dual-camera setup and telecentric lenses are integrated onto an adjustable bracket. A composite bracket structure with multi-degree-of-freedom adjustment mechanisms ensures system stability while achieving spatial compression of the measurement system. Mechanical simulations are conducted to validate structural stability. Additionally, a retractable multi-functional reference plate is designed, with partitioned reuse of the reference plate to minimize space occupation.
(2) Intelligent adjustment.
For intelligent regulation, a machine learning-based adaptive zoom system is developed. Combined with closed-loop control algorithms, it enables a dynamic adjustment of working distance. An online calibration and compensation mechanism is established to maintain measurement accuracy. This optimization solution significantly enhances system compactness and environmental adaptability while preserving original measurement performance, providing technical support for industrial field applications.

5. Conclusions

To address the requirement for high-precision rotational angle measurement of a large gear, this study proposes a high-precision binocular vision-based measurement method. Sub-pixel edge extraction algorithms are employed to capture calibration circle contours, with least-squares fitting applied to determine circle center coordinates. The pixel equivalent ratio of the binocular camera is calibrated by comparing the physical and pixel dimensions of inter-circle distances. A measurement reference plate serves as the unified coordinate system benchmark, and its spatial calibration is achieved via a two-dimensional four-parameter transformation algorithm, ensuring multi-view data alignment in a global coordinate system.
The implemented binocular vision system dynamically captures calibration circle images of large gear to compute rotation angles in real time. The experimental results demonstrate that within a 0–30° rotation range, the system achieves a maximum absolute error of 0.001895° (equivalent to 7 arcseconds) and a repeatability deviation below 0.000699° (3 arcseconds). The system addresses the accuracy degradation in rotational angle measurement of large gear caused by mechanical wear or deformation in traditional contact-based sensors through non-contact measurement and is validated in demanding industrial scenarios requiring ultra-high precision, such as aerospace module assembly and wind turbine gearbox inspection.

Author Contributions

Conceptualization, P.D. and Z.D.; methodology, P.D.; software, P.D. and E.L.; validation, P.D. and G.J.; formal analysis, P.D.; investigation, J.Z.; resources, W.Z. and Z.D.; data curation, P.D.; writing—original draft preparation, P.D.; writing—review and editing, P.D., J.Z., and G.J.; visualization, E.L.; supervision, J.Z. and Z.D.; project administration, W.Z.; funding acquisition, Z.D. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Scientific Research Fund Project of Liaoning Provincial Department of Education (Grant No. LJGD2020003, LJKZ0160).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors upon request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Kumar, A.S.A.; George, B.; Mukhopadhyay, S.C. Technologies and Applications of Angle Sensors: A Review. IEEE Sens. J. 2021, 21, 7195–7206. [Google Scholar] [CrossRef]
  2. Gao, W.; Kim, S.W.; Bosse, H.; Haitjema, H.; Chena, Y.L.; Lu, X.D.; Knapp, W.; Weckenmann, A.; Estler, W.T.; Kunzmann, H. Measurement technologies for precision positioning. CIRP Ann.–Manuf. Technol. 2015, 64, 773–796. [Google Scholar] [CrossRef]
  3. Wang, S.; Ma, R.; Cao, F.; Luo, L.; Li, X. A Review: High-Precision Angle Measurement Technologies. Sensors 2024, 24, 1755. [Google Scholar] [CrossRef]
  4. Li, M.; Guo, L.; Guo, X.; Yang, Y.; Zou, J. Research on the Construction of a Digital Twin Model for a 3D Measurement System of Gears. IEEE Access 2023, 11, 124556–124569. [Google Scholar] [CrossRef]
  5. Guan, X.; Tang, Y.; Dong, B.; Li, G.; Fu, Y.; Tian, C. An Intelligent Detection System for Surface Shape Error of Shaft Workpieces Based on Multi-Sensor Combination. Appl. Sci. 2023, 13, 12931. [Google Scholar] [CrossRef]
  6. Wen, Q.; Li, P.; Zhang, Z.; Hu, H. Displacement Measurement Method Based on Double-Arrowhead Auxetic Tubular Structure. Sensors 2023, 23, 9544. [Google Scholar] [CrossRef] [PubMed]
  7. Das, S.; Chakraborty, B. Design and Realization of an Optical Rotary Sensor. IEEE Sens. J. 2018, 18, 2675–2681. [Google Scholar] [CrossRef]
  8. Ahmadi Tameh, T.; Sawan, M.; Kashyap, R. Novel Analog Ratio-Metric Optical Rotary Encoder for Avionic Applications. IEEE Sens. J. 2016, 16, 6586–6595. [Google Scholar] [CrossRef]
  9. Liu, B.; Wang, Y.; Liu, J.; Chen, A. Self-calibration Method of Circular Grating Error in Rotary Inertial Navigation. In Proceedings of the 2023 2nd International Symposium on Sensor Technology and Control (ISSTC), Hangzhou, China, 11–13 August 2023; pp. 139–143. [Google Scholar]
  10. Du, Y.; Yuan, F.; Jiang, Z.; Li, K.; Yang, S.; Zhang, Q.; Zhang, Y.; Zhao, H.; Li, Z.; Wang, S. Strategy to Decrease the Angle Measurement Error Introduced by the Use of Circular Grating in Dynamic Torque Calibration. Sensors 2021, 21, 7599. [Google Scholar] [CrossRef]
  11. Wu, H.; Zhang, F.; Liu, T.; Li, J.; Qu, X. Absolute distance measurement with correction of air refractive index by using two-color dispersive interferometry. Opt. Express 2016, 24, 24361–24376. [Google Scholar] [CrossRef]
  12. Liang, X.; Lin, J.; Yang, L.; Wu, T.; Liu, Y.; Zhu, J. Simultaneous Measurement of Absolute Distance and Angle Based on Dispersive Interferometry. IEEE Photonics Technol. Lett. 2020, 32, 449–452. [Google Scholar] [CrossRef]
  13. Straube, G.; Fischer Calderón, J.S.; Ortlepp, I.; Füßl, R.; Manske, E. A Heterodyne Interferometer with Separated Beam Paths for High-Precision Displacement and Angular Measurements. Nanomanuf. Metrol. 2021, 4, 200–207. [Google Scholar] [CrossRef]
  14. Kumar, A.; Chandraprakash, C. Computer Vision-Based On-Site Estimation of Contact Angle From 3-D Reconstruction of Droplets. IEEE Trans. Instrum. Meas. 2023, 72, 2524108. [Google Scholar] [CrossRef]
  15. Hsu, Q.-C.; Ngo, N.-V.; Ni, R.-H. Development of a faster classification system for metal parts using machine vision under different lighting environments. Int. J. Adv. Manuf. Technol. 2019, 100, 3219–3235. [Google Scholar] [CrossRef]
  16. Shankar, R.K.; Indra, J.; Oviya, R.; Heeraj, A.; Ragunathan, R. Ragunathan. Machine Vision based quality inspection for automotive parts using edge detection technique. IOP Conf. Ser. Mater. Sci. Eng. 2021, 1055, 012029. [Google Scholar] [CrossRef]
  17. McGuinness, B.; Duke, M.; Au, C.K.; Lim, S.H. Measuring radiata pine seedling morphological features using a machine vision system. Comput. Electron. Agric. 2021, 189, 106355. [Google Scholar] [CrossRef]
  18. Zhou, J.; Yu, J. Chisel edge wear measurement of high-speed steel twist drills based on machine vision. Comput. Ind. 2021, 128, 103436. [Google Scholar] [CrossRef]
  19. Li, Z.; Liao, W.; Zhang, L.; Ren, Y.; Sun, G.; Sang, Y. Feature-Model-Based In-Process Measurement of Machining Precision Using Computer Vision. Appl. Sci. 2024, 14, 6094. [Google Scholar] [CrossRef]
  20. Venkatesan, C.; Ai-Turjman, F.; Pelusi, D. Guest Editorial: Smart Measurement in Machine Vision for Challenging Applications. IEEE Instrum. Meas. Mag. 2023, 26, 3. [Google Scholar] [CrossRef]
  21. Wu, J.; Jiang, H.; Wang, H.; Wu, Q.; Qin, X.; Dong, K. Vision-based multi-view reconstruction for high-precision part positioning in industrial robot machining. Measurement 2025, 242, 116042. [Google Scholar] [CrossRef]
  22. Huang, H.L.; Jywe, W.Y.; Chen, G.R. Application of 3D vision and networking technology for measuring the positional accuracy of industrial robots. Int. J. Adv. Manuf. Technol. 2024, 135, 2051–2064. [Google Scholar] [CrossRef]
  23. Shan, D.; Zhu, Z.; Wang, X.; Zhang, P. Pose measurement method based on machine vision and novel directional target. Appl. Sci. 2024, 14, 1698. [Google Scholar] [CrossRef]
  24. Zhang, F.; Wang, L.; Zhang, J. A visual-based angle measurement method using the rotating spot images: Mathematic modeling and experiment validation. IEEE Sens. J. 2021, 21, 16576–16583. [Google Scholar] [CrossRef]
  25. Li, W.; Jin, J.; Li, X.; Li, B. Method of rotation angle measurement in machine vision based on calibration pattern with spot array. Appl. Opt. 2010, 9, 1001–1006. [Google Scholar] [CrossRef] [PubMed]
  26. Xu, J.; Zheng, S.; Sun, K.; Song, P. Research and Application of Contactless Measurement of Transformer Winding Tilt Angle Based on Machine Vision. Sensors 2023, 23, 4755. [Google Scholar] [CrossRef]
  27. Li, J.; Li, J.; Wang, X.; Tian, G.; Fan, J. Machine Vision-Based Method for Measuring and Controlling the Angle of Conductive Slip Ring Brushes. Micromachines 2022, 13, 447. [Google Scholar] [CrossRef]
  28. Zheng, T.; Gong, B.; Li, X.; Zhao, X.; Zhang, Y. A Novel Method for Measuring the Attitude Angles of the Artillery Barrel Based on Monocular Vision. IEEE Access 2024, 12, 155125–155135. [Google Scholar] [CrossRef]
  29. Ahn, J.; Kim, S. Development of online braiding angle measurement system. Text. Res. J. 2024, 95, 1526–1536. [Google Scholar] [CrossRef]
  30. Yu, Z.; Bu, T.; Zhang, Y.; Jia, S.; Huang, T.; Liu, J.K. Robust Decoding of Rich Dynamical Visual Scenes With Retinal Spikes. IEEE Trans. Neural Netw. Learn. Syst. 2025, 36, 3396–3409. [Google Scholar] [CrossRef]
  31. Liu, Z.; Guan, B.; Shang, Y.; Bian, Y.; Sun, P.; Yu, Q. Stereo Event-Based, 6-DOF Pose Tracking for Uncooperative Spacecraft. IEEE Trans. Geosci. Remote Sens. 2025, 63, 5607513. [Google Scholar] [CrossRef]
  32. Jin, J.; Zhao, L.; Xu, S. High-precision rotation angle measurement method based on monocular vision. J. Opt. Soc. Am. A 2014, 31, 1401–1407. [Google Scholar] [CrossRef] [PubMed]
  33. Cui, J.-S.; Huo, J.; Yang, M.; Wang, Y.-K. Research on the rigid body posture measurement using monocular vision by coplanar feature points. Optik 2015, 126, 5423–5429. [Google Scholar] [CrossRef]
  34. Eves, B.J.; Leroux, I.D.; Cen, A.J. Phase shifting angle interferometer. Metrologia 2023, 60, 055006. [Google Scholar] [CrossRef]
  35. Chen, S.-H.; Chen, H.-Y.; Lai, Y.-X.; Hsieh, H.-L. Development of a double-diffraction grating interferometer for measurements of displacement and angle. In Proceedings of the Conference on Optics and Photonics for Advanced Dimensional Metrology, Electr Network, Online, 6–10 April 2020. [Google Scholar]
  36. Xu, X.; Dai, Z.; Tan, Y. A Dual-Beam Differential Method Based on Feedback Interferometry for Noncontact Measurement of Linear and Angular Displacement. IEEE Trans. Ind. Electron. 2023, 70, 6405–6413. [Google Scholar] [CrossRef]
  37. Černe, B.; Petkovšek, M. High-speed camera-based optical measurement methods for in-mesh tooth deflection analysis of thermoplastic spur gears. Mater. Des. 2022, 223, 111184. [Google Scholar] [CrossRef]
  38. Yin, P.; Han, F.; Wang, J.; Lu, C. Influence of module on measurement uncertainty of gear tooth profile deviation on gear measuring center. Measurement 2021, 182, 109688. [Google Scholar] [CrossRef]
  39. Shimizu, Y.; Matsukuma, H.; Gao, W. Optical sensors for multi-axis angle and displacement measurement using grating reflectors. Sensors 2019, 19, 5289. [Google Scholar] [CrossRef]
  40. Shults, R.; Urazaliev, A.; Annenkov, A.; Nesterenko, O.; Kucherenko, O.; Kim, K. Different approaches to coordinate transformation parameters determination of nonhomogeneous coordinate systems. In Proceedings of the 11th International Conference “Environmental Engineering”, Vilnius, Lithuania, 21–22 May 2020. [Google Scholar] [CrossRef]
  41. ISO 20816-1:2016; Mechanical Vibration—Measurement and Evaluation of Machine Vibration. ISO: Geneva, Switzerland, 2016.
  42. Zhang, Z.; Wang, X.; Zhao, H.; Ren, T.; Xu, Z.; Luo, Y. The Machine Vision Measurement Module of the Modularized Flexible Precision Assembly Station for Assembly of Micro- and Meso-Sized Parts. Micromachines 2020, 11, 918. [Google Scholar] [CrossRef]
  43. Yin, Z.; Zhao, L.; Li, T.; Zhang, T.; Liu, H.; Cheng, J.; Chen, M. Automated alignment technology for laser repair of surface micro-damages on large-aperture optics based on machine vision and rangefinder ranging. Measurement 2025, 244, 116511. [Google Scholar] [CrossRef]
  44. Wang, X.; Li, F.; Du, Q.; Zhang, Y.; Wang, T.; Fu, G.; Lu, C. Micro-amplitude vibration measurement using vision-based magnification and tracking. Measurement 2023, 208, 112464. [Google Scholar] [CrossRef]
  45. GB/T 10095.1-2008; Cylindrical Gears—System of Accuracy—Part 1: Definitions and Allowable Values of Deviations Relevant to Corresponding Flanks of Gear Teeth. Standardization Administration of the People’s Republic of China: Beijing, China, 2008.
Figure 1. Schematic of rotation angle measurement.
Figure 1. Schematic of rotation angle measurement.
Sensors 25 03576 g001
Figure 2. Rotating angle measurement system for large gear.
Figure 2. Rotating angle measurement system for large gear.
Sensors 25 03576 g002
Figure 3. Design of reference plate.
Figure 3. Design of reference plate.
Sensors 25 03576 g003
Figure 4. The relationship between coordinate systems.
Figure 4. The relationship between coordinate systems.
Sensors 25 03576 g004
Figure 5. Experimental device.
Figure 5. Experimental device.
Sensors 25 03576 g005
Figure 6. Calibration of reference plate.
Figure 6. Calibration of reference plate.
Sensors 25 03576 g006
Figure 7. Coordinate position relationship of a i b i , and  P i .
Figure 7. Coordinate position relationship of a i b i , and  P i .
Sensors 25 03576 g007
Figure 8. The coordinate trajectory of the camera center in the reference plate coordinate system. (a) The coordinate curve of Camera 1′s center point in the reference plate coordinate system. (b) The coordinate curve of Camera 2′s center point in the reference plate coordinate system.
Figure 8. The coordinate trajectory of the camera center in the reference plate coordinate system. (a) The coordinate curve of Camera 1′s center point in the reference plate coordinate system. (b) The coordinate curve of Camera 2′s center point in the reference plate coordinate system.
Sensors 25 03576 g008
Figure 9. Rotation angle measurement positions (30° counterclockwise rotations starting from Gear 1, Gear 6, Gear 11, Gear 16, Gear 21, and Gear 26 respectively).
Figure 9. Rotation angle measurement positions (30° counterclockwise rotations starting from Gear 1, Gear 6, Gear 11, Gear 16, Gear 21, and Gear 26 respectively).
Sensors 25 03576 g009
Figure 10. Verification results of measurement system accuracy.
Figure 10. Verification results of measurement system accuracy.
Sensors 25 03576 g010
Figure 11. Verification results of repeatability measurement accuracy.
Figure 11. Verification results of repeatability measurement accuracy.
Sensors 25 03576 g011
Figure 12. Measured value of total profile deviation for large gear.
Figure 12. Measured value of total profile deviation for large gear.
Sensors 25 03576 g012
Table 1. Calibration results of pixel equivalent.
Table 1. Calibration results of pixel equivalent.
CameraThe Center Distance of
Physical Dimensions/mm
The Center Distance of Pixel Dimensions/PixelPixel Equivalent/μm
15266.017518.7958
25266.473418.7636
Table 2. Coordinate values of the selected points in the image physical coordinate system of the image.
Table 2. Coordinate values of the selected points in the image physical coordinate system of the image.
PointsThe Image Physical
Coordinate Values
PointsThe Image Physical
Coordinate Values
A 11 (−7.7499, 4.5968) B 11 (−7.7462, 3.5232)
A 12 (−7.7585, −5.3396) B 12 (−7.7394, −6.4343)
A 21 (2.2464, −5.3572) B 21 (2.1792, −6.4728)
A 22 (2.255, 4.6411) B 22 (2.2253, 3.4502)
A (−2.7518, −0.3647) B (−2.7703, −1.4834)
C 11 (−6.8633, 4.1014) D 11 (−6.9554, 4.0799)
C 12 (−6.9426, −5.8532) D 12 (−6.9522, −5.9734)
C 21 (3.1689, −5.8435) D 21 (3.1409, −5.8716)
C 22 (3.0945, 4.1221) D 22 (3.0282, 4.0708)
C (−1.8856, −0.8683) D (−1.9346, −0.9236)
Table 3. The coordinates of the selected points in the image physical coordinate system and the calibration board coordinate system.
Table 3. The coordinates of the selected points in the image physical coordinate system and the calibration board coordinate system.
PointsThe Image Physical Coordinate
Values
The Calibration Board Coordinate Values
a 11 (−7.8233, 4.6096)(−5, 170)
a 22 (2.1557, −5.4125)(−15, 160)
b 11 (−7.7393, 3.6197)(−360, 170)
b 22 (2.237, −6.4059)(−370, 160)
c 11 (−6.9289, 4.1157)(5, 15)
c 22 (3.0512, −5.9063)(15, 5)
d 11 (−6.9941, 4.0914)(360, 15)
d 22 (3.0367, −5.8802)(370, 5)
Table 4. Four parameter values from the image physical coordinate system to the calibration board coordinate system.
Table 4. Four parameter values from the image physical coordinate system to the calibration board coordinate system.
Points Δ X Δ Y m α
A 12.8328165.4075−0.99993.1394
B 367.7474166.3997−0.99993.1391
C 11.936710.8993−0.99993.1395
D 366.981110.8884−0.99993.1386
Table 5. The coordinates of A , B , C , and D in the calibration board coordinate system.
Table 5. The coordinates of A , B , C , and D in the calibration board coordinate system.
PointsThe Calibration Board Coordinate Values
A (10.08196, 165.0369)
B (364.9811, 164.9096)
C (10.05311, 10.02708)
D (365.044, 9.970665)
Table 6. The coordinates of selected points in the reference plate coordinate system.
Table 6. The coordinates of selected points in the reference plate coordinate system.
PointsThe Reference Plate
Coordinate Values
PointsThe Reference Plate
Coordinate Values
A 11 (−5.4162, 4.791) B 11 (349.5118, 5.0201)
A 12 (−5.3551, −5.18) B 12 (349.5381, −4.9659)
A 21 (4.5691, −5.1535) B 21 (359.5017, −5.0561)
A 22 (4.5462, 4.7649) B 22 (359.5675, 5.0564)
A (−0.414, −0.1944) B (354.5298, 0.0136)
C 11 (−5.2281, 249.7988) D 11 (349.7845, 249.9674)
C 12 (−5.2981, 239.7837) D 12 (349.7289, 240.0091)
C 21 (4.6984, 239.7735) D 21 (359.7011, 239.9157)
C 22 (4.7079, 249.7902) D 22 (359.7338, 250.0244)
C (−0.28, 244.7865) D (354.7371, 244.9792)
Table 7. Four parameter values from the calibration board coordinate system to the reference board coordinate system.
Table 7. Four parameter values from the calibration board coordinate system to the reference board coordinate system.
The Calibration Board Δ X Δ Y m α
1#−10.3268−165.2615−1.00033.1406
2#−10.3247234.7516−1.00013.1409
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Du, P.; Duan, Z.; Zhang, J.; Zhao, W.; Lai, E.; Jiang, G. The Design and Implementation of a Dynamic Measurement System for a Large Gear Rotation Angle Based on an Extended Visual Field. Sensors 2025, 25, 3576. https://doi.org/10.3390/s25123576

AMA Style

Du P, Duan Z, Zhang J, Zhao W, Lai E, Jiang G. The Design and Implementation of a Dynamic Measurement System for a Large Gear Rotation Angle Based on an Extended Visual Field. Sensors. 2025; 25(12):3576. https://doi.org/10.3390/s25123576

Chicago/Turabian Style

Du, Po, Zhenyun Duan, Jing Zhang, Wenhui Zhao, Engang Lai, and Guozhen Jiang. 2025. "The Design and Implementation of a Dynamic Measurement System for a Large Gear Rotation Angle Based on an Extended Visual Field" Sensors 25, no. 12: 3576. https://doi.org/10.3390/s25123576

APA Style

Du, P., Duan, Z., Zhang, J., Zhao, W., Lai, E., & Jiang, G. (2025). The Design and Implementation of a Dynamic Measurement System for a Large Gear Rotation Angle Based on an Extended Visual Field. Sensors, 25(12), 3576. https://doi.org/10.3390/s25123576

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop