Next Article in Journal
Longitudinal Wave Defect Detection Technology Based on Ablation Mechanism
Previous Article in Journal
Developments of Waveguide Lasers by Femtosecond Laser Direct–Writing Technology
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Three-Dimensional Reconstruction Method Based on Telecentric Epipolar Constraints

1
State Key Laboratory of Radio Frequency Heterogeneous Integration, Shenzhen University, Shenzhen 518060, China
2
Institute of Intelligent Optical Measurement and Detection, Shenzhen University, Shenzhen 518060, China
3
College of Physics and Optoelectronic Engineering, Shenzhen University, Shenzhen 518060, China
*
Author to whom correspondence should be addressed.
Photonics 2024, 11(9), 804; https://doi.org/10.3390/photonics11090804
Submission received: 26 June 2024 / Revised: 21 August 2024 / Accepted: 23 August 2024 / Published: 28 August 2024

Abstract

:
When calibrating a microscopic fringe projection profile system with a telecentric camera, the orthogonality of the camera causes an ambiguity in the positive and negative signs of its external parameters. A common solution is to introduce additional constraints, which often increase the level of complexity and the calibration cost. Another solution is to abandon the internal/external parameter models derived from the physical imaging process and obtain a numerically optimal projection matrix through the least squares solution. This paper proposes a novel calibration method, which derives a telecentric epipolar constraint model from the conventional epipolar constraint relationship and uses this constraint relationship to complete the stereo calibration of the system. On the one hand, since only the camera’s intrinsic parameters are needed, there is no need to introduce additional constraints. On the other hand, the solution is optimized based on the full consideration of the imaging model to make the parameters confirm to the physical model. Our experiments proved the feasibility and accuracy of the method.

1. Introduction

With the rapid upgrading of the microelectronics industry, an increasing number of devices have reached the micrometer scale, demanding a higher level of precision for surface mount technology (SMT). Traditional contact detection methods, such as micro-CMMs (coordinate measurement machines) [1,2], can no longer meet the requirements for fast, high-precision, and non-destructive detection in the microelectronics manufacturing industry. Among the non-contact optical measurement methods, con-focal microscopy [3] and white-light interference microscopy [4,5] have the advantages of a high measurement accuracy, but their complex structure and high price make them unsuitable for harsh industrial environments [6]. Fringe projection profilometry (FPP) uses phase information to obtain high-precision, high-density three-dimensional topography [7]. Due to its insensitivity to changes in background, contrast, and noise, as well as its ability to acquire full-field data, its flexibility, and its low price [8,9], it has been widely used in industrial inspection, reverse engineering, plastic surgery, cultural relic evaluation, entertainment, and other fields [10,11]. Microscale FPP, also known as fringe projection three-dimensional microscopy (FP-3DM), plays an increasingly important role in micromanufacturing and roughness measurement [12]. Traditional stereo microscopes based on FP-3DM are often modified to achieve micron-level measurements [13,14].
However, the depth of field (DOF) of conventional microscopic profilometry systems is limited to the sub-millimeter level, which is insufficient for measuring three-dimensional objects with height variations of several millimeters. In addition, conventional lenses suffer from perspective effects and lens distortion, which can cause objects to appear distorted at short distances [15]. In contrast, telecentric lenses have many unparalleled advantages, such as a high resolution, almost no distortion, constant magnification, and an increased depth of field [16,17,18,19,20]. Since telecentric lenses can eliminate parallax, determine the exact size of objects, and produce a high measurement accuracy [21], they are very suitable for demanding machine vision applications [22,23,24] and the high-precision three-dimensional (3D) reconstruction of small objects [25,26,27].
The existing FPP system calibration methods can be roughly divided into triangular stereo models and phase–height models [28]. The phase–height model fits the relationship between the absolute phase and the three-dimensional coordinates through a calibration. The calibration process is relatively complicated and requires a precise translation platform and reference plane [29]. The triangular stereo model calibrates cameras of different perspectives and finally completes the reconstruction through the matching of points with the same name. Due to the orthogonal imaging model of telecentric lenses, the pinhole model calibration method is no longer applicable. Therefore, many methods for telecentric lens calibration have been proposed. They can be roughly divided into three categories: self-calibration [30,31,32], three-dimensional-target-based calibration, and planar-target-based calibration. The self-calibration method does not require a calibration target and can calibrate a moving camera directly from the scene image. This method is flexible but sensitive to noise and is not suitable for a fixed camera. The three-dimensional-target-based calibration method is simple and direct, but the micro-scale three-dimensional calibration cube suitable for telecentric lenses is very expensive. The planar-target-based method is most suitable for telecentric cameras because it is easy to apply and has good accuracy. Li Dong et al. proposed a two-step calibration method that takes into account the calibration accuracy under different distortion models [33]. It is widely used due to its flexibility and accuracy [34,35,36]. For example, Li Rao et al. regard the telecentric camera as an entocentric system with a large focal length and assume that the external parameters except t Z are unchanged in terms of both perspective and the affine projection transformation. Haskamp et al. directly use nonlinear optimization methods to estimate camera parameters [37], but this method is likely to fail to reach the global minimum. Yin et al. introduced a general imaging model and moved the calibration plate in a certain way during the calibration process to complete the calibration of a telecentric camera [12]. In refs. [38,39], a telecentric camera was calibrated by a simple and easy-to-implement linear fitting, and the relationship in the Z direction was determined by the movement of the micro-displacement platform. Based on the factorization of ref. [40], Lanman and Liu et al. successfully obtained an initial estimate of a camera’s internal and external parameters [41,42]. There is a problem in the telecentric calibration method based on planar objects; that is, there is sign ambiguity in the external parameters. In Lanman’s method, a checkerboard plane with a known height is introduced to restore the sign. Inspired by Zhang [43], Chen et al. [44] also proposed a closed-form solution method for parameters based on orthogonality, which uses a micro-displacement platform to move along the Z -axis to eliminate ambiguity. Many authors [45,46,47,48,49] have also proposed their own calibration methods on this basis. Interestingly, Chao Chen et al. [46] used a virtual calibration plate to avoid the use of a micro-displacement platform. Most methods introduce a micro-displacement platform and calibrate the internal and external parameters separately, which makes the calibration complicated and expensive. In refs. [6,50], the authors believe that the internal and external parameters of the telecentric lens are naturally coupled together and difficult to separate directly. Separate calibration leads to uncertainties in the accuracy of the imaging model; that is, the model itself is not robust enough. They convert the calibration corner points to the projector coordinate system (PCS), directly calculate the projection matrix formed by the coupling of the internal and external parameters, and thus complete the stereo calibration, avoiding the problem of sign ambiguity.
In this paper, we propose a flexible calibration method based on telecentric epipolar constraints and develop a three-dimensional reconstruction approach using it. In our method, we derive the telecentric epipolar constraint, using it to describe the poses between a camera and a projector. The issue of symbol ambiguity can be avoided since the extrinsic parameters of the telecentric camera are not needed when solving for the telecentric essential matrix; thus, no additional devices are required. Finally, the telecentric essential matrix does not conform to the form of an antisymmetric matrix multiplied with an orthogonal matrix. Therefore, a new method is introduced to ensure orthogonality between the rotation elements. The experimental results demonstrate the feasibility and accuracy of our approach.
A brief introduction to the calibration principles for cameras and projectors is in Section 2. The principle of telecentric epipolar constraints is described in detail in Section 3. We also briefly introduce the reconstruction method based on the projection matrix proposed by Zhang et al. [45]. as we will compare this method to our method in subsequent experiments. Finally, Section 4 and Section 5 cover the experiments and conclusion, respectively.

2. System Calibration

The calibration of pinhole models and telecentric models has already been well developed. This section will briefly introduce the methods used in this paper for the calibration of the camera and projector, respectively. However, we do not calibrate the extrinsic parameters for the telecentric camera.

2.1. Calibration for Telecentric Lens

The imaging model for telecentric lenses can be described as follows:
u c v c 1 = m 0 0 0 0 m 0 0 0 0 0 1 r 11 c r 12 c r 13 c t x c r 21 c r 22 c r 23 c t y c r 31 c r 32 c r 33 c t z c 0 0 0 1 X w Y w Z w 1
where m represents the magnification of the telecentric lens, X w Y w Z w 1 T represents the homogeneous coordinates of a point in the world coordinate system (WCS), u c v c 1 T represents the corresponding camera pixel coordinates, r 11 ~ r 33 represent the rotation components, and t x , t y , t z represent the translation components. Additionally, the superscript c represents the external parameters from the WCS to the telecentric camera coordinate system (CCS). Assuming Z w = 0 , the equation above can be expressed in a more simplified form as follows:
u c v c 1 = m 0 0 0 m 0 0 0 1 r 11 c r 12 c t x c r 21 c r 22 c t y c 0 0 1 X w Y w 1 = M s K s P s = H s P s
where M s and K s correspond to the two matrices before the equal signs, respectively. H s = M s K s = h 11 h 12 h 13 h 21 h 22 h 23 0 0 1 , which can be easily solved through a direct linear transformation (DLT). In light of the rotation matrix in Equation (1) and leveraging its orthogonality, we obtain the following:
r 11 c r 21 c + r 12 c r 22 c + r 13 c r 23 c = 0 r 13 c = ± 1 r 11 c 2 r 12 c 2 r 23 c = ± 1 r 21 c 2 r 22 c 2
Combining Equations (2) and (3), a relation about the square of the magnification m 2 can be expressed as follows:
m 4 h 21 2 + h 22 2 + h 11 2 + h 12 2 m 2 + h 11 h 22 h 12 h 21 2 = 0
By solving Equation (4), we obtain four solutions, among which the two non-negative solutions are meaningful. The final magnification can be determined through r i j c 2 = h i j 2 / m 2   1 , where i = 1,2 ,   j = 1,2 .

2.2. Projector Calibration

The imaging model for the projectors follows the pinhole model, which is commonly calibrated using Zhang’s [43] calibration method. The key to projector calibration is enabling the projector to “see” corners like a camera. To achieve this, the projector projects a pattern and the camera captures it simultaneously. This indirect process allows the projector to effectively acquire image data.
In this paper, we employ an eight-step phase-shifting method combined with complementary Gray codes to enhance levels of robustness in stripe boundary decoding. The imaging model for the projector can be described as follows:
u p v p 1 = s f x 0 u 0 0 0 f y v 0 0 0 0 1 0 r 11 p r 12 p r 13 p t x p r 21 p r 22 p r 23 p t y p r 31 p r 32 p r 33 p t z p 0 0 0 1 X w Y w Z w 1
where s is a projection coefficient, f x , u 0 and f y , v 0 represent the focal length and principal point of the projector in the x and y directions, u p v p 1 T represents the pixel coordinates in the projector, and the superscript p in the external parameter indicates that it is transformed from the WCS to the PCS.
Assuming that the calibration pattern lies on Z w = 0 , the imaging model can be presented as follows:
u p v p 1 = s M r 1 r 2 t X w Y w 1 = h 1 h 2 h 3 X w Y w 1 = H X w Y w 1
where M represents the internal matrix, r 1 , r 2 denote the vectors of the first and second columns in the rotation matrix, and t is the translational vector. Furthermore, h 1 h 2 h 3 represents each column of the homology matrix H , which can be determined using DLT.
Using the orthogonality of the rotation matrix in Equation (5), the following equations can be obtained:
r 1 T r 2 = 0 = h 1 T M T M 1 h 2 r 1 T r 1 = 1 = h 1 T M T M 1 h 1
Let B = M T M 1 = B 11 B 12 B 13 B 12 B 22 B 23 B 13 B 23 B 33 . By inserting this into h i T B h j and expanding it, we can obtain the following:
h i T B h j = h i 1 h j 1 h i 1 h j 2 + h i 2 h j 1 h i 2 h j 2 h i 3 h j 1 + h i 1 h j 3 h i 3 h j 2 + h i 2 h j 3 h i 3 h j 3 T B 11 B 12 B 22 B 13 B 23 B 33 = v i j T b
where v i j T and b correspond to the two vectors before them.
According to r 1 T r 2 = 0 , r 1 = r 2 , Equation (8) becomes the following:
v 12 T v 11 T v 22 T b = 0
After solving Equation (9) and obtaining b , the elements of the intrinsic and extrinsic matrices can be determined by B = M T M 1 and Equation (6) easily.

3. Telecentric Epipolar Constraints

According to conventional epipolar constraints, there does not seem to have the same relationship between telecentric and pinhole models. However, we can still observe a certain constraint between them and derive it using equations.

3.1. Principle

The conventional epipolar constraint is discussed first. As shown in Figure 1, O 1 , O 2 represent the optical centers of the camera and projector, respectively. P is an object point in space, p 1 , p 2 are the projections of point P , and e 1 , e 2 are the points coming from the intersection of the line O 1 O 2 with the image planes. Connect e 1 , p 1 and e 2 , p 2 to form l 1 and l 2 . If p 1 is known, its corresponding point must lie on the intersection of the plane O 1 O 2 P with the right image plane, which is referred to as the epipolar constraint.
As shown in Figure 2, for an object point P W in space, there is a telecentric CCS O c - X c Y c Z c and a PCS O p - X p Y p Z p . In practice, the optical center of camera is located at infinity due to the telecentric lens, preventing the formation of a baseline. Meanwhile, P W is imaged at p l in the camera image and at p r in the projector image. Assuming that the optical center of the projector O p is imaged at the point O c in the telecentric camera, we can still observe the geometric relations: a point on the camera image plane corresponds to a line on the projector image plane; that is, point p l ’s corresponding point p r lies on the line L r .
Following the form of the conventional epipolar constraint, the telecentric epipolar constraint is defined in the same way as follows:
P l T E P r = 0 p l T F p r = 0
Among them, E and F represent the essential matrix and fundamental matrix. P l and P r are the points of P W in the CCS and PCS, respectively, while p l and p r are the image points in the camera and the projector.
After the PCS is considered as the reference coordinate system (RCS), the projection equations for the camera and projector can be presented as follows:
p l = M l K l P w s r p r = M r K r P w
Among them, M l , K l and M r , K r are the internal and external matrix of camera and projector mentioned in Section 2, and s r is the coefficient in the projector imaging model.
Note that the rotation matrix R l and the translation vector T l in camera extrinsic matrix K l are expressed as follows:
R l = r 11 l r 12 l r 13 l r 21 l r 22 l r 23 l 0 0 0 , T l = t x l t y l 1
By substituting P w = R r 1 P r T r into P l = R l P w + T l , we can obtain the following:
P l = R l R r 1 P r + T l R l R r 1 T r
Comparing this with P l = R P r + T , the rotation matrix and translation vector from the projector to the camera are given by R = R l R r 1 , T = T l R l R r 1 T r , respectively. It should be noted that the third row of R and T consists of 0 and 1 , respectively, due to Equation (12).
Perform a left multiplication on both sides of Equation (13) with matrix T ^ , which is the antisymmetric matrix of T :
T ^ = 0 1 T y 1 0 T x T y T x 0
We can obtain T ^ P l = T ^ R P r because T ^ T = 0 and perform a left multiplication transformation with P l T again. The final equation is presented as follows:
P l T T ^ R P r = 0
Here, the telecentric essential matrix and the telecentric fundamental matrix are defined as E = T ^ R and F = s r M l T E M r 1 , respectively, and we obtain the following:
P l T E P r = 0 p l T F p r = 0
Expand matrix F into vector F = f 11 f 12 f 13 f 21 f 22 f 23 f 31 f 32 f 33 T . Ideally, all corner points in the camera image u l i , v l i and projector image u r i , v r i should satisfy Equation (16), so the following overdetermined linear equations can be listed:
u l 1 u r 1 u l 1 v r 1 u l 1 v l 1 u r 1 v l 1 v r 1 v l 1 u r 1 v r 1 1 u l 2 u r 2 u l 2 v r 2 u l 2 v l 2 u r 2 v l 2 v r 2 v l 2 u r 2 v r 2 1 u l N u r N u l N v r N u l N v l N u r N v l N v r N v l N u r N v r N 1 f 11 f 12 f 13 f 21 f 22 f 23 f 31 f 32 f 33 = 0
where N represents the total number of corner points. Finding the least squares solution to Equation (17), we can obtain matrices F and E .

3.2. Three-Dimensional Reconstruction Based on Telecentric Epipolar Constraint

To obtain the pose between the camera and projector, the telecentric essential matrix is extended according to E = T ^ R :
E = r 21 r 22 r 23 r 11 r 12 r 13 r 21 t x r 11 t y r 22 t x r 12 t y r 23 t x r 13 t y
The components r 11 ~ r 23 , t x , t y in Equation (18) represent the external parameters from the projector to the camera. However, r 11 ~ r 23 do not necessarily satisfy the orthogonality of the rotation matrix R mentioned above. In the traditional epipolar constraint, a singular value decomposition (SVD) of the essential matrix is performed, and its singular value matrix is adjusted to Σ = d i a g 1,1 , 0 to ensure orthogonality. However, the singular value matrix of the telecentric essential matrix we proposed does not satisfy the equation Σ = d i a g 1,1 , 0 since it does not conform to the form of multiplying an antisymmetric matrix and an orthogonal matrix.
The singular values of the telecentric essential matrix are deduced below, assuming E = U Σ V T :
E E T = T ^ R R T T ^ T = U Σ V T V Σ T U T = U Σ 2 U T
which can be written in detail as follows:
T ^ R R T T ^ T = 1 0 t x 0 1 t y t x t y t x 2 + t y 2
The eigenvalues λ 1 = t x 2 + t y 2 + 1 , λ 2 = 1 , λ 3 = 0 can be obtained after solving Equation (20). So, its singular value matrix should be the following:
Σ = d i a g λ 1 , 1,0
By performing an SVD on matrix E and adjusting its singular value matrix into the form of Σ = d i a g σ 1 , 1,0 , where σ 1 = λ 1 , we can finally obtain the rotation vector r 1 , r 2 and the translation vector t .
Based on the transformation of the RCS to the PCS, we combine the camera and projector projection equations for the three-dimensional reconstruction as follows:
u c v c 1 = M l K l P w = H 3 × 4 P w = h 11 h 12 h 13 h 14 h 21 h 22 h 23 h 24 0 0 0 1 X w Y w Z w 1 s r u p v p 1 = M r K r P w = f x 0 u 0 0 0 f y v 0 0 0 0 1 0 X w Y w Z w 1
where the internal parameter matrices M l , M r are obtained through the calibration steps described above. According to Equation (22), the equations can be established and written in matrix form as follows:
h 11 h 12 h 13 h 21 h 22 h 23 f x 0 u 0 u p X w Y w Z w = u c h 14 v c h 24 0
By solving Equation (23), the three-dimensional coordinates can be obtained.

3.3. Three-Dimensional Reconstruction Based on Projection Matrix

Song Zhang’s method, which we mentioned above, couples the internal parameters and external parameters of the telecentric camera and calculates its projection matrix directly.
Because of this, we call it the three-dimensional reconstruction method based on the projection matrix.
Assuming that the projector has been calibrated through the method above, the corner points of the calibration board in all poses can be converted to the PCS through the external parameters of the projector and can be written as follows:
X n i = X p Y p Z p 1 n i T
where n is the corner point number of each calibration board, and i denotes the serial number of the calibration board in different poses. According to the imaging model of the telecentric camera, it can be expressed by the following equation:
u c v c 1 = h 11 h 12 h 13 h 14 h 21 h 22 h 23 h 24 0 0 0 1 X n i
which can also be rewritten as follows:
X 1 1 T X 2 1 T X n i T h 11 h 12 h 13 h 14 = u c 1 u c 2 u c n i X 1 1 T X 2 1 T X n i T h 21 h 22 h 23 h 24 = v c 1 v c 2 v c n i
All parameters in the projection matrix can be computed from the least squares solution of Equation (26). Similarly, we can complete the three-dimensional reconstruction by combining Equation (26) with projection equations of the projector, which has the same form as Equation (23). It can be seen that this three-dimensional reconstruction method is simple and convenient.

4. Experiment and Discussion

To verify the method we proposed, we establish an experiment system to evaluate the accuracy of the three-dimensional reconstruction and compare this method with the projection matrix method. We used a Hikvision camera, model MV-CH210-90YM-M58S-NN, with a resolution of 5120 pixel × 4096 pixel and a pixel size of 4.5 μm × 4.5 μm, and a CR Canrui XF-PTL05535-F-VI telecentric lens, whose object field-of-view target surface size is 35.0 mm × 28.0 mm, depth of field is 0.9~12.0 mm, and magnification is 0.658. We used an Anhua Optoelectronics model H9H projector with a resolution 1280 pixel × 720 pixel. The calibration board and the diameter of the standard ball used in the experiment are, respectively, GP050-9 × 9 and 10 mm. In addition, the manufacturing accuracy of the standard ball is ∅10.0041 mm. The camera and projector are installed by the brackets, which can also fix their poses. The platform built for the experiment is as shown in the Figure 3:
The experiment is carried out within the 3 mm depth of field of the telecentric lens, at which clear pictures can be captured. Place the calibration board at different positions at each depth to ensure that the camera’s field of view is covered. Specifically, the board is moved at 0.75 mm intervals over the depth range, and at each depth, the board is moved to the four corners and the center to ensure that all the positions in the camera’s field of view are covered. Project the Gray code and phase-shift stripes after photographing the calibration board. Theoretically, the higher the number of steps, the higher the accuracy. However, when the number of steps reaches a specific threshold, further increments do not markedly enhance the precision, but rather augment the computational burden [51]. So, we use an eight-step phase shift in this experiment. In order to cover the projector image plane, we chose a 10-bit Gray code because the projector’s resolution is 1280 pixels × 720 pixels and the stripe is 4 pixels wide. We could have chosen a 9-bit Gray code, but we chose the 10-bit code because of the complementary Gray Code. The stripes are shown in Figure 4:
After calibrating the system, the pose between the camera and the projector is calculated through the telecentric epipolar constraint algorithm. Based on the theory above, the code is written to reconstruct the point clouds of the planes and spheres. We utilize Geomagic Studio 2013 for point cloud data processing, which includes modeling, noise reduction, the elimination of non-connected points, fitting, etc. In this study, we use it to fit and calculate the deviation for the plane and sphere data. These results serve as the basis for comparing different methods. However, because Geomagic Studio 2013 cannot draw a diagram with an axis line, we use MATLAB R2020a to plot a precise diagram.
Based on the telecentric epipolar constraints, a reconstruction is performed for the planes and spheres. In the pseudo-colored map, the color represents the distance deviation between the point cloud data and the fitting feature. The correspondence between the degree of deviation and the color is given by the axis next to it.
In Figure 5, the fitting standard deviation of the plane is 11.0 μm, and the color map for the plane fitting deviation is relatively uniform, indicating that the fitting plane has an overall flatness. The fitting standard deviation of the sphere is 14.9 μm, and the color map for the deviation is also relatively uniform, with a diameter of 10.0064 mm.
Meanwhile, the three-dimensional reconstruction based on the projection matrix method is performed on the same plane and sphere, following the same process. The final reconstructed results are shown below.
In Figure 6, the fitting standard deviation of the plane is 23.81 μm. The pseudo-colored map reveals distortion at the edges of the plane, indicating that the fitting plane has some curvature. The diameter of the sphere is 10.0304 mm, with a fitting standard deviation of 13.4 μm.
To compare the reconstruction results across the entire field of view, the sphere is placed at different positions within the camera’s view as shown in Figure 7. Two methods are employed to reconstruct the sphere under varying fields of view. The point cloud data are then fitted to find the best-fitting sphere. The standard deviation of the fitting data and the absolute deviation of diameter (the difference between the fitting diameter and the diameter of a standard ball (10 mm)) are calculated. The specific comparative data for the reconstructions are presented in Table 1. Please note that in Table 1 and Table 2, we use ‘M’ to represent the data produced from the projection matrix method and ‘E’ to represent the data produced from the method we proposed.
It can be observed that the standard deviations of the two methods are generally at the same level and relatively stable. The absolute deviation of the sphere’s diameter reconstructed by our proposed method is within 10 μm, which is better than that achieved by the projection matrix method.
Similarly, the plane is reconstructed at different depths. The baseline plane is taken from the farthest depth, and reconstructions are performed by moving along the plane’s Z-axis with a step size of 0.75 mm. The specific comparative data are presented in the following table.
As shown in Table 2, the fitting standard deviations of the plane reconstructed by our method are consistently better than those obtained with the projection matrix method. Taking Plane 3 as an example, the fitting standard deviation achieved by our method is 10.9 μm, which outperforms the projection matrix reconstruction’s fitting standard deviation of 21.9 μm, resulting in an improvement of approximately 50.2%.
To further investigate the cause of the fitting standard deviation difference between the two methods, we uniformly sample a set of point clouds from the four corners and center of each plane. The average distance from these point clouds to the fitted plane is calculated. The absolute difference between the corner-to-plane distance and the center-to-plane distance is defined as the curvature of that plane. The specific experimental data for the curvature of planes reconstructed using the projection matrix method and the epipolar constraint method are shown in Table 3 and Table 4, respectively.
From Table 3 and Table 4, it can be observed that the corner curvature of the planes reconstructed using the projection matrix method is consistently greater than that achieved by our method. Taking Plane 3 as an example, the average curvature of the plane reconstructed using the epipolar constraint method is 5.6 μm, which is significantly lower than the average curvature of 15.9 μm obtained by the projection matrix method. This reduction in curvature by approximately 64.8% further confirms that the larger standard deviation in the projection matrix method primarily stems from distortion at the edges of the reconstructed plane, resulting in an overall curved shape. In contrast, our proposed method yields planes with better planar characteristics. This result aligns with the pseudo-colored deviation maps of the planes shown in Figure 5 and Figure 6.

5. Conclusions

In this paper, we propose a calibration method based on the constraint of telecentric epipolar geometry. It utilizes the epipolar constraint relationship between the telecentric and pinhole models to solve for the pose between the camera and the projector, based on which we realize a three-dimensional reconstruction.
Compared to the projection matrix method, the approach we proposed achieves diametric absolute deviations within 10 μm, with both methods yielding similar standard deviations for the sphere reconstruction. Moreover, in the case of the plane reconstruction, our method results in generally flat planes, while the projection matrix method exhibits curvature at the edges. This discrepancy arises because the projection matrix method does not consider the inherent properties of the rotation matrix; instead, it directly couples the intrinsic and extrinsic parameters. In contrast, our approach derives the essential matrix from the geometry relation, fully considering its properties and optimizing the rotation parameters to ensure orthogonality.
In conclusion, this paper has made improvements in the following three aspects:
  • When recovering the extrinsic parameters between the camera and the projector using the telecentric essential matrix, we ensure the intrinsic properties of the essential matrix under telecentric conditions by performing an SVD and adjusting the singular value matrix. This guarantees that the decomposed rotation matrix satisfies orthogonality constraints. Compared to the projection matrix method, the approach proposed in this paper results in smaller reconstruction standard deviations.
  • It is more flexible. The equation calculating the essential matrix does not involve the extrinsic parameters of the camera. Therefore, during the camera calibration process, it is only necessary to determine the intrinsic parameters, which avoids the symbolic ambiguity of the extrinsic parameters. Additionally, in the experimental procedure, there is no need to introduce a micro-displacement platform, simplifying the process and minimizing potential errors.
  • During the essential matrix estimation process, incorporating the calibration board corner data from all poses for the optimization results in a certain level of average error improvement.
The method we proposed abandons extrinsic parameters when calibrating the telecentric camera but requires the intrinsic parameters of the camera and projector when solving the telecentric essential matrix. In the code of the paper, the nonlinear optimization is adopted for the optimal projector internal parameters while the internal parameters of the camera depend on the quality of the calibration images. A better accuracy can be achieved with additional pose optimization between the camera and the projector.

Author Contributions

Conceptualization, X.Z. and X.Y.; methodology, Q.L. and Z.G.; software, Q.L.; validation, Q.L.; investigation, X.Y.; resources, X.Z.; data curation, Z.G.; writing—original draft preparation, Q.L.; writing—review and editing, Q.L.; visualization, Q.L.; supervision, X.Z.; project administration, Q.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Dai, G.; Bütefisch, S.; Pohlenz, F.; Danzebrink, H.-U. A high precision micro/nano CMM using piezoresistive tactile probes. Meas. Sci. Technol. 2009, 20, 084001. [Google Scholar] [CrossRef]
  2. Claverley, J.D.; Leach, R.K. A review of the existing performance verification infrastructure for micro-CMMs. Precis. Eng.-J. Int. Soc. Precis. Eng. Nanotechnol. 2015, 39, 1–15. [Google Scholar] [CrossRef]
  3. Hung, C.-C.; Fang, Y.-C.; Tsai, C.-M.; Lin, C.-C.; Yeh, K.-M.; Wu, J.-H. Optical design of high performance con-focal microscopy with digital micro-mirror and stray light filters. Optik 2010, 121, 2073–2079. [Google Scholar] [CrossRef]
  4. Kumar, U.P.; Bhaduri, B.; Kothiyal, M.P.; Mohan, N.K. Two-wavelength micro-interferometry for 3-D surface profiling. Opt. Lasers Eng. 2009, 47, 223–229. [Google Scholar] [CrossRef]
  5. Kumar, U.P.; Wang, H.F.; Mohan, N.K.; Kothiyal, M.P. White light interferometry for surface profiling with a colour CCD. Opt. Lasers Eng. 2012, 50, 1084–1088. [Google Scholar] [CrossRef]
  6. Deng, H.W.; Hu, P.Y.; Zhang, G.F.; Xia, C.S.; Cai, Y.D.; Yang, S.M. Accurate and flexible calibration method for a 3D microscopic structured light system with telecentric imaging and Scheimpflug projection. Opt. Express 2023, 31, 3092–3113. [Google Scholar] [CrossRef] [PubMed]
  7. Gorthi, S.S.; Rastogi, P. Fringe projection techniques: Whither we are? Opt. Lasers Eng. 2010, 48, 133–140. [Google Scholar] [CrossRef]
  8. Van der Jeught, S.; Dirckx, J.J.J. Real-time structured light profilometry: A review. Opt. Lasers Eng. 2016, 87, 18–31. [Google Scholar] [CrossRef]
  9. Mei, Q.; Gao, J.; Lin, H.; Chen, Y.; He, Y.B.; Wang, W.; Zhang, G.J.; Chen, X. Structure light telecentric stereoscopic vision 3D measurement system based on Scheimpflug condition. Opt. Lasers Eng. 2016, 86, 83–91. [Google Scholar] [CrossRef]
  10. Sansoni, G.; Trebeschi, M.; Docchio, F. State-of-The-Art and Applications of 3D Imaging Sensors in Industry, Cultural Heritage, Medicine, and Criminal Investigation. Sensors 2009, 9, 568–601. [Google Scholar] [CrossRef]
  11. Yin, Y.K.; He, D.; Liu, Z.Y.; Liu, X.L.; Peng, X. Phase aided 3D imaging and modeling: Dedicated systems and case studies. In Proceedings of the Conference on Optical Micro- and Nanometrology V, Brussels, Belgium, 15–17 April 2014. [Google Scholar]
  12. Yin, Y.K.; Wang, M.; Gao, B.Z.; Liu, X.L.; Peng, X. Fringe projection 3D microscopy with the general imaging model. Opt. Express 2015, 23, 6846–6857. [Google Scholar] [CrossRef]
  13. Windecker, R.; Fleischer, M.; Tiziani, H.J. Three-dimensional topometry with stereo microscopes. Opt. Eng. 1997, 36, 3372–3377. [Google Scholar] [CrossRef]
  14. Li, A.M.; Peng, X.; Yin, Y.K.; Liu, X.L.; Zhao, Q.P.; Köerner, K.; Osten, W. Fringe projection based quantitative 3D microscopy. Optik 2013, 124, 5052–5056. [Google Scholar] [CrossRef]
  15. Rico Espino, J.G.; Gonzalez-Barbosa, J.-J.; Gómez Loenzo, R.A.; Córdova Esparza, D.M.; Gonzalez-Barbosa, R. Vision system for 3D reconstruction with telecentric lens. In Proceedings of the Pattern Recognition: 4th Mexican Conference (MCPR 2012), Huatulco, Mexico, 27–30 June 2012; pp. 127–136. [Google Scholar]
  16. Miks, A.; Novák, J. Design of a double-sided telecentric zoom lens. Appl. Opt. 2012, 51, 5928–5935. [Google Scholar] [CrossRef] [PubMed]
  17. Zhang, J.K.; Chen, X.B.; Xi, J.T.; Wu, Z.Q. Aberration correction of double-sided telecentric zoom lenses using lens modules. Appl. Opt. 2014, 53, 6123–6132. [Google Scholar] [CrossRef] [PubMed]
  18. Kim, J.S.; Kanade, T. Multiaperture telecentric lens for 3D reconstruction. Opt. Lett. 2011, 36, 1050–1052. [Google Scholar] [CrossRef] [PubMed]
  19. Li, B.W.; Zhang, S. Microscopic structured light 3D profilometry: Binary defocusing technique vs. sinusoidal fringe projection. Opt. Lasers Eng. 2017, 96, 117–123. [Google Scholar] [CrossRef]
  20. Zhang, J.K.; Chen, X.B.; Xi, J.T.; Wu, Z.Q. Paraxial analysis of double-sided telecentric zoom lenses with four components. Opt. Eng. 2014, 53, 4957–4967. [Google Scholar] [CrossRef]
  21. Niu, Z.Q.; Gao, N.; Zhang, Z.H.; Gao, F.; Jiang, X.Q. 3D shape measurement of discontinuous specular objects based on advanced PMD with bi-telecentric lens. Opt. Express 2018, 26, 1615–1632. [Google Scholar] [CrossRef]
  22. Ota, M.; Leopold, F.; Noda, R.; Maeno, K. Improvement in spatial resolution of background-oriented schlieren technique by introducing a telecentric optical system and its application to supersonic flow. Exp. Fluids 2015, 56, 48. [Google Scholar] [CrossRef]
  23. Marani, R.; Roselli, G.; Nitti, M.; Cicirelli, G.; D’Orazio, T.; Stella, E. A 3D vision system for high resolution surface reconstruction. In Proceedings of the 7th International Conference on Sensing Technology (ICST), Wellington, New Zealand, 3–5 December 2013; pp. 157–162. [Google Scholar]
  24. Baldwin-Olguin, G. Telecentric lens for precision machine vision. In Proceedings of the Second Iberoamerican Meeting on Optics, Guanajuato, Mexico, 18–22 September 1995; pp. 440–443. [Google Scholar]
  25. Zhang, S.F.; Li, B.; Ren, F.J.; Dong, R. High-Precision Measurement of Binocular Telecentric Vision System With Novel Calibration and Matching Methods. IEEE Access 2019, 7, 54682–54692. [Google Scholar] [CrossRef]
  26. Hu, Y.; Chen, Q.; Feng, S.J.; Tao, T.Y.; Asundi, A.; Zuo, C. A new microscopic telecentric stereo vision system—Calibration, rectification, and three-dimensional reconstruction. Opt. Lasers Eng. 2019, 113, 14–22. [Google Scholar] [CrossRef]
  27. Hu, Y.; Liang, Y.C.; Tao, T.Y.; Feng, S.J.; Zuo, C.; Zhang, Y.Z.; Chen, Q. Dynamic 3D measurement of thermal deformation based on geometric-constrained stereo-matching with a stereo microscopic system. Meas. Sci. Technol. 2019, 30, 125007. [Google Scholar] [CrossRef]
  28. Feng, S.J.; Zuo, C.; Zhang, L.; Tao, T.Y.; Hu, Y.; Yin, W.; Qian, J.M.; Chen, Q. Calibration of fringe projection profilometry: A comparative review. Opt. Lasers Eng. 2021, 143, 106622. [Google Scholar] [CrossRef]
  29. Li, D.; Liu, C.; Tian, J. Telecentric 3D profilometry based on phase-shifting fringe projection. Opt. Express 2014, 22, 31826–31835. [Google Scholar] [CrossRef] [PubMed]
  30. Teng, C.H.; Chen, Y.S.; Hsu, W.H. Camera self-calibration method suitable for variant camera constraints. Appl. Opt. 2006, 45, 688–696. [Google Scholar] [CrossRef]
  31. Kahl, F.; Heyden, A. Affine structure and motion from points, lines and conics. Int. J. Comput. Vis. 1999, 33, 163–180. [Google Scholar] [CrossRef]
  32. Shapiro, L.S.; Zisserman, A.; Brady, M. 3D Motion Recovery via Affine Epipolar Geometry. Int. J. Comput. Vis. 1995, 16, 147–182. [Google Scholar] [CrossRef]
  33. Li, D.; Tian, J.D. An accurate calibration method for a camera with telecentric lenses. Opt. Lasers Eng. 2013, 51, 538–541. [Google Scholar] [CrossRef]
  34. Chen, K.P.; Shi, T.L.; Wang, X.; Zhang, Y.C.; Hong, Y.; Liu, Q.; Liao, G.L. Calibration of telecentric cameras with an improved projection model. Opt. Eng. 2018, 57, 044103. [Google Scholar] [CrossRef]
  35. Peng, J.Z.; Wang, M.; Deng, D.N.; Liu, X.L.; Yin, Y.K.; Peng, X. Distortion correction for microscopic fringe projection system with Scheimpflug telecentric lens. Appl. Opt. 2015, 54, 10055–10062. [Google Scholar] [CrossRef]
  36. Rao, L.; Da, F.P.; Kong, W.Q.; Huang, H.M. Flexible calibration method for telecentric fringe projection profilometry systems. Opt. Express 2016, 24, 1222–1237. [Google Scholar] [CrossRef] [PubMed]
  37. Haskamp, K.; Kästner, M.; Reithmeier, E. Accurate Calibration of a Fringe Projection System by Considering Telecentricity. In Proceedings of the Conference on Optical Measurement Systems for Industrial Inspection VII, Munich, Germany, 23–26 May 2011. [Google Scholar]
  38. Zhu, F.P.; Liu, W.W.; Shi, H.J.; He, X.Y. Accurate 3D measurement system and calibration for speckle projection method. Opt. Lasers Eng. 2010, 48, 1132–1139. [Google Scholar] [CrossRef]
  39. Feng, G.; Zhu, L.; Zou, H.; Nianxiang, W. A calibration methods for vision measuring system with large view field. In Proceedings of the 2011 4th International Congress on Image and Signal Processing, Shanghai, China, 15–17 October 2011; pp. 1377–1380. [Google Scholar]
  40. Tomasi, C.; Kanade, T. Shape and Motion from Image Streams under Orthography—A Factorization Method. Int. J. Comput. Vis. 1992, 9, 137–154. [Google Scholar] [CrossRef]
  41. Lanman, D.; Hauagge, D.C.; Taubin, G. Shape from depth discontinuities under orthographic projection. In Proceedings of the 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops, Kyoto, Japan, 27 September–4 October 2009; pp. 1550–1557. [Google Scholar]
  42. Liu, H.B.; Lin, H.J.; Yao, L. Calibration method for projector-camera-based telecentric fringe projection profilometry system. Opt. Express 2017, 25, 31492–31508. [Google Scholar] [CrossRef]
  43. Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef]
  44. Chen, Z.; Liao, H.Y.; Zhang, X.M. Telecentric stereo micro-vision system: Calibration method and experiments. Opt. Lasers Eng. 2014, 57, 82–92. [Google Scholar] [CrossRef]
  45. Yao, L.S.; Liu, H.B. A Flexible Calibration Approach for Cameras with Double-sided Telecentric Lenses. Int. J. Adv. Robot. Syst. 2016, 13, 82. [Google Scholar] [CrossRef]
  46. Chen, C.; Chen, B.; Pan, B. Telecentric camera calibration with virtual patterns. Meas. Sci. Technol. 2021, 32, 125004. [Google Scholar] [CrossRef]
  47. Liao, H.Y.; Chen, Z.; Zhang, X.M. Calibration of Camera with Small FOV and DOF Telecentric Lens. In Proceedings of the IEEE International Conference on Robotics and Biomimetics (ROBIO), Shenzhen, China, 12–14 December 2013; pp. 498–503. [Google Scholar]
  48. Xiao, W.F.; Zhou, P.; An, S.Y.; Zhu, J.P. Constrained nonlinear optimization method for accurate calibration of a bi-telecentric camera in a three-dimensional microtopography system. Appl. Opt. 2022, 61, 157–166. [Google Scholar] [CrossRef]
  49. Guan, B.L.; Yao, L.S.; Liu, H.B.; Shang, Y. An accurate calibration method for non-overlapping cameras with double-sided telecentric lenses. Optik 2017, 131, 724–732. [Google Scholar] [CrossRef]
  50. Li, B.; Zhang, S. Flexible calibration method for microscopic structured light system using telecentric lens. Opt. Express 2015, 23, 25795–25803. [Google Scholar] [CrossRef] [PubMed]
  51. Yang, S.C.; Wu, G.X.; Wu, Y.X.; Yan, J.; Luo, H.F.; Zhang, Y.N.; Liu, F. High-accuracy high-speed unconstrained fringe projection profilometry of 3D measurement. Opt. Laser Technol. 2020, 125, 106063. [Google Scholar] [CrossRef]
Figure 1. Epipolar constraint in the pinhole model.
Figure 1. Epipolar constraint in the pinhole model.
Photonics 11 00804 g001
Figure 2. Epipolar constraint under telecentric model.
Figure 2. Epipolar constraint under telecentric model.
Photonics 11 00804 g002
Figure 3. Experiment platform.
Figure 3. Experiment platform.
Photonics 11 00804 g003
Figure 4. Gray code stripes (left) and phase-shift stripes (right).
Figure 4. Gray code stripes (left) and phase-shift stripes (right).
Photonics 11 00804 g004
Figure 5. Reconstruction of point cloud pseudo−colored models for plane and sphere based on telecentric epipolar constraints.
Figure 5. Reconstruction of point cloud pseudo−colored models for plane and sphere based on telecentric epipolar constraints.
Photonics 11 00804 g005
Figure 6. The pseudo−colored point cloud models of the plane and sphere reconstruction using projection matrix.
Figure 6. The pseudo−colored point cloud models of the plane and sphere reconstruction using projection matrix.
Photonics 11 00804 g006
Figure 7. The position of the standard sphere in the camera’s field of view.
Figure 7. The position of the standard sphere in the camera’s field of view.
Photonics 11 00804 g007
Table 1. Standard deviation and diameter absolute deviation of sphere reconstruction at different positions.
Table 1. Standard deviation and diameter absolute deviation of sphere reconstruction at different positions.
SpherePositionStandard Deviation (M)/μmStandard Deviation (E)/μmDiameter Absolute Deviation (M)/μmDiameter Absolute Deviation (E)/μm
S1upper left13.414.930.46.4
S2upper right14.014.716.38.4
S3middle13.114.625.41.0
S4lower left14.115.426.39.4
S5lower right13.914.814.16.3
Table 2. Comparison of standard deviations for plane reconstruction at different depths.
Table 2. Comparison of standard deviations for plane reconstruction at different depths.
PlaneZ/mmStandard Deviation (M)/μmStandard Deviation (E)/μm
Plane 10.0023.811.0
Plane 20.7521.711.1
Plane 31.5021.910.9
Plane 42.2521.89.9
Plane 53.0023.19.1
Table 3. Corner curvature of plane reconstruction at different depths using the projection matrix method.
Table 3. Corner curvature of plane reconstruction at different depths using the projection matrix method.
Distance to Fitting Plane/μmCurvature/μm
Upper LeftUpper RightLower LeftLower RightCenterUpper LeftUpper RightLower LeftLower RightAVG
Plane 141.562.227.260.647.96.414.320.712.713.5
Plane 239.065.228.755.547.18.118.118.48.413.3
Plane 324.152.031.366.943.519.48.512.223.415.9
Plane 429.463.433.961.247.017.616.413.114.215.3
Plane 533.777.726.587.156.222.521.529.730.926.2
Table 4. Corner curvature of plane reconstruction at different depths using the epipolar constraint method.
Table 4. Corner curvature of plane reconstruction at different depths using the epipolar constraint method.
Distance to Fitting Plane/μmCurvature/μm
Upper LeftUpper RightLower LeftLower RightCenterUpper LeftUpper RightLower LeftLower RightAVG
Plane 110.915.29.99.58.62.36.61.30.92.8
Plane 215.016.913.112.48.26.88.74.94.26.2
Plane 317.216.79.012.28.29.08.50.84.05.6
Plane 48.617.210.413.08.80.28.41.64.23.6
Plane 55.427.48.318.87.52.119.90.811.38.5
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Li, Q.; Ge, Z.; Yang, X.; Zhu, X. A Three-Dimensional Reconstruction Method Based on Telecentric Epipolar Constraints. Photonics 2024, 11, 804. https://doi.org/10.3390/photonics11090804

AMA Style

Li Q, Ge Z, Yang X, Zhu X. A Three-Dimensional Reconstruction Method Based on Telecentric Epipolar Constraints. Photonics. 2024; 11(9):804. https://doi.org/10.3390/photonics11090804

Chicago/Turabian Style

Li, Qinsong, Zhendong Ge, Xin Yang, and Xianwei Zhu. 2024. "A Three-Dimensional Reconstruction Method Based on Telecentric Epipolar Constraints" Photonics 11, no. 9: 804. https://doi.org/10.3390/photonics11090804

APA Style

Li, Q., Ge, Z., Yang, X., & Zhu, X. (2024). A Three-Dimensional Reconstruction Method Based on Telecentric Epipolar Constraints. Photonics, 11(9), 804. https://doi.org/10.3390/photonics11090804

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop