# 3D Metrology Using One Camera with Rotating Anamorphic Lenses

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Anamorphic Imaging Model

#### 2.1. Anamorphic Imaging Model

_{W}-X

_{W}Y

_{W}Z

_{W}are the world coordinates, O

_{ci}-X

_{ci}Y

_{ci}Z

_{c}are the camera coordinates centered in the CCD plane, and O

_{p}-X

_{p}Y

_{p}Z

_{c}are the pixel coordinates centered in the CCD plane. O

_{cx}and O

_{cy}are the optical centers in the horizontal plane and the vertical plane, and f

_{x}and f

_{y}are the focal lengths in the two planes. The imaging model for anamorphic lenses can be expressed as follows:

_{w}; Y

_{w}; Z

_{w}] is a point in the world coordinates, [X

_{c}; Y

_{c}; Z

_{c}] is the point expressed in the anamorphic coordinates. R and T denote the rotational and translating matrix from the world coordinates to the anamorphic coordinates. (X

_{I}, Y

_{I}) are the image coordinates in the camera coordinates in the image plane, and (X

_{P}, Y

_{P}) are the image coordinates in the pixel coordinates in the image plane.

#### 2.2. Anamorphic Distortion Model

_{rad}, Y

_{rad}), third-order distortions (X

_{3}, Y

_{3}), and second-order distortions (X

_{2}, Y

_{2}). (x

_{d}, y

_{d}) denotes the distorted image coordinates, and (x

_{c}, y

_{c}) denotes the undistorted image coordinates. [k

_{1}, k

_{2}, n

_{1}, n

_{2}, m

_{1}, m

_{2}, x

_{2}

_{1}, x

_{12}, x

_{03}, x

_{2}

_{0}, x

_{11}, x

_{02}, y

_{3}

_{0}, y

_{2}

_{1}, y

_{12}, y

_{03}, y

_{2}

_{0}, y

_{11}, y

_{02}] are the 19 distortion coefficients for anamorphic lenses that must be calibrated.

## 3. 3D Metrology Using Rotating Anamorphic Lenses

#### 3.1. Description of 3D Metrology Using Rotating Anamorphic Lenses

_{v}, y

_{v}) refer to the image position of the object point P

_{V}= (X

_{C}, Y

_{C}, Z

_{C}), and P

_{V}is in the anamorphic coordinates in the vertical position. After the anamorphic rotation, the coordinates of the object point in the anamorphic coordinates change as follows:

_{H}= [Y

_{C}; −X

_{C}; Z

_{C}], and we have the following equations:

_{x}, f

_{y}, and ad, as well as the image coordinates (x

_{v}, y

_{v}) and (x

_{h}, y

_{h}) of an object point in an ideal anamorphic lens, the 3D coordinates of this point can be easily reconstructed using Equation (7).

#### 3.2. Point Matching

_{x}= 12 mm, f

_{y}= 16 mm, and no distortions. As shown in Figure 3, the object to be built is a collection of points on a spherical surface, with their coordinates given in the vertical position of the anamorphic lens. The simulated image points on the image planes are shown in Figure 4, where the dot points and the circle points express the image points when the anamorphic lens is in the vertical and horizontal positions, respectively. It is not easy to match the image points from these two positions in the current stage. In our method, the point matching can be greatly simplified by using a parameter of an anamorphic lens known as the anamorphic ratio AR, which is:

#### 3.3. Stereo Vision with Anamorphic Lenses

_{x}, f

_{y}, u

_{0}, v

_{0}, ad, aa] and the 19 distortion coefficients [k

_{1}, k

_{2}, n

_{1}, n

_{2}, m

_{1}, m

_{2}, x

_{21}, x

_{12}, x

_{03}, x

_{20}, x

_{11}, x

_{02}, y

_{30}, y

_{21}, y

_{12}, y

_{03}, y

_{20}, y

_{11}, y

_{02}] can be determined.

_{x}, t

_{y}, t

_{z}, a, b, r] between the 3D calibration target and the anamorphic lens can be easily calibrated:

_{x}, t

_{y}, t

_{z}, a, b, r] = [0 mm, 0 mm, 0 mm, 0°, 0°, 90°]. After the anamorphic lens calibration and the stereo calibration for the rotating anamorphic lenses, the 3D reconstruction using rotating anamorphic stereo vision is similar to that in stereo vision with spherical lenses.

## 4. Experiments

#### 4.1. Experiments

_{x}, t

_{y}, t

_{z}, a, b, r] = [−0.8943 mm, 0.3591 mm, −0.1712 mm, −0.5586°, −0.8074°, 90.0059°]. After the anamorphic lens calibration and the rotating anamorphic stereo calibration, we conducted the 3D metrology using the rotating anamorphic lens. As shown in Figure 10, to achieve dense 3D points, a checkerboard was adopted with a square’s length of 5 mm. The original anamorphic images are shown in Figure 10. Figure 10a shows the image when the anamorphic lens was in the vertical position, and Figure 10b shows the image when the anamorphic lens was in the horizontal position. Figure 10b was achieved by rotating the anamorphic lens by −90° anticlockwise along the optical axis. Once the corresponding corners in Figure 10a,b were determined, the 3D coordinates of the corners could be reconstructed from the rotating anamorphic stereo vision.

_{u}, d

_{v}) = (105.0792 pixels, −28.8297 pixels), the two images will overlap. In Figure 12b, the corresponding points are very close, as shown in Figure 5.

#### 4.2. Accuracy Analysis

_{x}, and f

_{y}are also very important. This method’s measuring principle is based primarily on the anamorphic distance, which changes the image position compared to spherical lenses. The anamorphic distance ad shown in Section 4.1 was small in comparison to the object distance; thus, any small pixel position error would have a significant impact on the measurement results. From Equation (7), we have:

_{h}, δx

_{v}, and δy

_{v}are the pixel errors. If we substitute the pixel errors in Equation (14) with δ, the point error can be deduced from Equation (14) as follows:

_{2}y

_{2}, x

_{2}, x

_{1}y

_{1}, and y

_{2}in Equation (15) are given by:

_{x}= 12 mm, f

_{y}= 16 mm, aa = 0°, and ad = 30 mm.

#### 4.2.1. Accuracy Analysis for a Point

_{C}, Y

_{C}, Z

_{C}] = [500 mm, 500 mm, 1500 mm] in anamorphic coordinates for the vertical position. The pixel coordinates of the point in the vertical and the horizontal anamorphic positions were calculated using Equations (3)–(6). Given the pixel coordinates and the calibrated parameters of the rotating anamorphic stereo vision, the 3D coordinates of the point could be calculated. The pixel errors in the image plane were assumed to have a Gaussian normal distribution with a standard deviation of 0.2 pixels, which was the calibration result of the anamorphic lens. Then, 5000 3D points were reconstructed with varying pixel errors, and the reconstructed 3D points are shown in Figure 16. The standard deviation of the reconstructed 3D point was 17.3378 mm.

#### 4.2.2. Accuracy Analysis for a Surface

_{C}and Y

_{C}coordinates ranged from −1000 mm to 1000 mm, with an interval of 50 mm, and all the Z

_{C}were 1500 mm. The pixel errors in the image plane satisfied a Gauss normal distribution, and the standard deviation of the pixel error was set to 0.2 pixels. The standard deviation for each reconstructed 3D point is shown in Figure 16. As can be seen in Figure 17, the measuring accuracy for the points away from the X

_{C}axis was high, while it was very low if the object points were near the X

_{C}axis. The points near the X

_{C}axis were removed, which left a gap in Figure 17. Thus, there was a measuring gap if rotating anamorphic lenses were used for 3D construction. We named this gap the anamorphic gap.

_{x}, and f

_{y}are closely related because they are determined by the anamorphic lens structure, and it is not possible to simply change one parameter independently [25]. We proposed paraxial lens designs for anamorphic lenses with zero anamorphic distance [26], but designing an anamorphic attachment with an extremely large anamorphic distance appeared to be difficult.

## 5. Conclusions

_{C}axis of the anamorphic lens in the vertical position. These characteristics make the rotating anamorphic stereo vision suitable for a fast 3D reconstruction without a high demand for measurement accuracy, such as car navigation applications. Further research might include high precision anamorphic lens calibration, error compensation, point matching for rotating anamorphic lenses, and anamorphic lens designs with a large anamorphic distance.

## Author Contributions

## Funding

## Data Availability Statement

## Conflicts of Interest

## References

- Leach, R. Optical Measurement of Surface Topography; China Science Publishing & Media Ltd.: Beijing, China, 2012; pp. 154–196. [Google Scholar]
- Gaha, R.; Durupt, A.; Eynard, B. Towards the implementation of the Digital Twin in CMM inspection process: Opportunities, challenges and proposals. Procedia Manuf.
**2021**, 54, 216–221. [Google Scholar] [CrossRef] - Sutton, M.A.; Li, N.; Joy, D.C.; Reynolds, A.P.; Li, X. Scanning electron microscopy for quantitative small and large deformation measurements part I: SEM imaging at magnifications from 200 to 10,000. Exp. Mech.
**2007**, 47, 775–787. [Google Scholar] [CrossRef] - Harding, K. Challenges and opportunities for 3D optical metrology: What is needed today from an industry perspective. Two-Three-Dimens. Methods Insp. Metrol. VI
**2008**, 7066, 112–119. [Google Scholar] - Marrugo, A.G.; Gao, F.; Zhang, S. State-of-the-art active optical techniques for three-dimensional surface metrology: A review. JOSA A
**2020**, 37, B60–B77. [Google Scholar] [CrossRef] [PubMed] - Soid, S.N.; Zainal, Z.A. Spray and combustion characterization for internal combustion engines using optical measuring techniques—A review. Energy
**2011**, 36, 724–741. [Google Scholar] [CrossRef] - O’Riordan, A.; Newe, T.; Dooly, G.; Toal, D. December. Stereo vision sensing: Review of existing systems. In Proceedings of the 2018 12th International Conference on Sensing Technology (ICST), Limerick, Ireland, 4–6 December 2018. [Google Scholar]
- Lazaros, N.; Sirakoulis, G.C.; Gasteratos, A. Review of stereo vision algorithms: From software to hardware. Int. J. Optomech.
**2008**, 2, 435–462. [Google Scholar] [CrossRef] - Hussmann, S.; Ringbeck, T.; Hagebeuker, B. A performance review of 3D TOF vision systems in comparison to stereo vision systems. In Stereo Vision; Intechopen: London, UK, 2008. [Google Scholar]
- Zhang, S. High-speed 3D shape measurement with structured light methods: A review. Opt. Lasers Eng.
**2018**, 106, 119–131. [Google Scholar] [CrossRef] - Chen, M.; Tang, Y.; Zou, X.; Huang, Z.; Zhou, H.; Chen, S. 3D global mapping of large-scale un-structured orchard integrating eye-in-hand stereo vision and SLAM. Comput. Electron. Agric.
**2021**, 187, 106237. [Google Scholar] [CrossRef] - Tang, Y.; Zhou, H.; Wang, H.; Zhang, Y. Fruit detection and positioning technology for a Camellia oleifera C. Abel orchard based on improved YOLOv4-tiny model and binocular stereo vision. Expert Syst. Appl.
**2023**, 211, 118573. [Google Scholar] [CrossRef] - Navarro, A.V.; Navarro, A.V.; Garrido, C.A. Anamorphic Lens. US 9063321 B2, 23 June 2015. [Google Scholar]
- Lippman, D.H.; Teverovsky, D.S.; Bentley, J.L. Monte Carlo first-order design method for anamorphic cinema zoom lenses. Opt. Eng.
**2021**, 60, 051203. [Google Scholar] [CrossRef] - Xu, C.; Song, W.; Wang, Y. Design of a miniature anamorphic lens with a freeform front group and an aspheric rear group. Opt. Eng.
**2021**, 60, 065104. [Google Scholar] [CrossRef] - Zhang, J.; Wang, X.; Ma, M.; Li, F.; Liu, H.; Cui, H. Camera calibration for anamorphic lenses with three-dimensional targets. Appl. Opt.
**2020**, 59, 324–332. [Google Scholar] [CrossRef] [PubMed] - Dodoc, A. Anamorphic prime and zoom lenses. In Proceedings of the Zoom Lenses VI, SPIE, San Diego, CA, USA, 9 September 2019; Volume 11106, pp. 21–40. [Google Scholar]
- Zhang, J.; Chen, X.; Liu, H.; Li, F.; Sun, X. Thin lens aberrations for anamorphic lenses. Appl. Opt.
**2019**, 58, 182–188. [Google Scholar] [CrossRef] [PubMed] - Soskind, M.; Soskind, Y.G. Propagation invariant laser beams for optical metrology applications. In Proceedings of the Modeling Aspects in Optical Metrology V, SPIE, Munich, Germany, 21 June 2015; Volume 9526, pp. 392–398. [Google Scholar]
- Ma, M.; Shao, H.; Zhang, J.; Wang, X.; Li, G. A Calibration Method of Anamorphic Lens Camera Based on Virtual 3D Target. In Proceedings of the 2019 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Hong Kong, China, 8 July 2019. [Google Scholar]
- Towers, C.E.; Towers, D.P.; Campbell, H.I. Three-dimensional particle imaging by wavefront sensing. Opt. Lett.
**2006**, 31, 1220–1222. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Cao, Z.; Wang, K.; Wu, Q. Aspherical anamorphic lens for shaping laser diode beam. Opt. Commun.
**2013**, 305, 53–56. [Google Scholar] [CrossRef] - Durko, H.L.; Barrett, H.H.; Furenlid, L.R. High-Resolution Anamorphic SPECT Imaging. IEEE Trans. Nucl. Sci.
**2014**, 61, 1126–1135. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Blais, F.; Beraldin, J.-A. Calibration of an anamorphic laser based 3-D range sensor. In Proceedings of the SPIE Videometrics V., San Diego, CA, USA, 27 July–1 August 1997; Volume 3174, pp. 113–122. [Google Scholar]
- Chen, X.; Zhang, J. High-precision anamorphic lens calibration with 3D and 2D calibration targets. Appl. Opt.
**2022**, 61, 6062–6075. [Google Scholar] [CrossRef] [PubMed] - Yuan, S.; Sasian, J. Aberrations of anamorphic optical systems. II. Primary aberration theory for cylindrical anamorphic systems. Appl. Opt.
**2009**, 48, 2836–2841. [Google Scholar] [CrossRef] [PubMed] - Chen, X.; Zhang, J. Lens design for parallel cylindrical anamorphic attachments with finite object distance. Appl. Opt.
**2022**, 61, 4610–4619. [Google Scholar] [CrossRef] [PubMed] - Jinkai, Z.; Chen, X. Paraxial lens design of anamorphic lenses with a fixed anamorphic ratio. OSA Contin.
**2019**, 2, 1430–1454. [Google Scholar]

**Figure 1.**Anamorphic imaging model. (

**a**) Imaging rays in the Y

_{c}-O

_{cy}-Z

_{c}plane; (

**b**) imaging rays in the X

_{c}-O

_{cx}-Z

_{c}plane.

**Figure 2.**The two anamorphic positions. (

**a**) Vertical position and (

**b**) horizontal position which is achieved by a rotation of the anamorphic lens in the vertical position by −90°.

**Figure 3.**Simulated object points on a spherical surface in anamorphic coordinates of the vertical position.

**Figure 4.**Simulated image points on the image plane. The dot points refer to the image points when the anamorphic lens is in the vertical position, and the circle points refer to the image points when the anamorphic lens is in the horizontal position.

**Figure 5.**Simulated image points after anamorphic ration (AR) expansion. The dot points are rectified horizontally by AR, and the circle points are rectified vertically by AR.

**Figure 8.**Anamorphic lens composed of a front anamorphic attachment and a rear spherical lens. The anamorphic lens was mounted on a rotary table which could rotate the anamorphic lens by −90° along the optical axis. (

**a**) Side view; (

**b**) Front view.

**Figure 10.**Anamorphic images. (

**a**) Image when the anamorphic lens was in the vertical position, and (

**b**) image when the anamorphic lens was in the horizontal position.

**Figure 11.**Anamorphic images after anamorphic ratio (AR) rectification. (

**a**) Rectified image when the anamorphic lens was in the vertical position, and (

**b**) rectified image when the anamorphic lens was in the horizontal position.

**Figure 12.**Corners after anamorphic ratio (AR) rectification. The red points indicate the corners in Figure 11a, and the green points indicate the corners in Figure 11b. (

**a**) Image points in their original position, (

**b**) green points shift their positions entirely, after which the two images almost overlap. This pixel decentering was due to the deviation between the optical axis and the axis of the rotary table.

**Figure 13.**Images for 3D construction when the anamorphic lens was in the vertical position. (

**a**,

**b**) refer to images of 3D targets and (

**c**–

**h**) refer to the images of a 2D target.

**Figure 14.**Images for 3D construction when the anamorphic lens was in the horizontal position. (

**a**,

**b**) refer to images of 3D targets and (

**c**–

**h**) refer to the images of a 2D target.

**Figure 16.**Image showing 5000 constructed 3D points for an object point in [500 mm, 500 mm, 1500 mm] with different pixel errors. The standard deviation of the constructed 3D points was 17.3378 mm.

**Figure 17.**Standard deviation for object points in a plane. The standard deviations for object points with small Y

_{C}coordinates were removed for large reconstruction errors. The standard deviation of the pixel position was 0.2 pixels, the ad was 30 mm, and the Z

_{C}was 1500 mm.

**Figure 18.**Standard deviation for object points in a plane. The standard deviations for object points with small Y

_{C}coordinates were removed for large reconstruction errors. The standard deviation of the pixel position was 0.2 pixels, the ad was 100 mm, and the Z

_{C}was 1500 mm.

e_{0} (mm) | e_{1} (mm) | e_{2} (mm) | T_{11} (mm) | T_{12} (mm) | T_{2} (mm) |
---|---|---|---|---|---|

16.163062 | 30 | 40 | 4 | 4 | 4 |

f (mm) | R_{11} (mm) | R_{12} (mm) | R_{13} (mm) | R_{21} (mm) | R_{22} (mm) |

16 | 155 | 156.4 | −857.6 | −174.8 | 124.5 |

n_{1} | V_{1} | n_{2} | V_{2} | n_{3} | V_{3} |

1.516797 | 64.212351 | 1.672702 | 32.17888 | 1.516797 | 64.212351 |

f_{x} (mm) | f_{y} (mm) | u_{0} (pixel) | v_{0} (pixel) | ad (mm) |
---|---|---|---|---|

12.0520 | 16.1026 | 1.2790 × 10^{3} | 1.0023 × 10^{3} | 26.5516 |

aa (°) | k_{1} | k_{2} | n_{1} | n_{2} |

−0.5789 | 0.0304 | −2.7151 × 10^{−5} | 0.0183 | −3.3372 × 10^{−5} |

m_{1} | m_{2} | x_{21} | x_{12} | x_{03} |

−0.0155 | 1.277e-5 | 9.1986 × 10^{−5} | −0.0139 | 1.8928 × 10^{−5} |

y_{30} | y_{21} | y_{12} | y_{03} | x_{20} |

1.6727 × 10^{−5} | 0.011 | 1.0611 × 10^{−4} | −0.0023 | 1.3802 × 10^{−4} |

x_{11} | x_{02} | y_{20} | y_{11} | y_{02} |

−4.7071 × 10^{−5} | 5.8709 × 10^{−5} | 9.3657 × 10^{−5} | −3.9944 × 10^{−4} | −2.9851 × 10^{−4} |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Chen, X.; Zhang, J.; Xi, J.
3D Metrology Using One Camera with Rotating Anamorphic Lenses. *Sensors* **2022**, *22*, 8407.
https://doi.org/10.3390/s22218407

**AMA Style**

Chen X, Zhang J, Xi J.
3D Metrology Using One Camera with Rotating Anamorphic Lenses. *Sensors*. 2022; 22(21):8407.
https://doi.org/10.3390/s22218407

**Chicago/Turabian Style**

Chen, Xiaobo, Jinkai Zhang, and Juntong Xi.
2022. "3D Metrology Using One Camera with Rotating Anamorphic Lenses" *Sensors* 22, no. 21: 8407.
https://doi.org/10.3390/s22218407