# Global Calibration of Multi-Cameras Based on Refractive Projection and Ray Tracing

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Mathematical Model of Camera and Ray Tracing

#### 2.1. Camera Model

#### 2.2. Refractive Projection and Ray Tracing

- The procedure is initialized by $k=1$. ${r}_{1}^{k}$ denotes the direction of the line connecting the camera center ${X}_{C}$ and the 3D point $P$. We can find the intersection of ${r}_{1}^{k}$ and ${S}_{1}$ at the point ${X}_{{i}_{1}}^{k}$.
- When ${n}_{1}$ and ${n}_{2}$ are known, we can find the ${r}_{2}^{k}$ using the Equation (8), which intersects ${S}_{2}$ at the point ${X}_{{i}_{2}}^{k}$.
- The ray $-{r}_{2}^{k}$ is projected from P to interface ${S}_{1}$, and parallel to ${r}_{2}^{k}$ but opposite in direction.
- Finally, the ray $-{r}_{2}$ is intersected with ${S}_{1}$, resulting the point ${X}_{{i}_{1}}^{{}^{\prime}k}$.
- If the distance $\mathrm{\Delta}{X}_{{i}_{1}}^{k}=\left|{X}_{{i}_{1}}^{k}-{X}_{{i}_{1}}^{{}^{\prime}k}\right|$ between the ${X}_{{i}_{1}}^{k}$ and ${X}_{{i}_{1}}^{{}^{\prime}k}$ is larger than the tolerance, the above procedures will be reiterated, and the point at $\frac{1}{2}\left({X}_{{i}_{1}}^{k}+{X}_{{i}_{1}}^{{}^{\prime}k}\right)$ is defined as ${X}_{{i}_{1}}^{k+1}$. Otherwise, the optimal solution of the intersection of ${r}_{1}^{k}$ and ${S}_{1}$ is found.

## 3. The Proposed Calibration Method

#### 3.1. Multi-Camera Calibration Based on Refractive Projection

#### 3.2. Solving Intrinsic Camera Parameters and Initial Estimation of Extrinsic Camera Parameters

#### 3.3. Summary

- (1)
- Multiple cameras are installed and their FOV covers the same area of the calibration target simultaneously. Intrinsic camera parameters and distortion coefficients of each camera are calibrated independently.
- (2)
- In the overlapping FOV of the MCS, multiple cameras acquire the image of the calibration target from different orientations. Images captured by each camera contain the front or back of the calibration target.
- (3)
- Using the DLT method or the theory of multi-layer flat refractive geometry to obtain the extrinsic camera parameters of each camera relative to their WCS, the extrinsic camera parameters of each camera are unified to the master WCS. The rotation and translation of each camera relative to the master camera are obtained as Equations (12) and (13).
- (4)
- The extrinsic camera parameters of the system and the refractive index of the glass are optimized by the bundle adjustment method and the refractive projection model.

## 4. Experiments and Discussion

#### 4.1. Synthetic Data

#### 4.2. Real Data

^{2}and the distance between the adjacent points is 12 mm in the horizontal and the vertical directions. The checkerboard pattern is printed on one side of glass calibration plate with a position accuracy of 0.0015 mm.

#### 4.3. Discussion

## 5. Conclusions

## Acknowledgments

## Author Contributions

## Conflicts of Interest

## References

- Zhan, D.; Yu, L.; Xiao, J.; Chen, T.L. Multi-camera and structured-light vision system (MSVS) for dynamic high-accuracy 3d measurements of railway tunnels. Sensors
**2015**, 15, 8664–8684. [Google Scholar] [CrossRef] [PubMed] - Gong, Z.; Liu, Z.; Zhang, G.J. Flexible global calibration of multiple cameras with nonoverlapping fields of view using circular targets. Appl. Opt.
**2017**, 56, 3122–3131. [Google Scholar] [CrossRef] [PubMed] - Bosch, J.; Gracias, N.; Ridao, P.; Ribas, D. Omnidirectional underwater camera design and calibration. Sensors
**2015**, 15, 6033–6065. [Google Scholar] [CrossRef] [PubMed] - Schmidt, A.; Kasiński, A.; Kraft, M.; Fularz, M.; Domagała, Z. Calibration of the multi-camera registration system for visual navigation benchmarking. Int. J. Adv. Robot. Syst.
**2014**, 11, 83. [Google Scholar] [CrossRef] - Ryan, D.; Denman, S.; Fookes, C.; Sridharan, S. Scene invariant multi camera crowd counting. Pattern Recogn. Lett.
**2014**, 44, 98–112. [Google Scholar] [CrossRef][Green Version] - Kovac, I. Flexible inspection systems in the body-in-white manufacturing. In Proceedings of the 2004 International Workshop on Robot Sensing, Graz, Austria, 24–25 May 2004; Institute of Electrical and Electronics Engineers Inc.: Graz, Austria, 2004; pp. 41–48. [Google Scholar] [CrossRef]
- Chen, X.; Yang, L.X.; Xu, N.; Xie, X.; Sia, B.; Xu, R. Cluster approach based multi-camera digital image correlation: Methodology and its application in large area high temperature measurement. Opt. Laser Technol.
**2014**, 57, 318–326. [Google Scholar] [CrossRef] - Chen, F.X.; Chen, X.; Xie, X.; Feng, X.; Yang, L.X. Full-field 3d measurement using multi-camera digital image correlation system. Opt. Laser Eng.
**2013**, 51, 1044–1052. [Google Scholar] [CrossRef] - Weng, J.; Cohen, P.; Herniou, M. Camera calibration with distortion models and accuracy evaluation. IEEE Trans. Pattern Anal. Mach. Intell.
**1992**, 14, 965–980. [Google Scholar] [CrossRef] - Shen, E.; Hornsey, R. Multi-camera network calibration with a non-planar target. IEEE Sens. J.
**2011**, 11. [Google Scholar] [CrossRef] - Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell.
**2000**, 22, 1330–1334. [Google Scholar] [CrossRef] - Dong, S.; Shao, X.X.; Kang, X.; Yang, F.J.; He, X.Y. Extrinsic calibration of a non-overlapping camera network based on close-range photogrammetry. Appl. Opt.
**2016**, 55, 6363–6370. [Google Scholar] [CrossRef] [PubMed] - Baker, P.T.; Aloimonos, Y. Calibration of a multicamera network. In Proceedings of the 2003 Conference on Computer Vision and Pattern Recognition Workshop, Madison, WI, USA, 16–22 June 2003; p. 72. [Google Scholar] [CrossRef]
- Belden, J. Calibration of multi-camera systems with refractive interfaces. Exp. Fluids
**2013**, 54, 1463. [Google Scholar] [CrossRef] - Orteu, J.J.; Bugarin, F.; Harvent, J.; Robert, L.; Velay, V. Multiple-camera instrumentation of a single point incremental forming process pilot for shape and 3d displacement measurements: Methodology and results. Exp. Mech.
**2011**, 51, 625–639. [Google Scholar] [CrossRef] - Zhengyou, Z. Camera calibration with one-dimensional objects. IEEE Trans. Pattern Anal. Mach. Intell.
**2004**, 26, 892–899. [Google Scholar] [CrossRef] [PubMed] - Wang, L.; Wang, W.W.; Shen, C.; Duan, F.Q. A convex relaxation optimization algorithm for multi-camera calibration with 1d objects. Neurocomputing
**2016**, 215, 82–89. [Google Scholar] [CrossRef] - Liu, Z.; Li, F.J.; Zhang, G.J. An external parameter calibration method for multiple cameras based on laser rangefinder. Measurement
**2014**, 47, 954–962. [Google Scholar] [CrossRef] - Fu, Q.; Quan, Q.; Cai, K.Y. Calibration of multiple fish-eye cameras using a wand. IET Comput. Vis.
**2015**, 9, 378–389. [Google Scholar] [CrossRef] - Loaiza, M.E.; Raposo, A.B.; Gattass, M. Multi-camera calibration based on an invariant pattern. Comput. Graph.
**2011**, 35, 198–207. [Google Scholar] [CrossRef] - De Franca, J.A.; Stemmer, M.R.; Franca, M.B.D.; Piai, J.C. A new robust algorithmic for multi-camera calibration with a 1d object under general motions without prior knowledge of any camera intrinsic parameter. Pattern Recogn.
**2012**, 45, 3636–3647. [Google Scholar] [CrossRef] - Shin, K.Y.; Mun, J.H. A multi-camera calibration method using a 3-axis frame and wand. Int. J. Precis. Eng. Manuf.
**2012**, 13, 283–289. [Google Scholar] [CrossRef] - Long, Q.; Zhongdan, L. Linear n-point camera pose determination. IEEE Trans. Pattern Anal. Mach. Intell.
**1999**, 21, 774–780. [Google Scholar] [CrossRef] - Xu, G.; Zhang, X.; Li, X.; Su, J.; Hao, Z. Global calibration method of a camera using the constraint of line features and 3d world points. Meas. Sci. Rev.
**2016**, 16, 190. [Google Scholar] [CrossRef] - Devarajan, D.; Cheng, Z.L.; Radke, R.J. Calibrating distributed camera networks. Proc. IEEE
**2008**, 96, 1625–1639. [Google Scholar] [CrossRef] - Gemeiner, P.; Micusik, B.; Pflugfelder, R. Calibration methodology for distant surveillance cameras. Lect. Notes Comput. Sci.
**2015**, 8927, 162–173. [Google Scholar] [CrossRef] - Tsai, M.-J.; Hung, C.-C. Development of a high-precision surface metrology system using structured light projection. Measurement
**2005**, 38, 236–247. [Google Scholar] [CrossRef] - Tsai, R. A versatile camera calibration technique for high-accuracy 3d machine vision metrology using off-the-shelf TV cameras and lenses. IEEE J. Robot. Autom.
**1987**, 3, 323–344. [Google Scholar] [CrossRef] - Huang, J.H.; Wang, Z.; Gao, Z.H.; Gao, J.M. A novel color coding method for structured light 3d measurement. Proc. SPIE
**2011**, 8085. [Google Scholar] [CrossRef] - Mulsow, C. A flexible multi-media bundle approach. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.
**2010**, XXXVIII, 472–477. [Google Scholar] - Hartley, R.I. In defense of the eight-point algorithm. IEEE Trans. Pattern Anal. Mach. Intell.
**1997**, 19, 580–593. [Google Scholar] [CrossRef] - Agrawal, A.; Ramalingam, S.; Taguchi, Y.; Chari, V. A theory of multi-layer flat refractive geometry. In Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition, Providence, RI, USA, 16–21 June 2012; pp. 3346–3353. [Google Scholar] [CrossRef]
- Tan, L.; Wang, Y.N.; Yu, H.S.; Zhu, J. Automatic camera calibration using active displays of a virtual pattern. Sensors
**2017**, 17, 685. [Google Scholar] [CrossRef] [PubMed]

**Figure 6.**The relative error of extrinsic parameters for one camera without refraction estimation. (

**a**) Relative error for the rotation vector; (

**b**) Relative error for the translation vector.

**Figure 7.**The relative error of extrinsic parameters for one camera with refraction estimation. (

**a**) Relative error for the rotation vector; (

**b**) Relative error for the translation vector; (

**c**) Relative error for the refraction index.

**Figure 8.**The relative error of extrinsic parameters for binocular cameras with refraction estimation. (

**a**) Relative error for the rotation vector of the left camera; (

**b**) Relative error for the translation vector of the left camera; (

**c**) Relative error for the rotation vector of the left and right camera; (dRelative error for the translation vector of the left and right camera; (

**e**) Relative error for the refraction index.

Camera 1 | Camera 2 | Camera 3 | Camera 4 | Uncertainty (3σ) | |
---|---|---|---|---|---|

Focal length | $\left[\begin{array}{c}2618.29\\ 2618.20\end{array}\right]$ | $\left[\begin{array}{c}2625.76\\ 2625.61\end{array}\right]$ | $\left[\begin{array}{c}2617.17\\ 2616.88\end{array}\right]$ | $\left[\begin{array}{c}2620.34\\ 2620.35\end{array}\right]$ | $\left[\begin{array}{c}0.49\\ 0.44\end{array}\right]$ |

Principal point | $\left[\begin{array}{c}1290.91\\ 1014.72\end{array}\right]$ | $\left[\begin{array}{c}1286.45\\ 1001.44\end{array}\right]$ | $\left[\begin{array}{c}1255.36\\ 1026.86\end{array}\right]$ | $\left[\begin{array}{c}1293.70\\ 1006.56\end{array}\right]$ | $\left[\begin{array}{c}0.80\\ 0.73\end{array}\right]$ |

Distortion (${k}_{1}{k}_{2}$) | $\left[\begin{array}{c}-0.1338\\ 0.1326\end{array}\right]$ | $\left[\begin{array}{c}-0.1356\\ 0.1462\end{array}\right]$ | $\left[\begin{array}{c}-0.1332\\ 0.1360\end{array}\right]$ | $\left[\begin{array}{c}-0.1324\\ 0.1344\end{array}\right]$ | $\left[\begin{array}{c}0.0008\\ 0.0036\end{array}\right]$ |

Camera 2-1 | Camera 3-1 | Camera 4-1 | Uncertainty (3σ) | |
---|---|---|---|---|

Rotation Vector | $\left[\begin{array}{c}0.0892\\ 0.7389\\ 0.0365\end{array}\right]$ | $\left[\begin{array}{c}0.1479\\ 3.0076\\ 0.2330\end{array}\right]$ | $\left[\begin{array}{c}0.0212\\ -2.3761\\ -0.1040\end{array}\right]$ | $\left[\begin{array}{c}0.0018\\ 0.0026\\ 0.0009\end{array}\right]$ |

Translation vector | $\left[\begin{array}{c}-248.9713\\ 1.0768\\ 93.0314\end{array}\right]$ | $\left[\begin{array}{c}-4.3414\\ -81.7511\\ 766.3633\end{array}\right]$ | $\left[\begin{array}{c}274.9126\\ -42.0589\\ 641.0544\end{array}\right]$ | $\left[\begin{array}{c}0.1858\\ 0.1280\\ 0.3189\end{array}\right]$ |

© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Feng, M.; Jia, X.; Wang, J.; Feng, S.; Zheng, T. Global Calibration of Multi-Cameras Based on Refractive Projection and Ray Tracing. *Sensors* **2017**, *17*, 2494.
https://doi.org/10.3390/s17112494

**AMA Style**

Feng M, Jia X, Wang J, Feng S, Zheng T. Global Calibration of Multi-Cameras Based on Refractive Projection and Ray Tracing. *Sensors*. 2017; 17(11):2494.
https://doi.org/10.3390/s17112494

**Chicago/Turabian Style**

Feng, Mingchi, Xiang Jia, Jingshu Wang, Song Feng, and Taixiong Zheng. 2017. "Global Calibration of Multi-Cameras Based on Refractive Projection and Ray Tracing" *Sensors* 17, no. 11: 2494.
https://doi.org/10.3390/s17112494