# Generating Virtual Images from Oblique Frames

^{1}

^{2}

^{*}

^{†}

## Abstract

**:**

## 1. Introduction

## 2. Background

^{®}Digital Mapping Camera (DMC

^{®}), which has four panchromatic and four multispectral heads. The first step is the laboratory geometric and radiometric calibration of each camera head individually. The positions of each camera perspective centers within the cone are directly measured, but the mounting angles cannot be measured with the required accuracy. These mounting angles are estimated in a bundle adjustment step, known as platform calibration. This bundle adjustment uses tie points extracted in the overlapping areas of the four panchromatic images by image matching techniques and with the IOP of each head being determined in the laboratory calibration. Transformation parameters are then computed to map from each single image to the virtual image and these images are projected to generate a panchromatic virtual image. Finally the four multispectral images are fused with the high resolution virtual panchromatic image. This process is accurate but requires laboratory facilities to perform the first steps.

#### 2.1. Camera Calibration

_{f}, y

_{f}are the image coordinates and X, Y, Z the coordinates of the same point in the object space; m

_{ij}are the rotation matrix elements; X

_{0}, Y

_{0}, Z

_{0}are the coordinates of the camera perspective center (PC); x

_{0}, y

_{0}are the principal point coordinates; f is the camera focal length and δx

_{i}, δy

_{i}are the effects of radial, decentering lens distortion and affinity model [19].

_{0}) are not separable and the system becomes singular or ill-conditioned. In addition to these correlations, the coordinates of the principal point are highly correlated with the perspective center coordinates (x

_{0}and X

_{0}; y

_{0}and Y

_{0}). To cope with these dependencies, several methods have been proposed, such as the mixed range method [20] and the convergent camera method [17]. The maturity of direct orientation techniques, using Global Navigation Satellite Systems (GNSS) dual frequency receivers and Inertial Measurement Units (IMU), makes feasible the robust integration of sensor orientation and calibration. Position of the PC and camera attitude can be considered as observed values, being introduced as constraints in the bundle adjustment, aiming the minimization of the correlation problem previously mentioned. For applications requiring real time response, such as natural disasters or accidents, Choi and Lee [21] presented a method of real time aerial triangulation combining direct orientation data gathered from GNSS receivers and IMU and indirect orientation techniques. The proposed solution considered that the IOP where previously determined. Camera calibration can be performed considering that the IOP vary for each image but this option is difficult to handle due to the high correlation between some parameters. In order to make this technique feasible, Nakano and Chikatsu [22] presented a camera calibration technique that combines a consumer grade camera with a LASER distance meter aligned with the camera optical axis, without the need for GNSS data and Ground Control Points.

#### 2.2. Multi-Head Camera Calibration

_{lm}

^{(i)}being the elements of the relative rotation matrix for an image pair (i) and ΔR

_{lm}

^{(k)}the relative rotation matrix for an image pair (k). The first three independent Equations (2) reflect the assumption of relative rotation stability. The last three Equations (3) are based on the assumption that the base components of two different stereopairs should also be the same. He et al.[24] also used the base distance between the cameras perspective centers, directly measured by theodolites as an additional weighted constraint which defines the scale of the local coordinate system. He et al.[24] did not mention how they treated the stochastic properties of the constraints given in Equation (2). In the original formulation presented by He et al.[24] the base components were defined as the differences between the perspective center coordinates of the two cameras and were not pre-multiplied by the rotation matrix, restricting the application of the proposed method. They computed simultaneously the IOP and the EOP.

_{RO}is the RO matrix; R

_{C1}and R

_{C2}are rotation matrices for cameras 1 and 2, respectively. Other elements that can be considered as stable during the acquisition are the Euclidian distance D between the cameras perspective centers (the base length) or the base components.

_{RO}

^{(t)}as the RO matrix and D

^{2(t)}as the squared distance between the cameras perspective centers, for the instant t and, analogously, for the instant t+1, R

_{RO}

^{(t+1)}and D

^{2(t+1)}, it can be assumed that the RO matrix and the distance between the perspective centers are stable, but admitting some random variations. Based on these assumptions, the following equations can be written:

_{ic}is a residual in the constraint equation.

_{a}) and unknowns (X

_{a}) in an implicit model, which can be represented as F(L

_{a},X

_{a}) = 0, when the linearized model takes the following form [32], with some minor changes in notation:

_{c}(L

^{a}

_{c},X

_{a}), the corresponding linearized equations are:

^{a}

_{c}are “pseudo-observations”, B

_{c}is the design matrix of functions F

_{c}with respect to the pseudo-observations, C is the design matrix of functions F

_{c}with respect to the parameters and W′ is also a misclosure vector:

_{0}, which is updated iteratively, is given by:

^{T}(BQB

^{T})

^{−1}A, U=A

^{T}(BQB

^{T})

^{−1}W, N

_{C}=C

^{T}(B

_{C}Q

_{C}B

_{C}

^{T})

^{−1}C and U

_{C}=C

^{T}(B

_{C}Q

_{C}B

_{C}

^{T})

^{−1}W′. In Equation (17) N

_{C}and U

_{C}reflect the effects of constraints.

_{C}the a priori cofactor matrix of the “pseudo-observations”, which is derived by covariance propagation from the expected variations in the relative orientation parameters.

## 3. Methodology

#### 3.1. Camera Calibration with RO Constraints

#### 3.2. Image Rectification

#### 3.3. Image Registration

#### 3.4. Images Fusion

## 4. Experimental Assessment

_{0}and y

_{0}) for both cameras are presented for each experiment. It can be seen that the similar estimated standard deviations were achieved in all experiments, except when the cameras were calibrated independently (Experiment A1).

_{c}) and 1 pixels in rows (σ

_{r}). It is possible to note that the matching of the rectified image pairs when using parameters generated by self-calibration with RO constraints is better, mainly in experiments C and D (in which angular variations of 1″ and 10″ were considered, respectively). The effects of varying the weight in the base components constraints were not assessed in these experiments.

## 5. Conclusions

## Acknowledgments

## References

- Ruy, R.S.; Tommaselli, A.M.G.; Galo, M.; Hasegawa, J.K.; Reis, T.T. Accuracy Analysis of Modular Aerial Digital System SAAPI in Projects of Large Areas. Proceedings of EuroCow2012—International Calibration and Orientation Workshop, Castelldefels, Spain, 8–10 February 2012.
- Mostafa, M.M.R.; Schwarz, K.-P. A multi-sensor system for airborne image capture and georreferencing. Photogramm. Eng. Remote Sensing
**2000**, 66, 1417–1423. [Google Scholar] - Zeitler, D.W.; Doerstel, C.; Jacobsen, D.K. Geometric Calibration of the DMC: Method and Results. Proceedings of the ISPRS Commission I/Pecora 15 Conference, Denver, CO, USA, 10–15 November 2002; pp. 324–333.
- Hunt, E.R.; Hively, W.D.; Fujikawa, S.J.; Linden, D.S.; Daughtry, C.S.T.; McCarty, G.W. Acquisition of NIR-green-blue digital photographs from unmanned aircraft for crop monitoring. Remote Sens
**2010**, 2, 290–305. [Google Scholar] - Ritchie, G.L.; Sullivan, D.G.; Perry, C.D.; Hook, J.E.; Bednarz, C.W. Preparation of a low-cost digital camera system for remote sensing. Appl. Eng. Agric
**2008**, 24, 885–896. [Google Scholar] - Chao, H.; Jensen, A.M.; Han, Y.; Chen, Y.; McKee, M. AggieAir: Towards Low-Cost Cooperative Multispectral Remote Sensing Using Small Unmanned Aircraft Systems. In Advances in Geoscience and Remote Sensing; Jedlovec, G., Ed.; InTech: Rijeka, Croatia, 2009. [Google Scholar] [CrossRef]
- Schoonmaker, J.; Podobna, Y.; Boucher, C.; Saggese, S.; Runnels, D. Multichannel imaging in remote sensing. Proc. SPIE
**2009**. [Google Scholar] [CrossRef] - Hakala, T.; Suomalainen, J.; Peltoniemi, J.I. Acquisition of bidirectional reflectance factor dataset using a micro unmanned aerial vehicle and a consumer camera. Remote Sens
**2010**, 2, 819–832. [Google Scholar] - Laliberte, A.S.; Goforth, M.A.; Steele, C.M.; Rango, A. Multispectral remote sensing from unmanned aircraft: Image processing workflows and applications for rangeland environments. Remote Sens
**2011**, 3, 2529–2551. [Google Scholar] - D’Oleire-Oltmanns, S.; Marzolff, I.; Peter, K.D.; Ries, J.B. Unmanned Aerial Vehicle (UAV) for monitoring soil erosion in Morocco. Remote Sens
**2012**, 4, 3390–3416. [Google Scholar] - Grenzdörffer, G.; Niemeyer, F.; Schmidt, F. Development of Four Vision Camera System for a Micro-UAV. Proceedings of XXII ISPRS Congress, Melbourne, Australia, 25 August–1 September 2012; pp. 369–374.
- Yang, C. A high-resolution airborne four-camera imaging system for agricultural remote sensing. Comput. Electron. Agric
**2012**, 88, 13–24. [Google Scholar] - Holtkamp, D.J.; Goshtasby, A.A. Precision registration and mosaicking of multicamera images. IEEE Trans. Geosci. Remote Sens
**2009**, 47, 3446–3455. [Google Scholar] - Petrie, G. Systematic oblique aerial photography using multiple digital frame camera. Photogramm. Eng. Remote Sensing
**2009**, 75, 102–107. [Google Scholar] - Tommaselli, A.M.G.; Galo, M.; Marcato, J., Jr.; Ruy, R.S.; Lopes, R.F. Registration and Fusion of Multiple Images Acquired with Medium Format Cameras. Proceedings of the Canadian Geomatics Conference 2010 and Symposium of Commission I, Calgary, AB, Canada, 15–18 June 2010.
- Tommaselli, A.M.G.; Moraes, M.V.A.; Marcato Junior, J.; Caldeira, C.R.T.; Lopes, R.F.; Galo, M. Using Relative Orientation Constraints to Produce Virtual Images from Oblique Frames. Proceedings of XXII ISPRS Congress, Melbourne, Australia, 25 August–1 September 2012; pp. 61–66.
- Brown, D. Close-range camera calibration. Photogramm. Eng
**1971**, 37, 855–866. [Google Scholar] - Clarke, T.; Fryer, J. The development of camera calibration methods and models. Photogramm. Rec
**1998**, 16, 51–66. [Google Scholar] - Habib, A.F.; Morgan, M.F. Automatic calibration of low-cost digital cameras. Opt. Eng
**2003**, 42, 948–955. [Google Scholar] - Merchant, D.C. Analytical Photogrammetry: Theory and Practice; Ohio State University: Columbus, OH, USA, 1979. [Google Scholar]
- Choi, K.; Lee, I. A Sequential aerial triangulation algorithm for real-time georeferencing of image sequences acquired by an airborne multi-sensor system. Remote Sens
**2013**, 5, 57–82. [Google Scholar] - Nakano, K.; Chikatsu, H. Camera-variant calibration and sensor modeling for practical photogrammetry in archeological sites. Remote Sens
**2011**, 3, 554–569. [Google Scholar] - Zhuang, H. A self-calibration approach to extrinsic parameter estimation of stereo cameras. Robot. Auton. Syst
**1995**, 15, 189–197. [Google Scholar] - He, G.; Novak, K.; Feng, W. Stereo camera system calibration with relative orientation constraints. Proc. SPIE
**1994**. [Google Scholar] [CrossRef] - King, B.A. Methods for the photogrammetric adjustment of bundles of constrained stereopairs. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci
**1994**, 30, 473–480. [Google Scholar] - King, B.A. Bundle adjustment of constrained stereopairs-mathematical models. Geomat. Res. Austr
**1995**, 63, 67–92. [Google Scholar] - El-Sheimy, N. The Development of VISAT—A Mobile Survey System for GIS Applications. 1996. [Google Scholar]
- Tommaselli, A.M.G; Alves, A.O. Calibração de uma Estereocâmara Baseada em vídeo. In Séries em Ciências Geodésicas—30 Anos de Pós-Graduação em Ciências Geodésicas no Brasil; Universidade Federal do Paraná: Curitiba, PR, Brazil, 2001; Volume 1, pp. 199–213. [Google Scholar]
- Tommaselli, A.; Galo, M.; Bazan, W.; Ruy, R.; Junior, J.M. Simultaneous Calibration of Multiple Camera Heads with Fixed Base Constraint. Proceedings of the 6th International Symposium on Mobile Mapping Technology, Presidente Prudente, SP, Brazil, 21–24 July 2009.
- Lerma, J.L.; Navarro, S.; Cabrelles, M.; Seguí, A.E. Camera calibration with baseline distance constraints. Photogramm. Rec
**2010**, 25, 140–158. [Google Scholar] - Blázquez, M.; Colomina, I. Fast AT: A simple procedure for quasi direct orientation. ISPRS J. Photogramm
**2012**, 71, 1–11. [Google Scholar] - Mikhail, E.M.; Ackermann, F.E. Observations and Least Squares; University Press of America: New York, NY, USA, 1983. [Google Scholar]

**Figure 1.**Resulting rectified images of dual cameras: (

**a**) left image from camera 2, and (

**b**) right image from camera 1, (

**c**) resulting fused image from two rectified images after registration and, (

**d**) cropped without the borders.

**Figure 3.**(

**a**) Image of the calibration field; (

**b**) origin of the arbitrary object reference system; and (

**c**) existing targets and distances directly measured with a precision calliper for quality control.

**Figure 7.**Standard deviations of rotation elements of the Relative Rotation matrix computed from estimated exterior orientation parameters (EOP).

**Figure 8.**(

**a**) Set of virtual images used in the fusion experiments; (

**b**) reduced set used in the bundle block adjustment.

**Figure 9.**Average values for the standard deviations of discrepancies in tie points coordinates of 5 rectified image pairs with different sets of Interior Orientation Parameters (IOP) and Relative Orientation Parameters (ROP).

**Figure 10.**Average values for the standard deviations of discrepancies in tie points coordinates of five rectified image pairs with different sets of IOP and ROP, after scale change in the right image.

**Figure 11.**Distribution of ground control Points (triangles) and check points (circles) in the experiment with bundle adjustment.

**Figure 12.**RMSE in the check points coordinates obtained in a bundle adjustment with three virtual images generated with parameters obtained in the experiments: (

**a**) without and (

**b**) with scale correction in the right image.

Cameras | Fuji S3 Pro |
---|---|

Sensor | CCD − 23.0 × 15.5 mm |

Number of pixels | 4,256 × 2,848 (12 MP) |

Pixel size (mm) | 0.0054 |

Focal length (mm) | 28.4 |

Experiment | A1 and A2 | B | C | D | E | F | G |
---|---|---|---|---|---|---|---|

RO Constraints | Single camera calib. | N | Y | Y | Y | Y | Y |

Variation of the RO angular elements | - | - | 1″ | 10″ | 15″ | 30″ | 1′ |

Variation of the base components (mm) | - | - | 1 | 1 | 1 | 1 | 1 |

## Share and Cite

**MDPI and ACS Style**

Tommaselli, A.M.G.; Galo, M.; De Moraes, M.V.A.; Marcato, J., Jr.; Caldeira, C.R.T.; Lopes, R.F.
Generating Virtual Images from Oblique Frames. *Remote Sens.* **2013**, *5*, 1875-1893.
https://doi.org/10.3390/rs5041875

**AMA Style**

Tommaselli AMG, Galo M, De Moraes MVA, Marcato J Jr., Caldeira CRT, Lopes RF.
Generating Virtual Images from Oblique Frames. *Remote Sensing*. 2013; 5(4):1875-1893.
https://doi.org/10.3390/rs5041875

**Chicago/Turabian Style**

Tommaselli, Antonio M. G., Mauricio Galo, Marcus V. A. De Moraes, José Marcato, Jr., Carlos R. T. Caldeira, and Rodrigo F. Lopes.
2013. "Generating Virtual Images from Oblique Frames" *Remote Sensing* 5, no. 4: 1875-1893.
https://doi.org/10.3390/rs5041875