Next Article in Journal
Sensing Home: A Cost-Effective Design for Smart Home via Heterogeneous Wireless Networks
Next Article in Special Issue
Underwater Photogrammetry and Object Modeling: A Case Study of Xlendi Wreck in Malta
Previous Article in Journal
Tracking Multiple Video Targets with an Improved GM-PHD Tracker
Previous Article in Special Issue
On the Accuracy Potential in Underwater/Multimedia Photogrammetry
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Camera Calibration for Water-Biota Research: The Projected Area of Vegetation

School of Civil and Building Engineering, Loughborough University, Loughborough, Leicestershire LE11 3TU, UK
*
Author to whom correspondence should be addressed.
Sensors 2015, 15(12), 30261-30269; https://doi.org/10.3390/s151229798
Submission received: 8 October 2015 / Revised: 27 November 2015 / Accepted: 30 November 2015 / Published: 3 December 2015

Abstract

:
Imaging systems have an indisputable role in revealing vegetation posture under diverse flow conditions, image sequences being generated with off the shelf digital cameras. Such sensors are cheap but introduce a range of distortion effects, a trait only marginally tackled in hydraulic studies focusing on water-vegetation dependencies. This paper aims to bridge this gap by presenting a simple calibration method to remove both camera lens distortion and refractive effects of water. The effectiveness of the method is illustrated using the variable projected area, computed for both simple and complex shaped objects. Results demonstrate the significance of correcting images using a combined lens distortion and refraction model, prior to determining projected areas and further data analysis. Use of this technique is expected to increase data reliability for future work on vegetated channels.

1. Introduction

The capability of aquatic plants to deform and “reconfigure” is critical to the functioning of lotic ecosystems [1,2]. Specifically, adverse effects imposed by these barriers (in terms of flow resistance) are counterbalanced by a variety of ecosystem services associated with plant motion, namely regulating services [3]. Thus, some authors have sought to quantify plants’ morphology as a way to assess the performance of different species [4].
Sagnes [4] describes the technical challenges with quantifying the frontal area of a plant. Specifically, Sagnes identifies that “the projected frontal surface area (Af) captures flow-induced shape variation and is seemingly the most realistic physical description”. Different setups and image perspectives have been adopted to estimate Af or equivalent descriptors, ranging from: mirrors attached to the bottom part of laboratory facilities or in situ environments combined with top view images using regular cameras [4,5,6]; images acquired in still air [7] and water conditions [8,9]; to submerged digital cameras aligned with the plant mass centre [10,11]. If light absorption or scattering is not dominant in the course of image acquisition, underwater techniques provide the only opportunity to accurately inspect the morphological reconfiguration of vegetation specimens in the field. Nevertheless, non-metric sensors such as consumer grade digital cameras do not possess, as opposed to photogrammetric or metric cameras, a calibration certificate. Basically, this demands deriving a set of parameters which can be used to describe the internal geometry of the imaging system (e.g., focal length, principal point offset, and radial and tangential lens distortion) [12]. This step is crucial, notably if precise spatial information is to be extracted and carried out through a process known as “self-calibrating bundle adjustment” [13,14]. The impact of lens distortion on subsequent measurements have been previously mentioned on vegetated studies, but have been neither thoroughly investigated nor quantified. For instance, Jalonen et al. [11] identified that scaling errors can distort the estimated projected area up to 10%, however, it is a plausible conjecture that these results possibly include a combination of errors caused by scale constraints and uncorrected lens distortion. Even in the work conducted by Sagnes [4], possibly the most comprehensive work on the topic that one can find in the literature, overlook this aspect. Our belief is that this is mainly a consequence of user unawareness of imaging geometry or a procedure to appropriately calibrate non-metric imagery. Whittaker [15] states that in the absence of a known focal length, distortion effects cannot be scrutinized and Wunder et al. [10] assumed, without apparent reason, that camera distortion effects were minimized in their work. Bearing in mind these considerations, this paper presents a method based on well-established photogrammetric principles to eliminate lens distortion in both dry and wet environments and compares projected areas using non-calibrated and calibrated cameras. Our work proves that a simple methodology, easily adoptable by experimentalists, allows for an effective camera calibration, thus enabling refinement of existing experimental protocols, particularly those prevailing in laboratory-based activities. The present analysis is restricted to the parameter projected area due to its relevance in aquatic studies (e.g., to evaluate the drag coefficient) but conclusions stemming from this work are equally valid for other morphological studies using similar imaging systems.
Tests performed for this work are explained in the next section. Afterwards, the camera calibration procedure is described. Finally, results are presented and some conclusions drawn.

2. Experimental Setup

Three different experimental setups were employed to determine the projected area of an object in dry conditions and in both submerged static and submerged flow conditions (discharge: 0.124 m3·s−1, water depth: 0.275 m, flume length: 5.24 m, flume width: 0.915 m). Areas evaluated in these practical applications included the use of a simple metal cube, which provided an accurate reference area (0.01055 m2), and a real plant (bush species: Buxus sempervirens, height: 0.20 m). In all these measurements, distances between photogrammetric targets attached to a wooden frame (Figure 1) were determined using a vernier calliper, and used for scaling purposes in the process of calculating Af using digital imagery and photogrammetric measurement. The target frame was located in the same plane as the front of the test object. A video sequence of the objects surface area Af was acquired using an underwater endoscope camera (Figure 2), at an object to camera distance of 0.7 m, and approximately perpendicular to the metal cube and vegetation bodies.
Figure 1. Metal cube and target frame used to provide a reference area (Left) and respective dimensions of the cube (Right).
Figure 1. Metal cube and target frame used to provide a reference area (Left) and respective dimensions of the cube (Right).
Sensors 15 29798 g001
Figure 2. Underwater endoscope camera (resolution: 640 × 480 pixels; price July 2013: £25).
Figure 2. Underwater endoscope camera (resolution: 640 × 480 pixels; price July 2013: £25).
Sensors 15 29798 g002
The trial was conducted under dry conditions and furthered the opportunity to test the methodology without the additional distorting effect of the light rays passing through water due to refraction. Furthermore, for this attempt, a DSLR camera (Nikon D80, resolution: 3872 × 2592 pixels), shown to be suitable for accurate photogrammetric measurement in the past [16], was also employed for comparing images taken by both cameras. Use of a plastic water tank (submerged static conditions) offered a controlled environment to calibrate the underwater camera and assess if the lens distortion and refractive effects due to the water could be accurately modelled (Figure 3). Results achieved using the plastic water tank encouraged a further test to determine the projected area of both objects, i.e., the cube and the bush, under flow conditions in an open-channel flume (Figure 4).
Figure 3. Plastic water tank setup.
Figure 3. Plastic water tank setup.
Sensors 15 29798 g003
Figure 4. Open-channel flume.
Figure 4. Open-channel flume.
Sensors 15 29798 g004
For the three cases mentioned above (still air, unstressed and stressed conditions), a Matlab routine was developed to manipulate and measure the images containing the object and the target frame (for stressed flow conditions an image was arbitrarily selected from the video footage). After reading the image file, a Matlab function was used to measure the distance between two photogrammetric targets in the image space. The measured distance in the image space and the distance measured in the object space were used to calculate an image scale factor. In essence, the routine converts an RGB image to a binary image using a simple 2-fold image classification. The pixels in the region of interest (i.e., pixels representing the cube and the bush) are represented by white pixels, whilst all other objects are represented by black pixel values (Figure 5). Pixels representing the cube and the bush were counted automatically and the area was quantified by using the image scale factor. When attempted, image thresholding is almost certainly affected by some degree of uncertainty/imprecision (for example, Af is slightly overestimated in Figure 5). Hence, in practical terms, each researcher should carry out a systematic modification of the threshold value until the desired classification is reached.
Figure 5. Bush and reference frame used for area determination (Left) and binary image obtained from Matlab (Right).
Figure 5. Bush and reference frame used for area determination (Left) and binary image obtained from Matlab (Right).
Sensors 15 29798 g005
It needs to be recognized that most cameras are not designed for accurate photogrammetric measurement [17]. Camera lenses are characterized by significant lens distortion which degrades the achievable accuracy in the object space [18] and additionally, in our study, the distortion effects of the endoscope camera will also change radically when used in an underwater environment as a result of water refraction. Such imaging sensors can be calibrated to minimize the combined effect of these two phenomena, i.e., lens distortions and refraction effects for a specific camera to object distance. Routinely, this is done by assuming that these two components are implicitly considered in the distortion terms of the functional model known as the extended collinearity equations [19,20]. The camera calibration process that has been used prior to computing Af in both the dry and underwater studies constitutes the core of this work and is portrayed in the subsequent section.

3. Camera Calibration

The extended collinearity equations provide a framework to directly transform the object coordinates into the corresponding photo coordinates [21,22]
x′ a =   x p c · [ r 11 ( X 0 X A ) + r 12 ( Y 0 Y A ) + r 13 ( Z 0 Z A ) ] [ r 31 ( X 0 X A ) + r 32 ( Y 0 Y A ) + r 33 ( Z 0 Z A ) ] +   Δ x′ y′ a =   y p c · [ r 21 ( X 0 X A ) + r 22 ( Y 0 Y A ) + r 23 ( Z 0 Z A ) ] [ r 31 ( X 0 X A ) + r 32 ( Y 0 Y A ) + r 33 ( Z 0 Z A ) ] +   Δ y′
where (x′a, y′a) and (XA, YA, ZA) represent the coordinates of a generic point A in the image and object space, respectively, (xp, yp) are the principal point coordinates, (X0, Y0, Z0) are the coordinates of the perspective centre in the object space, rij (with i,j = 1,2,3) represent the elements of a rotation matrix, c is the principal distance and Δx′ and Δy′ are photo coordinate corrections to the combined (radial and decentring) lens distortion. The combined lens distortion terms can be represented by the equations [12]:
Δ x′ = Δ x r a d + Δ x d e c Δ x = x Δ r r a d r + P 1 ( r 2 + 2 x 2 ) + 2 P 2 x y Δ y′ = Δ y r a d + Δ y d e c Δ y = y Δ r r a d r + P 2 ( r 2 + 2 y 2 ) + 2 P 1 x y Δ r r a d = K 1 r 3 + K 2 r 5 + K 3 r 7
Both camera exterior orientation (defined by X0, Y0, Z0, and rij) and interior orientation (comprising xp, yp, c, Δx′, and Δy′) are typically obtained through a bundle adjustment [23]. Auspiciously, over the past years, continued advances in digital photogrammetry have increased the number of applications of photogrammetry. In particular, automated image-processing algorithms have attenuated competences needed to deal with photogrammetric projects and therefore, this can certainly be a promising solution for hydraulicians studying certain physical processes with the aid of imaging systems [13]. Having these considerations in mind, the PhotoModeler Scanner software (64 bit) [24] was selected to calibrate the two cameras. PhotoModeler models the radial lens distortions and the decentring distortions through Equations (2). As an output, the software provides some quality indicators (average and maximum residuals) which are extremely useful to judge the overall accuracy of the derived calibration data.
Figure 6 represents the image configuration (image frames represented by numbers 1 to 12) and the calibration board used to determine the camera calibration parameters for both the D80 camera and the underwater probe (dry and submerged condition for the underwater probe). The calibration board consisted of 49 coded targets generated by PhotoModeler Scanner. Twelve images of the calibration board were captured, with three image frames rotated by 90 degrees (frames 2, 4, and 6 in Figure 6) to provide the possibility to estimate the principal point offset xp and yp of the camera [17,25]. The camera to object distance was set to 0.7 m, the exact same distance used when collecting the metal cube and the bush imagery. The calibration files were subsequently uploaded to a PC, and processed using the camera calibration tool in PhotoModeler Scanner. Finally, camera models determined for the D80 camera and the underwater probe were applied in order to remove the distortion effects of the recorded images. PhotoModeler provides the option to use the estimated camera parameters to produce an undistorted or “idealized” image. In general terms, during idealization, the software re-maps the image pixel by pixel and removes any lens distortion, non-centred principal point and any non-square pixels [24] (Figure 7). The effect of camera calibration on the computation of the surface area will be explored in the following section.
Figure 6. Camera calibration configuration—Note that two distinct environments were considered at this phase: dry and wet (using the plastic water tank) to fully consider the fluid at the camera’s interface during area assessment.
Figure 6. Camera calibration configuration—Note that two distinct environments were considered at this phase: dry and wet (using the plastic water tank) to fully consider the fluid at the camera’s interface during area assessment.
Sensors 15 29798 g006
Figure 7. Black metal cube image using the endoscope camera in the plastic water tank.
Figure 7. Black metal cube image using the endoscope camera in the plastic water tank.
Sensors 15 29798 g007

4. Results and Discussion

4.1. Dry Case

The projected area of a metal cube with known dimensions and a real bush were determined under dry conditions using the endoscope camera and a Nikon D80 DSLR camera normally used for spatial measurement. Table 1 summarizes the estimated cube areas using the two cameras. The first column contains the calibration status of the cameras, whilst the second column tabulates the determined cube area using images acquired with the Nikon camera. The percentage error of the cube area obtained with the Nikon camera is identified in column three and the final two columns represent the cube area and the percentage error determined using the underwater endoscope camera. It should be emphasized that the percentage error was computed as:
( | areapredicted areaactual | areaactual ) * 100
Both cameras achieved similar results when calibrated (percentage error of 0.1%). The Nikon D80 camera attained a percentage error of 0.8% when camera calibration parameters were not considered, whilst the determined percentage error of the underwater endoscope camera was 1.4%. The performance of both cameras is mainly affected by lens distortion, which evidently is of a different magnitude in these two cases. Nevertheless, results demonstrate that both camera lenses are able to derive an accurate area in dry conditions, if appropriately calibrated.
The metal cube was exchanged for the bush and results are presented in Table 2. Obviously, computed areas at this stage can only be compared in relation to each other, as no “true” area estimation is available. Results reinforce the viability of using this particular endoscope camera to obtain accurate estimates of the projected area, once lens distortion is considered. The areas determined varied by 2.7% using images acquired with non-calibrated cameras. Remarkably, areas of similar orders (discrepancy of 0.3%) have been determined using images where lens distortion was accounted for.
Table 1. Metal cube area.
Table 1. Metal cube area.
Camera CalibrationArea D80 Camera [m2]Error D80 Camera [%]Area Endoscope Camera [m2]Error Endoscope Camera [%]
Not calibrated dry0.010630.80.010401.4
Calibrated dry0.010560.10.010560.1
Not calibrated tank0.00969.0
Calibrated tank0.01041.4
Not calibrated flume0.00987.1
Calibrated flume0.01071.4
Table 2. Bush area dry condition.
Table 2. Bush area dry condition.
Camera CalibrationArea D80 Camera [m2]Area Endoscope Camera [m2]Difference D80-Endoscope [%]
Not calibrated dry0.03240.03332.7
Calibrated dry0.03180.03170.3

4.2. Plastic Water Tank

In the presence of static water, distortions are expected to increase since light paths are refracted twice in the vicinity of camera lenses. Again, underwater images of the metal cube were acquired and results are shown in Table 1. Images not corrected for lens distortion exhibit a marked difference to the known metal cube area (error of 9%). However, the percentage error between the computed area and the reference area is reduced to just 1.4% when distortion effects are modelled using the radial lens parameters. This can dramatically reduce the uncertainty of image analysis in these conditions.
The projected area determined for the bush using the endoscope camera image without a lens model diverged from the bush area in dry conditions by 12.3% (Table 3). This error was reduced to just 1.6% when lens distortion was considered. These areas are usually assumed to be coincident since buoyancy effects are taken to be negligible [15]. This is also likely to be true in our case due to the high flexural rigidity of the vegetation stems. Consequently, we hypothesize that this small difference is related to minor experimental errors, e.g., an imperfect alignment of the underwater camera.
Table 3. Bush area using the endoscope camera in the plastic water tank.
Table 3. Bush area using the endoscope camera in the plastic water tank.
Camera CalibrationArea Bush Submerged [m2]Area Bush Dry [m2]Difference Submerged-Dry [%]
Not calibrated tank0.02920.033312.3
Calibrated tank0.03120.03171.6

4.3. Open-Channel Flume

This test was conducted under conditions similar to those found in a field environment, especially with respect to water clarity. The water discharge was 0.124 m3·s−1 and use of the underwater camera had the additional advantage of allowing the adoption of reduced object to camera distances with minimal flow disturbance. For the metal cube, a noteworthy discrepancy was found if calibration is ignored (Table 1). This is visible from the substantial departure from the cube reference area (7.1%). Once again, it is clear that the inclusion of a lens model significantly improves area estimations (from one case to the other, area variation was of 8.4%). A similar conclusion was found for the vegetation specimen (Table 4). If distortion effects are compensated, surface area is actually 5% greater in flow conditions. Moreover, area differences between flowing and dry conditions range from 17.3% (not calibrated) to 5.7% (calibrated). This finding is significant since it expresses morphological adjustment of the specimen due to water flowing over and around the plant in “stressed” conditions.
Table 4. Bush area using the endoscope camera in the flume.
Table 4. Bush area using the endoscope camera in the flume.
Camera CalibrationArea Bush Submerged [m2]Area Bush Dry [m2]Difference Flowing-Dry [%]
Not calibrated flume0.02840.033317.3
Calibrated flume0.02990.03175.7
Difference calibrated-not calibrated [%]5.04.9
This trial demonstrated that image acquisition can be problematic in a real river environment. Due to low illumination of the lower parts of the vegetation specimen, external lighting sources had to be used to improve illumination to a suitable level for image processing. Additionally, a high suspended sediment load in the flume appeared to reduce image quality, although not to a level to affect image processing.

5. Conclusions

Imaging systems are becoming increasingly used by experimentalists, due to their ability to clarify certain aspects of flow-vegetation interactions. This fact together with the notion that calibration of non-metric cameras is vital to extract reliable spatial data [13,14] inspired the present work. The magnitude of lens distortion depends on the combination of several factors, namely the focus settings of the lens, the camera depth of field, the medium of data acquisition, and the lens itself. In other words, lens distortions and/or refraction effects will always be present, to a greater or lesser extent, when image based approaches are used. By assessing the two most demanding arrangements used in this study (i.e., results obtained with the underwater endoscope in the tank and stressed flow conditions) and considering their worst case scenarios, failure to consider camera calibration would lead to errors of 9.0% and 12.3% (cube and bush in the tank, respectively), 7.1% (cube in the open-channel), and 5% (bush in the open-channel). Distortions are clearly case dependent, whereby a sound calibration procedure such as the one presented here can be highly convenient, since simplistic procedures to evaluate lens distortions magnitude (such as the one suggested by Sagnes [4]) are avoided. Our results illustrate the need to consider these distortion effects explicitly, especially in flume and field studies. This will undoubtedly contribute to the refinement of current experimental practices, particularly on vegetated flows research, which is largely focussed on a laboratory scale. This requirement is expected to be even higher in turbid waters, where short focal distances will be needed to attain optimum results, and consequently larger distortions will be created. Although recognizing the existence of other methods to deal with this subject, e.g., the ray tracing approach [20], the author’s belief that the approach described above constitutes a valuable starting point for experimentalists whenever environmental conditions (e.g., light and turbidity content) are favourable. This can now be accomplished in a relatively straightforward manner by making use of specialized digital photogrammetry tools.

Acknowledgments

Authors would like to thank the assistance of Nick Rodgers during the experiments. This work was conducted with funding provided by the UK’s Engineering and Physical Sciences Research Council (Grant EP/K004891/1).

Author Contributions

All authors have contributed equally in terms of data acquisition, analysis, and writing of this research contribution.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Marion, A.; Nikora, V.; Puijalon, S.; Koll, K.; Ballio, F.; Tait, S.; Zaramella, M.; Sukhodolov, S.; O’Hare, M.; Wharton, G.; et al. Aquatic interfaces: A hydrodynamic and ecological perspective. J. Hydraul. Res. 2014, 52, 744–758. [Google Scholar] [CrossRef]
  2. Puijalon, S.; Bornette, G.; Sagnes, P. Adaptations to increasing hydraulic stress: Morphology, hydrodynamics and fitness of two higher aquatic plant species. J. Exp. Bot. 2005, 56, 777–786. [Google Scholar] [CrossRef] [PubMed]
  3. Luhar, M.; Nepf, H. Flow induced reconfiguration of buoyant and flexible aquatic vegetation. Limnol. Oceanogr. 2011, 56, 2003–2017. [Google Scholar] [CrossRef]
  4. Sagnes, P. Using multiple scales to estimate the projected frontal surface area of complex three-dimensional shapes such as flexible freshwater macrophytes at different flow conditions. Limnol. Oceanogr. Methods 2010, 8, 474–483. [Google Scholar] [CrossRef]
  5. Stazner, B.; Lamouroux, N.; Nikora, V.; Sagnes, P. The debate about drag and reconfiguration of freshwater macrophytes: Comparing results obtained by three recently discussed approaches. Freshw. Biol. 2006, 51, 2173–2183. [Google Scholar] [CrossRef]
  6. Neumeier, U. Quantification of vertical density variations of salt-marsh vegetation. Estuar. Coast. Shelf Sci. 2005, 63, 489–496. [Google Scholar] [CrossRef]
  7. Pavlis, M.; Kane, B.; Harris, J.R.; Seiler, J.R. The effects of pruning on drag and bending moment of shade trees. Arboric. Urban For. 2008, 34, 207–215. [Google Scholar]
  8. Armanini, A.; Righetti, M.; Grisenti, P. Direct measurement of vegetation resistance in prototype scale. J. Hydraul. Res. 2005, 43, 481–487. [Google Scholar] [CrossRef]
  9. Wilson, C.A.M.E.; Hoyt, J.; Schnauder, I. Impact of foliage on the drag force of vegetation in aquatic flows. J. Hydraul. Eng. 2008, 134, 885–891. [Google Scholar] [CrossRef]
  10. Wunder, S.; Lehmann, B.; Nestmann, F. Determination of the drag coefficients of emergent and just submerged willows. Int. J. River Basin Manag. 2011, 9, 231–236. [Google Scholar] [CrossRef]
  11. Jalonen, J.; Järvelä, J.; Aberle, J. Vegetated flows: Drag force and velocity profiles for foliated plant stands. In In River Flow, Proceedings of the 6th International Conference on Fluvial Hydraulics, San José, Costa Rica, 5–7 September 2012; Murillo Muñoz, R.E., Ed.; CRC Press: Boca Raton, FL, USA, 2012; pp. 233–239. [Google Scholar]
  12. Brown, D.C. Close-range camera calibration. Photogramm. Eng. 1971, 37, 855–866. [Google Scholar]
  13. Chandler, J. Effective application of automated digital photogrammetry for geomorphological research. Earth Surf. Process. Landf. 1999, 24, 51–63. [Google Scholar] [CrossRef]
  14. Lane, S.N.; Chandler, J.H.; Porfiri, K. Monitoring river channel and flume surfaces with digital photogrammetry. J. Hydraul. Eng. 2001, 127, 871–877. [Google Scholar] [CrossRef]
  15. Whittaker, P. Modelling the Hydrodynamic Drag Force of Flexible Riparian Woodland. Ph.D. Thesis, Cardiff University, Cardiff, UK, 2014. [Google Scholar]
  16. Wackrow, R.; Chandler, J. A convergent image configuration for DEM extraction that minimizes the systematic effects caused by an inaccurate lens model. Photogramm. Rec. 2008, 23, 6–18. [Google Scholar] [CrossRef] [Green Version]
  17. Fryer, J.G. Camera calibration. In Close Range Photogrammetry and Machine Vision; Atkinson, K.B., Ed.; Whittles Publishing: Caithness, UK, 2001. [Google Scholar]
  18. Wackrow, R.; Chandler, J.H.; Bryan, P. Geometric consistency and stability of consumer-grade digital cameras for accurate spatial measurement. Photogram. Rec. 2007, 22, 121–134. [Google Scholar] [CrossRef] [Green Version]
  19. Fryer, J.G.; Fraser, C.S. On the calibration of underwater cameras. Photogramm. Rec. 1986, 12, 73–85. [Google Scholar] [CrossRef]
  20. Harvey, E.S.; Shortis, M.R. Calibration stability of an underwater stereo video system: Implications for measurement accuracy and precision. Mar. Technol. Soc. J. 1998, 32, 3–17. [Google Scholar]
  21. Cooper, M.A.R.; Robson, S. Theory of close range photogrammetry. In Close Range Photogrammetry and Machine Vision; Atkinson, K.B., Ed.; Whittles Publishing: Caithness, UK, 2001. [Google Scholar]
  22. Wackrow, R. Spatial Measurement with Consumer Grade Digital Cameras. Ph.D. Thesis, Loughborough University, Loughborough, UK, 2008. [Google Scholar]
  23. Granshaw, S.I. Bundle adjustment methods in engineering photogrammetry. Phtogramm. Rec. 2006, 10, 181–207. [Google Scholar] [CrossRef]
  24. PhotoModeler Scanner. Available online: http://photomodeler.com/index.html (accessed on 24 September 2005).
  25. Fraser, C.S. Digital self-calibration. ISPRS J. Photogramm. Remote Sens. 1997, 52, 149–159. [Google Scholar] [CrossRef]

Share and Cite

MDPI and ACS Style

Wackrow, R.; Ferreira, E.; Chandler, J.; Shiono, K. Camera Calibration for Water-Biota Research: The Projected Area of Vegetation. Sensors 2015, 15, 30261-30269. https://doi.org/10.3390/s151229798

AMA Style

Wackrow R, Ferreira E, Chandler J, Shiono K. Camera Calibration for Water-Biota Research: The Projected Area of Vegetation. Sensors. 2015; 15(12):30261-30269. https://doi.org/10.3390/s151229798

Chicago/Turabian Style

Wackrow, Rene, Edgar Ferreira, Jim Chandler, and Koji Shiono. 2015. "Camera Calibration for Water-Biota Research: The Projected Area of Vegetation" Sensors 15, no. 12: 30261-30269. https://doi.org/10.3390/s151229798

Article Metrics

Back to TopTop