Next Article in Journal
Pattern of Endodontic Lesions of Maxillary and Mandibular Posterior Teeth: A Cone-Beam Computed Tomography Study
Next Article in Special Issue
Perception and Quantization Model for Periodic Contour Modifications
Previous Article in Journal
A Study of the Information Embedding Method into Raster Image Based on Interpolation
Previous Article in Special Issue
Line Clipping in 2D: Overview, Techniques and Algorithms
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Experimental Study of Radial Distortion Compensation for Camera Submerged Underwater Using Open SaltWaterDistortion Data Set

1
Evocargo LLC, 129085 Moscow, Russia
2
Moscow Institute of Physics and Technology (National Research University), 141701 Dolgoprudny, Russia
3
Smart Engines Service LLC, 117312 Moscow, Russia
4
Federal Research Center “Computer Science and Control” of Russian Academy of Sciences, 119333 Moscow, Russia
5
National University of Science and Technology MISIS, 119049 Moscow, Russia
6
Institute for Information Transmission Problems of Russian Academy of Sciences, 127051 Moscow, Russia
*
Author to whom correspondence should be addressed.
J. Imaging 2022, 8(10), 289; https://doi.org/10.3390/jimaging8100289
Submission received: 30 April 2022 / Revised: 14 September 2022 / Accepted: 12 October 2022 / Published: 19 October 2022
(This article belongs to the Special Issue Geometry Reconstruction from Images)

Abstract

:
This paper describes a new open data set, consisting of images of a chessboard collected underwater with different refractive indices, which allows for investigation of the quality of different radial distortion correction methods. The refractive index is regulated by the degree of salinity of the water. The collected data set consists of 662 images, and the chessboard cell corners are manually marked for each image (for a total of 35 , 748 nodes). Two different mobile phone cameras were used for the shooting: telephoto and wide-angle. With the help of the collected data set, the practical applicability of the formula for correction of the radial distortion that occurs when the camera is submerged underwater was investigated. Our experiments show that the radial distortion correction formula makes it possible to correct images with high precision, comparable to the precision of classical calibration algorithms. We also show that this correction method is resistant to small inaccuracies in the indication of the refractive index of water. The data set, as well as the accompanying code, are publicly available.

1. Introduction

Various kinds of aberrations occur in optic systems. Therefore, when shooting with a camera other than a pinhole one, distortion in the image may emerge, such as radial distortion (RD) [1]. RD can be eliminated by calibration of the camera, during which the parameters are estimated for further software distortion compensation.
In the air, the refractive index (and, hence, the distortion parameters) barely changes, which allows the camera calibration to be carried out only once to obtain undistorted images. However, after submersion of the camera underwater, the distortion parameters change significantly due to the refraction of light waves when passing through the water–glass–air interfaces.
The exploration of RD correction methods underwater and study of their characteristics are relevant among scientists in many applied scientific fields. Underwater video analytics systems, for instance, are being developed to monitor the population status of forage fish in closed reservoirs [2]. The quality of the RD correction turns out to be crucial for the high-quality performance of such a system. A related problem has been described in the work [3], where a computer vision system was suggested as an alternative to the use of electric fishing rods for spawning fish “sampling”. The authors noted that an important factor affecting the quality of its operation is proper underwater RD correction. The author of [4] came to similar conclusions, in the context of an underwater object tracking task. Furthermore, in the work [5], the authors noted the importance of RD correction for the study of fish behaviour in their natural environment.
The authors of the work [6] focused on a system for autonomous mapping of the ocean floor, and demonstrated a number of experimental results using real and synthetic data. Based on these results, they explicitly showed the strong affect of even small errors in the estimated RD parameters on the constructed map quality. The same problem has been faced by scientists during the creation of a system for three-dimensional reconstruction of cultural heritage objects underwater [7]. The task of eliminating RD caused by the submersion of a lens underwater has also arisen in a work devoted to the creation of an visual odometry algorithm for an underwater autonomous vehicle [8].
A study that particularly deserves mentioning is [9], in which the authors encountered an underwater RD calibration problem in the task of improving the image aesthetics. Often, to enhance the image, scene depth map restoration is performed; for example, it is necessary to simulate the bokeh effect. The resulting distorted structure of the depth map can, to the contrary, lead to a decrease in aesthetics. Moreover, manufacturers of mobile phones and smartphones are currently extensively switching to the new IP68 [10] standard, with increased requirements for the stability of underwater performance, which has leads to growth in the popularity of amateur underwater photography.
The classic approach [11] for underwater RD correction is repeating calibration after submersion, which is a rather expensive and time-consuming process: a set of images of a special calibration object from different angles must be collected, followed by performing an automated search for RD correction parameters, the quality of which is determined by the prepared set of calibration images.
Most modern mobile phones cameras are sufficiently well pre-calibrated for shooting in the air, where RD is typically almost absent. In contrast, when underwater distortion appears, recalibration is required (Figure 1); however, due to the high labour consumption and complexity, its execution is not always justified.
Although it has been practically shown that such a shooting protocol actually allows for underwater RD elimination with sufficient quality [12], it is not applicable to any shooting scenario. For example, during amateur photography, strict limitations on the underwater shooting session time, or cases when operator’s expertise is limited, such an approach will generally not work. The average mobile phone user is more likely to delete a distorted image than figure out how to fix it.
The problem of RD correction parameter estimation is also aggravated by the fact that the refractive index may differ in different water reservoirs; consequently, the correction parameters found for some reservoir may not be suitable for another. The same may be true for a single reservoir, if its salinity or the temperature of the water changes [13]. It turns out that either the camera must be recalibrated for every submersion, or the necessity of recalibration must somehow be assessed.
Thus, the question arises: “Is it possible not to recalibrate, but to recalculate the parameters of the RD correction, knowing the refractive index of water?” For the first time, the answer to this question was proposed in [14]. In this study, the authors conducted a number of experiments, on the basis of which they derived an empirical formula for radial distortion correction depending on the refractive index. Later, in the work [15], the authors presented similar results, which were obtained theoretically, not experimentally.
In addition to deriving the RD correction formula, Ref. [15] have demonstrated its accuracy for a real image. However, there remain a number of questions, the answer to which requires a full-fledged experiment:
  • How accurate is such analytical calibration?
  • How does the accuracy of the initial calibration in the air affect the RD correction quality for this type of water?
  • How does the inaccuracy of water refractive index selection affect the RD correction quality?
This work is devoted to answering these questions, thereby assessing the practical applicability of the RD compensation formula proposed in [15].
However, to answer these questions, a data set of underwater images which allows us to numerically assess the quality of distortion correction is required; for example, a set of images of some calibration object with a known structure. We could not find such an open data set. Therefore, a new set of images of a chessboard with manually marked cells square was assembled, called S W D (Salt Water Distortion), which will be described in detail in Section 4 of this work.
The new results of numerical experiments confirming the practical applicability of the RD correction formula are described in Section 5.1 and Section 5.2. Section 5.3 evaluates the quality of the performance of an automatic detector of chessboard cell corners in underwater images, used as a tool to assess the quality of associated algorithms.

2. Reference Calibration Algorithm

In the work [16], it has been shown that the use of the pinhole camera model underwater has some restrictions, as the refractive index underwater depends on the wavelength. However, according to [17], for 19 °C distilled water, the refractive index varies from 1.332 at a wavelength of 656 nm to 1.343 at a wavelength of 404 nm. As such variation is insignificant, the resulting angular difference in refraction is insignificant. The authors of [2] came to a similar conclusion.
An alternative to using the pinhole camera model is a more complex model, the so-called non-single viewpoint model (nSVP), while all falling rays pass through one point in the traditional camera model, in nSVP, they fall on the caustic curve [18]. As the authors of [19] have mentioned, the standard calibration object (chessboard) is not suitable in this case. Instead, a new calibration object was proposed: a perforated lattice, which is illuminated by two different wavelengths of light, forming an exactly known scheme of point light sources.
Thus, as the pinhole camera model is quite accurate, and the complications of the model lead to significant complication of the calibration procedure and the data collection process, only RD correction algorithms based on the classical pinhole camera model were studied in this work.
In this category, there exist many algorithms for RD calibration [20], which can be divided into the following groups:
  • Calibration algorithms calculating RD correction parameters based on a series of images of a calibration object; for example, a chessboard [11], a flat object with evenly spaced LED bulbs [21], or an arbitrary flat textured object [22].
  • Calibration algorithms based on active vision, calculating the parameters of RD correction from a series of scene images; in this case, information about camera movement is known during calibration [23].
  • Self-calibration algorithms that calculate the RD correction parameters by checking the correctness of the epipolar constraint for a series of images of the same scene taken from different angles [24].
  • Self-calibration algorithms that calculate camera parameters from a single-shot image [25,26].
As the purpose of this work is to check the formula for RD parameter re-calculation in laboratory conditions, the algorithm described in [11] is considered in our work as the most suitable reference; hereafter, we denote this algorithm by “classic”.
It is worth mentioning that, while the chosen classic algorithm aims to address more complex problems (i.e., not only RD correction), this algorithm is used only as a reference and to have an adequate baseline for quality assessment of the RD correction formula.

3. Methods

3.1. Classic Algorithm for Radial Distortion Correction

The classic algorithm works with a set of images of a flat calibration object with periodic structure (e.g., a chessboard) and known geometric parameters, taken from various angles. In these images, special points are defined (e.g., the nodes of a chessboard). This method is a traditional approach, in which the calculation of parameters of the distortion model is performed simultaneously as the calculation of the internal parameters of the camera (in the pinhole camera model), through solving the a non-linear optimization problem.
The calibration algorithm typically uses a standard distortion model [27] describing radially symmetric distortions. The software implementation of this algorithm in the OpenCV [28] library uses a model in which the transformation of the image point coordinates (in the plane of the pinhole camera screen) is set as follows:
x y = x y 1 + k 1 r 2 + k 2 r 4 + k 3 r 6 1 + k 4 r 2 + k 5 r 4 + k 6 r 6 ,
where x y T denotes the initial image point coordinates; x y T denotes the coordinates of this image point after distortion correction; r = x 2 + y 2 ; and k 1 , , k 6 are the distortion coefficients.

3.2. Formula for Radial Distortion Correction

In [15], it was shown that, with a known refractive index of water, the correction of the distortion occurring when the camera is immersed can be described by converting the coordinates of the image in the plane of the pinhole camera screen, as follows:
x y = 1 1 ( n 2 1 ) r 2 x y ,
where x y T and x y T denote the coordinates of the object on the pinhole camera screen when shooting in the air and underwater, respectively; r = ( x ) 2 + ( y ) 2 ; and n is the refractive index of water.
This formula is obtained by assuming infinitesimal thickness of the material separating water and air.

4. Calibration Data Set of Images in Salt Water (SWD)

Shooting was conducted using a modern mobile phone (Huawei Mate 20 Pro). The data set was obtained using two cameras: a telephoto (focal length 7.485 mm) camera and a wide-angle (focal length 2.35 mm) camera. The image size was the same for both cameras, and was equal to 1459 × 1094 pixels. A 13 × 9.1 centimetre chessboard was used as a calibration object, with a side length per chessboard cell of 1.3 cm ( 10 × 7 cells).
Underwater image collection was carried out by submerging the cameras into an aquarium filled with tap water (salinity less than 1 % ). The non-salty water image set contained 48 and 56 images for each camera, correspondingly. Then, the salinity was increased, using table salt, to 13, 27, or 40 % , with the required amount of salt having been pre-calculated, as the spatial measurements of the used aquarium were known. The obtained salt water image sets consisted of 34, 86, and 88 images for the telephoto camera and 47, 89, and 80 images for the wide-angle camera, respectively.
The coordinates of the cell corners of the chessboard image for all collected images were marked manually. The coordinates of each point were specified with sub-pixel accuracy and, depending on the situation, either a point in the middle of a pixel, a point on a pixel grid, or a middle point of the border of two adjacent pixels was assigned to the corner pre-image of the chessboard. Examples of images and the used markup are provided in Figure 2. The collected data set is publicly available (https://github.com/Visillect/SaltWaterDistortion (accessed on 11 October 2022)).
It is also worth noting the technical difficulties that arose during the collection of the data set, which affected its appearance:
  • After the shutter of the smartphone camera was released, the image was processed programmatically from the RAW format, as a result of which the final image was cropped, which was not displayed on the smartphone screen at the time of shooting. Thus, it was difficult to obtain a board at the edge of the image, where the distortion is known to be maximal. An example histogram of the distribution of board cells corners in the original image is shown in Figure 3.
  • Due to the movement of water movement during shutter release and further image processing, many photos turned cloudy. These photos had to be excluded as, due to the turbidity, some cell corners of the chessboard became indistinguishable.
  • With an increase in salinity, the water turbidity also increased, as the salt used was not pure enough and contained impurities.
  • Laminated paper with the chessboard image was used for shooting underwater. Because of this, light glares appeared at some angles. Particularly, it became difficult to determine the location of the chess grid corners, and such photos were also manually excluded.

5. Experimental Results

5.1. Precision of the Correction Formula

In this work, all numerical experiments were carried out on images captured only by a telephoto smartphone camera.
The initial calibration of the smartphone camera was carried out in air, using the classical method with images taken in the air from the SWD data set. Thus, the original RD of the lenses was eliminated, and the associated correction parameters were fixed. For images corrected with these parameters, the RD correction formula with varying index n was applied to the underwater photography. The marked points on the chessboard for all images in the experiment were also re-calculated, according to these parameters, using Formula (1). Based on analysis of the structure of the transformed set of marked points, the precision of the correction was evaluated.
To correctly compare the results of such correction with the classical method, the RD correction parameters underwater were calculated using the classical method for each degree of salinity separately. After that, using the obtained RD correction parameters, the marked points were transformed. Similarly to above, the precision of the correction algorithm was evaluated based on the obtained results.
In this paper, a software implementation presented in the OpenCV [28,29] library was used for the classical calibration method. An example of image correction using both methods is shown in Figure 4.
The experiment was carried out the same way for each level of salinity. Optimal distortion correction coefficients were selected through cross-validation (the size of the training set was 75 % , and the validation set was 25 % ) on the training set (i.e., with the exception of 10 test images).
To assess the quality of the radial distortion effect correction, the structure of the transformed set of marked points was analysed: the better that the points corresponding to one straight line of chessboard corners fit to a straight line, the better the calibration effect. In this paper, two metrics were used to estimate the quality of calibration:
  • Metric 1. The standard deviation of the cell corners from the straight line approximating them, determined using the OLS method, was estimated.
  • Metric 2. The distance from the straight line constructed through the corners of the chessboard and the most distant cell corner corresponding to it.
The second metric is especially important for quantifying the effect, as it is more likely that the furthest line will be at the border of the image, where RD correction errors are particularly pronounced.
For each set of images corresponding to different salinity indices, four errors were calculated:
  • M1. The average value of metric 1 on the entire set of images for all lines.
  • M2. The maximum value of metric 1 on the entire set of images among all lines.
  • M3. The average value of metric 2 on the entire set of images for all lines.
  • M4. The average value of metric 2 on the entire set of images for the most distant lines (for each image, one such line was chosen).
The point coordinates were multiplied by the same number, such that the length of the largest chessboard side for each image was 1000 pixels. This normalization was carried out to eliminate the influence of the scaling factor when correcting the image. The results of the experiment are presented in Table 1.
It can be concluded, from the table, that the errors in the air after applying the classical method differed only by a few hundredths of a pixel from the errors before applying the method, which is reasonable as most of cameras (and especially smartphone cameras) are designed and optimised for shooting in the air. At the same time, errors in water at all levels of salinity, after applying the classical calibration method and the correction Formula (2), were much lower than the errors obtained before RD correction. The correction precision of both methods was comparable; in some cases, the correction error using Formula (2) turned out to be even lower than that obtained when using classical method. From this, we can conclude that the proposed formula is applicable for the correction of RD that occurs when a camera is submerged underwater.

5.2. Dependence of the Correction Precision on Salinity

As is well-known, the refractive index of water is affected by salinity and temperature [13]. In this paper, to conduct experiments with different refractive indices, the degree of salinity of water at room temperature was varied. The salinity of water is easier to control technically; moreover, it influences the refractive index of water more significantly.
Four sets of underwater images with a telephoto camera from the SWD data set collected in water with different salinity levels were used for the experiment. The refractive index n = 1.33 corresponds to distilled water, while n = 1.40 corresponds to 40 % saline solution (i.e., the salinity of the Dead Sea) [30]. The 13 % and 27 % salt solutions corresponded to n = 1.35 and n = 1.37 , respectively.
For each of these sets, correction was performed using Formula (2) with different values of the specified refractive index. The results of the experiments are presented in Table 2. The experiment showed that even significant changes in the salinity index only slightly affected the precision of the final correction; namely, the precision did not change by more than 0.2 .
It should also be noted that, for all experiments, the error increased with an increase in the refractive index. This was due to the imperfection of the image normalization method: with an increase in the refraction parameter, the degree of image distortion increases, which leads to an increase in the final error when normalizing the largest chessboard side to a size of a thousand pixels.
From the results of this experiment, it can be concluded that the correction of radial distortion by Formula (2) with a refractive index of 1.33 provides acceptable precision, in most cases.

5.3. Quality of the Automatic Chessboard Cell Corner Detector

To automatically assess the quality of the RD correction algorithms, it was necessary to be able to programmatically search for cell corners in the calibration object (i.e., the chessboard). This feature was provided by the findChessboardCorners function in the OpenCV library. A logical question arises: “Is it possible to use this function to assess the quality of correction in experiments with underwater images without using a data set?”.
To answer this question, it was considered sufficient to compare the detection result with the marked points in the SWD data set. For each image, the largest Euclidean distance between a pair of corresponding points was estimated, the comparison results are presented in Table 3.
The results clearly demonstrate that the accuracy of the cell corner detector was significantly lower than the accuracy of correction. Large values in the column with maximum errors indicate that there were outliers that made at least the M2 and M4 metrics uninformative. Finally, the results in the table indicate that, with increasing salinity and turbidity of the water, the reliability of this measurement method decreases.

6. Conclusions

In this article, we described a new open data set for evaluating the accuracy of underwater radial distortion calibration algorithms under different refractive indices. The data set consists of 662 images of a chessboard collected with two different cameras, with the location of cell corners marked manually.
Based on the collected data set, a number of experiments were conducted to assess the practical applicability of a radial distortion correction formula when the camera is submerged underwater. According to the experimental results, the precision of RD correction using the formula was not inferior to a full-fledged calibration procedure for specific operating conditions. We also showed that the inaccuracy of specifying the refractive index of water does not significantly affect the precision of the correction and, so, it can be set equal to 1.33 .
Thus, this article experimentally confirmed that the use of the radial distortion correction formula allows us to not only significantly simplify and reduce the cost of operating a camera underwater, but also maintains the calibration accuracy at a sufficient level.

Author Contributions

Data collection and preparation, software, research and experiments, visualization, writing, and editing: D.S.; review and editing: D.P.; supervision, review, and editing: E.E.; resources and writing: I.K. All authors have read and agreed to the published version of the manuscript.

Funding

This work was partially supported by the Russian Foundation for Basic Research (19-29-09075).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data set, as well as the accompanying code for all of the experiments described in the article, are publicly available at https://github.com/Visillect/SaltWaterDistortion (accessed on 14 October 2022).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Conrady, A.E. Decentred Lens-Systems. Mon. Not. R. Astron. Soc. 1919, 79, 384–390. [Google Scholar] [CrossRef] [Green Version]
  2. Shortis, M. Camera Calibration Techniques for Accurate Measurement Underwater. Sensors 2015, 15, 30810–30826. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Ellender, B.; Becker, A.; Weyl, O.; Swartz, E. Underwater video analysis as a non-destructive alternative to electrofishing for sampling imperilled headwater stream fishes. Aquat. Conserv. Mar. Freshw. Ecosyst. 2012, 22, 58–65. [Google Scholar] [CrossRef]
  4. Pavin, A.M. Identifikatsiya podvodnykh ob”ektov proizvol’noi formy na fotosnimkakh morskogo dna [Identification of underwater objects of any shape on photos of the sea floor]. Podvodnye Issledovaniya i Robototekhni- ka [Underw. Res. Robot.] 2011, 2, 26–31. [Google Scholar]
  5. Somerton, D.; Glendhill, C. Report of the National Marine Fisheries Service Workshop on Underwater Video Analysis. In Proceedings of the National Marine Fisheries Service Workshop on Underwater Video Analysis, Seattle, WA, USA, 4–6 August 2004. [Google Scholar]
  6. Elibol, A.; Möller, B.; Garcia, R. Perspectives of auto-correcting lens distortions in mosaic-based underwater navigation. In Proceedings of the 2008 23rd International Symposium on Computer and Information Sciences, Istanbul, Turkey, 27–29 October 2008; pp. 1–6. [Google Scholar] [CrossRef] [Green Version]
  7. Skarlatos, D.; Agrafiotis, P. Image-Based Underwater 3D Reconstruction for Cultural Heritage: From Image Collection to 3D. Critical Steps and Considerations. In Visual Computing for Cultural Heritage; Springer: Cham, Switzerland, 2020; pp. 141–158. [Google Scholar] [CrossRef]
  8. Botelho, S.; Drews-Jr, P.; Oliveira, G.; Figueiredo, M. Visual odometry and mapping for Underwater Autonomous Vehicles. In Proceedings of the 6th Latin American Robotics Symposium (LARS 2009), Valparaiso, Chile, 29–30 October 2009; pp. 1–6. [Google Scholar] [CrossRef]
  9. Berman, D.; Levy, D.; Avidan, S.; Treibitz, T. Underwater Single Image Color Restoration Using Haze-Lines and a New Quantitative Dataset. arXiv 2018, arXiv:1811.01343. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  10. Degrees of Protection Provided by Enclosures (IP Code). International Standard IEC 60529:1989+AMD1:1999+AMD2:2013. 2013. Available online: https://webstore.iec.ch/publication/2452 (accessed on 1 April 2022).
  11. Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef] [Green Version]
  12. Heikkila, J. Geometric Camera Calibration Using Circular Control Points. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1066–1077. [Google Scholar] [CrossRef] [Green Version]
  13. Quan, X.; Fry, E.S. Empirical equation for the index of refraction of seawater. Appl. Opt. 1995, 34, 3477–3480. [Google Scholar] [CrossRef] [PubMed]
  14. Lavest, J.M.; Rives, G.; Lapresté, J.T. Underwater Camera Calibration; Springer: Berlin/Heidelberg, Germany, 2000; pp. 654–668. [Google Scholar]
  15. Konovalenko, I.; Sidorchuk, D.; Zenkin, G. Analysis and Compensation of Geometric Distortions, Appearing when Observing Objects under Water. Pattern Recognit. Image Anal. 2018, 28, 379–392. [Google Scholar] [CrossRef]
  16. Sedlazeck, A.; Koch, R. Perspective and Non-perspective Camera Models in Underwater Imaging—Overview and Error Analysis. In Advances in Computer Communication and Computational Sciences; Springer: Berlin/Heidelberg, Germany, 2012; pp. 212–242. [Google Scholar]
  17. Daimon, M.; Masumura, A. Measurement of the refractive index of distilled water from the near-infrared region to the ultraviolet region. Appl. Opt. 2007, 46, 3811–3820. [Google Scholar] [CrossRef] [PubMed]
  18. Huang, L.; Zhao, X.; Huang, X.; Liu, Y. Underwater camera model and its use in calibration. In Proceedings of the 2015 IEEE International Conference on Information and Automation, Lijiang, China, 8–10 August 2015; pp. 1519–1523. [Google Scholar]
  19. Yau, T.; Gong, M.; Yang, Y.H. Underwater Camera Calibration Using Wavelength Triangulation. In Proceedings of the Conference on Computer Vision and Pattern Recognition, Portland, OR, USA, 23–28 June 2013; pp. 2499–2506. [Google Scholar]
  20. Long, L.; Dongri, S. Review of Camera Calibration Algorithms. In Advances in Computer Communication and Computational Sciences; Springer: Singapore, 2019; pp. 723–732. [Google Scholar]
  21. Zhang, Y.; Zhou, F.; Deng, P. Camera calibration approach based on adaptive active target. In Proceedings of the Fourth International Conference on Machine Vision (ICMV 2011), Singapore, 9–10 December 2012; Volume 8350, p. 83501G. [Google Scholar]
  22. Brunken, H.; Gühmann, C. Deep learning self-calibration from planes. In Proceedings of the Twelfth International Conference on Machine Vision (ICMV 2019), Amsterdam, The Netherlands, 16–18 November 2019; Volume 11433, p. 1114333L. [Google Scholar]
  23. Duan, Y.; Ling, X.; Zhang, Z.; Liu, X.; Hu, K. A Simple and Efficient Method for Radial Distortion Estimation by Relative Orientation. IEEE Trans. Geosci. Remote Sens. 2017, 55, 6840–6848. [Google Scholar] [CrossRef]
  24. Lehtola, V.; Kurkela, M.; Ronnholm, P. Radial Distortion from Epipolar Constraint for Rectilinear Cameras. J. Med. Imaging 2017, 3, 8. [Google Scholar] [CrossRef] [Green Version]
  25. Kunina, I.; Gladilin, S.; Nikolaev, D. Blind radial distortion compensation in a single image using fast Hough transform. Comput. Opt. 2016, 40, 395–403. [Google Scholar] [CrossRef]
  26. Xue, Z.; Xue, N.; Xia, G.S.; Shen, W. Learning to calibrate straight lines for fisheye image rectification. In Proceedings of the Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019; pp. 1643–1651. [Google Scholar]
  27. Brown, D.C. Decentering distortion of lenses. Photogramm. Eng. Remote Sens. 1966, 32, 444–462. [Google Scholar]
  28. Open Source Computer Vision Library. Available online: https://opencv.org (accessed on 7 July 2020).
  29. OpenCV Documentation: Camera Calibration and 3D Reconstruction. Available online: https://docs.opencv.org/2.4/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html (accessed on 7 July 2020).
  30. CRC Handbook. CRC Handbook of Chemistry and Physics, 85th ed.; CRC Press: Boca Raton, FL, USA, 2004; pp. 8–71. [Google Scholar]
Figure 1. Examples of images made with the same camera in the air (left) and underwater (right).
Figure 1. Examples of images made with the same camera in the air (left) and underwater (right).
Jimaging 08 00289 g001
Figure 2. Example images from SWD data set and cell corner markup: (a) Captured in the air; (b) captured underwater (salinity < 1 % ); (c) captured underwater (salinity 13 % ); (d) captured underwater (salinity 27 % ); and (e) captured underwater (salinity 40 % ).
Figure 2. Example images from SWD data set and cell corner markup: (a) Captured in the air; (b) captured underwater (salinity < 1 % ); (c) captured underwater (salinity 13 % ); (d) captured underwater (salinity 27 % ); and (e) captured underwater (salinity 40 % ).
Jimaging 08 00289 g002
Figure 3. Histogram of the corner distribution over the image area in the SWD data set for the telephoto camera.
Figure 3. Histogram of the corner distribution over the image area in the SWD data set for the telephoto camera.
Jimaging 08 00289 g003
Figure 4. Examples of images (from left to right): Taken underwater (salinity < 1 % ) with parameters for correcting radial distortion when shooting in air; corrected using the classical calibration method; and corrected by Formula (2).
Figure 4. Examples of images (from left to right): Taken underwater (salinity < 1 % ) with parameters for correcting radial distortion when shooting in air; corrected using the classical calibration method; and corrected by Formula (2).
Jimaging 08 00289 g004
Table 1. Comparison of the precision of radial distortion correction using the classical method and Formula (2).
Table 1. Comparison of the precision of radial distortion correction using the classical method and Formula (2).
ConditionsMethodM1M2M3M4
In the airWithout correction0.512.741.061.66
Classic0.492.671.021.59
Underwater (salinity < 1 % )Without correction1.417.572.453.97
Classic0.934.892.333.68
Formula (2) (n = 1.33)0.653.961.792.75
Underwater (salinity 13 % )Without correction1.516.802.795.28
Classic0.904.382.133.49
Formula (2) (n = 1.35)0.694.051.802.64
Underwater (salinity 27 % )Without correction1.608.543.255.36
Classic1.034.792.663.90
Formula (2) (n = 1.38)0.914.552.443.54
Underwater (salinity 40 % )Without correction1.507.033.546.08
Classic0.895.242.343.63
Formula (2) (n = 1.40)0.835.002.183.24
Table 2. Dependence of RD correction error when using Formula (2) on the accuracy of specifying the water refraction index.
Table 2. Dependence of RD correction error when using Formula (2) on the accuracy of specifying the water refraction index.
The Set of ImagesAssumed nM1M2M3M4
Underwater, salinity < 1 %
(actual refractive index n = 1.33 )
1.330.65043.96161.79342.7539
1.340.65293.97291.80622.7755
1.350.65663.98401.81982.7968
1.360.66153.99491.83362.8180
1.370.66744.00551.84882.8399
1.380.67434.01591.86452.8615
1.390.68194.02601.88382.8838
1.40.69014.03601.89752.9061
Underwater, salinity 13 %
(actual refractive index n = 1.35 )
1.330.68983.99571.79312.6572
1.340.69034.02251.79682.6445
1.350.69184.04871.80062.6405
1.360.69434.07431.80602.6442
1.370.69774.09941.81402.6865
1.380.70184.12391.82352.6865
1.390.70664.14781.83532.7120
1.40.71214.17121.84762.7386
Underwater, salinity 27 %
(actual refractive index n = 1.37 )
1.330.89404.45892.39463.4778
1.340.89504.46832.40033.4911
1.350.89674.49012.40773.5040
1.360.89924.51142.41683.5174
1.370.90224.53232.42663.5308
1.380.90584.55272.43713.5445
1.390.91004.57272.44823.5581
1.40.91464.59222.46043.5720
Underwater, salinity 40 %
(actual refractive index n = 1.40 )
1.330.80384.86822.18833.2080
1.340.80564.88812.18423.2064
1.350.80834.90742.18123.2060
1.360.81164.92642.17943.2086
1.370.81564.94492.17893.2149
1.380.82014.96302.17993.2211
1.390.82514.98072.18143.2275
1.40.83054.99802.18413.2369
Table 3. Accuracy evaluation of chessboard cell corner detection using the OpenCV library findChessboardCorners function.
Table 3. Accuracy evaluation of chessboard cell corner detection using the OpenCV library findChessboardCorners function.
Image SetMeanMaximum
In the air1.78333.7317
Underwater (salinity < 1 % )2.840822.6256
Underwater (salinity 13 % )2.66686.7385
Underwater (salinity 27 % )3.314411.9810
Underwater (salinity 40 % )4.19019.7023
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Senshina, D.; Polevoy, D.; Ershov, E.; Kunina, I. Experimental Study of Radial Distortion Compensation for Camera Submerged Underwater Using Open SaltWaterDistortion Data Set. J. Imaging 2022, 8, 289. https://doi.org/10.3390/jimaging8100289

AMA Style

Senshina D, Polevoy D, Ershov E, Kunina I. Experimental Study of Radial Distortion Compensation for Camera Submerged Underwater Using Open SaltWaterDistortion Data Set. Journal of Imaging. 2022; 8(10):289. https://doi.org/10.3390/jimaging8100289

Chicago/Turabian Style

Senshina, Daria, Dmitry Polevoy, Egor Ershov, and Irina Kunina. 2022. "Experimental Study of Radial Distortion Compensation for Camera Submerged Underwater Using Open SaltWaterDistortion Data Set" Journal of Imaging 8, no. 10: 289. https://doi.org/10.3390/jimaging8100289

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop