# Stereo Vision System for Vision-Based Control of Inspection-Class ROVs

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Methodology

#### 2.1. Underwater Imaging

#### 2.2. Local Image Features

#### 2.3. Stereo Vision System

- –
- distortion removal—mathematical removal of radial and tangential lens distortion;
- –
- rectification—mathematical adjusting angles and distances between cameras;
- –
- stereo correspondence—finding the same image feature in the left and right image view;
- –
- triangulation—calculating distances between cameras and corresponding points.

#### 2.3.1. Distortion Removal

#### 2.3.2. Rectification

- –
- rotation of the left and the right image to move the epipolar points to infinity, described by the matrix $Q$,
- –
- rotation of the right image by a matrix $R$.

- –
- the focal lengths of the two cameras are the same,
- –
- the origin of the local camera coordinate system is the principal camera point.

#### 2.3.3. Stereo Correspondence

#### 2.3.4. Triangulation

#### 2.3.5. Distance Measurement

- –
- focal length,
- –
- distance between cameras,
- –
- CCD resolution.

- –
- the distance between the cameras, b, and the distance between the vehicle and the bottom, h, should assure a 75% visibility of the corresponding region in both images;
- –
- the velocity of the vehicle in the forward direction, the distance from the bottom, and the frame rate of the image acquisition system should assure 75% visibility of the same region in the consecutive frames.

#### 2.4. Image-Based Motion Calculation

- –
- distortion removal,
- –
- rectification,
- –
- detection of image features in the left image,
- –
- stereo correspondence to find counterparts for detected image features in the right image,
- –
- triangulation, to find distances from the bottom for matched image features,
- –
- image-feature correspondence in successive frames of the left camera.

- $\mathrm{s}a$—$\mathrm{sin}a$,
- $\mathrm{c}a$—$\mathrm{cos}a$,
- $\mathrm{t}a$—$\mathrm{tan}a$

#### 2.5. Control System

- –
- the state-transitional model of the surge motion and distance from the bottom:$$A=\left[\begin{array}{ccc}1& \Delta t& \frac{\Delta {t}^{2}}{2}\\ 0& 1& \Delta t\\ 0& 0& 1\end{array}\right],$$
- –
- the observational model of the surge motion:$$H=\left[\begin{array}{ccc}0& 1& 0\end{array}\right],$$
- –
- the covariance of the process noise of the surge motion:$$Q=\left[\begin{array}{ccc}0.1& 0& 0\\ 0& 0.1& 0\\ 0& 0& 0.1\end{array}\right],$$
- –
- the covariance of the observation noise of the surge motion $R=25,$
- –
- the observational model of the distance from the bottom:$$H=[\begin{array}{ccc}1& 0& 0\end{array}],$$
- –
- the covariance of the process noise of the distance from the bottom:$$Q=\left[\begin{array}{ccc}1& 0& 0\\ 0& 1& 0\\ 0& 0& 1\end{array}\right],$$
- –
- the covariance of the observation noise of the distance from the bottom $R=125$.

## 3. Results and Discussion

#### 3.1. Heading Control

#### 3.2. Surge Control

#### 3.3. Distance-from-the-Bottom Control

## 4. Conclusions

## Author Contributions

## Funding

## Conflicts of Interest

## References

- Allibert, G.; Hua, M.-D.; Krupínski, S.; Hamel, T. Pipeline following by visual servoing for Autonomous Underwater Vehicles. Control Eng. Pract.
**2019**, 82, 151–160. [Google Scholar] [CrossRef] [Green Version] - Fatan, M.; Daliri, M.R.; Mohammad Shahri, A. Underwater cable detection in the images using edge classification based on texture information. Measurement
**2016**, 91, 309–317. [Google Scholar] [CrossRef] - Trslic, P.; Rossi, M.; Robinson, L.; O’Donnel, C.W.; Weir, A.; Coleman, J.; Riordan, J.; Omerdic, E.; Dooly, G.; Toal, D. Vision based autonomous docking for work class ROVs. Ocean Eng.
**2020**, 196, 106840. [Google Scholar] [CrossRef] - Palomeras, N.; Carreras, M.; Andrade-Cetto, J. Active SLAM for Autonomous Underwater Exploration. Remote Sens.
**2019**, 11, 2827. [Google Scholar] [CrossRef] [Green Version] - Chung, D.; Kim, J. Pose Estimation Considering an Uncertainty Model of Stereo Vision for In-Water Ship Hull Inspection. IFAC-PapersOnLine
**2018**, 51, 400–405. [Google Scholar] [CrossRef] - Heshmati-alamdari, S.; Eqtami, A.; Karras, G.C.; Dimarogonas, D.V.; Kyriakopoulos, K.J. A Self-triggered Position Based Visual Servoing Model Predictive Control Scheme for Underwater Robotic Vehicles. Machines
**2020**, 8, 33. [Google Scholar] [CrossRef] - Wang, R.; Wang, X.; Zhu, M.; Lin, Y. Application of a Real-Time Visualisation Method of AUVs in Underwater Visual Localization. Appl. Sci.
**2019**, 9, 1428. [Google Scholar] [CrossRef] [Green Version] - Paull, L.; Saeedi, S.; Seto, M.; Li, H. AUV navigation and localisation: A review. IEEE J. Ocean. Eng.
**2014**, 39, 131–149. [Google Scholar] [CrossRef] - Nicosevici, T.; Garcia, R.; Carreras, M.; Villanueva, M. A review of sensor fusion techniques for underwater vehicle navigation. In Oceans ’04 MTS/IEEE Techno-Ocean ’04 (IEEE Cat. No.04CH37600); IEEE: Manhattan, NY, USA, 2004; Volume 3, pp. 1600–1605. [Google Scholar]
- Vasilijevic, A.; Borovic, B.; Vukic, Z. Underwater Vehicle Localization with Complementary Filter: Performance Analysis in the Shallow Water Environment. J. Intell. Robot. Syst.
**2012**, 68, 373–386. [Google Scholar] [CrossRef] - Almeida, J.; Matias, B.; Ferreira, A.; Almeida, C.; Martins, A.; Silva, E. Underwater Localization System Combining iUSBL with Dynamic SBL in ¡VAMOS! Trials. Sensors
**2020**, 20, 4710. [Google Scholar] [CrossRef] [PubMed] - Bremnes, J.E.; Brodtkorb, A.H.; Sørensen, A.J. Hybrid Observer Concept for Sensor Fusion of Sporadic Measurements for Underwater Navigation. Int. J. Control Autom. Syst.
**2021**, 19, 137–144. [Google Scholar] [CrossRef] - Capocci, R.; Dooly, G.; Omerdić, E.; Coleman, J.; Newe, T.; Toal, D. Inspection-class remotely operated vehicles—A review. J. Mar. Sci. Eng.
**2017**, 5, 13. [Google Scholar] [CrossRef] - Ferrera, M.; Moras, J.; Trouvé-Peloux, P.; Creuze, V. Real-time monocular visual odometry for turbid and dynamic underwater environments. Sensors
**2019**, 19, 687. [Google Scholar] [CrossRef] [Green Version] - Hożyń, S.; Zalewski, J. Shoreline Detection and Land Segmentation for Autonomous Surface Vehicle Navigation with the Use of an Optical System. Sensors
**2020**, 20, 2799. [Google Scholar] [CrossRef] - Mur-Artal, R.; Tardos, J.D. ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras. IEEE Trans. Robot.
**2017**, 33, 1255–1262. [Google Scholar] [CrossRef] [Green Version] - Mur-Artal, R.; Montiel, J.M.M.; Tardos, J.D. ORB-SLAM: A Versatile and Accurate Monocular SLAM System. IEEE Trans. Robot.
**2015**, 31, 1147–1163. [Google Scholar] [CrossRef] [Green Version] - Mur-Artal, R.; Tardos, J.D. Visual-Inertial Monocular SLAM with Map Reuse. IEEE Robot. Autom. Lett.
**2017**, 2, 796–803. [Google Scholar] [CrossRef] [Green Version] - Cui, J.; Min, C.; Bai, X.; Cui, J. An Improved Pose Estimation Method Based on Projection Vector with Noise Error Uncertainty. IEEE Photonics J.
**2019**, 11, 1–16. [Google Scholar] [CrossRef] - Cui, J.; Feng, D.; Li, Y.; Tian, Q. Research on simultaneous localisation and mapping for AUV by an improved method: Variance reduction FastSLAM with simulated annealing. Def. Technol.
**2020**, 16, 651–661. [Google Scholar] [CrossRef] - Kumar, P.P. An Image Based Technique for Enhancement of Underwater Images. Int. J. Mach. Intell.
**2011**, 3, 975–2927. [Google Scholar] - Fuentes-Pacheco, J.; Ruiz-Ascencio, J.; Rendón-Mancha, J.M. Visual simultaneous localization and mapping: A survey. Artif. Intell. Rev.
**2015**, 43, 55–81. [Google Scholar] [CrossRef] - Praczyk, T.; Szymak, P.; Naus, K.; Pietrukaniec, L.; Hożyń, S. Report on Research with Biomimetic Autonomous Underwater Vehicle — Low Level Control. Sci. J. Polish Nav. Acad.
**2018**, 212, 105–123. [Google Scholar] [CrossRef] [Green Version] - Praczyk, T.; Szymak, P.; Naus, K.; Pietrukaniec, L.; Hożyń, S. Report on Research with Biomimetic Autonomous Underwater Vehicle—Navigation and Autonomous Operation. Sci. J. Polish Nav. Acad.
**2019**, 213, 53–67. [Google Scholar] [CrossRef] [Green Version] - Aguirre-Castro, O.A.; Inzunza-González, E.; García-Guerrero, E.E.; Tlelo-Cuautle, E.; López-Bonilla, O.R.; Olguín-Tiznado, J.E.; Cárdenas-Valdez, J.R. Design and construction of an ROV for underwater exploration. Sensors
**2019**, 19, 5387. [Google Scholar] [CrossRef] [Green Version] - Sivčev, S.; Rossi, M.; Coleman, J.; Omerdić, E.; Dooly, G.; Toal, D. Collision detection for underwater ROV manipulator systems. Sensors
**2018**, 18, 1117. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Khojasteh, D.; Kamali, R. Design and dynamic study of a ROV with application to oil and gas industries of Persian Gulf. Ocean Eng.
**2017**, 136, 18–30. [Google Scholar] [CrossRef] - Babić, A.; Mandić, F.; Mišković, N. Development of Visual Servoing-Based Autonomous Docking Capabilities in a Heterogeneous Swarm of Marine Robots. Appl. Sci.
**2020**, 10, 7124. [Google Scholar] [CrossRef] - Nađ, Đ.; Mandić, F.; Mišković, N. Using Autonomous Underwater Vehicles for Diver Tracking and Navigation Aiding. J. Mar. Sci. Eng.
**2020**, 8, 413. [Google Scholar] [CrossRef] - Becker, R. Underwater Forensic Investigation; CRC Press LLC: Boca Raton, FL, USA, 2013. [Google Scholar]
- Chaumette, F.; Hutchinson, S.; Corke, P. Visual Servoing. In Springer Handbook of Robotics; Siciliano, B., Khatib, O., Eds.; Springer International Publishing: Cham, Switzerland, 2016; pp. 841–866. ISBN 978-3-319-32552-1. [Google Scholar]
- Myint, M.; Yonemori, K.; Lwin, K.N.; Mukada, N. Vision-based Docking Simulation of Underwater Vehicle Using Stereo Vision under Dynamic Light Environment. In Proceedings of the 9th SICE Symposium on Computational Intelligence, Chiba, Japan, 2016; pp. 101–107. [Google Scholar]
- Li, J.; Huang, H.; Xu, Y.; Wu, H.; Wan, L. Uncalibrated Visual Servoing for Underwater Vehicle Manipulator Systems with an Eye in Hand Configuration Camera. Sensors
**2019**, 19, 5469. [Google Scholar] [CrossRef] [Green Version] - Laranjeira, M.; Dune, C.; Hugel, V. Catenary-based visual servoing for tether shape control between underwater vehicles. Ocean Eng.
**2020**, 200, 107018. [Google Scholar] [CrossRef] - Sivčev, S.; Rossi, M.; Coleman, J.; Dooly, G.; Omerdić, E.; Toal, D. Fully automatic visual servoing control for work-class marine intervention ROVs. Control Eng. Pract.
**2018**, 74, 153–167. [Google Scholar] [CrossRef] - Hansen, N.; Nielsen, M.C.; Christensen, D.J.; Blanke, M. Short-range sensor for underwater robot navigation using line-lasers and vision. IFAC-PapersOnLine
**2015**, 28, 113–120. [Google Scholar] [CrossRef] [Green Version] - Karras, G.C.; Loizou, S.G.; Kyriakopoulos, K.J. A visual-servoing scheme for semi-autonomous operation of an underwater robotic vehicle using an IMU and a laser vision system. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, 3–7 May 2010; pp. 5262–5267. [Google Scholar] [CrossRef]
- Ishibashi, S. The stereo vision system for an underwater vehicle. In Proceedings of the OCEANS 2009-EUROPE, Bremen, Germany, 11–14 May 2009; IEEE: Manhattan, NY, USA, 2009; pp. 1–6. [Google Scholar]
- Lodi Rizzini, D.; Kallasi, F.; Aleotti, J.; Oleari, F.; Caselli, S. Integration of a stereo vision system into an autonomous underwater vehicle for pipe manipulation tasks. Comput. Electr. Eng.
**2017**, 58, 560–571. [Google Scholar] [CrossRef] - Birk, A.; Antonelli, G.; Di Lillo, P.; Simetti, E.; Casalino, G.; Indiveri, G.; Ostuni, L.; Turetta, A.; Caffaz, A.; Weiss, P.; et al. Dexterous Underwater Manipulation from Onshore Locations: Streamlining Efficiencies for Remotely Operated Underwater Vehicles. IEEE Robot. Autom. Mag.
**2018**, 25, 24–33. [Google Scholar] [CrossRef] - Łuczyński, T.; Łuczyński, P.; Pehle, L.; Wirsum, M.; Birk, A. Model based design of a stereo vision system for intelligent deep-sea operations. Meas. J. Int. Meas. Confed.
**2019**, 144, 298–310. [Google Scholar] [CrossRef] - Fabio, O.; Fabjan, K.; Dario, L.R.; Jacopo, A.; Stefano, C. Performance Evaluation of a Low-Cost Stereo Vision System for Underwater Object Detection. IFAC Proc. Vol.
**2014**, 47, 3388–3394. [Google Scholar] [CrossRef] [Green Version] - Jin, L.; Liang, H.; Yang, C. Accurate Underwater ATR in Forward-Looking Sonar Imagery Using Deep Convolutional Neural Networks. IEEE Access
**2019**, 7, 125522–125531. [Google Scholar] [CrossRef] - Oleari, F.; Kallasi, F.; Rizzini, D.L.; Aleotti, J.; Caselli, S. An underwater stereo vision system: From design to deployment and dataset acquisition. In Proceedings of the OCEANS 2015—Genova, Genova, Italy, 18–21 May 2015; IEEE: Manhattan, NY, USA, 2015; pp. 1–6. [Google Scholar]
- Hożyń, S. Vision-Based Modelling and Control of Small Underwater Vehicles. In Advances in Intelligent Systems and Computing; Springer: Berlin/Heidelberg, Germany, 2020; Volume 1196, pp. 1553–1564. [Google Scholar]
- Hożyń, S.; Żak, B. A Concept for Application of a Stereo Vision Method in Control System of an Underwater Vehicle. Appl. Mech. Mater.
**2016**, 817, 73–80. [Google Scholar] [CrossRef] - Mangeruga, M.; Bruno, F.; Cozza, M.; Agrafiotis, P.; Skarlatos, D. Guidelines for underwater image enhancement based on benchmarking of different methods. Remote Sens.
**2018**, 10, 1652. [Google Scholar] [CrossRef] [Green Version] - Lu, H.; Li, Y.; Zhang, L.; Serikawa, S. Contrast enhancement for images in turbid water. J. Opt. Soc. Am. A
**2015**, 32, 886. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Ma, J.; Fan, X.; Yang, S.X.; Zhang, X.; Zhu, X. Contrast Limited Adaptive Histogram Equalisation-Based Fusion in YIQ and HSI Color Spaces for Underwater Image Enhancement. Int. J. Pattern Recognit. Artif. Intell.
**2018**, 32, 1854018. [Google Scholar] [CrossRef] - Martinez-Martin, E.; Del Pobil, A.P. Vision for robust robot manipulation. Sensors
**2019**, 19, 1648. [Google Scholar] [CrossRef] [Green Version] - Shortis, M. Calibration Techniques for Accurate Measurements by Underwater Camera Systems. Sensors
**2015**, 15, 30810–30826. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Li, S.-Q.; Xie, X.-P.; Zhuang, Y.-J. Research on the calibration technology of an underwater camera based on equivalent focal length. Measurement
**2018**, 122, 275–283. [Google Scholar] [CrossRef] - Lowe, D.G. Distinctive image features from scale invariant keypoints. Int. J. Comput. Vis.
**2004**, 60, 91–110. [Google Scholar] [CrossRef] - Bay, H.; Tuytelaars, T.; Van Gool, L. SURF: Speeded Up Robust Features. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Graz, Austria, 2006; Volume 3951 LNCS, pp. 404–417. ISBN 3540338322. [Google Scholar]
- Leutenegger, S.; Chli, M.; Siegwart, R.Y. BRISK: Binary Robust invariant scalable keypoints. In Proceedings of the 2011 International Conference on Computer Vision, Barcelona, Spain, 6–13 November 2011; IEEE: Manhattan, NY, USA, 2011; pp. 2548–2555. [Google Scholar]
- Rublee, E.; Rabaud, V.; Konolige, K.; Bradski, G. ORB: An efficient alternative to SIFT or SURF. In Proceedings of the 2011 International Conference on Computer Vision, Barcelona, Spain, 6–13 November 2011; IEEE: Manhattan, NY, USA, 2011; pp. 2564–2571. [Google Scholar]
- Harris, C.; Stephens, M. A Combined Corner and Edge Detector. In Proceedings of the Alvey Vision Conference 1988, Manchester, UK, 31 August–2 September 1988; Alvey Vision Club: Manchester, UK, 1998; pp. 147–151. [Google Scholar]
- Rosten, E.; Drummond, T. Machine Learning for High-Speed Corner Detection. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Graz, Austria, 2006; pp. 430–443. ISBN 3540338322. [Google Scholar]
- Agrawal, M.; Konolige, K.; Blas, M.R. CenSurE: Center Surround Extremas for Realtime Feature Detection and Matching. In Computer Vision—ECCV 2008; Springer: Berlin/Heidelberg, Germany, 2008; pp. 102–115. [Google Scholar]
- Calonder, M.; Lepetit, V.; Strecha, C.; Fua, P. BRIEF: Binary robust independent elementary features. Lect. Notes Comput. Sci.
**2010**, 6314, 778–792. [Google Scholar] [CrossRef] [Green Version] - Alahi, A.; Ortiz, R.; Vandergheynst, P. FREAK: Fast Retina Keypoint. In Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition, Providence, RI, USA, 16–21 June 2012; IEEE: Manhattan, NY, USA, 2012; pp. 510–517. [Google Scholar]
- Hożyń, S.; Żak, B. Local image features matching for real-time seabed tracking applications. J. Mar. Eng. Technol.
**2017**, 16, 273–282. [Google Scholar] [CrossRef] [Green Version] - Hidalgo, F.; Bräunl, T. Evaluation of Several Feature Detectors/Extractors on Underwater Images towards vSLAM. Sensors
**2020**, 20, 4343. [Google Scholar] [CrossRef] [PubMed] - Zhang, X.; Chen, Z. SAD-based stereo vision machine on a system-on-programmable-chip (SoPC). Sensors
**2013**, 13, 3014–3027. [Google Scholar] [CrossRef] [Green Version] - Hożyń, S.; Żak, B. Distance Measurement Using a Stereo Vision System. Solid State Phenom.
**2013**, 196, 189–197. [Google Scholar] [CrossRef] - Khalil, M.; Ibrahim, A. Quick Techniques for Template Matching by Normalized Cross-Correlation Method. Br. J. Math. Comput. Sci.
**2015**, 11, 1–9. [Google Scholar] [CrossRef] - Mahmood, A.; Khan, S. Correlation-Coefficient-Based Fast Template Matching Through Partial Elimination. IEEE Trans. Image Process.
**2012**, 21, 2099–2108. [Google Scholar] [CrossRef] - Hożyń, S.; Żak, B. Moving Object Detection, Localization and Tracking Using Stereo Vison System. Solid State Phenom.
**2015**, 236, 134–141. [Google Scholar] [CrossRef] - Hożyń, S.; Żak, B. Stereoscopic Technique for a Motion Parameter Determination of Remotely Operated Vehicles. In Advances in Intelligent Systems and Computing; Springer: Berlin/Heidelberg, Germany, 2016; Volume 414, pp. 263–283. [Google Scholar]
- Fossen, T.I. Wiley InterScience (Online service). In Handbook of Marine Craft Hydrodynamics and Motion Control; Wiley: Hoboken, NJ, USA, 2011; ISBN 9781119991496. [Google Scholar]
- Hożyń, S.; Żak, B. Identification of unmanned underwater vehicles for the purpose of fuzzy logic control. Sci. Asp. Unmanned Mob. Objects
**2014**, 162–174. [Google Scholar]

**Figure 6.**Corresponding region of visibility for different distances from bottom and the vehicle’s velocities.

**Figure 15.**Heading control for velocity equals 0.1 m/s: (

**a**) stereo vision; (

**b**) SLAM lasers; (

**c**) SLAM stereo; (

**d**) SLAM IMU.

**Figure 16.**Heading control for velocity equals 0.2 m/s: (

**a**) stereo vision; (

**b**) SLAM lasers; (

**c**) SLAM stereo; (

**d**) SLAM IMU.

**Figure 17.**Surge control at variable speed: (

**a**) stereo vision; (

**b**) SLAM lasers; (

**c**) SLAM stereo; (

**d**) SLAM IMU.

**Figure 18.**Surge control at constant speed equals 0.2 m/s: (

**a**) stereo vision; (

**b**) SLAM lasers; (

**c**) SLAM stereo; (

**d**) SLAM IMU.

**Figure 19.**Distance-from-the-bottom control at a speed of 0.1 m/s: (

**a**) stereo vision; (

**b**) SLAM lasers; (

**c**) SLAM stereo; (

**d**) SLAM IMU.

**Figure 20.**Distance-from-the-bottom control at a speed of 0.2 m/s: (

**a**) stereo vision; (

**b**) SLAM lasers; (

**c**) SLAM stereo; (

**d**) SLAM IMU.

e(t) | ||||||
---|---|---|---|---|---|---|

UD | US | Z | DS | DD | ||

$\frac{de(t)}{dt}$ | UD | UD | UD | US | US | Z |

US | UD | UD | US | Z | DS | |

Z | US | US | Z | DS | DS | |

DS | US | Z | DS | DD | DD | |

DD | Z | DS | DS | DD | DD |

Stereo Vision | SLAM Lasers | SLAM Stereo | SLAM IMU | |
---|---|---|---|---|

u = 0.1 m/s, variable heading | 58,823 | 93,058 | 73,005 | 67,244 |

u = 0.2 m/s, variable heading | 56,344 | 89,976 | 68,679 | 66,995 |

u = 0.1 m/s, constant heading | 28,432 | 51,234 | 35,244 | 31,556 |

u = 0.2 m/s, constant heading | 27,564 | 47,564 | 49,567 | 34,773 |

Stereo Vision | SLAM Lasers | SLAM Stereo | SLAM IMU | |
---|---|---|---|---|

u = 0.1 m/s, variable course | 39.2 | 43.5 | 62.4 | 73.4 |

u = 0.2 m/s, variable course | 31.1 | 35.2 | 44.6 | 54.2 |

u = 0.1 m/s, constant course | 21.2 | 24.6 | 31.7 | 32.8 |

u = 0.2 m/s, constant course | 18.9 | 21.3 | 25.3 | 25.3 |

Stereo Vision | SLAM Lasers | SLAM Stereo | SLAM IMU | |
---|---|---|---|---|

u = 0.1 m/s, variable distance | 120.5 | 187.9 | 191.6 | 168.7 |

u = 0.2 m/s, variable distance | 130.3 | 189.8 | 199.7 | 169.8 |

u = 0.1 m/s, constant distance | 63.5 | 77.1 | 90.2 | 137.5 |

u = 0.2 m/s, constant distance | 65.2 | 88.5 | 90.3 | 141.8 |

Stereo Vision | SLAM Lasers | SLAM Stereo | SLAM IMU | |
---|---|---|---|---|

heading control, variable distance, variable speed | 57,268 | 90,098 | 68,728 | 66,870 |

surge control, variable heading, variable distance | 34.1 | 41.2 | 46.2 | 71.8 |

distance control, variable heading, variable speed | 1283.5 | 182.2 | 196.3 | 178.2 |

Mean Computational Cost | |
---|---|

Stereo vision | 176 ms |

SLAM lasers | 153 ms |

SLAM stereo | 191 ms |

SLAM IMU | 143 ms |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Hożyń, S.; Żak, B.
Stereo Vision System for Vision-Based Control of Inspection-Class ROVs. *Remote Sens.* **2021**, *13*, 5075.
https://doi.org/10.3390/rs13245075

**AMA Style**

Hożyń S, Żak B.
Stereo Vision System for Vision-Based Control of Inspection-Class ROVs. *Remote Sensing*. 2021; 13(24):5075.
https://doi.org/10.3390/rs13245075

**Chicago/Turabian Style**

Hożyń, Stanisław, and Bogdan Żak.
2021. "Stereo Vision System for Vision-Based Control of Inspection-Class ROVs" *Remote Sensing* 13, no. 24: 5075.
https://doi.org/10.3390/rs13245075