Monocular Vision Guidance for Unmanned Surface Vehicle Recovery
Abstract
:1. Introduction
- Designed an artificial marker consisting of a circular ring for close-range visual localization.
- Utilized the least squares method to obtain the pixel coordinates of the centers of the two circles within the ring and addressed the issue of noncoinciding centers caused by perspective transformation using the cross-ratio invariance.
- Analyzed the influence of different distances on positioning angles and the influence of different angles on positioning distances. Through experimental comparison, it has been demonstrated that the method proposed in this paper is practical for close-range positioning and can be applied to recovery guidance for unmanned boats.
2. Materials and Methods
2.1. Design of Artificial Markers
2.2. Detection of Artificial Markers and Extraction of Ellipse Centers
2.3. Correction of Circular Ring Center Points
2.4. Visual Localization of the USV
2.4.1. Coordinate System Conversion
2.4.2. Visual Localization Process
3. Experimental Results and Analysis
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Woo, J.; Park, J.; Yu, C.; Kim, N. Dynamic model identification of unmanned surface vehicles using deep learning network. Appl. Ocean Res. 2018, 78, 123–133. [Google Scholar] [CrossRef]
- Lewicka, O.; Specht, M.; Stateczny, A.; Specht, C.; Dardanelli, G.; Brčić, D.; Szostak, B.; Halicki, A.; Stateczny, M.; Widzgowski, S. Integration data model of the bathymetric monitoring system for shallow waterbodies using UAV and USV platforms. Remote Sens. 2022, 14, 4075. [Google Scholar] [CrossRef]
- Mishra, R.; Koay, T.B.; Chitre, M.; Swarup, S. Multi-USV Adaptive Exploration Using Kernel Information and Residual Variance. Front. Robot. AI 2021, 8, 572243. [Google Scholar] [CrossRef] [PubMed]
- Barrera, C.; Padron, I.; Luis, F.S.; Llinas, O. Trends and Challenges in Unmanned Surface Vehicles (USV): From Survey to Shipping. TransNav Int. J. Mar. Navig. Saf. Sea Transp. 2021, 15, 135–142. [Google Scholar] [CrossRef]
- Yang, Q.; Yin, Y.; Chen, S.; Liu, Y. Autonomous exploration and navigation of mine countermeasures USV in complex unknown environment. In Proceedings of the 2021 33rd Chinese Control and Decision Conference (CCDC), Kunming, China, 22–24 May 2021; pp. 4373–4377. [Google Scholar] [CrossRef]
- Feng, L. Research on Anti-submarine Warfare Scheme Design of Unmanned Surface Ship. In Proceedings of the 2022 IEEE 13th International Conference on Software Engineering and Service Science (ICSESS), Beijing, China, 21–23 October 2022; pp. 133–137. [Google Scholar] [CrossRef]
- Huang, Q.; Ma, X.; Liu, K.; Ma, X.; Pang, W. Autonomous Reconnaissance Action of Swarm Unmanned System Driven by Behavior Tree. In Proceedings of the 2022 IEEE International Conference on Unmanned Systems (ICUS), Guangzhou, China, 28–30 October 2022; pp. 1540–1544. [Google Scholar] [CrossRef]
- Mo, H.; Zhao, X.; Wang, F.Y. Application of interval type-2 fuzzy sets in unmanned vehicle visual guidance. Int. J. Fuzzy Syst. 2019, 21, 1661–1668. [Google Scholar] [CrossRef]
- Patruno, C.; Nitti, M.; Petitti, A.; Stella, E.; D’Orazio, T. A vision-based approach for unmanned aerial vehicle landing. J. Intell. Robot. Syst. 2019, 95, 645–664. [Google Scholar] [CrossRef]
- Chou, J.S.; Cheng, M.Y.; Hsieh, Y.M.; Yang, I.; Hsu, H.T. Optimal path planning in real time for dynamic building fire rescue operations using wireless sensors and visual guidance. Autom. Constr. 2019, 99, 1–17. [Google Scholar] [CrossRef]
- Pan, Y. Challenges in Visual Navigation of AGV and Comparison Study of Potential Solutions. In Proceedings of the 2021 International Conference on Signal Processing and Machine Learning (CONF-SPML), Stanford, CA, USA, 14 November 2021; pp. 265–270. [Google Scholar] [CrossRef]
- Ortiz-Fernandez, L.E.; Cabrera-Avila, E.V.; Silva, B.M.F.; Gonçalves, L.M.G. Smart Artificial Markers for Accurate Visual Mapping and Localization. Sensors 2021, 21, 625. [Google Scholar] [CrossRef]
- Khattar, F.; Luthon, F.; Larroque, B.; Dornaika, F. Visual localization and servoing for drone use in indoor remote laboratory environment. Mach. Vis. Appl. 2021, 32, 32. [Google Scholar] [CrossRef]
- Wu, X.; Sun, C.; Zou, T.; Li, L.; Wang, L.; Liu, H. SVM-based image partitioning for vision recognition of AGV guide paths under complex illumination conditions. Robot. Comput. Integr. Manuf. 2020, 61, 101856. [Google Scholar] [CrossRef]
- Li, H.; Zhang, Y.; Ye, B.; Zhao, H. A Monocular vision positioning and tracking system based on deep neural network. J. Eng. 2023, 3, e12246. [Google Scholar] [CrossRef]
- Yu, R.; Fang, X.; Hu, C.; Yang, X.; Zhang, X.; Zhang, C.; Yang, W.; Mao, Q.; Wang, J. Research on positioning method of coal mine mining equipment based on monocular vision. Energies 2022, 15, 8068. [Google Scholar] [CrossRef]
- Wang, H.; Sun, Y.; Wu, Q.M.J.; Lu, X.; Wang, X.; Zhang, Z. Self-supervised monocular depth estimation with direct methods. Neurocomputing 2021, 421, 340–348. [Google Scholar] [CrossRef]
- Meng, C.; Bao, H.; Ma, Y.; Xu, X.; Li, Y. Visual Meterstick: Preceding vehicle ranging using monocular vision based on the fitting method. Symmetry 2019, 11, 1081. [Google Scholar] [CrossRef]
- Ma, L.; Meng, D.; Zhao, S.; An, B. Visual localization with a monocular camera for unmanned aerial vehicle based on landmark detection and tracking using YOLOv5 and DeepSORT. Int. J. Adv. Robot. 2023, 20, 17298806231164831. [Google Scholar] [CrossRef]
- Yin, X.; Ma, L.; Tan, X.; Qin, X. A Robust Visual Localization Method with Unknown Focal Length Camera. IEEE Access 2021, 9, 42896–42906. [Google Scholar] [CrossRef]
- Huang, L.; Zhe, T.; Wu, J.; Wu, Q.; Pei, C.; Chen, D. Robust inter-vehicle distance estimation method based on monocular vision. IEEE Access 2019, 7, 46059–46070. [Google Scholar] [CrossRef]
- Cao, X.; Shrikhande, N.; Hu, G. Approximate orthogonal distance regression method for fitting quadric surfaces to range data. Pattern Recognit. Lett. 1994, 15, 781–796. [Google Scholar] [CrossRef]
- Yang, R.; Yu, S.; Yao, Q.; Huang, J.; Ya, F. Vehicle Distance Measurement Method of Two-Way Two-Lane Roads Based on Monocular Vision. Appl. Sci. 2023, 13, 3468. [Google Scholar] [CrossRef]
- Lin, S.; Jin, L.; Chen, Z. Real-Time Monocular Vision System for UAV Autonomous Landing in Outdoor Low-Illumination Environments. Sensors 2021, 21, 6226. [Google Scholar] [CrossRef]
- Babinec, A.; Jurisica, L.; Hubinský, P.; Duchoň, F. Visual localization of mobile robot using artificial markers. Procedia Eng. 2014, 96, 1–9. [Google Scholar] [CrossRef]
- Huang, B.; Sun, Y.R.; Sun, X.D.; Liu, J.Y. Circular drogue pose estimation for vision-based navigation in autonomous aerial re-fueling. In Proceedings of the 2016 IEEE Chinese Guidance, Navigation and Control Conference (CGNCC), Nanjing, China, 12–14 August 2016; pp. 960–965. [Google Scholar] [CrossRef]
- Gu, Y.; Lu, Y.; Yang, H.; Huo, J. Concentric Circle Detection Method Based on Minimum Enveloping Circle and Ellipse Fitting. In Proceedings of the 2019 IEEE 10th International Conference on Software Engineering and Service Science (ICSESS), Beijing, China, 18–20 October 2019; pp. 523–527. [Google Scholar] [CrossRef]
- Xu, X.; Fei, Z.; Yang, J.; Tan, Z.; Luo, M. Line structured light calibration method and centerline extraction: A review. Results Phys. 2020, 19, 103637. [Google Scholar] [CrossRef]
Parameters | Parameter Settings |
---|---|
Camera model | MER-500-7UC |
Image resolution (pixel) | 2592 × 1944 |
Angle sensor | BWT61CL |
Accuracy of ranging sensor | 1 mm |
CPU | Intel(R) Core(TM) i5-8250U (Intel, Santa Clara, CA, USA) |
GPU | NVIDIA Geforce MX150 (NVIDIA, Santa Clara, CA, USA) |
Actual Distance (mm) | Range Error (mm) | ||
---|---|---|---|
Angle = 0° | Angle = 30° | Angle = 60° | |
1000 | 0 | 1 | 1 |
2000 | 1 | 1 | 2 |
3000 | 1 | 2 | 2 |
4000 | 2 | 2 | 3 |
5000 | 2 | 4 | 5 |
6000 | 10 | 16 | 18 |
7000 | 18 | 25 | 26 |
8000 | 25 | 30 | 35 |
9000 | 32 | 34 | 36 |
10,000 | 35 | 38 | 39 |
Actual Angle (°) | Average Angle Error (°) | ||
---|---|---|---|
Distance = 1000 mm | Distance = 2000 mm | Distance = 3000 mm | |
−60 | −2 | 2 | −2 |
−40 | 2 | 2 | 2 |
−20 | 2 | −3 | 4 |
0 | 0 | 1 | −1 |
20 | 1 | 2 | 2 |
40 | −2 | 1 | −3 |
60 | 4 | 2 | 1 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Li, Z.; Xi, Q.; Shi, Z.; Wang, Q. Monocular Vision Guidance for Unmanned Surface Vehicle Recovery. Appl. Sci. 2024, 14, 5104. https://doi.org/10.3390/app14125104
Li Z, Xi Q, Shi Z, Wang Q. Monocular Vision Guidance for Unmanned Surface Vehicle Recovery. Applied Sciences. 2024; 14(12):5104. https://doi.org/10.3390/app14125104
Chicago/Turabian StyleLi, Zhongguo, Qian Xi, Zhou Shi, and Qi Wang. 2024. "Monocular Vision Guidance for Unmanned Surface Vehicle Recovery" Applied Sciences 14, no. 12: 5104. https://doi.org/10.3390/app14125104
APA StyleLi, Z., Xi, Q., Shi, Z., & Wang, Q. (2024). Monocular Vision Guidance for Unmanned Surface Vehicle Recovery. Applied Sciences, 14(12), 5104. https://doi.org/10.3390/app14125104