Abstract
In response to the inefficiencies and high costs associated with manual buoy inspection, this paper presents the design and testing of an Autonomous Navigation Unmanned Surface Vehicle (USV) tailored for this purpose. The research is structured into three main components: Firstly, the hardware framework and communication system of the USV are detailed, incorporating the Robot Operating System (ROS) and additional nodes to meet practical requirements. Furthermore, a buoy tracking system utilizing the Kernelized Correlation Filter (KCF) algorithm is introduced. Secondly, buoy image training is conducted using the YOLOv7 object detection algorithm, establishing a robust model for accurate buoy state recognition. Finally, an improved Line-of-Sight (LOS) method for USV path tracking, assuming the presence of an attraction potential field around the inspected buoy, is proposed to enable a comprehensive 360-degree inspection. Experimental testing includes validation of buoy image target tracking and detection, assessment of USV autonomous navigation and obstacle avoidance capabilities, and evaluation of the enhanced LOS path tracking algorithm. The results demonstrate the USV’s efficacy in conducting practical buoy inspection missions. This research contributes insights and advancements to the fields of maritime patrol and routine buoy inspections.
1. Introduction
A buoy serves as a critical component in maritime navigation, delineating navigational channels, highlighting hazardous areas, and providing essential positional references []. Consequently, regular inspection and maintenance of buoys constitute imperative measures to ensure maritime safety and facilitate smooth maritime traffic flow. Presently, the predominant approach to buoy inspection relies on manual methods, encompassing assessments for color fading, physical damage, inclination deviations, and adherence to predefined spatial ranges. Nonetheless, conventional manual inspections entail substantial human resources, material investments, and financial allocations, while suffering from inefficiencies, prolonged processes, and sluggish response times [,,].
To address the shortcomings of manual buoy inspections, Li [] has developed a novel maritime buoy detection system leveraging the advantages of unmanned aerial vehicles (UAVs), including low cost, rapid response, and high flexibility, for monitoring coastal buoys. Additionally, a feasible path planning approach based on Convolutional Hyper Neural Networks (CHNNs) and genetic algorithms has been devised to obtain the shortest-distance trajectory for accessing each buoy once. Nevertheless, UAVs often encounter challenges in endurance when conducting prolonged, long-range inspection tasks, posing significant demands on their battery life, which is typically limited in current inspection UAVs [,]. Furthermore, UAV operations for buoy inspections are susceptible to interference from strong winds, thereby introducing new challenges for precise control and disturbance resilience of UAVs during inspections.
Intelligent buoys, by means of retrofitting conventional buoys with additional sensors such as GPS, enable the real-time monitoring of the buoy’s position and pertinent conditions of the surrounding water bodies [,]. Furthermore, some intelligent buoys, equipped with controllers, have the capability to adjust their routes and positions, thereby achieving the dual objectives of real-time monitoring and manipulation []. Despite the capacity of these intelligent buoys to provide real-time localization and self-diagnosis of damage, they are still unable to offer immediate feedback regarding their outward appearance. Moreover, the development of intelligent buoys necessitates substantial investment costs, thus hindering widespread adoption in many regions [].
Unmanned surface vehicles (USVs), emerging as novel waterborne unmanned platforms, possess the capabilities for long-range, prolonged operations and can be outfitted with a variety of sensors onboard to fulfill specific tasks, thus finding extensive applications in recent years in fields such as marine exploration, environmental monitoring, and patrol inspections [,]. Xiong et al. [] devised a USV tailored for water quality assessment and investigated its trajectory control algorithm. Sotelo-Torres et al. [] elaborated on the development of a cost-effective and practical USV for lake depth measurement. This system integrates an autonomous navigation framework, environmental sensors, and multibeam sonar to collect data on underwater terrain, temperature, and wind speed. Cheng et al. [] amalgamated USVs with target detection algorithms for surface target identification, establishing a comprehensive USV-based surface target recognition system.
Compared to the UAVs, the USVs offer extended endurance, rendering them suitable for prolonged, wide-ranging maritime inspection tasks []. Leveraging USVs for buoy inspections alleviates constraints associated with manpower and resources. USVs enable the proactive observation of buoy conditions, facilitating the targeted preparation of maintenance protocols by personnel, thereby significantly enhancing inspection efficiency. This approach also mitigates the risks posed to personnel by direct exposure to adverse sea conditions, thus enhancing task execution safety. Furthermore, USVs, owing to their high maneuverability, can swiftly respond to unforeseen events such as buoy malfunctions or displacements, promptly conducting on-site inspection and intervention [].
Based on the foregoing analysis, this paper presents the design of an unmanned surface vehicle tailored for routine buoy inspection tasks, with both simulation and field trials conducted. Reference [] mentioned that the UAV reaches the specified buoy position for inspection, but subsequent inspection procedures remain unexplored. This study offers a more comprehensive scheme for buoy inspection. The primary contributions are as follows: (1) To achieve autonomous navigation and obstacle avoidance, the paper integrates the Robot Operating System (ROS) into the USV and supplements relevant topics and nodes based on inspection requirements. (2) In order to conduct automated detection of buoy conditions, a substantial dataset of buoy images is collected and subjected to image enhancement. The YOLOv7 object detection algorithm is employed for buoy status determination post-training. (3) To achieve the comprehensive inspection of buoys, a buoy tracking system based on the Kernelized Correlation Filter (KCF) algorithm is developed, coupled with enhancements to the traditional line-of-sight (LOS) USV path tracking algorithm. Together, they ensure the continuous centering of the buoy within the inspection frame and enable comprehensive, all-angle buoy inspection.
3. Buoy Target Detection Algorithm Based on YOLOv7
In order to achieve the automated detection of buoy conditions, the integration of target detection algorithms with target tracking algorithms is imperative. Current target detection technologies can be categorized into two primary types: those based on traditional algorithms and those based on deep learning methodologies. Within the realm of deep learning, the rapidly evolving algorithms primarily encompass the two-stage R-CNN series and the one-stage YOLO (You Only Look Once) [] and SSD (Single Shot MultiBox Detector) series []. Specifically, the R-CNN series (including RCNN, Fast-RCNN, Faster-RCNN, Mask-RCNN, Cascade-RCNN, etc.) and R-FCN, as two-stage methods, initially conduct a coarse localization of candidate regions followed by precise classification. While this approach enhances the detection accuracy, its slower processing speed may render it less suitable for applications requiring real-time responses. In contrast, the YOLO series algorithms represent one-stage detection methods capable of directly predicting the location and category of objects through a single forward pass. This not only significantly improves the processing speed to enable real-time detection but also effectively utilizes global information to enhance detection accuracy []. These convolutional neural network (CNN)-based algorithms have been widely applied in multiple domains, including ocean exploration [], environmental monitoring, and equipment maintenance []. However, the research literature related to the detection and classification of buoy images remains relatively scarce []. Early research efforts attempted to develop a fine-grained navigational marker classification model based on ResNet, dubbed ResNet-Multiscale-Attention (RMA), yet experimental results indicated the need for improved accuracy in complex real-world settings.
The architecture diagram for buoy detection based on YOLOv7, as shown in Figure 9 and Figure 10, comprises four main components: the Input layer, Backbone network, Head network, and Prediction layer.
Figure 9.
YOLOv7 target detection algorithm framework diagram 1.
Figure 10.
YOLOv7 target detection algorithm framework diagram 2.
The input module of YOLOv7 adopts an adaptive anchor box calculation method, coupled with data concatenation and mixed augmentation techniques, to accommodate the requirement of 640 × 640 size dataset image inputs. The background network module comprises CBS, ELAN, and MP1 convolutional layers, tasked with extracting fundamental features, expanding receptive fields, and enhancing feature extraction capabilities, respectively. The Neck module integrates CBS, SPPCSPC, MP, and ELAN structures, extracting and merging feature information from different layers of the background network to enhance model performance.
The prediction module of YOLOv7 introduces a Repconv model featuring a reparameterization structure []. During training, this model disassembles the entire module into multiple branches, which may be identical or distinct, and subsequently reintegrates convolutional layers to form the trained model. This multi-branch training model is transformed into a high-speed single-branch inference model, reducing network complexity and increasing training duration while maintaining predictive performance, thereby enhancing inference outcomes [].
Enhancing the quantity and quality of the training set significantly contributes to improving the confidence and effectiveness of the target detection algorithm. To augment the accuracy of buoy target detection, the collected buoy images underwent data augmentation processing. Figure 11 illustrates a schematic representation of the navigational marker images we collected. By proportionally dividing the dataset into training, test, and validation sets and applying various data augmentation techniques to the training set images, the generalization capability was bolstered, thereby mitigating overfitting within the detection model.
Figure 11.
Collected buoy images.
Our experiment mainly uses the methods of rotation, translation, flipping, brightness adjustment, and adding noise to enhance the data. Following this comprehensive suite of data augmentation processes, the augmented training set ultimately comprised 10,368 buoy images, including 1296 original images and 9072 images generated through data augmentation methods.
5. Experiments and Discussion
5.1. Buoy Image Tracking Experiment
Upon powering up the tracking system, the performance of the system is evaluated using buoy images. The results are shown in Figure 16 below. It can be observed that as the buoy image moves, the system demonstrates proficient tracking capabilities, maintaining the buoy image consistently at the center of the camera frame, thus achieving the intended tracking objective.
Figure 16.
(a–d) Buoy image tracking experiment.
5.2. Buoy Detection Algorithm Based on YOLOv7 Experiment
Building upon the in-depth analysis presented in Section 3, a series of tests were conducted on the buoy target detection model. The model underwent training over 300 epochs, a duration strategically chosen to ensure comprehensive learning of buoy characteristics, thereby achieving a higher detection accuracy. Due to the imbalance in the quantity of images across different navigational types within the dataset, a specialized design was implemented for the loss function, as expressed by Equation (26). This design addresses the issue of data imbalance by adjusting the loss function to weight the contributions of different types of images differently, ensuring a more equitable learning process across all image categories. Such an approach enhances the model’s performance in the face of data imbalance, ensuring that even navigational types represented by fewer images are effectively learned. This method contributes to improved overall detection accuracy and enhances the model’s generalization capabilities.
Within this formulation, the parameter serves as a weight factor, pre-calculated based on the dataset, to modulate the contribution of different sample types to the loss function. Specifically, this parameter assigns greater weight to sample types that are less numerous, thereby compensating for the effects of data imbalance. Here, represents the true label of the sample, with a value of 0 or 1 indicating whether the sample belongs to the positive class, and denotes the log-odds of the probabilities. The loss function underscores the importance of accurate predictions for minority classes, ensuring a more equitable treatment across all categories during the training process. This approach allows the model to focus more on those categories that are underrepresented, thus enhancing their detection performance.
To facilitate a faster convergence to the optimal solution, Stochastic Gradient Descent (SGD) with a momentum term was selected as the optimization algorithm []. The inclusion of the momentum term helps to mitigate fluctuations in gradient updates to some extent, thereby stabilizing the training process and accelerating convergence. To assess model performance and conduct comparative analysis, experiments were not only conducted on the original dataset but also on a dataset subjected to contour enhancement. The contour-enhanced dataset aims to improve the model’s ability to recognize targets by enhancing the contour information of objects in the images. The experimental results are illustrated in Figure 17a,b (see below), which depict the loss and accuracy curves of the model on these two datasets.
Figure 17.
(a,b) Model loss and accuracy variation curves.
To further verify the effectiveness of the algorithm presented in this paper, we compared it with several common algorithms, with their average accuracy and recognition speed shown in the following Table 3. As can be seen from the table, the YOLOv7 algorithm slightly outperforms other algorithms in terms of recognition accuracy and speed.
Table 3.
Comparison with common object detection algorithms.
The final recognition outcome is illustrated in the figure below. Figure 18 shows the buoy under normal conditions, demonstrating a high confidence level in the detection results. This indicates that the target detection algorithm is capable of accurately identifying the buoy in its normal state.
Figure 18.
Normal buoy detection results.
Figure 19 presents the detection results for buoys under abnormal conditions. It is observed that when buoys are tilted or damaged, they are identified by the target detection algorithm as anomalies. The outcomes of the detection align with the actual state of the buoys, demonstrating the algorithm’s applicability in real-world scenarios.
Figure 19.
Abnormal buoy detection results.
However, due to the impossibility of encompassing all real-world scenarios within the training dataset, there may be instances in complex real-world aquatic environments where the detection results exhibit low confidence, as shown in Figure 20a,b, or cases of missed detections (c) and false positives (d). Testing with the test dataset has demonstrated that the detection results for the majority of navigation marks are satisfactory.
Figure 20.
(a,b) Low confidence; (c) missed detection; (d) false positive.
5.3. Buoy Circumnavigation Control Algorithm Experiment
As analyzed in Section 4, the simulation of the USV’s buoy circumnavigation control algorithm was conducted in Matlab 2020a. The design of the parameters in Equation (5) during the simulation is as shown in Table 4:
Table 4.
Simulated hydrodynamic parameters.
Based on the desired yaw angle (see Equation (25)) for the buoy circumnavigation control algorithm, simulation tests were conducted to evaluate the impact of different numerical values of and on the control algorithm. The values of , , the USV’s forward-looking distance , and the lateral drift speed under various conditions are presented in Table 5:
Table 5.
Simulation parameter settings.
The path tracking performance and errors of the USV’s buoy circumnavigation motion control algorithm under the aforementioned five scenarios are shown in Figure 21a,b.
Figure 21.
(a) Path tracking performance; (b) lateral error.
Based on the simulation results, it is evident that the best path tracking performance with the smallest lateral error occurs in the fourth scenario, where and , for the USV. An analysis of the other scenarios reveals that as the value of increases, meaning that the weight of in determining the final desired yaw angle increases, the actual path of the unmanned surface vehicle tends to be closer to the buoy, resulting in larger errors. In the fifth scenario, where , the final desired yaw angle of the USV is entirely determined by , corresponding to the traditional line-of-sight (LOS) guidance algorithm. A comparison between the fourth and fifth scenarios indicates that the modified LOS guidance algorithm outperforms the traditional LOS algorithm in the context of buoy circumnavigation path tracking, yielding a smaller lateral error . The changes in course angle and vessel speed across the five scenarios are shown in Figure 22a,b.
Figure 22.
(a) USV heading angle; (b) USV speed.
5.4. USV Field Test Experiment
To validate the practical navigation and obstacle avoidance efficacy of the USV, field tests were conducted at the Jinan University Zhuhai Campus, situated at the Day Moon Lake, as depicted in Figure 23.
Figure 23.
Field test.
Initially, a straight-line cruising test of the USV was conducted. Waypoints were set up at the ground station, enabling the vehicle to autonomously cruise to these points before returning. The image transmitted by the fixed front-facing camera was clear and stable, as shown in the Figure 24.
Figure 24.
(a) USV begins straight-line cruising; (b) USV returns to base.
Following the completion of straight-line cruising tests, obstacle avoidance assessments were conducted as shown in Figure 25. The unmanned surface vessel (USV) was initially positioned within an obstacle-laden environment, with the navigation endpoint set outside the obstacle region. The objective was to observe the USV’s obstacle avoidance performance. The results indicate that the USV effectively detected obstacles and successfully navigated around them, reaching the target endpoint.
Figure 25.
(a) USV obstacle avoidance; (b) USV reached the target point.
Due to minimal surface disturbances on the lake, we conducted circumnavigation tests at a coastal location in Zhuhai to validate the performance of the buoy circumnavigation control algorithm in real-world conditions. The coordinates 22.292655 latitude and 113.578515 longitude were designated as the circumnavigation point. Subsequently, 12 waypoints were manually configured. Following waypoint configuration, the USV commenced circumnavigation maneuvers around the designated points under the control of the buoy circumnavigation control algorithm, as shown in the Figure 26. The yellow dot represents the detour point, and the yellow line represents the tracking path in Figure 26.
Figure 26.
(a) USV begins circumnavigation; (b) USV completes circumnavigation.
The coordinates of the USV at each waypoint were exported and used to plot the path tracking and error curves, as shown in Figure 27.
Figure 27.
(a) Path tracking curve; (b) path tracking error.
The results indicate that the circumnavigation control algorithm exhibits certain errors, which can be attributed to the disturbances caused by coastal waves on the USV. Overall, the circumnavigation algorithm still demonstrates satisfactory tracking performance.
6. Conclusions
To mitigate the low efficiency prevalent in manual buoy inspection processes, this study proposes the design of an autonomous unmanned surface vehicle (USV) tailored for buoy inspection tasks, followed by comprehensive field trials encompassing navigation, buoy target tracking, and detection. The findings indicate that, supported by the Robot Operating System (ROS) navigation framework, the USV autonomously navigates and maneuvers through obstacles, while the YOLOv7 buoy target detection algorithm exhibits exceptional performance in buoy identification, even under challenging conditions such as tilt and damage, ensuring high-precision buoy detection. Moreover, in conjunction with the Kalman Consensus Filter (KCF)-based target tracking gimbal and an enhanced line-of-sight (LOS) path control algorithm, the USV achieves comprehensive circumnavigation of buoys for thorough inspection, thus presenting a viable direction towards the automation and intelligence enhancement of traditional manual buoy inspection practices.
However, in real-world maritime environments, the USV is subject to disturbances, with significant hull motion affecting the efficacy of target detection algorithms. Furthermore, substantial wave disturbances can impede the performance of the target tracking gimbal, highlighting deficiencies in one-dimensional gimbal systems. Future endeavors will entail field trials of the USV in maritime channels and the augmentation of the dimensionality of target tracking gimbals, alongside optimizations of target detection algorithms to address these challenges.
Author Contributions
Data curation, J.W. and C.L.; Formal analysis, C.L.; Investigation, W.L. and Z.Z.; Project administration, Z.L. and W.L.; Validation, Z.L., J.W. and Z.Z.; Visualization, Z.Z.; Writing—original draft, Z.L. and C.L.; Writing—review and editing, W.L. and X.Z. All authors have read and agreed to the published version of the manuscript.
Funding
This research was funded by Special Funds for the Cultivation of Guangdong College Student’s Scientific and Technological Innovation (pdjh2024a050), Guangdong Innovation and Entrepreneurship Training Program for Undergraduate (S202310559038).
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
Data is contained within the article.
Conflicts of Interest
The authors declare no conflicts of interest.
References
- Hasbullah, M.I.; Osnin, N.A.; Mohd Salleh, N.H. A Systematic Review and Meta-Analysis on the Development of Aids to Navigation. Aust. J. Marit. Ocean Aff. 2023, 15, 247–267. [Google Scholar] [CrossRef]
- Turner, N.M. Traditional Aids to Navigation: The next 25 Years. J. Navig. 1997, 50, 234–241. [Google Scholar] [CrossRef]
- MahmoudZadeh, S.; Yazdani, A. A Cooperative Fault-Tolerant Mission Planner System for Unmanned Surface Vehicles in Ocean Sensor Network Monitoring and Inspection. IEEE Trans. Veh. Technol. 2023, 72, 1101–1115. [Google Scholar] [CrossRef]
- Li, B.; Gao, S.; Li, C.; Wan, H. Maritime Buoyage Inspection System Based on an Unmanned Aerial Vehicle and Active Disturbance Rejection Control. IEEE Access 2021, 9, 22883–22893. [Google Scholar] [CrossRef]
- Trasviña-Moreno, C.A.; Blasco, R.; Marco, Á.; Casas, R.; Trasviña-Castro, A. Unmanned Aerial Vehicle Based Wireless Sensor Network for Marine-Coastal Environment Monitoring. Sensors 2017, 17, 460. [Google Scholar] [CrossRef]
- Lomax, A.S.; Corso, W.; Etro, J.F. Employing Unmanned Aerial Vehicles (UAVs) as an Element of the Integrated Ocean Observing System. In Proceedings of the Proceedings of OCEANS 2005 MTS/IEEE, Washington, DC, USA, 17–23 September 2006. [Google Scholar]
- Jin, J.-Y.; Dae Do, J.; Park, J.-S.; Park, J.S.; Lee, B.; Hong, S.-D.; Moon, S.-J.; Hwang, K.C.; Chang, Y.S. Intelligent Buoy System (INBUS): Automatic Lifting Observation System for Macrotidal Coastal Waters. Front. Mar. Sci. 2021, 8, 668673. [Google Scholar] [CrossRef]
- Xin, M.; Yang, F.; Liu, H.; Shi, B.; Zhang, K.; Zhai, M. Single-Difference Dynamic Positioning Method for GNSS-Acoustic Intelligent Buoys Systems. J. Navig. 2020, 73, 646–657. [Google Scholar] [CrossRef]
- Zhang, D.; Ashraf, M.A.; Liu, Z.; Peng, W.-X.; Golkar, M.J.; Mosavi, A. Dynamic Modeling and Adaptive Controlling in GPS-Intelligent Buoy (GIB) Systems Based on Neural-Fuzzy Networks. Ad Hoc. Netw. 2020, 103, 102149. [Google Scholar] [CrossRef]
- Yuan, S.; Li, Y.; Bao, F.; Xu, H.; Yang, Y.; Yan, Q.; Zhong, S.; Yin, H.; Xu, J.; Huang, Z.; et al. Marine Environmental Monitoring with Unmanned Vehicle Platforms: Present Applications and Future Prospects. Sci. Total Environ. 2023, 858, 159741. [Google Scholar] [CrossRef]
- de Sousa, J.B.; Andrade Gonçalves, G. Unmanned Vehicles for Environmental Data Collection. Clean Technol. Environ. Policy 2011, 13, 369–380. [Google Scholar] [CrossRef]
- Xiong, Y.; Zhu, H.; Pan, L.; Wang, J. Research on Intelligent Trajectory Control Method of Water Quality Testing Unmanned Surface Vessel. J. Mar. Sci. Eng. 2022, 10, 1252. [Google Scholar] [CrossRef]
- Sotelo-Torres, F.; Alvarez, L.V.; Roberts, R.C. An Unmanned Surface Vehicle (USV): Development of an Autonomous Boat with a Sensor Integration System for Bathymetric Surveys. Sensors 2023, 23, 4420. [Google Scholar] [CrossRef] [PubMed]
- Cheng, L.; Deng, B.; Yang, Y.; Lyu, J.; Zhao, J.; Zhou, K.; Yang, C.; Wang, L.; Yang, S.; He, Y. Water Target Recognition Method and Application for Unmanned Surface Vessels. IEEE Access 2022, 10, 421–434. [Google Scholar] [CrossRef]
- Kim, Y.; Ryou, J. A Study of Sonar Image Stabilization of Unmanned Surface Vehicle Based on Motion Sensor for Inspection of Underwater Infrastructure. Remote Sens. 2020, 12, 3481. [Google Scholar] [CrossRef]
- Zhou, Y.; Yang, W.; Shen, Y. Scale-Adaptive KCF Mixed with Deep Feature for Pedestrian Tracking. Electronics 2021, 10, 536. [Google Scholar] [CrossRef]
- Liu, K.; Tang, H.; He, S.; Yu, Q.; Xiong, Y.; Wang, N. Performance Validation of Yolo Variants for Object Detection. In Proceedings of the 2021 International Conference on Bioinformatics and Intelligent Computing, Harbin, China, 22–24 January 2021; ACM: New York, NY, USA, 2021. [Google Scholar]
- Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.-Y.; Berg, A.C. SSD: Single Shot MultiBox Detector. In Proceedings of the Computer Vision—ECCV 2016, Amsterdam, The Netherlands, 11–14 October 2016; Springer International Publishing: Cham, Switzerland, 2016; pp. 21–37. [Google Scholar]
- Wang, C.-Y.; Bochkovskiy, A.; Liao, H.-Y.M. YOLOv7: Trainable Bag-of-Freebies Sets New State-of-the-Art for Real-Time Object Detectors. In Proceedings of the 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Vancouver, BC, Canada, 17–24 June 2023. [Google Scholar]
- Zhou, X.; Ding, W.; Jin, W. Microwave-Assisted Extraction of Lipids, Carotenoids, and Other Compounds from Marine Resources. In Innovative and Emerging Technologies in the Bio-Marine Food Sector; Elsevier: Amsterdam, The Netherlands, 2022; pp. 375–394. [Google Scholar]
- Liu, Y.; Anderlini, E.; Wang, S.; Ma, S.; Ding, Z. Ocean Explorations Using Autonomy: Technologies, Strategies and Applications. In Offshore Robotics; Springer: Singapore, 2022; pp. 35–58. [Google Scholar]
- Pan, M.; Liu, Y.; Cao, J.; Li, Y.; Li, C.; Chen, C.-H. Visual Recognition Based on Deep Learning for Navigation Mark Classification. IEEE Access 2020, 8, 32767–32775. [Google Scholar] [CrossRef]
- Ding, X.; Hao, T.; Tan, J.; Liu, J.; Han, J.; Guo, Y.; Ding, G. ResRep: Lossless CNN Pruning via Decoupling Remembering and Forgetting. In Proceedings of the 2021 IEEE/CVF International Conference on Computer Vision (ICCV), Montreal, QC, Canada, 11–17 October 2021. [Google Scholar]
- McCue, L. Handbook of Marine Craft Hydrodynamics and Motion Control [Bookshelf]. IEEE Control Syst. 2016, 36, 78–79. [Google Scholar]
- Skjetne, R.; Smogeli, Ø.; Fossen, T.I. Modeling, Identification, and Adaptive Maneuvering of CyberShip II: A Complete Design with Experiments. IFAC Proc. Vol. 2004, 37, 203–208. [Google Scholar] [CrossRef]
- Ding, X.; Zhang, X.; Ma, N.; Han, J.; Ding, G.; Sun, J. RepVGG: Making VGG-Style ConvNets Great Again. In Proceedings of the 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Nashville, TN, USA, 20–25 June 2021. [Google Scholar]
- Fossen, T.I.; Breivik, M.; Skjetne, R. Line-of-Sight Path Following of Underactuated Marine Craft. IFAC Proc. Vol. 2003, 36, 211–216. [Google Scholar] [CrossRef]
- Xu, H.; Guedes Soares, C. Review of Path-Following Control Systems for Maritime Autonomous Surface Ships. J. Mar. Sci. Appl. 2023, 22, 153–171. [Google Scholar] [CrossRef]
- Moe, S.; Pettersen, K.Y.; Fossen, T.I.; Gravdahl, J.T. Line-of-Sight Curved Path Following for Underactuated USVs and AUVs in the Horizontal Plane under the Influence of Ocean Currents. In Proceedings of the 2016 24th Mediterranean Conference on Control and Automation (MED), Athens, Greece, 21–24 June 2016. [Google Scholar]
- Khatib, O. Real-Time Obstacle Avoidance for Manipulators and Mobile Robots. Int. J. Rob. Res. 1986, 5, 90–98. [Google Scholar] [CrossRef]
- Bradley, A.V.; Gomez-Uribe, C.A.; Vuyyuru, M.R. Shift-Curvature, SGD, and Generalization. Mach. Learn. Sci. Technol. 2022, 3, 045002. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).