Towards Intelligent Water Safety: Robobuoy, a Deep Learning-Based Drowning Detection and Autonomous Surface Vehicle Rescue System
Abstract
1. Introduction
- We introduce Robobuoy, a novel real-time drowning detection and rescue system that integrates dual deep learning models with autonomous USV navigation.
- We design an efficient framework combining YOLO12m for drowning detection and YOLOv5m for USV tracking, supported by a lightweight geometric localization algorithm for precise and low-cost navigation.
- We conduct comprehensive validation using both datasets and hardware experiments, demonstrating state-of-the-art detection accuracy and reliable autonomous rescue performance.
2. System Architecture
2.1. Unmanned Surface Vehicle
2.2. Monitoring Station
- Radio Transmitter: Sends navigation commands to the USV, enabling autonomous operation by transmitting real-time instructions from the computation unit.
- Raspberry Pi 5: Serves as the primary processing unit and was selected for this prototype due to its wide availability and sufficient capability for running object detection models in a controlled environment. It is connected to a webcam for real-time visual input. It processes video data using computer vision algorithms to detect drowning victims. The detection model is executed using the default CPU-based software configuration of the Raspberry Pi 5, without GPU support. Once a victim is identified, the Raspberry Pi locates the USV using the bright magenta balls mounted on top, calculates the required navigation path, and sends this data to the Arduino via a wired serial link.
- Arduino Nano 3.0 Microcontroller: Receives control signals from the Raspberry Pi and relays commands to the RC controller of the USV. Using the Servo.h library, the Arduino generates pulse-width modulation (PWM) signals to drive the servos mechanically coupled to the RC boat controls, adjusting the steering and throttle according to the computed navigation path. This process is repeated continuously until the USV reaches the designated point. As a fail-safe mechanism, if no new command is received for more than 3 s, the Arduino neutralizes the control outputs and stops the USV.
- Camera: Visual sensing is provided by a Logitech C615 webcam. It captures video at 1920 × 1080 (Full HD) resolution and provides the primary visual input for drowning detection and USV localization.
3. Experiments
3.1. Object Detection Experiment
3.1.1. Dataset Construction
3.1.2. Model Selection
3.2. Controlled Navigation Experiment
3.2.1. Experimental Setup
3.2.2. Procedure
3.2.3. Results and Performance Analysis
- Left target: 3/3 successful trials;
- Center target: 3/3 successful trials;
- Right target: 3/3 successful trials.
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Stop Drowning Now. Facts & Stats About Drowning. Available online: https://www.stopdrowningnow.org/drowning-statistics/ (accessed on 14 April 2024).
- World Health Organization. Drowning; World Health Organization: Geneva, Switzerland, 2024; Available online: https://www.who.int/news-room/fact-sheets/detail/drowning (accessed on 14 April 2024).
- Dworkin, G.M. 3-Year-Old Child Drowns in Guarded Pool in Front of Lifeguard in Elevated Stand. Lifesaving.com. Available online: https://lifesaving.com/case-studies/3-year-old-child-drowns-in-guarded-pool-in-front-of-lifeguard-in-elevated-stand/ (accessed on 25 April 2024).
- Jalalifar, S.; Karami, S.; Salimi, S.; Riahi, A.; Sedaghat, M.; Wang, H. Enhancing Water Safety: Exploring Recent Technological Approaches for Drowning Detection. Sensors 2024, 24, 331. [Google Scholar] [CrossRef] [PubMed]
- Pratap, K.; Marjorie, R. Anti Drowning System with Remote Alert Using Zigbee. Int. J. Pharm. Technol. 2016, 8, 20523–20527. [Google Scholar]
- Kulkarni, A.; Lakhani, K.; Lokhande, S. A Sensor-Based Low-Cost Drowning Detection System for Human Life Safety. In Proceedings of the 2016 5th International Conference on Reliability, Infocom Technologies and Optimization (Trends and Future Directions) (ICRITO), Noida, India, 7–9 September 2016; pp. 301–306. [Google Scholar] [CrossRef]
- Liu, T.; He, X.; He, L.; Yuan, F. A Video Drowning Detection Device Based on Underwater Computer Vision. IET Image Process. 2023, 17, 1905–1918. [Google Scholar] [CrossRef]
- Hayat, M.A.; Yang, G.; Iqbal, A. Mask R-CNN Based Real-Time Near-Drowning Person Detection System in Swimming Pools. In Proceedings of the 2022 Mohammad Ali Jinnah University International Conference on Computing (MAJICC), Karachi, Pakistan, 27–28 October 2022; pp. 1–6. [Google Scholar] [CrossRef]
- He, Q.; Zhang, H.; Mei, Z.; Xu, X. High Accuracy Intelligent Real-Time Framework for Detecting Infant Drowning Based on Deep Learning. Expert Syst. Appl. 2023, 228, 120204. [Google Scholar] [CrossRef]
- Wang, H.; Zhang, G.; Cao, H.; Hu, K.; Wang, Q.; Deng, Y.; Gao, J.; Tang, Y. Geometry-Aware 3D Point Cloud Learning for Precise Cutting-Point Detection in Unstructured Field Environments. J. Field Robot. 2025, 42, 3063–3076. [Google Scholar] [CrossRef]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
- He, T.; Ye, X.; Wang, M. An Improved Swimming Pool Drowning Detection Method Based on YOLOv8. In Proceedings of the 2023 IEEE 7th Information Technology and Mechatronics Engineering Conference (ITOEC), Chongqing, China, 15–17 September 2023; Volume 7, pp. 835–839. [Google Scholar] [CrossRef]
- Rusakov, K.D.; Gladkikh, T.Y.; Grafenkov, A.V.; Mostakov, N.A.; Goloburdin, N.V.; Migachev, A.N. Drowning Detection Algorithm in Coastal Zones. In Proceedings of the 2023 7th International Conference on Information, Control, and Communication Technologies (ICCT), Astrakhan, Russia, 2–6 October 2023; pp. 1–5. [Google Scholar] [CrossRef]
- Tran, N.-H.; Pham, Q.-H.; Lee, J.-H.; Choi, H.-S. VIAM-USV2000: An Unmanned Surface Vessel with Novel Autonomous Capabilities in Confined Riverine Environments. Machines 2021, 9, 133. [Google Scholar] [CrossRef]
- Tran, H.D.; Nguyen, N.T.; Tran Cao, T.N.; Gia, L.X.; Ho, K.; Nguyen, D.D.; Pham, B.T.; Truong, V.N. Unmanned Surface Vehicle for Automatic Water Quality Monitoring. E3S Web Conf. 2024, 496, 03005. [Google Scholar] [CrossRef]
- Yan, X.; Yang, X.; Feng, B.; Liu, W.; Ye, H.; Zhu, Z.; Shen, H.; Xiang, Z. A Navigation Accuracy Compensation Algorithm for Low-Cost Unmanned Surface Vehicles Based on Models and Event Triggers. Control Eng. Pract. 2024, 146, 105896. [Google Scholar] [CrossRef]
- Wang, Y.; Liu, W.; Liu, J.; Sun, C. Cooperative USV–UAV Marine Search and Rescue with Visual Navigation and Reinforcement Learning-Based Control. ISA Trans. 2023, 137, 222–235. [Google Scholar] [CrossRef] [PubMed]
- Hamid, N.; Dharmawan, W.; Nambo, H. Dynamic Path Planning for Unmanned Surface Vehicles with a Modified Neuronal Genetic Algorithm. Appl. Syst. Innov. 2023, 6, 109. [Google Scholar] [CrossRef]
- Sotelo-Torres, F.; Alvarez, L.V.; Roberts, R.C. An Unmanned Surface Vehicle (USV): Development of an Autonomous Boat with a Sensor Integration System for Bathymetric Surveys. Sensors 2023, 23, 4420. [Google Scholar] [CrossRef] [PubMed]
- Chen, M.; Zhang, X.; Xiong, X.; Zeng, F.; Zhuang, W. Transformer: A Multifunctional Fast Unmanned Aerial Vehicles–Unmanned Surface Vehicles Coupling System. Machines 2021, 9, 146. [Google Scholar] [CrossRef]
- Pillai, B.M.; Suthakorn, J.; Sivaraman, D.; Nakdhamabhorn, S.; Nillahoot, N.; Ongwattanakul, S.; Magid, E. A Heterogeneous Robots Collaboration for Safety, Security, and Rescue Robotics: E-ASIA Joint Research Program for Disaster Risk and Reduction Management. Adv. Robot. 2024, 38, 129–151. [Google Scholar] [CrossRef]
- IEEE Region 10. IEEE RoboComp 2024. Available online: https://robocomp.ieeer10.org/ (accessed on 14 December 2025).
- Lifeguarding Project. Lifeguarding w/YOLOv10 Dataset. Roboflow Universe, Roboflow. Available online: https://universe.roboflow.com/lifeguarding-project/lifeguarding-w-yolov10 (accessed on 7 September 2025).
- Project-Rrsvg. Swimming Pool Safety Management Dataset. Roboflow Universe, Roboflow. Available online: https://universe.roboflow.com/project-rrsvg/swimming-pool-safety-management (accessed on 7 September 2025).
- Jocher, G. YOLOv5; GitHub Repository: San Francisco, CA, USA, 2020. Available online: https://github.com/ultralytics/yolov5 (accessed on 14 December 2025).
- Jocher, G. YOLOv8; Ultralytics: Frederick, MD, USA, 2023. Available online: https://docs.ultralytics.com/models/yolov8/ (accessed on 14 December 2025).
- Wang, C.-Y.; Yeh, I.-H.; Liao, H. YOLOv9: Learning What You Want to Learn Using Programmable Gradient Information. arXiv 2024, arXiv:2402.13616. [Google Scholar] [CrossRef]
- Wang, A.; Chen, H.; Liu, L.; Chen, K.; Lin, Z.; Han, J.; Ding, G. YOLOv10: Real-Time End-to-End Object Detection. arXiv 2024, arXiv:2405.14458. [Google Scholar]
- Ultralytics. YOLOv11; Ultralytics Documentation: Frederick, MD, USA, 2024. Available online: https://docs.ultralytics.com/models/yolo11/ (accessed on 14 December 2025).
- Ultralytics. YOLOv12; Ultralytics Documentation: Frederick, MD, USA, 2024. Available online: https://docs.ultralytics.com/models/yolo12/ (accessed on 14 December 2025).
- Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.-Y.; Berg, A.C. SSD: Single Shot MultiBox Detector. In Proceedings of the Computer Vision—ECCV 2016, Amsterdam, The Netherlands, 11–14 October 2016; Springer: Cham, Switzerland, 2016; Volume 9905. [Google Scholar] [CrossRef]








| Dataset | Model | Performance Metrics | |||
|---|---|---|---|---|---|
| Precision | Recall | mAP@0.5 | Time * | ||
| 1 | YOLOv5m | 0.8944 | 0.8058 | 0.8779 | 88.12 |
| YOLOv8m | 0.8684 | 0.8094 | 0.8730 | 93.74 | |
| YOLOv9m | 0.8672 | 0.7990 | 0.8718 | 116.16 | |
| YOLOv10m | 0.8921 | 0.7786 | 0.8588 | 88.30 | |
| YOLO11m | 0.8829 | 0.8206 | 0.8737 | 95.18 | |
| YOLO12m | 0.8889 | 0.8064 | 0.8698 | 124.90 | |
| VGG16 + SSD300 | 0.0182 | 0.5206 | 0.1879 | 8.14 | |
| 2 | YOLOv5m | 0.9094 | 0.9403 | 0.9664 | 84.16 |
| YOLOv8m | 0.9316 | 0.9568 | 0.9729 | 98.97 | |
| YOLOv9m | 0.9387 | 0.9434 | 0.9689 | 127.22 | |
| YOLOv10m | 0.9321 | 0.9340 | 0.9643 | 94.62 | |
| YOLO11m | 0.9112 | 0.9444 | 0.9662 | 106.07 | |
| YOLO12m | 0.9401 | 0.9431 | 0.9718 | 136.76 | |
| VGG16 + SSD300 | 0.4401 | 0.8112 | 0.7490 | 7.47 | |
| 3 | YOLOv5m | 0.9171 | 0.8512 | 0.9117 | 83.72 |
| YOLOv8m | 0.8903 | 0.8499 | 0.9138 | 93.28 | |
| YOLOv9m | 0.9081 | 0.8589 | 0.9155 | 121.20 | |
| YOLOv10m | 0.9011 | 0.8493 | 0.9099 | 93.06 | |
| YOLO11m | 0.9154 | 0.8435 | 0.9076 | 99.31 | |
| YOLO12m | 0.9249 | 0.8602 | 0.9284 | 134.72 | |
| VGG16 + SSD300 | 0.0238 | 0.7070 | 0.5046 | 7.88 | |
| Model | Performance Metrics | |||
|---|---|---|---|---|
| Precision | Recall | mAP@0.5 | Time * | |
| YOLO12n | 0.9245 | 0.8362 | 0.9182 | 69.90 |
| YOLO12s | 0.8968 | 0.8777 | 0.9184 | 77.07 |
| YOLO12m | 0.9249 | 0.8602 | 0.9284 | 134.72 |
| YOLO12l | 0.9054 | 0.8819 | 0.9204 | 199.42 |
| YOLO12x | 0.8927 | 0.8670 | 0.9191 | 321.18 |
| Model | Performance Metrics | |||
|---|---|---|---|---|
| Precision | Recall | mAP@0.5 | Time * | |
| YOLOv5m | 0.9778 | 0.9775 | 0.9858 | 9.67 |
| YOLOv8m | 0.9715 | 0.9720 | 0.9776 | 10.24 |
| YOLOv9m | 0.9718 | 0.9645 | 0.9795 | 12.43 |
| YOLOv10m | 0.9767 | 0.9386 | 0.9717 | 9.44 |
| YOLO11m | 0.9766 | 0.9434 | 0.9770 | 9.92 |
| YOLO12m | 0.9723 | 0.9734 | 0.9762 | 12.14 |
| VGG16 + SSD300 | 0.9078 | 0.6782 | 0.6566 | 13.15 |
| MobileNetV2 | 0.0252 | 0.1379 | 0.0252 | 22.82 |
| Model | Performance Metrics | |||
|---|---|---|---|---|
| Precision | Recall | mAP@0.5 | Time * | |
| YOLOv5n | 0.9774 | 0.8658 | 0.9713 | 10.07 |
| YOLOv5s | 0.9832 | 0.9594 | 0.9725 | 8.04 |
| YOLOv5m | 0.9768 | 0.9765 | 0.9848 | 9.55 |
| YOLOv5l | 0.9709 | 0.9659 | 0.9734 | 11.34 |
| YOLOv5x | 0.9708 | 0.9663 | 0.9705 | 32.66 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Published by MDPI on behalf of the International Institute of Knowledge Innovation and Invention. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Srijiranon, K.; Varisthanist, N.; Tardtong, T.; Pumthurean, C.; Tanantong, T. Towards Intelligent Water Safety: Robobuoy, a Deep Learning-Based Drowning Detection and Autonomous Surface Vehicle Rescue System. Appl. Syst. Innov. 2026, 9, 12. https://doi.org/10.3390/asi9010012
Srijiranon K, Varisthanist N, Tardtong T, Pumthurean C, Tanantong T. Towards Intelligent Water Safety: Robobuoy, a Deep Learning-Based Drowning Detection and Autonomous Surface Vehicle Rescue System. Applied System Innovation. 2026; 9(1):12. https://doi.org/10.3390/asi9010012
Chicago/Turabian StyleSrijiranon, Krittakom, Nanmanat Varisthanist, Thanapat Tardtong, Chatchadaporn Pumthurean, and Tanatorn Tanantong. 2026. "Towards Intelligent Water Safety: Robobuoy, a Deep Learning-Based Drowning Detection and Autonomous Surface Vehicle Rescue System" Applied System Innovation 9, no. 1: 12. https://doi.org/10.3390/asi9010012
APA StyleSrijiranon, K., Varisthanist, N., Tardtong, T., Pumthurean, C., & Tanantong, T. (2026). Towards Intelligent Water Safety: Robobuoy, a Deep Learning-Based Drowning Detection and Autonomous Surface Vehicle Rescue System. Applied System Innovation, 9(1), 12. https://doi.org/10.3390/asi9010012

