Design of Monitoring System for River Crab Feeding Platform Based on Machine Vision
Abstract
1. Introduction
- (1)
- An innovative automated framework for aquaculture. This work presents the fully automated, electronically controlled feeding platform for river crab farming, engineered to modernize industry production methods.
- (2)
- Enhanced adaptability to complex conditions. Acknowledging that river crabs are most active during night and early morning, the system integrates the Contrast Limited Adaptive Histogram Equalization (CLAHE) algorithm. This preprocessing step enhances underwater image quality, ensuring stable detection capability across a wide range of lighting scenarios.
- (3)
- A task-adaptive detection model is used for non-invasive growth assessment. We have proposed an online real-time method to estimate the body length and weight of crabs. To comprehensively assess crab growth and feeding status, the system employs a YOLOv11 model specifically optimized for this task. Key enhancements—including a lightweight backbone, an attention mechanism, and a refined loss function—collectively boost the accuracy of multi-object detection and counting in underwater pond environments.
- (4)
- A closed-loop information system for users. This work implements a comprehensive cyber-physical system that collects, processes, and visualizes complex data. The integrated platform empowers users with easily accessible information and provides a strong, evidence-based reference for management decisions.
2. Materials and Methods
2.1. Overall System Design
2.2. Machine Vision Technology Development
2.2.1. Underwater Image Processing
2.2.2. Multi-Target Detection Based on Improved YOLOv11
FasterNet-Based Backbone Network Improvements
Triplet Attention Mechanism
SDIoU Loss Function
2.2.3. Ablation Experiment and Performance Evaluation
2.2.4. River Crab Size Detection
2.3. Underwater Environmental Monitoring System Design
2.4. Information Platform Construction
3. Results
3.1. Experiments
3.2. Quality Estimation and Bait Quantity Detection
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
References
- Khan, Z.; Shen, Y.; Liu, H. Objectdetection in agriculture: A comprehensive review of methods, applications, challenges, and future directions. Agriculture 2025, 15, 1351. [Google Scholar] [CrossRef]
- Sun, Y.P.; Chen, Z.X.; Zhao, D.A.; Zhan, T.T.; Zhou, W.Q.; Ruan, C.Z. Design and experiment of precise feeding system for pond crab culture. Trans. Chin. Soc. Agric. Mach. 2022, 53, 291–301. [Google Scholar]
- Ji, W.; Pan, Y.; Xu, B.; Wang, J. A real-time apple targets detection method for picking robot based on ShufflenetV2-YOLOX. Agriculture 2022, 12, 856. [Google Scholar] [CrossRef]
- Hu, T.; Wang, W.; Gu, J.; Xia, Z.; Zhang, J.; Wang, B. Research on apple object detection and localization method based on improved YOLOX and RGB-D images. Agronomy 2023, 13, 1816. [Google Scholar] [CrossRef]
- Zhang, Z.; Lu, Y.; Zhao, Y.; Pan, Q.; Jin, K.; Xu, G.; Hu, Y. TS-YOLO: An all-day and lightweight tea canopy shoots detection model. Agronomy 2023, 13, 1411. [Google Scholar] [CrossRef]
- Zhang, Q.; Chen, Q.; Xu, W.; Xu, L.; Lu, E. Prediction of feed quantity for wheat combine harvester based on improved YOLOv5s and weight of single wheat plant without stubble. Agriculture 2024, 14, 1251. [Google Scholar] [CrossRef]
- Cao, S.; Zhao, D.A.; Sun, Y.P.; Ruan, C.Z. Learning-based low-illumination image enhancer for underwater live crab detection. ICES J. Mar. Sci. 2021, 78, 979–993. [Google Scholar] [CrossRef]
- Girshick, R.B.; Donahue, J.; Darrell, T.; Malik, J. Rich feature hierarchies for accurate object detection and semantic segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Columbus, OH, USA, 23–28 June 2014; pp. 580–587. [Google Scholar]
- Girshick, R.B. Fast R-CNN. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Santiago, Chile, 7–13 December 2015; pp. 1440–1448. [Google Scholar]
- Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.Y.; Berg, A.C. SSD: Single shot multibox detector. In Proceedings of the European Conference on Computer Vision (ECCV), Amsterdam, The Netherlands, 11–14 October 2016; pp. 21–37. [Google Scholar]
- Redmon, J.; Divvala, S.K.; Girshick, R.B.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
- Redmon, J.; Farhadi, A. YOLO9000: Better, faster, stronger. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 June 2017; pp. 6517–6525. [Google Scholar]
- Banan, A.; Nasiri, A.; Taheri-Garavand, A. Deep learning-based appearance features extraction for automated carp species identification. Aquac. Eng. 2020, 89, 102053. [Google Scholar] [CrossRef]
- Allken, V.; Rosen, S.; Handegard, N.O.; Malde, K.; Demer, D. A deep learning-based method to identify and count pelagic and mesopelagic fishes from trawl camera images. ICES J. Mar. Sci. 2022, 78, 3780–3792. [Google Scholar] [CrossRef]
- Albuquerque, P.L.; Garcia, V.; Junior, A.D.; Lewandowski, T.; Detweiler, C.; Gonçalves, A.B.; Costa, C.S.; Naka, M.H.; Pistori, H. Automatic live fingerlings counting using computer vision. Comput. Electron. Agric. 2019, 167, 105015. [Google Scholar] [CrossRef]
- Han, F.L.; Yao, J.Z.; Zhu, H.T.; Wang, C.H. Underwater image processing and object detection based on deep CNN method. J. Sens. 2020, 2020, 6707328. [Google Scholar] [CrossRef]
- Hu, K.; Lu, F.Y.; Lu, M.X.; Deng, Z.L.; Liu, Y.P. A marine object detection algorithm based on SSD and feature enhancement. Complexity 2020, 2020, 5476142. [Google Scholar] [CrossRef]
- Lei, F.; Tang, F.F.; Li, S.H. Underwater target detection algorithm based on improved YOLOv5. J. Mar. Sci. Eng. 2022, 10, 310. [Google Scholar] [CrossRef]
- Siripattanadilok, W.; Siriborvornratanakul, T. Recognition of partially occluded soft-shell mud crabs using Faster R-CNN and Grad-CAM. Aquacult. Int. 2024, 32, 2977–2997. [Google Scholar] [CrossRef]
- Zhang, Z.; Liu, F.F.; He, X.F.; Wu, X.Y.; Xu, M.J.; Feng, S. Soft-shell crab detection model based on YOLOF. Aquacult. Int. 2024, 32, 5269–5298. [Google Scholar] [CrossRef]
- Llorens, S.; Pérez-Arjona, I.; Soliveres, E.; Espinosa, V. Detection and target strength measurements of uneaten feed pellets with a single beam echosounder. Aquac. Eng. 2017, 78, 216–220. [Google Scholar] [CrossRef]
- Liu, H.Y.; Xu, L.H.; Li, D.W. Detection and recognition of uneaten fish food pellets in aquaculture using image processing. In International Conference on Graphic and Image Processing; SPIE: Bellingham, WA, USA, 2015; Volume 9443, p. 94430G. [Google Scholar]
- Li, D.W.; Xu, L.H.; Liu, H.Y. Detection of uneaten fish food pellets in underwater images for aquaculture. Aquac. Eng. 2017, 78, 85–94. [Google Scholar] [CrossRef]
- Hu, X.L.; Liu, Y.; Zhao, Z.X.; Liu, J.T.; Yang, X.T.; Sun, C.H.; Chen, S.H.; Li, B.; Zhou, C. Real-time detection of uneaten feed pellets in underwater images for aquaculture using an improved YOLOv4 network. Comput. Electron. Agric. 2021, 185, 106135. [Google Scholar] [CrossRef]
- Ren, S.Q.; He, K.M.; Girshick, R.; Sun, J. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 1137–1149. [Google Scholar] [CrossRef]
- Chen, J.R.; Kao, S.; He, H.; Zhuo, W.P.; Wen, S.; Lee, C.H.; Chan, S.H.G. Run, don’t walk: Chasing higher FLOPS for faster neural networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Vancouver, BC, Canada, 17–24 June 2023; pp. 12021–12031. [Google Scholar]
- Misra, D.; Nalamada, T.; Arasanipalai, A.U.; Hou, Q. Rotate to attend: Convolutional triplet attention module. In Proceedings of the IEEE Winter Conference on Applications of Computer Vision (WACV), Waikoloa, HI, USA, 3–8 January 2021; pp. 3139–3148. [Google Scholar]
- Wang, X.Y.; Fu, C.Y.; He, J.W.; Wang, S.J.; Wang, J.W. StrongFusionMOT: A multi-object tracking method based on LiDAR-camera fusion. IEEE Sens. J. 2023, 23, 11241–11252. [Google Scholar] [CrossRef]
- Redmon, J.; Farhadi, A. YOLOv3: An incremental improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar] [CrossRef]
- Bochkovskiy, A.; Wang, C.Y.; Liao, H.Y.M. YOLOv4: Optimal speed and accuracy of object detection. arXiv 2020, arXiv:2004.10934. [Google Scholar] [CrossRef]
- Long, X.; Deng, K.P.; Wang, G.Z.; Zhang, Y.; Dang, Q.Q.; Gao, Y.; Shen, H.; Ren, J.G.; Han, S.M.; Ding, E.R.; et al. PP-YOLO: An effective and efficient implementation of object detector. arXiv 2020, arXiv:2007.12099. [Google Scholar] [CrossRef]
- Wang, C.Y.; Yeh, I.H.; Liao, H.Y.M. You only learn one representation: Unified network for multiple tasks. arXiv 2021, arXiv:2105.04206. [Google Scholar] [CrossRef]















| Model | Parameters/M | FLOPs/G | Precision/% | Recall/% | mAP/% | FPS |
|---|---|---|---|---|---|---|
| YOLOv11-FasterNet | 1.58 | 3.7 | 94.2 | 93.3 | 97.5 | 122 |
| YOLOv11-MobileNetV3 | 1.73 | 3.9 | 93.9 | 91.9 | 96.9 | 119 |
| YOLOv11- ShuffleNet | 1.21 | 3.3 | 92.8 | 91.8 | 95.8 | 133 |
| Model | Parameters/M | FLOPs/G | Precision/% | Recall/% | mAP/% | FPS |
|---|---|---|---|---|---|---|
| YOLOv11 | 2.58 | 6.3 | 94.8 | 93.9 | 97.9 | 102 |
| YOLOv11-FasterNet | 1.58 | 3.7 | 94.2 | 93.3 | 97.5 | 122 |
| YOLOv11-FasterNet-Triplet | 1.58 | 3.7 | 94.6 | 93.5 | 97.6 | 107 |
| YOLOv11-FasterNet-Triplet-SDIoU | 1.58 | 3.7 | 94.8 | 95.5 | 98.2 | 112 |
| Algorithms | Parameters/M | FLOPs/G | Precision/% | Recall/% | mAP/% | FPS |
|---|---|---|---|---|---|---|
| YOLOv3 | 61.5 | 154.6 | 92.1 | 93.1 | 92.6 | 39 |
| YOLOv4 | 64.4 | 142.8 | 93.9 | 94.2 | 94.3 | 52 |
| YOLOv5 | 12.3 | 16.1 | 94.6 | 95.2 | 95.1 | 58 |
| PPYOLO | 45.1 | 45.1 | 94.3 | 94.9 | 94.5 | 46 |
| YOLOR | 52.9 | 120.4 | 94.1 | 95.4 | 95.6 | 31 |
| YOLOv8 | 3.01 | 8.1 | 94.6 | 95.3 | 97.6 | 88 |
| Ours | 1.58 | 3.7 | 94.8 | 95.5 | 98.2 | 112 |
| Number | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
|---|---|---|---|---|---|---|---|---|---|---|
| Experiment Shell length/cm | 4.8 | 5.7 | 3.8 | 4.9 | 5.1 | 4.6 | 5.9 | 3.8 | 4.6 | 5.5 |
| Experiment shell width/cm | 4.3 | 5.6 | 3.4 | 4.4 | 4.5 | 4.1 | 4.9 | 3.3 | 3.9 | 4.8 |
| Experiment mass/g | 42.69 | 75.77 | 20.45 | 45.66 | 51.98 | 37.18 | 84.18 | 20.35 | 36.98 | 66.7 |
| Real Shell Length/cm | 4.6 | 5.8 | 3.7 | 5 | 5 | 4.6 | 5.8 | 3.7 | 4.5 | 5.6 |
| Real Shell width/cm | 4.2 | 5.3 | 3.2 | 4.4 | 4.2 | 4.2 | 5.1 | 3.3 | 3.8 | 5 |
| Real Shell mass/g | 40.09 | 80.75 | 19.02 | 46.68 | 53.64 | 40.80 | 80.26 | 19.8 | 33.86 | 73.76 |
| Mass error ratio/% | 6.48 | 6.17 | 7.53 | 2.18 | 3.10 | 8.87 | 4.88 | 2.79 | 9.23 | 9.57 |
| Categories | Number | Number of Experimental Feeds | Well-Lit Environment | Poorly Lit Environment | ||
|---|---|---|---|---|---|---|
| Number of Detections | Detection Rate/% | Number of Detections | Detection Rate/% | |||
| Pellets | 1 | 100 | 98 | 98 | 92 | 92 |
| 2 | 100 | 95 | 95 | 94 | 94 | |
| 3 | 100 | 96 | 96 | 96 | 96 | |
| Corn | 1 | 50 | 47 | 94 | 45 | 90 |
| 2 | 50 | 48 | 96 | 45 | 90 | |
| 3 | 50 | 48 | 96 | 47 | 94 | |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Sun, Y.; Li, Z.; Yang, Z.; Yuan, B.; Zhao, D.; Ren, N.; Cheng, Y. Design of Monitoring System for River Crab Feeding Platform Based on Machine Vision. Fishes 2026, 11, 88. https://doi.org/10.3390/fishes11020088
Sun Y, Li Z, Yang Z, Yuan B, Zhao D, Ren N, Cheng Y. Design of Monitoring System for River Crab Feeding Platform Based on Machine Vision. Fishes. 2026; 11(2):88. https://doi.org/10.3390/fishes11020088
Chicago/Turabian StyleSun, Yueping, Ziqiang Li, Zewei Yang, Bikang Yuan, De’an Zhao, Ni Ren, and Yawen Cheng. 2026. "Design of Monitoring System for River Crab Feeding Platform Based on Machine Vision" Fishes 11, no. 2: 88. https://doi.org/10.3390/fishes11020088
APA StyleSun, Y., Li, Z., Yang, Z., Yuan, B., Zhao, D., Ren, N., & Cheng, Y. (2026). Design of Monitoring System for River Crab Feeding Platform Based on Machine Vision. Fishes, 11(2), 88. https://doi.org/10.3390/fishes11020088

