EF yolov8s: A Human–Computer Collaborative Sugarcane Disease Detection Model in Complex Environment
Abstract
:1. Introduction
2. Materials and Methods
2.1. Image Acquisition and Data Enhancement
2.1.1. Image Acquisition
2.1.2. Image Enhancement
2.2. Experimental Design
2.3. Construction of Sugarcane Disease Detection Model
2.3.1. Construction of Common Disease Detection Model and Disease Segmentation Model Based on yolov8s
2.3.2. Construction of Improved EF-yolov8s Sugarcane Disease Detection Model
2.4. Evaluation Metrics
3. Results
3.1. Analysis of Experimental Results of Different Depth Learning Models
3.1.1. Performance Analysis of Different Depth Learning Algorithms in Sugarcane Disease Detection
3.1.2. Performance Comparison and Analysis of Different Depth Learning Algorithms in Different Sugarcane Disease Detection
3.2. Analysis of Experimental Results of Improved Deep Learning Model
3.2.1. Overall Analysis of Sugarcane Disease Detection Results after Model Improvement
3.2.2. Analysis of Detection Results of Different Sugarcane Diseases after Model Improvement
3.2.3. Performance Comparison before and after Model Improvement
3.3. Analysis of Experimental Results of Monitoring Nutrient Deficiency Symptoms of Sugarcane Tip Growth
4. Discussion
4.1. Analysis of the Practicality of a Sugarcane Disease Detection Model Based on Human–Machine Collaboration in Complex Environments
4.2. Improved EF yolov8s New Method for Constructing Sugarcane Disease Detection Model
4.3. Application Potential of Sugarcane Tip Growth Monitoring Intelligent Model in Nutrient Deficiency Symptoms
- Model underfitting: the model capacity is insufficient to fully capture the complex patterns in the data, which are usually manifested in that all evaluation indicators are not high, and the model complexity may need to be increased (such as increasing the number of network layers, expanding the number of filters).
- Training data problem: the sample distribution among target categories is uneven. There are too many samples of common categories and too few samples of rare categories, which may cause the model to be too biased towards common categories in the training process, and the detection ability of rare categories is weak, which may require us to pay attention to the distribution of samples in the future in terms of expanding the data set.
- Multi scale detection: YOLO models usually detect objects of different sizes at different scales. If multi-scale feature fusion or scale prediction module is not designed properly, it may lead to uneven detection performance between scales. Optimizing the multi-scale detection mechanism or introducing more effective cross scale information transfer methods can help improve the overall detection performance.
- The model is different from the data. The training data may be different from the sugarcane disease instances in the actual video, such as lighting, angle, occlusion, etc., and the data may not have enough diversity to cover all possible situations in the video.
- Video preprocessing. There may be differences between video frames and training images in pre-processing, such as scaling, clipping, normalization, etc.
- Post-processing strategy. In video reasoning, more complex post-processing strategies may be required to process the detection results between consecutive frames, such as using tracking algorithms to maintain the stability of detection frames.
5. Conclusions
- Under the same model super parameters, yolov8s is compared with yolov5n, yolov7, yolov8n, yolov8m, Faster R-CNN, libraRCNN, and other models. The yolov8s model is a relatively effective algorithm for sugarcane disease detection. Compared with yolov8m, its mAP_0.5, precision, recall, and F1 increased by 10.7%, 12.8%, 12.8%, and 13.7%, respectively, and the depth, parameter quantity, gradient, and floating-point calculation number of the model decreased by 23.7%, 56.9%, 56.9%, and 63.7%, respectively. The improved EF yolov8s is compared with the EF yolov8m, EF yolov8n, EF yolov7, and EF yolov5n models with the same EF module, and it is concluded that the improved model has achieved better detection results on the sugarcane disease image set. Although, compared with EF yolov8m, mAP_0.5, precision, and recall are 0.95%, 1.67%, and 0.62% lower, respectively, the depth of the model has been reduced by 70 layers, the number of parameters and gradients have been reduced by more than one time, and FLOGS has been reduced by more than two times, laying a foundation for lightweight deployment of subsequent models. Therefore, yolov8s and the improved model accuracy under comprehensive evaluation are better than the comparison algorithm, providing model support for the rapid detection application of multi-objective sugarcane diseases in complex environments.
- The yolov8s-seg model was used to train the case segmentation of sugarcane deficiency image. Under the same experimental environment, four case segmentation models, yolov8n-seg, yolov8m-seg, yolov8l-seg, and yolov8x-seg, were compared at the same time. The yolov8s-seg model is superior to other algorithms in mAP_0.5, recall, model depth, parameter quantity, and gradient. In general, yolov8s-seg has high detection accuracy, a lighter model, and other advantages, which makes it more suitable to be an effective tool for sugarcane deficiency case segmentation. At the same time, it can provide a reference for deploying intelligent detection applications of sugarcane deficiency on mobile terminal detection devices such as unmanned aerial vehicles.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Tao, T.; Wei, X. A hybrid CNN–SVM classifier for weed recognition in winter rape field. Plant Methods 2022, 18, 29. [Google Scholar] [CrossRef]
- Sun, C.; Zhou, X.; Zhang, M.; Qin, A. SE-VisionTransformer: Hybrid Network for Diagnosing Sugarcane Leaf Diseases Based on Attention Mechanism. Sensors 2023, 23, 8529. [Google Scholar] [CrossRef] [PubMed]
- Aghighi, H.; Azadbakht, M.; Ashourloo, D.; Shahrabi, H.S.; Radiom, S. Machine learning regression techniques for the silage maize yield prediction using time-series images of Landsat 8 OLI. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 4563–4577. [Google Scholar] [CrossRef]
- Sorjamaa, A.; Hao, J.; Reyhani, N.; Ji, Y.; Lendasse, A. Methodology for long-term prediction of time series. Neurocomputing 2007, 70, 2861–2869. [Google Scholar] [CrossRef]
- Wen, C.; Guyer, D.E.; Li, W. Local feature-based identification and classification for orchard insects. Biosyst. Eng. 2009, 104, 299–307. [Google Scholar] [CrossRef]
- Wen, C.; Guyer, D. Image-based orchard insect automated identification and classification method. Comput. Electron. Agric. 2012, 89, 110–115. [Google Scholar] [CrossRef]
- Liu, T.; Chen, W.; Wu, W.; Sun, C.; Guo, W.; Zhu, X. Detection of aphids in wheat fields using a computer vision technique. Biosyst. Eng. 2016, 141, 82–93. [Google Scholar] [CrossRef]
- Sasaki, Y.; Okamoto, T.; Imou, K.; Torii, T. Automatic diagnosis of plant disease recognition between healthy and diseased leaf. J. Jpn. Soc. Agric. Mach. 1999, 61, 119–126. [Google Scholar]
- Xie, C.; Zhang, J.; Li, R.; Li, J.; Hong, P.; Xia, J.; Chen, P. Automatic classification for field crop insects via multiple-task sparse representation and multiple-kernel learning. Comput. Electron. Agric. 2015, 119, 123–132. [Google Scholar] [CrossRef]
- Espinoza, K.; Valera, D.L.; Torres, J.A.; López, A.; Molina-Aiz, F.D. Combination of image processing and artificial neural networks as a novel approach for the identification of Bemisia tabaci and Frankliniella occidentalis on sticky traps in greenhouse agriculture. Comput. Electron. Agric. 2016, 127, 495–505. [Google Scholar] [CrossRef]
- Wang, Z.; Chu, G.; Zhang, H.; Liu, S.; Huang, X.; Gao, F.; Zhang, C.; Wang, J. Identification of diseased empty rice panicles based on Haar-like feature of UAV optical image. Trans CSAE 2018, 34, 73–82. [Google Scholar]
- Yigit, E.; Sabanci, K.; Toktas, A.; Kayabasi, A. A study on visual features of leaves in plant identification using artificial intelligence techniques. Comput. Electron. Agric. 2019, 156, 369–377. [Google Scholar] [CrossRef]
- Ratnasari, E.K.; Mentari, M.; Dewi, R.K.; Ginardi, R.H. Sugarcane leaf disease detection and severity estimation based on segmented spots image. In Proceedings of the International Conference on Information, Communication Technology and System (ICTS), Bandung, Indonesia, 28–30 May 2014; pp. 93–98. [Google Scholar]
- Zhang, Z.; He, X.; Sun, X.; Guo, L.; Wang, J.; Wang, F. Image recognition of maize leaf disease based on GA-SVM. Chem. Eng. Trans. 2015, 46, 199–204. [Google Scholar]
- Hossain, E.; Hossain, M.F.; Rahaman, M.A. A color and texture based approach for the detection and classification of plant leaf disease using KNN classifier. In Proceedings of the 2019 International Conference on Electrical, Computer and Communication Engineering (ECCE), Cox’sBazar, Bangladesh, 7–9 February 2019; pp. 1–6. [Google Scholar]
- Wang, D.; Wang, J. Crop disease classification with transfer learning and residual networks. Trans. Chin. Soc. Agric. Eng 2021, 37, 199–207. [Google Scholar]
- Li, X.; Ma, B.; Yu, G.; Chen, J.; Li, Y.; Li, C. Surface defect detection of Hami melon using deep learning and image processing. Trans. Chin. Soc. Agric. Eng 2021, 37, 223–232. [Google Scholar]
- Hou, J.; Fang, L.; Wu, Y.; Li, Y.; Xi, R. Rapid identification of ginger seed buds and it’s orientation determination based on deep learning. Trans. Chin. Soc. Agric. Eng 2021, 37, 213–222. [Google Scholar]
- Sandler, M.; Howard, A.; Zhu, M.; Zhmoginov, A.; Chen, L.-C. Mobilenetv2: Inverted residuals and linear bottlenecks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–22 June 2018; pp. 4510–4520. [Google Scholar]
- Chen, H.; Zhu, Y.; Sun, D.; Zhai, L.; Wan, L.; Ma, Z.; Liu, Z.; He, Y. CURRENT Status and Prospects of Deep Learning in Plant Phenotyping Research. Trans. Chin. Soc. Agric. Eng 2020, 36, 1–16. [Google Scholar]
- Fan, X.; Xu, Y.; Zhou, J.; Li, Z.; Peng, X.; Wang, X. Leaf disease detection system for grapes based on migration learning and improved CNNs. Trans. Chin. Soc. Agric. Eng 2021, 37, 151–159. [Google Scholar]
- Lin, J.; Wu, X.; Chai, Y.; Yin, H. A review of structural optimisation of convolutional neural networks. Acta Autom. Sin. 2020, 46, 24–37. [Google Scholar]
- Tian, Y.; Yang, G.; Wang, Z.; Wang, H.; Li, E.; Liang, Z. Apple detection during different growth stages in orchards using the improved YOLO-V3 model. Comput. Electron. Agric. 2019, 157, 417–426. [Google Scholar] [CrossRef]
- Sun, H.; Li, S.; Li, M.; Liu, H.; Qiao, L.; Zhang, Y. Research Progress in Agricultural Information Imaging Perception and Deep Learning Application. Trans. Chin. Soc. Agric. Mach. 2020, 51, 1–17. [Google Scholar]
- Wan, S.; Goudos, S. Faster R-CNN for multi-class fruit detection using a robotic vision system. Comput. Netw. 2020, 168, 107036. [Google Scholar] [CrossRef]
- Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.-Y.; Berg, A.C. Ssd: Single shot multibox detector. In Proceedings of the Computer Vision–ECCV 2016: 14th European Conference, Amsterdam, The Netherlands, 11–14 October 2016; Part I 14. Springer: Berlin/Heidelberg, Germany, 2016; pp. 21–37. [Google Scholar]
- Yuan, P.; Lai, W.; Ren, S.; Xu, H. Chrysanthemum flower type and variety recognition based on convolutional neural network. Trans. Chin. Soc. Agric. Eng 2018, 34, 152–158. [Google Scholar]
- Zhou, Y.; Xu, T.; Zheng, W.; Deng, H. Classification and identification of major organs of tomato based on deep convolutional neural network. Trans. Chin. Soc. Agric. Eng 2017, 33, 219–226. [Google Scholar]
- Quiroz, I.A.; Alférez, G.H. Image recognition of Legacy blueberries in a Chilean smart farm through deep learning. Comput. Electron. Agric. 2020, 168, 105044. [Google Scholar] [CrossRef]
- Wang, C.; Wu, X.; Li, Z. Multi-scale hierarchical feature extraction based on convolutional neural network to identify maize weeds. Trans. Chin. Soc. Agric. Eng 2018, 34, 144–151. [Google Scholar]
- Huang, S.; Sun, C.; Qi, L.; Ma, X.; Wang, W. A deep convolutional neural network-based method for rice blast detection. Trans. Chin. Soc. Agric. Eng 2017, 33, 169–176. [Google Scholar]
- Li, X.; Pan, J.; Xie, F.; Zeng, J.; Li, Q.; Huang, X.; Liu, D.; Wang, X. Fast and accurate green pepper detection in complex backgrounds via an improved Yolov4-tiny model. Comput. Electron. Agric. 2021, 191, 106503. [Google Scholar] [CrossRef]
- Li, L.; Zhang, S.; Wang, B. Plant disease detection and classification by deep learning—A review. IEEE Access 2021, 9, 56683–56698. [Google Scholar] [CrossRef]
- Abade, A.; Ferreira, P.A.; de Barros Vidal, F. Plant diseases recognition on images using convolutional neural networks: A systematic review. Comput. Electron. Agric. 2021, 185, 106125. [Google Scholar] [CrossRef]
- Gu, B.; Wen, C.; Liu, X.; Hou, Y.; Hu, Y.; Su, H. Improved YOLOv7-Tiny Complex Environment Citrus Detection Based on Lightweighting. Agronomy 2023, 13, 2667. [Google Scholar] [CrossRef]
- Xie, J.; Peng, J.; Wang, J.; Chen, B.; Jing, T.; Sun, D.; Gao, P.; Wang, W.; Lu, J.; Yetan, R.; et al. Litchi Detection in a Complex Natural Environment Using the YOLOv5-Litchi Model. Agronomy 2022, 12, 3054. [Google Scholar] [CrossRef]
- Peng, Y.; Zhong, W.; Peng, Z.; Tu, Y.; Xu, Y.; Li, Z.; Liang, J.; Huang, J.; Liu, X.; Fu, Y. Enhanced Estimation of Rice Leaf Nitrogen Content via the Integration of Hybrid Preferred Features and Deep Learning Methodologies. Agronomy 2024, 14, 1248. [Google Scholar] [CrossRef]
- Zhang, X.; Qiao, Y.; Meng, F.; Fan, C.; Zhang, M. Identification of maize leaf diseases using improved deep convolutional neural networks. IEEE Access 2018, 6, 30370–30377. [Google Scholar] [CrossRef]
- Carion, N.; Massa, F.; Synnaeve, G.; Usunier, N.; Kirillov, A.; Zagoruyko, S. End-to-end object detection with transformers. In European Conference on Computer Vision; Springer: Berlin/Heidelberg, Germany, 2020; pp. 213–229. [Google Scholar]
- Wang, Y.; Xu, Z.; Wang, X.; Shen, C.; Cheng, B.; Shen, H.; Xia, H. End-to-end video instance segmentation with transformers. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 8741–8750. [Google Scholar]
- Hirani, E.; Magotra, V.; Jain, J.; Bide, P. Plant Disease Detection Using Deep Learning. Int. J. Recent Technol. Eng. 2021, 9, 909–914. [Google Scholar]
- Han, K.; Wang, Y.; Tian, Q.; Guo, J.; Xu, C.; Xu, C. Ghostnet: More features from cheap operations. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020; pp. 1580–1589. [Google Scholar]
- Howard, A.; Sandler, M.; Chu, G.; Chen, L.-C.; Chen, B.; Tan, M.; Wang, W.; Zhu, Y.; Pang, R.; Vasudevan, V. Searching for mobilenetv3. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Seoul, Republic of Korea, 27 October–2 November 2019; pp. 1314–1324. [Google Scholar]
- Wang, C.-Y.; Bochkovskiy, A.; Liao, H.-Y.M. YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Vancouver, BC, Canada, 17–24 June 2023; pp. 7464–7475. [Google Scholar]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Trans. Pattern Anal. Mach. Intell. 2016, 39, 1137–1149. [Google Scholar] [CrossRef]
- Pang, J.; Chen, K.; Shi, J.; Feng, H.; Ouyang, W.; Lin, D. Libra r-cnn: Towards balanced learning for object detection. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019; pp. 821–830. [Google Scholar]
- Paleti, L.; Nagasri, A.; Sunitha, P.; Sandya, V.; Sumallika, T.; Kandukuri, P.; Kumar, K.K. Sugar Cane Leaf Disease Classification and Identification Using Deep Machine Learning Algorithms. J. Theor. Appl. Inf. Technol. 2023, 101, 6460–6472. [Google Scholar]
- da Silva Simões, I.O.P.; de Freitas, R.G.; Cursi, D.E.; Chapola, R.G.; do Amaral, L.R. Recognition of Sugar Cane Orange and Brown Rust Through Leaf Image Processing. Smart Agric. Technol. 2023, 4, 10085. [Google Scholar]
- Saleem, M.H.; Potgieter, J.; Arif, K.M. Plant disease detection and classification by deep learning. Plants 2019, 8, 468. [Google Scholar] [CrossRef]
- Daphal, S.D.; Koli, S.M. Enhanced deep learning technique for sugarcane leaf disease classification and mobile application integration. Heliyon 2024, 10, e29438. [Google Scholar] [CrossRef]
Disease Name | Number of Pictures | Number of Images after Data Enhancement | Number of Category Tags | Number of Labels after Data Enhancement |
---|---|---|---|---|
Sugarcane brown stripe | 970 | 2910 | 4106 | 12,318 |
Sugarcane rust | 852 | 2556 | 3434 | 10,302 |
Sugarcane brown spot | 1193 | 3579 | 17,823 | 53,469 |
Sugarcane ring spot | 1130 | 3390 | 5376 | 16,128 |
Sugarcane red rot | 2125 | 6375 | 5012 | 15,036 |
Aggregate | 6270 | 18,810 | 35,751 | 107,253 |
Sugarcane sulfur deficiency | 1019 | 3057 | 18,685 | 56,056 |
Sugarcane phosphorus deficiency | 667 | 2001 | 6111 | 18,333 |
Aggregate | 1686 | 5058 | 24,796 | 74,389 |
Model | mAP_0.5 | Precision | Recall | F1 | Depths | Total Parameters | Gradients | FLOPS (G) |
---|---|---|---|---|---|---|---|---|
Yolov5n | 67.80% | 66.40% | 63.50% | 64.00% | 193 | 2,503,919 | 2,509,423 | 7.1 |
Yolov7 | 64.10% | 62.40% | 63.70% | 63.00% | 407 | 37,216,250 | 37,216,250 | 105.2 |
Yolov8n | 70.60% | 67.60% | 66.40% | 67.00% | 168 | 3,006,623 | 3,011,807 | 8.1 |
Yolov8s | 85.70% | 84.56% | 81.12% | 83.00% | 225 | 11,137,535 | 11,137,519 | 28.7 |
Yolov8m | 77.40% | 75.00% | 71.90% | 73.00% | 295 | 25,859,215 | 25,859,199 | 79.1 |
Faster R-CNN | 62.59% | 45.31% | 73.60% | 51.00% | 19 | 136,770,964 | 401.788 | |
Libra R-CNN | 65.30% | 51.40% | 69.30% | 59.00% | 50 | 25,564,732 |
Model | Disease | Precision | Recall | Map_0.5 | Map_0.5–0.95 |
---|---|---|---|---|---|
Yolov5n | red rot disease | 71.40% | 73.20% | 77.00% | 39.30% |
ring spot disease | 66.80% | 64.10% | 66.80% | 29.20% | |
rust disease | 58.70% | 65.30% | 63.60% | 32.20% | |
brown spot disease | 66.10% | 70.60% | 70.70% | 30.80% | |
brown stripe disease | 69.30% | 44.30% | 61.10% | 29.90% | |
Yolov7 | red rot disease | 54.30% | 68.30% | 64.10% | 25.70% |
ring spot disease | 52.90% | 63.30% | 57.90% | 21.30% | |
rust disease | 45.10% | 49.00% | 44.70% | 19.20% | |
brown spot disease | 50.80% | 71.60% | 60.40% | 22.20% | |
brown stripe disease | 40.50% | 39.70% | 36.90% | 11.80% | |
Yolov8n | red rot disease | 72.80% | 76.30% | 80.40% | 42.40% |
ring spot disease | 69.50% | 64.60% | 69.00% | 30.80% | |
rust disease | 61.90% | 68.10% | 67.70% | 34.80% | |
brown spot disease | 67.00% | 71.50% | 72.00% | 31.80% | |
brown stripe disease | 66.80% | 51.70% | 64.00% | 32.60% | |
Yolov8s | red rot disease | 88.60% | 81.10% | 85.70% | 53.70% |
ring spot disease | 82.90% | 74.70% | 81.80% | 44.00% | |
rust disease | 79.70% | 75.80% | 81.70% | 50.70% | |
brown spot disease | 81.20% | 77.30% | 82.90% | 42.50% | |
brown stripe disease | 90.60% | 90.30% | 90.70% | 73.10% | |
Yolov8m | red rot disease | 81.20% | 82.20% | 86.80% | 50.50% |
ring spot disease | 77.10% | 70.50% | 76.40% | 37.70% | |
rust disease | 70.40% | 70.40% | 74.10% | 41.40% | |
brown spot disease | 74.90% | 72.60% | 77.40% | 35.20% | |
brown stripe disease | 71.30% | 64.50% | 72.50% | 42.30% | |
Faster R-CNN | red rot disease | 53.30% | 78.40% | 67.60% | 34.40% |
ring spot disease | 44.50% | 68.70% | 55.30% | 30.60% | |
rust disease | 34.80% | 65.80% | 48.00% | 28.50% | |
brown spot disease | 32.40% | 74.50% | 63.80% | 27.30% | |
brown stripe disease | 61.90% | 77.90% | 77.80% | 37.20% | |
Libra R-CNN | red rot disease | 54.80% | 74.00% | 71.30% | 36.70% |
ring spot disease | 47.90% | 65.90% | 62.80% | 29.40% | |
rust disease | 45.60% | 58.80% | 49.50% | 27.50% | |
brown spot disease | 51.90% | 68.30% | 69.30% | 29.10% | |
brown stripe disease | 56.90% | 79.70% | 72.10% | 38.10% |
Model | mAP_0.5 | Precision | Recall | F1 | Depths | Total Parameters | Gradients | FLOPS (G) |
---|---|---|---|---|---|---|---|---|
EF-yolov8s | 89.70% | 88.70% | 86.00% | 88.00% | 249 | 11,191,743 | 11,191,727 | 29.5 |
EF-yolov8m | 90.65% | 90.37% | 86.62% | 88.00% | 319 | 25,940,431 | 25,940,415 | 80.6 |
EF-yolov8n | 69.30% | 68.80% | 65.70% | 67.00% | 192 | 3,007,519 | 3,012,703 | 8.1 |
EF-yolov7 | 84.67% | 82.50% | 80.19% | 81.00% | 295 | 6,025,868 | 6,025,868 | 13.3 |
EF-yolov5n | 67.20% | 67.10% | 63.70% | 67.20% | 217 | 2,504,815 | 2,510,319 | 7.1 |
Model | Disease | Precision | Recall | Map_0.5 | Map_0.5–0.95 |
---|---|---|---|---|---|
EF-yolov8s | red rot disease | 92.90% | 90.30% | 93.60% | 67.20% |
ring spot disease | 87.30% | 81.90% | 88.00% | 54.40% | |
rust disease | 88.00% | 81.70% | 88.30% | 61.00% | |
brown spot disease | 84.90% | 84.30% | 88.00% | 52.30% | |
brown stripe disease | 93.30% | 92.30% | 91.70% | 78.40% | |
EF-yolov8m | red rot disease | 91.80% | 88.70% | 92.80% | 69.50% |
ring spot disease | 88.90% | 84.10% | 89.20% | 60.10% | |
rust disease | 89.20% | 83.90% | 90.40% | 67.00% | |
brown spot disease | 88.50% | 85.40% | 90.00% | 58.80% | |
brown stripe disease | 93.40% | 90.90% | 90.80% | 80.20% | |
EF-yolov8n | red rot disease | 74.10% | 70.80% | 75.90% | 42.60% |
ring spot disease | 67.40% | 55.20% | 60.00% | 27.20% | |
rust disease | 57.70% | 55.30% | 57.80% | 31.50% | |
brown spot disease | 66.90% | 63.80% | 68.00% | 30.50% | |
brown stripe disease | 77.80% | 83.60% | 84.70% | 65.40% | |
EF-yolov7 | red rot disease | 85.60% | 83.50% | 89.10% | 48.80% |
ring spot disease | 81.10% | 72.80% | 80.20% | 37.40% | |
rust disease | 80.80% | 75.40% | 82.70% | 44.10% | |
brown spot disease | 76.70% | 80.50% | 82.10% | 36.80% | |
brown stripe disease | 88.40% | 88.80% | 89.30% | 63.00% | |
EF-yolov5n | red rot disease | 71.50% | 67.10% | 72.40% | 39.60% |
ring spot disease | 66.90% | 53.40% | 58.80% | 25.90% | |
rust disease | 57.60% | 53.20% | 55.90% | 29.60% | |
brown spot disease | 63.70% | 65.90% | 66.80% | 29.40% | |
brown stripe disease | 75.80% | 78.90% | 82.00% | 61.70% |
Model | Map_0.5 | Precision | Recall | F1 | Depths | Total Parameters | Gradients | FLOPS (G) |
---|---|---|---|---|---|---|---|---|
yolov8n-seg | 75.10% | 71.90% | 73.10% | 72.00% | 261 | 3,264,006 | 3,263,990 | 12.1 |
yolov8s-seg | 80.30% | 74.90% | 79.00% | 72.00% | 261 | 11,790,870 | 11,790,854 | 42.7 |
yolov8m-seg | 78.40% | 74.62% | 77.55% | 76.00% | 331 | 27,240,806 | 27,240,790 | 110.4 |
yolov8l-seg | 78.28% | 75.69% | 77.72% | 77.00% | 401 | 45,937,590 | 45,937,574 | 220.8 |
yolov8x-seg | 79.38% | 76.56% | 78.41% | 77.00% | 401 | 71,752,774 | 71,752,758 | 344.5 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Sun, J.; Li, Z.; Li, F.; Shen, Y.; Qian, Y.; Li, T. EF yolov8s: A Human–Computer Collaborative Sugarcane Disease Detection Model in Complex Environment. Agronomy 2024, 14, 2099. https://doi.org/10.3390/agronomy14092099
Sun J, Li Z, Li F, Shen Y, Qian Y, Li T. EF yolov8s: A Human–Computer Collaborative Sugarcane Disease Detection Model in Complex Environment. Agronomy. 2024; 14(9):2099. https://doi.org/10.3390/agronomy14092099
Chicago/Turabian StyleSun, Jihong, Zhaowen Li, Fusheng Li, Yingming Shen, Ye Qian, and Tong Li. 2024. "EF yolov8s: A Human–Computer Collaborative Sugarcane Disease Detection Model in Complex Environment" Agronomy 14, no. 9: 2099. https://doi.org/10.3390/agronomy14092099
APA StyleSun, J., Li, Z., Li, F., Shen, Y., Qian, Y., & Li, T. (2024). EF yolov8s: A Human–Computer Collaborative Sugarcane Disease Detection Model in Complex Environment. Agronomy, 14(9), 2099. https://doi.org/10.3390/agronomy14092099