Detection of Floricane Raspberry Shrubs from Unmanned Aerial Vehicle Imagery Using YOLO Models
Abstract
1. Introduction
2. Materials and Methods
2.1. Data Collection
2.1.1. Drone
2.1.2. Overview of the Research Area
2.2. Data Preprocessing
2.2.1. Image Scaling
2.2.2. Image Labeling
2.2.3. Preparing Datasets
2.3. Models Training
2.3.1. YOLO Models
2.3.2. Training Parameters
2.4. Evaluation of Trained Models
2.5. Hardware Configuration
3. Results
3.1. Evaluation of YOLO Models Trained and Tested on Corresponding Image Modalities
3.2. Cross-Domain Evaluation of YOLO Models Trained and Tested on Different Image Modalities
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Avhale, V.R.; Senthil Kumar, G.; Kumaraperumal, R.; Prabukumar, G.; Bharathi, C.; Sathya Priya, R.; Yuvaraj, M.; Muthumanickam, D.; Parasuraman, P.; Pazhanivelan, S. AgriDrones: A Holistic Review on the Integration of Drones in Indian Agriculture. Agric. Res. 2025, 14, 34–46. [Google Scholar] [CrossRef]
- Merz, M.; Pedro, D.; Skliros, V.; Bergenhem, C.; Himanka, M.; Houge, T.; Matos-Carvalho, J.P.; Lundkvist, H.; Cürüklü, B.; Hamrén, R.; et al. Autonomous UAS-Based Agriculture Applications: General Overview and Relevant European Case Studies. Drones 2022, 6, 128. [Google Scholar] [CrossRef]
- Rejeb, A.; Abdollahi, A.; Rejeb, K.; Treiblmaier, H. Drones in Agriculture: A Review and Bibliometric Analysis. Comput. Electron. Agric. 2022, 198, 107017. [Google Scholar] [CrossRef]
- Guebsi, R.; Mami, S.; Chokmani, K. Drones in Precision Agriculture: A Comprehensive Review of Applications, Technologies, and Challenges. Drones 2024, 8, 686. [Google Scholar] [CrossRef]
- Weiss, M.; Jacob, F.; Duveiller, G. Remote Sensing for Agricultural Applications: A Meta-Review. Remote Sens. Environ. 2020, 236, 111402. [Google Scholar] [CrossRef]
- Zhang, X.; Bi, P.; Zhou, Q.; Liu, L.; Ren, L.; Luo, Y. Monitoring of Yellow Leaf Disease (YLD) Damage Based on Ground-Based LiDAR and UAV Multispectral Data. Comput. Electron. Agric. 2025, 236, 110461. [Google Scholar] [CrossRef]
- Teixeira, I.; Morais, R.; Sousa, J.J.; Cunha, A. Deep Learning Models for the Classification of Crops in Aerial Imagery: A Review. Agriculture 2023, 13, 965. [Google Scholar] [CrossRef]
- Niu, H.; Chen, Y. Smart Big Data in Digital Agriculture Applications: Acquisition, Advanced Analytics, and Plant Physiology-Informed Artificial Intelligence; Agriculture Automation and Control; Springer Nature: Cham, Switzerland, 2024. [Google Scholar]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep Learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet Classification with Deep Convolutional Neural Networks. Commun. ACM 2017, 60, 84–90. [Google Scholar] [CrossRef]
- Russakovsky, O.; Deng, J.; Su, H.; Krause, J.; Satheesh, S.; Ma, S.; Huang, Z.; Karpathy, A.; Khosla, A.; Bernstein, M.; et al. ImageNet Large Scale Visual Recognition Challenge. Int. J. Comput. Vis. 2015, 115, 211–252. [Google Scholar] [CrossRef]
- Lecun, Y.; Bottou, L.; Bengio, Y.; Haffner, P. Gradient-Based Learning Applied to Document Recognition. Proc. IEEE 1998, 86, 2278–2324. [Google Scholar] [CrossRef]
- Nair, V.; Hinton, G.E. Rectified Linear Units Improve Restricted Boltzmann Machines. In Proceedings of the 27th International Conference on Machine Learning (ICML 2010); Omnipress: Madison, WI, USA, 2010; pp. 807–814. Available online: https://dl.acm.org/doi/10.5555/3104322.3104425 (accessed on 19 October 2025).
- Zeiler, M.D.; Fergus, R. Visualizing and Understanding Convolutional Networks. In Computer Vision—ECCV 2014; Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T., Eds.; Lecture Notes in Computer Science; Springer International Publishing: Cham, Switzerland, 2014; Volume 8689, pp. 818–833. [Google Scholar]
- Hutchison, D.; Kanade, T.; Kittler, J.; Kleinberg, J.M.; Mattern, F.; Mitchell, J.C.; Naor, M.; Nierstrasz, O.; Pandu Rangan, C.; Steffen, B.; et al. Evaluation of Pooling Operations in Convolutional Architectures for Object Recognition. In Artificial Neural Networks—ICANN 2010; Diamantaras, K., Duch, W., Iliadis, L.S., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2010; Volume 6354, pp. 92–101. [Google Scholar]
- Goodfellow, I.; Courville, A.; Bengio, Y. Deep Learning; Adaptive Computation and Machine Learning; The MIT Press: Cambridge, MA, USA, 2016. [Google Scholar]
- Girshick, R.; Donahue, J.; Darrell, T.; Malik, J. Rich Feature Hierarchies for Accurate Object Detection and Semantic Segmentation. In Proceedings of the 2014 IEEE Conference on Computer Vision and Pattern Recognition; IEEE: Columbus, OH, USA, 2014; pp. 580–587. [Google Scholar]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. arXiv 2015, arXiv:1506.01497. [Google Scholar] [CrossRef] [PubMed]
- Girshick, R. Fast R-CNN. In Proceedings of the 2015 IEEE International Conference on Computer Vision (ICCV); IEEE: Santiago, Chile, 2015; pp. 1440–1448. [Google Scholar]
- Sapkota, R.; Flores-Calero, M.; Qureshi, R.; Badgujar, C.; Nepal, U.; Poulose, A.; Zeno, P.; Vaddevolu, U.B.P.; Khan, S.; Shoman, M.; et al. YOLO Advances to Its Genesis: A Decadal and Comprehensive Review of the You Only Look Once (YOLO) Series. Artif. Intell. Rev. 2025, 58, 274. [Google Scholar] [CrossRef]
- Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.-Y.; Berg, A.C. SSD: Single Shot MultiBox Detector. In Computer Vision—ECCV 2016; Leibe, B., Matas, J., Sebe, N., Welling, M., Eds.; Lecture Notes in Computer Science; Springer International Publishing: Cham, Switzerland, 2016; Volume 9905, pp. 21–37. [Google Scholar]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR); IEEE: Las Vegas, NV, USA, 2016; pp. 779–788. [Google Scholar]
- Ali, M.L.; Zhang, Z. The YOLO Framework: A Comprehensive Review of Evolution, Applications, and Benchmarks in Object Detection. Computers 2024, 13, 336. [Google Scholar] [CrossRef]
- Katkuri, A.V.R.; Madan, H.; Khatri, N.; Abdul-Qawy, A.S.H.; Patnaik, K.S. Autonomous UAV Navigation Using Deep Learning-Based Computer Vision Frameworks: A Systematic Literature Review. Array 2024, 23, 100361. [Google Scholar] [CrossRef]
- Xing, Y.; Liu, X.; Wang, X. Integrating UAVs, Satellite Remote Sensing, and Machine Learning in Precision Agriculture: Pathways to Sustainable Food Production, Resource Efficiency, and Scalable Innovation. Front. Agron. 2026, 7, 1670380. [Google Scholar] [CrossRef]
- Agrawal, J.; Arafat, M.Y. Transforming Farming: A Review of AI-Powered UAV Technologies in Precision Agriculture. Drones 2024, 8, 664. [Google Scholar] [CrossRef]
- Buczyński, K.; Kapłan, M.; Jarosz, Z. Review of the Report on the Nutritional and Health-Promoting Values of Species of the Rubus L. Genus. Agriculture 2024, 14, 1324. [Google Scholar] [CrossRef]
- DJI Agriculture. Available online: https://ag.dji.com/Mavic-3-m (accessed on 20 October 2025).
- Kamilczynski. GitHub Repository. 2026. Available online: https://github.com/kamilczynski/detection-of-floricane-raspberry-shrubs-from-unmanned-aerial-vehicles-imagery-using-yolo-models (accessed on 16 February 2026).
- Tzutalin. LabelImg: Image Annotation Tool, Version 1.8.6; GitHub Repository. 2015. Available online: https://github.com/tzutalin/labelimg (accessed on 1 June 2025).
- Yaseen, M. What Is YOLOv8: An In-Depth Exploration of the Internal Features of the Next-Generation Object Detector. arXiv 2024, arXiv:2408.15857. [Google Scholar]
- Alif, M.A.R.; Hussain, M. YOLOv1 to YOLOv10: A Comprehensive Review of YOLO Variants and Their Application in the Agricultural Domain. arXiv 2024, arXiv:2406.10139. [Google Scholar] [CrossRef]
- Ultralytics. Available online: https://docs.ultralytics.com/models/yolov8/#supported-tasks-and-modes (accessed on 20 October 2025).
- Khanam, R.; Hussain, M. YOLOv11: An Overview of the Key Architectural Enhancements. arXiv 2024, arXiv:2410.17725. [Google Scholar] [CrossRef]
- Jegham, N.; Koh, C.Y.; Abdelatti, M.; Hendawi, A. YOLO Evolution: A Comprehensive Benchmark and Architectural Review of YOLOv12, YOLO11, and Their Previous Versions. arXiv 2025, arXiv:2411.00201. [Google Scholar]
- Ultralytics. Available online: https://docs.ultralytics.com/models/yolo11/ (accessed on 20 October 2025).
- Tian, Y.; Ye, Q.; Doermann, D. YOLOv12: Attention-Centric Real-Time Object Detectors. arXiv 2025, arXiv:2502.12524. [Google Scholar]
- Ultralytics. Available online: https://docs.ultralytics.com/models/yolo12 (accessed on 20 October 2025).
- Ultralytics. Available online: https://docs.ultralytics.com/guides/yolo-performance-metrics/#class-wise-metrics (accessed on 10 November 2025).
- Zhu, Y.; Zhou, J.; Yang, Y.; Liu, L.; Liu, F.; Kong, W. Rapid Target Detection of Fruit Trees Using UAV Imaging and Improved Light YOLOv4 Algorithm. Remote Sens. 2022, 14, 4324. [Google Scholar] [CrossRef]
- Tian, H.; Fang, X.; Lan, Y.; Ma, C.; Huang, H.; Lu, X.; Zhao, D.; Liu, H.; Zhang, Y. Extraction of Citrus Trees from UAV Remote Sensing Imagery Using YOLOv5s and Coordinate Transformation. Remote Sens. 2022, 14, 4208. [Google Scholar] [CrossRef]
- Zhang, Y.; Fang, X.; Guo, J.; Wang, L.; Tian, H.; Yan, K.; Lan, Y. CURI-YOLOv7: A Lightweight YOLOv7tiny Target Detector for Citrus Trees from UAV Remote Sensing Imagery Based on Embedded Device. Remote Sens. 2023, 15, 4647. [Google Scholar] [CrossRef]
- Nguyen, H.D.; McHenry, B.; Nguyen, T.; Zappone, H.; Thompson, A.; Tran, C.; Segrest, A.; Tonon, L. Accurate Crop Yield Estimation of Blueberries Using Deep Learning and Smart Drones. arXiv 2025, arXiv:2501.02344. [Google Scholar] [CrossRef]
- Gavrilović, M.; Jovanović, D.; Božović, P.; Benka, P.; Govedarica, M. Vineyard Zoning and Vine Detection Using Machine Learning in Unmanned Aerial Vehicle Imagery. Remote Sens. 2024, 16, 584. [Google Scholar] [CrossRef]
- Jintasuttisak, T.; Edirisinghe, E.; Elbattay, A. Deep Neural Network Based Date Palm Tree Detection in Drone Imagery. Comput. Electron. Agric. 2022, 192, 106560. [Google Scholar] [CrossRef]
- Peng, H.; Xie, H.; Lia, W.; Liuc, H.; Li, X. YOLOv11-Litchi: Efficient Litchi Fruit Detection Based on UAV-Captured Agricultural Imagery in Complex Orchard Environments. arXiv 2025, arXiv:2510.10141. [Google Scholar]
- Wang, Q.; Pu, Z.; Luo, L.; Wang, L.; Gao, J. A Study on Tree Species Recognition in UAV Remote Sensing Imagery Based on an Improved YOLOv11 Model. Appl. Sci. 2025, 15, 8779. [Google Scholar] [CrossRef]
- Jemaa, H.; Bouachir, W.; Leblon, B.; LaRocque, A.; Haddadi, A.; Bouguila, N. UAV-Based Computer Vision System for Orchard Apple Tree Detection and Health Assessment. Remote Sens. 2023, 15, 3558. [Google Scholar] [CrossRef]
- Jemaa, H.; Bouachir, W.; Leblon, B.; Bouguila, N. Computer Vision System for Detecting Orchard Trees from UAV Images. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2022, XLIII-B4-2022, 661–668. [Google Scholar] [CrossRef]
- Kelly, M.; Feirer, S.; Hogan, S.; Lyons, A.; Lin, F.; Jacygrad, E. Mapping Orchard Trees from UAV Imagery Through One Growing Season: A Comparison Between OBIA-Based and Three CNN-Based Object Detection Methods. Drones 2025, 9, 593. [Google Scholar] [CrossRef]
- Feng, Y.; Chen, W.; Ma, Y.; Zhang, Z.; Gao, P.; Lv, X. Cotton Seedling Detection and Counting Based on UAV Multispectral Images and Deep Learning Methods. Remote Sens. 2023, 15, 2680. [Google Scholar] [CrossRef]
- Wang, X.; Zhang, C.; Qiang, Z.; Liu, C.; Wei, X.; Cheng, F. A Coffee Plant Counting Method Based on Dual-Channel NMS and YOLOv9 Leveraging UAV Multispectral Imaging. Remote Sens. 2024, 16, 3810. [Google Scholar] [CrossRef]
- El Sakka, M.; Ivanovici, M.; Chaari, L.; Mothe, J. A Review of CNN Applications in Smart Agriculture Using Multimodal Data. Sensors 2025, 25, 472. [Google Scholar] [CrossRef]
- El Melki, M.N.; Yahyaoui, A.; Faqeih, K.Y.; Al-Khayri, J.M. Agricultural Drones: An Eye in the Sky for Smart Agriculture. In Handbook of Agricultural Technologies; Al-Khayri, J.M., Yatoo, A.M., Jain, S.M., Penna, S., Eds.; Springer Nature: Singapore, 2025; pp. 1–23. [Google Scholar]
- Gamboa-Cruzado, J.; Estrada-Gutierrez, J.; Bustos-Romero, C.; Alzamora Rivero, C.; Valenzuela, J.N.; Tavera Romero, C.A.; Gamarra-Moreno, J.; Amayo-Gamboa, F. A Review of Drones in Smart Agriculture: Issues, Models, Trends, and Challenges. Sustainability 2026, 18, 507. [Google Scholar] [CrossRef]
- Makam, S.; Komatineni, B.K.; Meena, S.S.; Meena, U. Unmanned Aerial Vehicles (UAVs): An Adoptable Technology for Precise and Smart Farming. Discov. Internet Things 2024, 4, 12. [Google Scholar] [CrossRef]
- Asif, M.; Rayamajhi, A.; Mahmud, M.S. Technological Progress Toward Peanut Disease Management: A Review. Sensors 2025, 25, 1255. [Google Scholar] [CrossRef] [PubMed]
- Șerban, M.I.; Grad-Rusu, E.; Florian, T.; Grad, M.; Florian, V.C. Efficacy of Drone-Applied Fungicide Treatments in Control of Sunflower Diseases. Drones 2026, 10, 33. [Google Scholar] [CrossRef]
- Furiosi, M.; Triachini, S.; Beone, G.M.; Fontanella, M.C.; Gaaied, S.; Arbi, G.; Lomadze, A.; Grella, M.; Mozzanini, E.; Dicembrini, E.; et al. Aerial Spray Application of Plant Protection Products for Grapevine Downy Mildew Control: Efficacy and Canopy Deposit Evaluation in Semi-Field Trials. Agronomy 2025, 15, 2703. [Google Scholar] [CrossRef]
- Pansy, D.L.; Murali, M. UAV Hyperspectral Remote Sensor Images for Mango Plant Disease and Pest Identification Using MD-FCM and XCS-RBFNN. Environ. Monit. Assess. 2023, 195, 1120. [Google Scholar] [CrossRef]
- Ochieng’, V.; Rwomushana, I.; Ong’amo, G.; Ndegwa, P.; Kamau, S.; Makale, F.; Chacha, D.; Gadhia, K.; Akiri, M. Optimum Flight Height for the Control of Desert Locusts Using Unmanned Aerial Vehicles (UAV). Drones 2023, 7, 233. [Google Scholar] [CrossRef]
- Alsadik, B.; Ellsäßer, F.J.; Awawdeh, M.; Al-Rawabdeh, A.; Almahasneh, L.; Oude Elberink, S.; Abuhamoor, D.; Al Asmar, Y. Remote Sensing Technologies Using UAVs for Pest and Disease Monitoring: A Review Centered on Date Palm Trees. Remote Sens. 2024, 16, 4371. [Google Scholar] [CrossRef]
- Juan, Y.; Ke, Z.; Chen, Z.; Zhong, D.; Chen, W.; Yin, L. Rapid Density Estimation of Tiny Pests from Sticky Traps Using Qpest RCNN in Conjunction with UWB-UAV-Based IoT Framework. Neural Comput. Appl. 2024, 36, 9779–9803. [Google Scholar] [CrossRef]
- Albattah, W.; Masood, M.; Javed, A.; Nawaz, M.; Albahli, S. Custom CornerNet: A Drone-Based Improved Deep Learning Technique for Large-Scale Multiclass Pest Localization and Classification. Complex Intell. Syst. 2023, 9, 1299–1316. [Google Scholar] [CrossRef]
- Amarasingam, N.; Powell, K.; Sandino, J.; Bratanov, D.; Ashan Salgadoe, A.S.; Gonzalez, F. Mapping of Insect Pest Infestation for Precision Agriculture: A UAV-Based Multispectral Imaging and Deep Learning Techniques. Int. J. Appl. Earth Obs. Geoinf. 2025, 137, 104413. [Google Scholar] [CrossRef]
- Bautista, A.S.; Tarrazó-Serrano, D.; Uris, A.; Blesa, M.; Estruch-Guitart, V.; Castiñeira-Ibáñez, S.; Rubio, C. Remote Sensing Evaluation Drone Herbicide Application Effectiveness for Controlling Echinochloa Spp. in Rice Crop in Valencia (Spain). Sensors 2024, 24, 804. [Google Scholar] [CrossRef]
- Fathimathul Rajeena, P.P.; Ismail, W.N.; Ali, M.A.S. A Metaheuristic Harris Hawks Optimization Algorithm for Weed Detection Using Drone Images. Appl. Sci. 2023, 13, 7083. [Google Scholar] [CrossRef]
- Kebede, A.S.; Muluneh, T.W.; Adege, A.B. Detection of Weeds in Teff Crops Using Deep Learning and UAV Imagery for Precision Herbicide Application. Sci. Rep. 2025, 15, 30708. [Google Scholar] [CrossRef]
- Takekawa, J.Y.; Hagani, J.S.; Edmunds, T.J.; Collins, J.M.; Chappell, S.C.; Reynolds, W.H. The Sky Is Not the Limit: Use of a Spray Drone for the Precise Application of Herbicide and Control of an Invasive Plant in Managed Wetlands. Remote Sens. 2023, 15, 3845. [Google Scholar] [CrossRef]
- Toscano, F.; Fiorentino, C.; Santana, L.S.; Magalhães, R.R.; Albiero, D.; Tomáš, Ř.; Klocová, M.; D’Antonio, P. Recent Developments and Future Prospects in the Integration of Machine Learning in Mechanised Systems for Autonomous Spraying: A Brief Review. AgriEngineering 2025, 7, 142. [Google Scholar] [CrossRef]
- Miyoshi, K.; Hiraguri, T.; Shimizu, H.; Hattori, K.; Kimura, T.; Okubo, S.; Endo, K.; Shimada, T.; Shibasaki, A.; Takemura, Y. Development of Pear Pollination System Using Autonomous Drones. AgriEngineering 2025, 7, 68. [Google Scholar] [CrossRef]
- Manthos, I.; Sotiropoulos, T.; Vagelas, I. Is the Artificial Pollination of Walnut Trees with Drones Able to Minimize the Presence of Xanthomonas Arboricola Pv. Juglandis? A Review. Appl. Sci. 2024, 14, 2732. [Google Scholar] [CrossRef]
- Rice, C.R.; McDonald, S.T.; Shi, Y.; Gan, H.; Lee, W.S.; Chen, Y.; Wang, Z. Perception, Path Planning, and Flight Control for a Drone-Enabled Autonomous Pollination System. Robotics 2022, 11, 144. [Google Scholar] [CrossRef]
- Hulens, D.; Van Ranst, W.; Cao, Y.; Goedemé, T. Autonomous Visual Navigation for a Flower Pollination Drone. Machines 2022, 10, 364. [Google Scholar] [CrossRef]
- Chen, B.; Su, Q.; Li, Y.; Chen, R.; Yang, W.; Huang, C. Field Rice Growth Monitoring and Fertilization Management Based on UAV Spectral and Deep Image Feature Fusion. Agronomy 2025, 15, 886. [Google Scholar] [CrossRef]
- Xia, X.; Zhang, R.; Ma, L.; Su, J.; Yi, T.; Zhang, L.; Chen, X. Optimization of Unmanned Aerial Vehicle Operational Parameters to Maximize Fertilizer Application Efficiency in Rice Cultivation. J. Clean. Prod. 2025, 514, 145762. [Google Scholar] [CrossRef]
- Zhou, H.; Yao, W.; Su, D.; Guo, S.; Zheng, Z.; Yu, Z.; Gao, D.; Li, H.; Chen, C. Application of a Centrifugal Disc Fertilizer Spreading System for UAVs in Rice Fields. Heliyon 2024, 10, e29837. [Google Scholar] [CrossRef] [PubMed]
- Al-Najadi, R.; Al-Mulla, Y.; Al-Abri, I.; Al-Sadi, A.M. Effectiveness of Drone-Based Thermal Sensors in Optimizing Controlled Environment Agriculture Performance under Arid Conditions. Sci. Rep. 2025, 15, 9042. [Google Scholar] [CrossRef] [PubMed]
- Goswami, A.; Singh, R. A Review of Drone Based Irrigation System for Large Farms. JGEU 2025, 13, 323–338. [Google Scholar] [CrossRef]
- Yadav, M.; Vashisht, B.B.; Vullaganti, N.; Kumar, P.; Jalota, S.K.; Kumar, A.; Kaushik, P. UAV-Enabled Approaches for Irrigation Scheduling and Water Body Characterization. Agric. Water Manag. 2024, 304, 109091. [Google Scholar] [CrossRef]
- Sharma, H.; Sidhu, H.; Bhowmik, A. Remote Sensing Using Unmanned Aerial Vehicles for Water Stress Detection: A Review Focusing on Specialty Crops. Drones 2025, 9, 241. [Google Scholar] [CrossRef]
- Ortega-Farias, S.; Ramírez-Cuesta, J.M.; Nieto, H. Recent Advances on Water Management Using UAV-Based Technologies. Irrig. Sci. 2025, 43, 1–3. [Google Scholar] [CrossRef]
- Liang, Z.; Fu, Z.; Kiplagat, D.; Wang, W.; Yang, J.; Li, Z.; Cao, Q.; Tian, Y.; Zhu, Y.; Cao, W.; et al. Rice Yield Prediction Base on UAV Multispectral Imagery Using Machine Learning Methods. Smart Agric. Technol. 2025, 12, 101549. [Google Scholar] [CrossRef]
- Tripathi, R.; Gouda, A.K.; Jena, S.S.; Mohapatra, R.R.; Lal, M.K.; Dash, S.K.; Sahoo, R.N.; Nayak, A.K. Rice Yield Prediction Using UAV-Mounted RGB Sensors and Machine Learning Algorithms. Proc. Indian Natl. Sci. Acad. 2025. [Google Scholar] [CrossRef]
- Kešelj, K.; Stamenković, Z.; Kostić, M.; Aćin, V.; Tekić, D.; Novaković, T.; Ivanišević, M.; Ivezić, A.; Magazin, N. Machine Learning (AutoML)-Driven Wheat Yield Prediction for European Varieties: Enhanced Accuracy Using Multispectral UAV Data. Agriculture 2025, 15, 1534. [Google Scholar] [CrossRef]
- Dos Santos Felipetto, H.; Mercante, E.; Viana, O.; Elias, A.R.; Benin, G.; Scolari, L.; Armadori, A.; Donato, D.G. Combining Machine Learning with UAV Derived Multispectral Aerial Images for Wheat Yield Prediction, in Southern Brazil. Eur. J. Remote Sens. 2025, 58, 2464663. [Google Scholar] [CrossRef]
- Ivošević, B.; Pajević, N.; Brdar, S.; Waqar, R.; Khan, M.; Valente, J. Comprehensive Dataset from High Resolution UAV Land Cover Mapping of Diverse Natural Environments in Serbia. Sci. Data 2025, 12, 66. [Google Scholar] [CrossRef] [PubMed]
- Pinkas, J.; Toman, P.; Svoboda, J. Comparison of RGB and Multispectral Cameras for Targeted Applications in Agriculture. APP 2024, 51, 69–74. [Google Scholar] [CrossRef]
- Zhang, S.; Wang, X.; Lin, H.; Dong, Y.; Qiang, Z. A Review of the Application of UAV Multispectral Remote Sensing Technology in Precision Agriculture. Smart Agric. Technol. 2025, 12, 101406. [Google Scholar] [CrossRef]
- Wróblewska, W.; Pawlak, J.; Paszko, D. The Influence of Factors on the Yields of Two Raspberry Varieties (Rubus idaeus L.) and the Economic Results. ASPHC 2020, 19, 63–70. [Google Scholar] [CrossRef]
- Wróblewska, W.; Pawlak, J.; Paszko, D. Economic Aspects in the Raspberry Production on the Example of Farms from Poland, Serbia and Ukraine. J. Hortic. Res. 2019, 27, 71–80. [Google Scholar] [CrossRef]
- Paszko, D.; Krawiec, P.; Pawlak, J.; Wróblewska, W. Assess the Cost and Profitability of Raspberry Production under Cover in the Context of Building Competitive Advantage on Example of Selected Farm. Ann. PAAAE 2017, XIX, 218–223. [Google Scholar] [CrossRef]
- Baranowska, A.; Skowera, B.; Węgrzyn, A. Wpływ Warunków Meteorologicznych i Zabiegów Agrotechnicznych Na Wynik Produkcyjny i Ekonomiczny Uprawy Maliny Jesiennej—Studium Przypadku. Agron. Sci. 2025, 79, 169–182. [Google Scholar] [CrossRef]
- Buczyński, K.; Kapłan, M.; Borkowska, A.; Kilmek, K. Wpływ Węgla Brunatnego Na Wielkość i Jakość Plonu Maliny Odmiany Polana. Ann. Hortic. 2024, 32, 5–20. [Google Scholar] [CrossRef]
- Kasetty, S.B.; K, R. Advancing Object Detection in Remote Sensing: A Rigorous Evaluation of YOLO Models. In Proceedings of the 2025 11th International Conference on Communication and Signal Processing (ICCSP); IEEE: Melmaruvathur, India, 2025; pp. 999–1004. [Google Scholar]
- Wang, A.; Chen, H.; Liu, L.; Chen, K.; Lin, Z.; Han, J.; Ding, G. YOLOv10: Real-Time End-to-End Object Detection. arXiv 2024, arXiv:2405.14458. [Google Scholar]









| Hyperparameter | Value |
|---|---|
| epochs | 200 |
| batch | 32 |
| imgsz | 640 |
| optimizer | ‘SGD’ |
| momentum | 0.937 |
| weight_decay | 0.0005 |
| lr0 | 0.01 |
| lrf | 0.01 |
| seed | 0 |
| augment | True |
| workers | 8 |
| patience | 0 |
| device | ‘cuda’ |
| Component | Specification |
|---|---|
| Processor (CPU) | AMD Ryzen 9 9950X (Advanced Micro Devices, Inc., Sunnyvale, CA, USA) |
| Graphics Processing Unit (GPU) | ASUS GeForce RTX 5080 PRIME OC, 16 GB (ASUSTek Computer Inc., Taipei, Taiwan) |
| Motherboard | Gigabyte B850 AORUS ELITE WIFI7 AM5 (GIGA-BYTE Technology Co., Ltd., New Taipei City, Taiwan) |
| Memory (RAM) | Corsair Vengeance, DDR5 2 × 32 GB (64 GB) (Corsair Memory, Inc., Milpitas, CA, USA) |
| Storage | SSD Lexar NM790 2 TB (Lexar Co., Limited, Shatin, Hong Kong) |
| Software | Version |
| Microsoft Windows | 11 Pro (build 26200, 64-bit) |
| Python | 3.11.13 |
| PyTorch | 2.8.0 |
| CUDA | 12.8 |
| cuDNN | 9.1.0.2 (NVIDIA build 91002) |
| Ultralytics | 8.3.13 |
| Image Modality | Model | Precision | Recall | mAP50 | mAP50:95 | F1-Score |
|---|---|---|---|---|---|---|
| RGB | YOLOv8s | 0.997 | 0.996 | 0.995 | 0.975 | 0.997 |
| YOLO11s | 0.996 | 0.998 | 0.995 | 0.975 | 0.997 | |
| YOLO12s | 0.996 | 0.996 | 0.995 | 0.973 | 0.996 | |
| NIR | YOLOv8s | 0.997 | 0.997 | 0.995 | 0.984 | 0.997 |
| YOLO11s | 0.997 | 0.996 | 0.995 | 0.986 | 0.997 | |
| YOLO12s | 0.998 | 0.998 | 0.995 | 0.986 | 0.998 | |
| RE | YOLOv8s | 0.996 | 0.999 | 0.995 | 0.985 | 0.998 |
| YOLO11s | 0.997 | 0.997 | 0.995 | 0.985 | 0.997 | |
| YOLO12s | 0.998 | 0.998 | 0.995 | 0.983 | 0.998 | |
| R | YOLOv8s | 0.996 | 0.996 | 0.995 | 0.941 | 0.996 |
| YOLO11s | 0.995 | 0.994 | 0.995 | 0.949 | 0.995 | |
| YOLO12s | 0.996 | 0.996 | 0.995 | 0.952 | 0.996 | |
| G | YOLOv8s | 0.988 | 0.999 | 0.995 | 0.973 | 0.999 |
| YOLO11s | 0.999 | 0.998 | 0.995 | 0.974 | 0.999 | |
| YOLO12s | 0.998 | 0.999 | 0.995 | 0.972 | 0.998 |
| Training Image Modality | Testing Image Modality | Model | Precision | Recall | mAP50 | mAP50:95 | F1-Score |
|---|---|---|---|---|---|---|---|
| RGB | NIR | YOLOv8s | 0.922 | 0.803 | 0.887 | 0.737 | 0.859 |
| YOLO11s | 0.236 | 0.272 | 0.285 | 0.226 | 0.253 | ||
| YOLO12s | 0.849 | 0.382 | 0.573 | 0.509 | 0.527 | ||
| RE | YOLOv8s | 0.925 | 0.822 | 0.904 | 0.773 | 0.871 | |
| YOLO11s | 0.875 | 0.287 | 0.472 | 0.370 | 0.432 | ||
| YOLO12s | 0.985 | 0.951 | 0.989 | 0.905 | 0.968 | ||
| R | YOLOv8s | 0.757 | 0.395 | 0.483 | 0.279 | 0.519 | |
| YOLO11s | 0.774 | 0.264 | 0.408 | 0.253 | 0.393 | ||
| YOLO12s | 0.867 | 0.677 | 0.813 | 0.512 | 0.760 | ||
| G | YOLOv8s | 0.897 | 0.619 | 0.743 | 0.553 | 0.733 | |
| YOLO11s | 0.757 | 0.218 | 0.344 | 0.232 | 0.339 | ||
| YOLO12s | 0.960 | 0.909 | 0.979 | 0.784 | 0.934 |
| Training Image Modality | Testing Image Modality | Model | Precision | Recall | mAP50 | mAP50:95 | F1-Score |
|---|---|---|---|---|---|---|---|
| NIR | RGB | YOLOv8s | 0.172 | 0.104 | 0.069 | 0.062 | 0.130 |
| YOLO11s | 0.698 | 0.482 | 0.547 | 0.422 | 0.570 | ||
| YOLO12s | 0.497 | 0.424 | 0.450 | 0.337 | 0.458 | ||
| RE | YOLOv8s | 0.994 | 0.996 | 0.995 | 0.972 | 0.995 | |
| YOLO11s | 0.997 | 0.997 | 0.995 | 0.978 | 0.997 | ||
| YOLO12s | 0.996 | 0.996 | 0.995 | 0.978 | 0.996 | ||
| R | YOLOv8s | 0.573 | 0.355 | 0.403 | 0.304 | 0.438 | |
| YOLO11s | 0.854 | 0.426 | 0.560 | 0.384 | 0.568 | ||
| YOLO12s | 0.691 | 0.549 | 0.613 | 0.392 | 0.612 | ||
| G | YOLOv8s | 0.828 | 0.605 | 0.718 | 0.624 | 0.699 | |
| YOLO11s | 0.937 | 0.790 | 0.907 | 0.756 | 0.857 | ||
| YOLO12s | 0.930 | 0.918 | 0.971 | 0.822 | 0.924 |
| Training Image Modality | Testing Image Modality | Model | Precision | Recall | mAP50 | mAP50:95 | F1-Score |
|---|---|---|---|---|---|---|---|
| RE | RGB | YOLOv8s | 0.801 | 0.358 | 0.426 | 0.380 | 0.495 |
| YOLO11s | 0.757 | 0.529 | 0.599 | 0.487 | 0.623 | ||
| YOLO12s | 0.732 | 0.644 | 0.725 | 0.606 | 0.685 | ||
| NIR | YOLOv8s | 0.994 | 0.998 | 0.995 | 0.982 | 0.996 | |
| YOLO11s | 0.995 | 0.995 | 0.995 | 0.981 | 0.995 | ||
| YOLO12s | 0.998 | 0.997 | 0.995 | 0.981 | 0.997 | ||
| R | YOLOv8s | 0.874 | 0.492 | 0.640 | 0.514 | 0.630 | |
| YOLO11s | 0.883 | 0.464 | 0.589 | 0.444 | 0.608 | ||
| YOLO12s | 0.857 | 0.727 | 0.842 | 0.632 | 0.787 | ||
| G | YOLOv8s | 0.963 | 0.874 | 0.949 | 0.875 | 0.917 | |
| YOLO11s | 0.949 | 0.907 | 0.959 | 0.877 | 0.927 | ||
| YOLO12s | 0.994 | 0.984 | 0.995 | 0.927 | 0.989 |
| Training Image Modality | Testing Image Modality | Model | Precision | Recall | mAP50 | mAP50:95 | F1-Score |
|---|---|---|---|---|---|---|---|
| R | RGB | YOLOv8s | 0.848 | 0.932 | 0.941 | 0.857 | 0.888 |
| YOLO11s | 0.951 | 0.949 | 0.985 | 0.902 | 0.950 | ||
| YOLO12s | 0.864 | 0.926 | 0.948 | 0.860 | 0.894 | ||
| NIR | YOLOv8s | 0.908 | 0.727 | 0.857 | 0.727 | 0.808 | |
| YOLO11s | 0.932 | 0.752 | 0.856 | 0.718 | 0.833 | ||
| YOLO12s | 0.942 | 0.856 | 0.922 | 0.712 | 0.897 | ||
| RE | YOLOv8s | 0.913 | 0.885 | 0.949 | 0.841 | 0.899 | |
| YOLO11s | 0.950 | 0.911 | 0.964 | 0.880 | 0.930 | ||
| YOLO12s | 0.952 | 0.860 | 0.927 | 0.808 | 0.904 | ||
| G | YOLOv8s | 0.993 | 0.994 | 0.995 | 0.951 | 0.994 | |
| YOLO11s | 0.998 | 0.994 | 0.994 | 0.950 | 0.996 | ||
| YOLO12s | 0.997 | 0.996 | 0.995 | 0.949 | 0.996 |
| Training Image Modality | Testing Image Modality | Model | Precision | Recall | mAP50 | mAP50:95 | F1-Score |
|---|---|---|---|---|---|---|---|
| G | RGB | YOLOv8s | 0.928 | 0.876 | 0.928 | 0.851 | 0.901 |
| YOLO11s | 0.922 | 0.937 | 0.968 | 0.879 | 0.930 | ||
| YOLO12s | 0.888 | 0.956 | 0.973 | 0.905 | 0.921 | ||
| NIR | YOLOv8s | 0.993 | 0.991 | 0.995 | 0.958 | 0.992 | |
| YOLO11s | 0.990 | 0.990 | 0.994 | 0.955 | 0.990 | ||
| YOLO12s | 0.994 | 0.993 | 0.995 | 0.960 | 0.994 | ||
| RE | YOLOv8s | 0.994 | 0.993 | 0.995 | 0.976 | 0.994 | |
| YOLO11s | 0.998 | 0.995 | 0.995 | 0.974 | 0.996 | ||
| YOLO12s | 0.996 | 0.995 | 0.995 | 0.971 | 0.995 | ||
| R | YOLOv8s | 0.985 | 0.991 | 0.995 | 0.861 | 0.988 | |
| YOLO11s | 0.984 | 0.977 | 0.993 | 0.848 | 0.981 | ||
| YOLO12s | 0.996 | 0.987 | 0.995 | 0.862 | 0.991 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Kapłan, M.; Buczyński, K.; Jarosz, Z. Detection of Floricane Raspberry Shrubs from Unmanned Aerial Vehicle Imagery Using YOLO Models. Agriculture 2026, 16, 664. https://doi.org/10.3390/agriculture16060664
Kapłan M, Buczyński K, Jarosz Z. Detection of Floricane Raspberry Shrubs from Unmanned Aerial Vehicle Imagery Using YOLO Models. Agriculture. 2026; 16(6):664. https://doi.org/10.3390/agriculture16060664
Chicago/Turabian StyleKapłan, Magdalena, Kamil Buczyński, and Zbigniew Jarosz. 2026. "Detection of Floricane Raspberry Shrubs from Unmanned Aerial Vehicle Imagery Using YOLO Models" Agriculture 16, no. 6: 664. https://doi.org/10.3390/agriculture16060664
APA StyleKapłan, M., Buczyński, K., & Jarosz, Z. (2026). Detection of Floricane Raspberry Shrubs from Unmanned Aerial Vehicle Imagery Using YOLO Models. Agriculture, 16(6), 664. https://doi.org/10.3390/agriculture16060664

