A Deployment-Oriented Benchmarking of You Look Only Once (YOLO) Models for Orange Detection and Segmentation in Agricultural Robotics
Abstract
1. Introduction
2. Materials and Methods
2.1. Model Selection
2.2. Benchmarking Models
2.2.1. YOLOv5
2.2.2. GrapeDetectNet (GDN)
2.2.3. Improved YOLOv5
2.2.4. YOLOv7
2.2.5. DSW-YOLO
2.2.6. YOLOv8
2.2.7. NVW-YOLOv8
2.2.8. TCAttn-YOLOv8
2.2.9. YOLO11
2.2.10. CO-YOLO
2.3. Proposed Benchmarking
2.3.1. Dataset
2.3.2. Implementation Details
2.3.3. Experimental Setup
2.3.4. Evaluation Metrics
3. Results
3.1. Identification Accuracy
3.1.1. YOLOv5 Derivatives
3.1.2. YOLOv7 Derivatives
3.1.3. YOLOv8 Derivatives
3.1.4. YOLO11 Derivatives
3.2. Robustness
3.3. Computational Complexity
3.4. Execution Time Analysis
3.5. Energy Consumption
4. Discussion
4.1. A Practical Guide for Model Selection in Agricultural Robotics
4.2. Limitations and Future Research Directions
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- United Nations. World Population Prospects 2019: Highlights—Ten Key Findings. Available online: https://population.un.org/wpp/Publications/Files/WPP2019_10KeyFindings.pdf (accessed on 19 April 2025).
- Ishangulyyev, R.; Kim, S.; Lee, S.H. Understanding Food Loss and Waste—Why Are We Losing and Wasting Food? Foods 2019, 8, 297. [Google Scholar] [CrossRef] [PubMed]
- Economou, F.; Chatziparaskeva, G.; Papamichael, I.; Loizia, P.; Voukkali, I.; Navarro-Pedreño, J.; Klontza, E.; Lekkas, D.F.; Naddeo, V.; A Zorpas, A. The Concept of Food Waste and Food Loss Prevention and Measuring Tools. Waste Manag. Res. 2024, 42, 651–669. [Google Scholar] [CrossRef]
- WWF-UK. Driven to Waste: The Global Impact of Food Loss and Waste on Farms. Available online: https://wwfint.awsassets.panda.org/downloads/wwf_uk__driven_to_waste___the_global_impact_of_food_loss_and_waste_on_farms.pdf (accessed on 19 April 2025).
- Silwal, A.; Davidson, J.R.; Karkee, M.; Mo, C.; Zhang, Q.; Lewis, K. Design, Integration, and Field Evaluation of a Robotic Apple Harvester. J. Field Robot. 2017, 34, 1140–1159. [Google Scholar]
- Homayouni, T.; Maharlooei, M.; Toudeshki, A.; Ferguson, L.; Ehsani, R. Simultaneous Trunk and Canopy Shaking Improves Table Olive Harvester Efficiency Versus Trunk Shaking Alone. Horticulturae 2023, 9, 640. [Google Scholar]
- Fountas, S.; Malounas, I.; Athanasakos, L.; Avgoustakis, I.; Espejo-Garcia, B. AI-Assisted Vision for Agricultural Robots. AgriEngineering 2022, 4, 674–694. [Google Scholar] [CrossRef]
- Ren, G.; Wu, T.; Lin, T.; Yang, L.; Chowdhary, G.; Ting, K.C.; Ying, Y. Mobile Robotics Platform for Strawberry Sensing and Harvesting within Precision Indoor Farming Systems. J. Field Robot. 2024, 41, 2047–2065. [Google Scholar] [CrossRef]
- Li, M.; Liu, P. A Bionic Adaptive End-Effector with Rope-Driven Fingers for Pear Fruit Harvesting. Comput. Electron. Agric. 2023, 211, 107952. [Google Scholar] [CrossRef]
- Yang, Q.; Du, X.; Wang, Z.; Meng, Z.; Ma, Z.; Zhang, Q. A Review of Core Agricultural Robot Technologies for Crop Productions. Comput. Electron. Agric. 2023, 206, 107701. [Google Scholar] [CrossRef]
- Li, T.; Xie, F.; Zhao, Z.; Zhao, H.; Guo, X.; Feng, Q. A Multi-Arm Robot System for Efficient Apple Harvesting: Perception, Task Plan and Control. Comput. Electron. Agric. 2023, 211, 107979. [Google Scholar] [CrossRef]
- Yu, X.; Fan, Z.; Wang, X.; Wan, H.; Wang, P.; Zeng, X.; Jia, F. A Lab-Customized Autonomous Humanoid Apple Harvesting Robot. Comput. Electr. Eng. 2021, 96, 107459. [Google Scholar] [CrossRef]
- Magalhães, S.A.; Moreira, A.P.; dos Santos, F.N.; Dias, J. Active Perception Fruit Harvesting Robots—A Systematic Review. J. Intell. Robot. Syst. 2022, 105, 14. [Google Scholar] [CrossRef]
- Zhou, H.; Wang, X.; Au, W.; Kang, H.; Chen, C. Intelligent Robots for Fruit Harvesting: Recent Developments and Future Challenges. Precis. Agric. 2022, 23, 1856–1907. [Google Scholar] [CrossRef]
- Beldek, C.; Cunningham, J.; Aydin, M.; Sariyildiz, E.; Phung, S.L.; Alici, G. Sensing-Based Robustness Challenges in Agricultural Robotic Harvesting. In Proceedings of the 2025 IEEE International Conference on Mechatronics (ICM), Wollongong, Australia, 25–27 February 2025; pp. 1–6. [Google Scholar]
- Sakai, H.; Shiigi, T.; Kondo, N.; Ogawa, Y.; Taguchi, N. Accurate Position Detecting during Asparagus Spear Harvesting Using a Laser Sensor. Eng. Agric. Environ. Food 2013, 6, 105–110. [Google Scholar] [CrossRef]
- Liu, M.; Jia, W.; Wang, Z.; Niu, Y.; Yang, X.; Ruan, C. An Accurate Detection and Segmentation Model of Obscured Green Fruits. Comput. Electron. Agric. 2022, 197, 106984. [Google Scholar] [CrossRef]
- Kim, S.; Hong, S.-J.; Ryu, J.; Kim, E.; Lee, C.-H.; Kim, G. Application of Amodal Segmentation on Cucumber Segmentation and Occlusion Recovery. Comput. Electron. Agric. 2023, 210, 107847. [Google Scholar] [CrossRef]
- Rathore, D.; Divyanth, L.G.; Reddy, K.L.S.; Chawla, Y.; Buragohain, M.; Soni, P.; Machavaram, R.; Hussain, S.Z.; Ray, H.; Ghosh, A. A Two-Stage Deep-Learning Model for Detection and Occlusion-Based Classification of Kashmiri Orchard Apples for Robotic Harvesting. J. Biosyst. Eng. 2023, 48, 242–256. [Google Scholar] [CrossRef]
- Íñiguez, R.; Palacios, F.; Barrio, I.; Hernández, I.; Gutiérrez, S.; Tardaguila, J. Impact of Leaf Occlusions on Yield Assessment by Computer Vision in Commercial Vineyards. Agronomy 2021, 11, 1003. [Google Scholar] [CrossRef]
- WineAmerica. Wine as Agriculture—WineAmerica. Available online: https://wineamerica.org/wine-as-agriculture (accessed on 12 June 2025).
- Lin, G.; Tang, Y.; Zou, X.; Li, J.; Xiong, J. In-Field Citrus Detection and Localisation Based on RGB-D Image Analysis. Biosyst. Eng. 2019, 186, 34–44. [Google Scholar] [CrossRef]
- Fujinaga, T.; Yasukawa, S.; Ishii, K. Evaluation of Tomato Fruit Harvestability for Robotic Harvesting. In Proceedings of the 2021 IEEE/SICE International Symposium on System Integration (SII), Iwaki, Fukushima, Japan, 11–14 January 2021; pp. 35–39. [Google Scholar]
- Li, X.; Pan, J.; Xie, F.; Zeng, J.; Li, Q.; Huang, X.; Liu, D.; Wang, X. Fast and Accurate Green Pepper Detection in Complex Backgrounds via an Improved YOLOv4-Tiny Model. Comput. Electron. Agric. 2021, 191, 106503. [Google Scholar] [CrossRef]
- Zu, L.; Zhao, Y.; Liu, J.; Su, F.; Zhang, Y.; Liu, P. Detection and Segmentation of Mature Green Tomatoes Based on Mask R-CNN with Automatic Image Acquisition Approach. Sensors 2021, 21, 7842. [Google Scholar] [CrossRef] [PubMed]
- Chen, W.; Lu, S.; Liu, B.; Chen, M.; Li, G.; Qian, T. CitrusYOLO: An Algorithm for Citrus Detection under Orchard Environment Based on YOLOv4. Multimed. Tools Appl. 2022, 81, 31363–31389. [Google Scholar] [CrossRef]
- Yang, J.; Deng, H.; Zhang, Y.; Zhou, Y.; Miao, T. Application of Amodal Segmentation for Shape Reconstruction and Occlusion Recovery in Occluded Tomatoes. Front. Plant Sci. 2024, 15, 1391963. [Google Scholar] [CrossRef]
- Chu, P.; Li, Z.; Zhang, K.; Chen, D.; Lammers, K.; Lu, R. O2RNet: Occluder-Occludee Relational Network for Robust Apple Detection in Clustered Orchard Environments. Smart Agric. Technol. 2023, 5, 100284. [Google Scholar] [CrossRef]
- Wang, Y.; Xiao, S.; Meng, X. Incoherent Region-Aware Occlusion Instance Synthesis for Grape Amodal Detection. Sensors 2025, 25, 1546. [Google Scholar] [CrossRef]
- Li, Y.; Liao, J.; Wang, J.; Luo, Y.; Lan, Y. Prototype Network for Predicting Occluded Picking Position Based on Lychee Phenotypic Features. Agronomy 2023, 13, 2435. [Google Scholar] [CrossRef]
- Yuan, Y.; Liu, H.; Yang, Z.; Zheng, J.; Li, J.; Zhao, L. A Detection Method for Occluded and Overlapped Apples under Close-Range Targets. Pattern Anal. Appl. 2024, 27, 12. [Google Scholar] [CrossRef]
- Kok, E.; Chen, C. Occluded Apples Orientation Estimator Based on Deep Learning Model for Robotic Harvesting. Comput. Electron. Agric. 2024, 219, 108781. [Google Scholar] [CrossRef]
- Gong, L.; Wang, W.; Wang, T.; Liu, C. Robotic Harvesting of the Occluded Fruits with a Precise Shape and Position Reconstruction Approach. J. Field Robot. 2021, 39, 69–84. [Google Scholar] [CrossRef]
- Chen, C.; Li, B.; Liu, J.; Bao, T.; Ren, N. Monocular Positioning of Sweet Peppers: An Instance Segmentation Approach for Harvest Robots. Biosyst. Eng. 2020, 196, 15–28. [Google Scholar] [CrossRef]
- Gené-Mola, J.; Ferrer-Ferrer, M.; Gregorio, E.; Blok, P.M.; Hemming, J.; Morros, J.-R.; Rosell-Polo, J.R.; Vilaplana, V.; Ruiz-Hidalgo, J. Looking behind Occlusions: A Study on Amodal Segmentation for Robust On-Tree Apple Fruit Size Estimation. Comput. Electron. Agric. 2023, 209, 107854. [Google Scholar] [CrossRef]
- Liang, J.; Huang, K.; Lei, H.; Zhong, Z.; Cai, Y.; Jiao, Z. Occlusion-Aware Fruit Segmentation in Complex Natural Environments under Shape Prior. Comput. Electron. Agric. 2024, 217, 108620. [Google Scholar] [CrossRef]
- Tu, S.; Deng, L.; Huang, Z.; Tang, Q.; Peng, L. Passion Fruit Detection and Counting Based on Multiple Scale Faster R-CNN Using RGB-D Images. Precis. Agric. 2020, 21, 1072–1091. [Google Scholar] [CrossRef]
- Chen, J.; Fu, H.; Lin, C.; Liu, X.; Wang, L.; Lin, Y. YOLOPears: A novel benchmark of YOLO object detectors for multi-class pear surface defect detection in quality grading systems. Front. Plant Sci. 2025, 16, 1483824. [Google Scholar] [CrossRef]
- Mirhaji, H.; Soleymani, M.; Asakereh, A.; Abdanan Mehdizadeh, S. Fruit detection and load estimation of an orange orchard using the YOLO models through simple approaches in different imaging and illumination conditions. Comput. Electron. Agric. 2021, 191, 106533. [Google Scholar] [CrossRef]
- Kamat, P.; Gite, S.; Chandekar, H.; Dlima, L.; Pradhan, B. Multi-class fruit ripeness detection using YOLO and SSD object detection models. Discov. Appl. Sci. 2025, 7, 931. [Google Scholar] [CrossRef]
- Page, M.; Tetzlaff, J.; Moher, D. The PRISMA 2020 Statement: An Updated Guideline for Reporting Systematic Reviews. Value Health 2021, 23, S312–S313. [Google Scholar]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. arXiv 2015, arXiv:1506.02640. [Google Scholar]
- Badgujar, C.M.; Poulose, A.; Gan, H. Agricultural Object Detection with You Only Look Once (YOLO) Algorithm: A Bibliometric and Systematic Literature Review. Comput. Electron. Agric. 2024, 223, 109090. [Google Scholar] [CrossRef]
- Deng, L.; Luo, S.; He, C.; Xiao, H.; Wu, H. Underwater Small and Occlusion Object Detection with Feature Fusion and Global Context Decoupling Head-Based YOLO. Multimed. Syst. 2024, 30, 4. [Google Scholar] [CrossRef]
- Soviany, P.; Ionescu, R.T. Frustratingly Easy Trade-Off Optimization between Single-Stage and Two-Stage Deep Object Detectors. In Computer Vision—ECCV 2018 Workshops; Springer: Cham, Switzerland, 2018; pp. 366–378. [Google Scholar]
- Jocher, G. Ultralytics/Yolov5. Available online: https://github.com/ultralytics/yolov5 (accessed on 12 September 2025).
- Miao, Z.; Yu, X.; Li, N.; Zhang, Z.; He, C.; Li, Z.; Deng, C.; Sun, T. Efficient Tomato Harvesting Robot Based on Image Processing and Deep Learning. Precis. Agric. 2023, 24, 254–287. [Google Scholar] [CrossRef]
- Wang, W.; Shi, Y.; Liu, W.; Che, Z. An Unstructured Orchard Grape Detection Method Utilizing YOLOv5s. Agriculture 2024, 14, 262. [Google Scholar] [CrossRef]
- Zhao, J.; Bao, W.; Mo, L.; Li, Z.; Liu, Y.; Du, J. Design of Tomato Picking Robot Detection and Localization System Based on Deep Learning Neural Networks Algorithm of YOLOv5. Sci. Rep. 2025, 15, 6180. [Google Scholar] [CrossRef] [PubMed]
- Wang, C.Y.; Bochkovskiy, A.; Liao, H.Y.M. YOLOv7: Trainable Bag-of-Freebies Sets New State-of-the-Art for Real-Time Object Detectors. arXiv 2022, arXiv:2207.02696. [Google Scholar]
- Du, X.; Meng, Z.; Sapkota, R.; Ma, Z.; Cheng, H. DSW-YOLO: A Detection Method for Ground-Planted Strawberry Fruits under Different Occlusion Levels. Comput. Electron. Agric. 2023, 214, 108304. [Google Scholar] [CrossRef]
- Jocher, G.; Chaurasia, A.; Qiu, J. YOLOv8 by Ultralytics. 2023. Available online: https://github.com/ultralytics/ultralytics (accessed on 12 September 2025).
- Wang, A.; Qian, W.; Li, A.; Xu, Y.; Hu, J.; Xie, Y.; Zhang, L. NVW-YOLOv8s: An Improved YOLOv8s Network for Real-Time Detection and Segmentation of Tomato Fruits at Different Ripeness Stages. Comput. Electron. Agric. 2024, 219, 108833. [Google Scholar] [CrossRef]
- Tian, Z.; Hao, H.; Dai, G.; Li, Y. Optimizing Tomato Detection and Counting in Smart Greenhouses: A Lightweight YOLOv8 Model Incorporating High- and Low-Frequency Feature Transformer Structures. Netw. Comput. Neural Syst. 2024, 1–37. [Google Scholar] [CrossRef]
- Jocher, G.; Qiu, J. Ultralytics YOLO11. 2024. Available online: https://github.com/ultralytics/ultralytics (accessed on 12 September 2025).
- Jin, S.; Zhou, L.; Zhou, H. CO-YOLO: A Lightweight and Efficient Model for Camellia oleifera Fruit Object Detection and Posture Determination. Comput. Electron. Agric. 2025, 235, 110394. [Google Scholar] [CrossRef]
- Hou, C.; Zhang, X.; Tang, Y.; Zhuang, J.; Tan, Z.; Huang, H.; Chen, W.; Wei, S.; He, Y.; Luo, S. Detection and Localization of Citrus Fruit Based on Improved You Only Look Once v5s and Binocular Vision in the Orchard. Front. Plant Sci. 2022, 13, 937553. [Google Scholar] [CrossRef]
- Wong, K.Y. Yolov7/seg at u7. 2022. Available online: https://github.com/WongKinYiu/yolov7/tree/u7/seg (accessed on 12 June 2025).
- Meng, Z.; Du, X.; Sapkota, R.; Ma, Z.; Cheng, H. YOLOv10-Pose and YOLOv9-Pose: Real-Time Strawberry Stalk Pose Detection Models. Comput. Ind. 2025, 165, 104231. [Google Scholar] [CrossRef]
- TechPowerUp. NVIDIA GeForce RTX 3060 12 GB Specs. Available online: https://www.techpowerup.com/gpu-specs/geforce-rtx-3060-12-gb.c3682 (accessed on 12 June 2025).
Technique | Low Level | Medium Level | High Level |
---|---|---|---|
Random brightness contrast (brightness, contrast, p) | 0.1, 0.1, 1.0 | 0.3, 0.3, 1.0 | 0.5, 0.5, 1.0 |
Gauss noise (var_limit, p) | (10.0, 20.0), 1.0 | (30.0, 50.0), 1.0 | (60.0, 90.0), 1.0 |
Colour jitter (H, S, p) | 0.05, 0.05, 1.0 | 0.1, 0.1, 1.0 | 0.2, 0.2, 1.0 |
Rotate (limit, p) | No | No | 10, 0.5 |
Irregular stains (count, size, alpha) | (1–3), (20–50), 0.75 | (3–6), (40–80), 0.75 | (5–10), (60–120), 0.75 |
BB | M | ||||||||
---|---|---|---|---|---|---|---|---|---|
Ref. | Model | P | R | mAP@50 | mAP@50:95 | P | R | mAP@50 | mAP@50:95 |
[46] | Base-YOLOv5 | 0.815 | 0.847 | 0.916 | 0.739 | 0.815 | 0.844 | 0.913 | 0.653 |
[48] | GDN | 0.837 | 0.836 | 0.915 | 0.738 | 0.838 | 0.837 | 0.914 | 0.642 |
[49] | Improved-YOLOv5 | 0.839 | 0.849 | 0.92 | 0.742 | 0.843 | 0.842 | 0.918 | 0.649 |
BB | M | ||||||||
---|---|---|---|---|---|---|---|---|---|
Ref. | Model | P | R | mAP@50 | mAP@50:95 | P | R | mAP@50 | mAP@50:95 |
[50] | Base-YOLOv7 | 0.856 | 0.82 | 0.912 | 0.718 | 0.855 | 0.819 | 0.908 | 0.623 |
[51] | DSW-YOLO | 0.845 | 0.834 | 0.911 | 0.717 | 0.849 | 0.828 | 0.909 | 0.631 |
BB | M | ||||||||
---|---|---|---|---|---|---|---|---|---|
Ref. | Model | P | R | mAP@50 | mAP@50:95 | P | R | mAP@50 | mAP@50:95 |
[52] | Base-YOLOv8 | 0.825 | 0.838 | 0.916 | 0.738 | 0.835 | 0.825 | 0.915 | 0.649 |
[53] | NVW-YOLOv8 | 0.842 | 0.806 | 0.899 | 0.696 | 0.842 | 0.809 | 0.892 | 0.638 |
[54] | TCAttn-YOLOv8 | 0.816 | 0.824 | 0.898 | 0.688 | 0.816 | 0.824 | 0.896 | 0.61 |
BB | M | ||||||||
---|---|---|---|---|---|---|---|---|---|
Ref. | Model | P | R | mAP@50 | mAP@50:95 | P | R | mAP@50 | mAP@50:95 |
[55] | Base-YOLO11 | 0.795 | 0.855 | 0.904 | 0.71 | 0.793 | 0.853 | 0.901 | 0.627 |
[56] | CO-YOLO | 0.835 | 0.839 | 0.91 | 0.721 | 0.834 | 0.838 | 0.907 | 0.637 |
Rank | Model | Mean Drop Ratio | Robustness Score (1–10) |
---|---|---|---|
1 | YOLO11-based | 0.359 | 10.00 |
2 | CO-YOLO | 0.375 | 9.45 |
3 | GDN | 0.536 | 2.67 |
4 | Improved YOLOv5 | 0.539 | 2.56 |
5 | YOLOv5-base | 0.542 | 2.43 |
6 | YOLOv7-base | 0.563 | 1.70 |
7 | TCAttn-YOLOv8 | 0.587 | 0.86 |
8 | DSW-YOLO | 0.592 | 0.67 |
9 | YOLOv8-base | 0.607 | 0.16 |
10 | NVW-YOLOv8s | 0.612 | 0.00 |
Device | FP32 GFLOPs | Compatible Models (Based on GFLOPs) |
---|---|---|
Raspberry Pi 5 | ~30 | base-YOLO11 and CO-YOLO |
Intel HD Graphics 12EU Mobile | ~96 | base-YOLOv5, GDN, base-YOLOv8 and NVW-YOLOv8 |
Jetson Nano | ~236 | DSW-YOLO and Base-YOLOv7 |
Model | GFLOPs | Parameter Size (M) | Total Time (ms) | Energy Consumption (J) |
---|---|---|---|---|
Base-YOLOv5 | 37.8 | 9.76 | 7.8 | 1.17 |
GDN | 65.2 | 15.98 | 12 | 1.8 |
Improved-YOLOv5 | 41.5 | 9.93 | 7.4 | 1.11 |
Base-YOLOv7 | 142.6 | 37.86 | 23.3 | 3.5 |
DSW-YOLO | 137.3 | 34.51 | 20.8 | 3.12 |
Base-YOLOv8 | 42.5 | 11.77 | 8.3 | 1.25 |
NVW-YOLOv8 | 42.4 | 11.70 | 8.4 | 1.26 |
TCAttn-YOLOv8 | 16 | 2.96 | 5.1 | 0.76 |
Base-YOLO11 | 10.2 | 2.83 | 5.5 | 0.82 |
CO-YOLO | 12 | 2.7 | 5.9 | 0.89 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Beldek, C.; Sariyildiz, E.; Alici, G. A Deployment-Oriented Benchmarking of You Look Only Once (YOLO) Models for Orange Detection and Segmentation in Agricultural Robotics. Agriculture 2025, 15, 2170. https://doi.org/10.3390/agriculture15202170
Beldek C, Sariyildiz E, Alici G. A Deployment-Oriented Benchmarking of You Look Only Once (YOLO) Models for Orange Detection and Segmentation in Agricultural Robotics. Agriculture. 2025; 15(20):2170. https://doi.org/10.3390/agriculture15202170
Chicago/Turabian StyleBeldek, Caner, Emre Sariyildiz, and Gursel Alici. 2025. "A Deployment-Oriented Benchmarking of You Look Only Once (YOLO) Models for Orange Detection and Segmentation in Agricultural Robotics" Agriculture 15, no. 20: 2170. https://doi.org/10.3390/agriculture15202170
APA StyleBeldek, C., Sariyildiz, E., & Alici, G. (2025). A Deployment-Oriented Benchmarking of You Look Only Once (YOLO) Models for Orange Detection and Segmentation in Agricultural Robotics. Agriculture, 15(20), 2170. https://doi.org/10.3390/agriculture15202170