A Review of Integrated Approaches in Robotic Raspberry Harvesting
Abstract
1. Introduction
2. Materials and Methods
- (i).
- Machine vision methods for detection, localisation, and ripeness assessment;
- (ii).
- Design principles of end-effectors.
3. Results
3.1. Current State of Knowledge in Robotic Raspberry Harvesting
3.1.1. Integrated Robotic Systems Designed for Raspberry Harvesting
Lab2Field Approach to Robotic Raspberry Harvesting
Fieldwork Approach to Robotic Raspberry Harvesting
3.1.2. Research Conducted for Visual Detection and Assessment of Raspberry Ripeness
3.2. Navigation and Detection Analysis
3.2.1. Evaluation Metrics
3.2.2. The References Used
3.2.3. Comparison of the Analysed Studies
Deep Learning-Based Methods—CNN-Based Classification
Deep Learning-Based Methods—Image Segmentation
Deep Learning-Based Methods—Image Detection and Segmentation
Deep Learning-Based Methods—Object Detection and Classification Using YOLO Architecture
- YOLOv3-Based Methods
- YOLOv4-Based Methods
- YOLOv5-Based Methods
- YOLOv7-Based Methods
- YOLOv8-Based Methods
- YOLOv11-Based Methods
- Suitability for Raspberries
3.3. Analysis of Solutions for Gripping Fruits and Sensor Technology
3.3.1. The References Used
3.3.2. Classification and Principles of Grasping
- Fin-ray: fingers with an internal ribbed structure that deform on contact and passively envelop the shape of the fruit.
- Tendon-driven: a design inspired by the human hand where flexible rods bend to function similarly to tendons, ensuring sensitive and controlled force distribution.
- Enveloping: mechanisms that completely surround the fruit with soft material and release it through rotation or contraction.
- Hybrid end-effectors: devices that combine soft grip principles with additional functions such as integrated cutting devices, stiffness-changing mechanisms or modular, interchangeable end pieces.
- Fin-Ray end-effectors
- Tendon-driven end-effectors
- Enveloping end-effectors
- Hybrid end-effectors
3.3.3. Gripper Materials, Construction, and Variable Stiffness
3.3.4. Sensors and Grip Control
3.3.5. Comparison of Solutions for Gripping Fruits and Sensor Technology
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
| 2D | Two-dimensional |
| 3D | Three-dimensional |
| ADown | Attention-based Downsampling |
| AP | Average Precision |
| ASPP | Atrous Spatial Pyramid Pooling |
| BSD | Berkeley Segmentation Dataset |
| C2f | Cross Concat Fusion |
| C2f-OREPA | Cross Concat Fusion with Online Re-parameterization |
| C2PSA-SE | Cross Concat Partial Spatial Attention with Squeeze-and-Excitation |
| C3k2-BFAM | Cross-Stage Partial module and Bidirectional Feature Alignment Module |
| C3x | Cross-Stage Partial module variant |
| CAA | Coordinate Attention Algorithm |
| CBAM | Convolutional Block Attention Module |
| CNN | Convolutional Neural Networks |
| CSP | Cross-Stage Partial |
| CSPPC | Cross-Stage Partial Spatial Pyramid Pooling |
| DAM | Dense Attention Module |
| DWR | Dilation-Wise Residual |
| DySample | Dynamic Sampling |
| ECA | Efficient Channel Attention |
| ECA-SimAM | Efficient Channel Attention and Simple Attention Module |
| EcoFlex | Silicone Elastomer Material |
| EIoU_Loss | Extended Intersection over Union Loss |
| EMA | Efficient Multi-Scale Attention |
| EPFL | École Polytechnique Fédérale de Lausanne |
| F1 | F1 score |
| FAOSTAT | Food and Agriculture Organization Corporate Statistical Database |
| Fin-Ray | Fin-Ray effect gripper |
| FN | False Negative |
| FP | False Positive |
| FPN | Feature Pyramid Network |
| fps | Frames Per Second |
| HCSA | Hybrid Channel-Spatial Attention |
| HSA | Hybrid Soft Attention |
| HSV | Colour space (Hue, Saturation, Value) |
| CHT | Circular Hough Transform |
| k | number of object classes |
| LDA | Linear Discriminant Analysis |
| LfD | Learning from Demonstration |
| LUTs | Look-Up Tables |
| mAP | mean Average Precision |
| ML | Machine Learning |
| MobileNetv3 | Mobile Neural Network Version 3 |
| MSSENet | Multi-Scale Squeeze-and-Excitation Network |
| n | number of samples |
| OAK-D | OpenCV AI Kit with Depth camera |
| PCA | Principal Component Analysis |
| PDC | Parallel Dilated Convolution |
| PDMS | Polydimethylsiloxane |
| P-Head | Prediction Head |
| Pinter | interpolated precision |
| PT | Physical twin |
| QDA | Quadratic Discriminant Analysis |
| r | recall value |
| R-CNN | Region-based Convolutional Neural Network |
| RDR | Rate of Damage Ratio |
| Res-Net | Residual Network |
| RGB | Colour space (Red, Green, Blue) |
| RT-DETR | Real-Time DEtection TRansformer |
| SAM | Spatial Attention Module |
| SCNet50 | Self-Calibrated Network (50 layers) |
| SVM | Support Vector Machines |
| ToF | Time of Flight |
| TP | True Positive |
| TPE | Thermoplastic Elastomer |
| TPU | Thermoplastic Polyurethane |
| VGG16 | Visual Geometry Group 16-layer Network |
| ViTs | Vision Transformers |
| YCbCr | Luminance–Chrominance Colour Space |
| YOLO | You Only Look Once (real-time deep learning algorithm for object detection) |
| ZED2 | StereoLabs ZED 2 Depth Camera |
References
- FAO. Food and Agriculture Organization of the United Nations. Available online: https://www.fao.org/home/en (accessed on 11 October 2025).
- Ponder, A.; Hallmann, E. Phenolics and Carotenoid Contents in the Leaves of Different Organic and Conventional Raspberry (Rubus idaeus l.) Cultivars and Their in Vitro Activity. Antioxidants 2019, 8, 458. [Google Scholar] [CrossRef]
- Popa, R.G.; Șchiopu, E.C.; Pătrașcu, A.; Bălăcescu, A.; Toader, F.A. Raspberry Production Opportunity to Develop an Agricultural Business in the Context of the Circular Economy: Case Study in South-West Romania. Agriculture 2024, 14, 1822. [Google Scholar] [CrossRef]
- Apáti, F. Farm Economic Evaluation of Raspberry Production. Int. J. Hortic. Sci. 2014, 20, 53–56. [Google Scholar] [CrossRef]
- Junge, K.; Pires, C.; Hughes, J. Lab2Field Transfer of a Robotic Raspberry Harvester Enabled by a Soft Sensorized Physical Twin. Commun. Eng. 2023, 2, 40. [Google Scholar] [CrossRef]
- Ramsay, A.M. Mechanical Harvesting of Raspberries—A Review with Particular Reference to Engineering Development in Scotland. J. Agric. Eng. Res. 1983, 28, 183–206. [Google Scholar] [CrossRef]
- Rabcewicz, J.; Białkowski, P.; Konopacki, P. Evaluation of the Possibility of Shaking off Raspberry Fruits with a Pulsating Air Stream. J. Hortic. Res. 2017, 25, 61–66. [Google Scholar] [CrossRef][Green Version]
- Smith, E.A.; Ramsay, A.M. Forces during Fruit Removal by a Mechanical Raspberry Harvester. J. Agric. Eng. Res. 1983, 28, 21–32. [Google Scholar] [CrossRef]
- Fieldwork Robotics. Fieldwork Robotics—Soft, Selective & Autonomous Harvesting Robots. Available online: https://fieldworkrobotics.com/ (accessed on 12 October 2025).
- Kollewe, J. World’s First Raspberry Picking Robot Cracks the Toughest Nut: Soft Fruit. Available online: https://www.theguardian.com/business/2022/jun/01/uk-raspberry-picking-robot-soft-fruit (accessed on 12 October 2025).
- Kollewe, J. Improved Version of ‘Robocrop’ Only Picks Ripe Raspberries. Available online: https://www.theguardian.com/technology/article/2024/aug/26/improved-version-robocrop-only-picks-ripe-raspberries (accessed on 12 October 2025).
- Sauerwald, T.; Pulle, C.F.; Bodkin, T.; Whitear, D. An End-Effector. US20240284829A1, 29 August 2024. [Google Scholar]
- Strautiņa, S.; Kalniņa, I.; Kaufmane, E.; Sudars, K.; Namatēvs, I.; Nikulins, A.; Edelmers, E. RaspberrySet: Dataset of Annotated Raspberry Images for Object Detection. Data 2023, 8, 86. [Google Scholar] [CrossRef]
- Jafary, P.; Bazangeya, A.; Pham, M.; Campbell, L.G.; Saeedi, S.; Zareinia, K.; Bougherara, H. Raspberry PhenoSet: A Phenology-Based Dataset for Automated Growth Detection and Yield Estimation. arXiv 2024. [Google Scholar] [CrossRef]
- Ling, C.; Zhang, Q.; Zhang, M.; Gao, C. Research on Adaptive Object Detection via Improved HSA-YOLOv5 for Raspberry Maturity Detection. IET Image Process 2024, 18, 4898–4912. [Google Scholar] [CrossRef]
- Luo, R.; Ding, X.; Wang, J. Red Raspberry Maturity Detection Based on Multi-Module Optimized YOLOv11n and Its Application in Field and Greenhouse Environments. Agriculture 2025, 15, 881. [Google Scholar] [CrossRef]
- Zhang, X.; Zhang, N.; Xu, X.; Wang, H.; Cao, J. Optimal Cutting Point Determination for Robotic Raspberry Harvesting Based on Computer Vision Strategy. Multimed. Tools Appl. 2025, 84, 41257–41276. [Google Scholar] [CrossRef]
- Wang, C.; Pan, W.; Zou, T.; Li, C.; Han, Q.; Wang, H.; Yang, J.; Zou, X. A Review of Perception Technologies for Berry Fruit-Picking Robots: Advantages, Disadvantages, Challenges, and Prospects. Agriculture 2024, 14, 1346. [Google Scholar] [CrossRef]
- Li, L.; He, Z.; Li, K.; Ding, X.; Li, H.; Gong, W.; Cui, Y. Object Detection and Spatial Positioning of Kiwifruits in a Wide-Field Complex Environment. Comput. Electron. Agric. 2024, 223, 109102. [Google Scholar] [CrossRef]
- Gené-Mola, J.; Gregorio, E.; Guevara, J.; Auat, F.; Sanz-Cortiella, R.; Escolà, A.; Llorens, J.; Morros, J.R.; Ruiz-Hidalgo, J.; Vilaplana, V.; et al. Fruit Detection in an Apple Orchard Using a Mobile Terrestrial Laser Scanner. Biosyst. Eng. 2019, 187, 171–184. [Google Scholar] [CrossRef]
- Neupane, C.; Koirala, A.; Wang, Z.; Walsh, K.B. Evaluation of Depth Cameras for Use in Fruit Localization and Sizing: Finding a Successor to Kinect V2. Agronomy 2021, 11, 1780. [Google Scholar] [CrossRef]
- Gaikwad, S.; Tidke, S. Multi-Spectral Imaging for Fruits and Vegetables. Int. J. Adv. Comput. Sci. Appl. 2022, 13, 743–760. [Google Scholar] [CrossRef]
- Li, H.; Gu, Z.; He, D.; Wang, X.; Huang, J.; Mo, Y.; Li, P.; Huang, Z.; Wu, F. A Lightweight Improved YOLOv5s Model and Its Deployment for Detecting Pitaya Fruits in Daytime and Nighttime Light-Supplement Environments. Comput. Electron. Agric. 2024, 220, 108914. [Google Scholar] [CrossRef]
- Kienzle, S.; Sruamsiri, P.; Carle, R.; Sirisakulwat, S.; Spreer, W.; Neidhart, S. Harvest Maturity Detection for ‘Nam Dokmai #4’ Mango Fruit (Mangifera indica L.) in Consideration of Long Supply Chains. Postharvest Biol. Technol. 2012, 72, 64–75. [Google Scholar] [CrossRef]
- Mohammadi, V.; Kheiralipour, K.; Ghasemi-Varnamkhasti, M. Detecting Maturity of Persimmon Fruit Based on Image Processing Technique. Sci. Hortic. 2015, 184, 123–128. [Google Scholar] [CrossRef]
- Zhao, J.; Chen, J. Detecting Maturity in Fresh Lycium barbarum L. Fruit Using Color Information. Horticulturae 2021, 7, 108. [Google Scholar] [CrossRef]
- Talekar, B. A Detailed Review on Decision Tree and Random Forest. Biosci. Biotechnol. Res. Commun. 2020, 13, 245–248. [Google Scholar] [CrossRef]
- Olisah, C.C.; Trewhella, B.; Li, B.; Smith, M.L.; Winstone, B.; Whitfield, E.C.; Fernández, F.F.; Duncalfe, H. Convolutional Neural Network Ensemble Learning for Hyperspectral Imaging-Based Blackberry Fruit Ripeness Detection in Uncontrolled Farm Environment. Eng. Appl. Artif. Intell. 2024, 132, 107945. [Google Scholar] [CrossRef]
- Fu, L.; Feng, Y.; Majeed, Y.; Zhang, X.; Zhang, J.; Karkee, M.; Zhang, Q. Kiwifruit Detection in Field Images Using Faster R-CNN with ZFNet. IFAC-PapersOnLine 2018, 51, 45–50. [Google Scholar] [CrossRef]
- Xiao, B.; Nguyen, M.; Yan, W.Q. Fruit Ripeness Identification Using Transformers. Appl. Intell. 2023, 53, 22488–22499. [Google Scholar] [CrossRef]
- Hu, H.M.; Kaizu, Y.; Zhang, H.D.; Xu, Y.W.; Imou, K.; Li, M.; Huang, J.J.; Dai, S. Recognition and Localization of Strawberries from 3D Binocular Cameras for a Strawberry Picking Robot Using Coupled YOLO/Mask R-CNN. Int. J. Agric. Biol. Eng. 2022, 15, 175–179. [Google Scholar] [CrossRef]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
- Xu, D.; Zhao, H.; Lawal, O.M.; Lu, X.; Ren, R.; Zhang, S. An Automatic Jujube Fruit Detection and Ripeness Inspection Method in the Natural Environment. Agronomy 2023, 13, 451. [Google Scholar] [CrossRef]
- Tituaña, L.; Gholami, A.; He, Z.; Xu, Y.; Karkee, M.; Ehsani, R. A Small Autonomous Field Robot for Strawberry Harvesting. Smart Agric. Technol. 2024, 8, 100454. [Google Scholar] [CrossRef]
- Terven, J.; Córdova-Esparza, D.M.; Romero-González, J.A. A Comprehensive Review of YOLO Architectures in Computer Vision: From YOLOv1 to YOLOv8 and YOLO-NAS. Mach. Learn. Knowl. Extr. 2023, 5, 1680–1716. [Google Scholar] [CrossRef]
- Ni, X.; Li, C.; Jiang, H.; Takeda, F. Deep Learning Image Segmentation and Extraction of Blueberry Fruit Traits Associated with Harvestability and Yield. Hortic. Res. 2020, 7, 110. [Google Scholar] [CrossRef]
- Yang, W.; Ma, X.; Hu, W.; Tang, P. Lightweight Blueberry Fruit Recognition Based on Multi-Scale and Attention Fusion NCBAM. Agronomy 2022, 12, 2354. [Google Scholar] [CrossRef]
- Haydar, Z.; Esau, T.J.; Farooque, A.A.; Zaman, Q.U.; Hennessy, P.J.; Singh, K.; Abbas, F. Deep Learning Supported Machine Vision System to Precisely Automate the Wild Blueberry Harvester Header. Sci. Rep. 2023, 13, 10198. [Google Scholar] [CrossRef] [PubMed]
- Xiao, F.; Wang, H.; Xu, Y.; Shi, Z.; Kujawa, S.; Wojciechowski, T.; Piekutowska, M.; Xiao, F.; Wang, H.; Xu, Y.; et al. A Lightweight Detection Method for Blueberry Fruit Maturity Based on an Improved YOLOv5 Algorithm. Agriculture 2023, 14, 36. [Google Scholar] [CrossRef]
- Gai, R.; Liu, Y.; Xu, G. TL-YOLOv8: A Blueberry Fruit Detection Algorithm Based on Improved YOLOv8 and Transfer Learning. IEEE Access 2024, 12, 86378–86390. [Google Scholar] [CrossRef]
- Zhang, J.; Maleski, J.; Ashrafi, H.; Spencer, J.A.; Chu, Y. Open-Source High-Throughput Phenotyping for Blueberry Yield and Maturity Prediction Across Environments: Neural Network Model and Labeled Dataset for Breeders. Horticulturae 2024, 10, 1332. [Google Scholar] [CrossRef]
- Liu, Y.; Zheng, H.; Zhang, Y.; Zhang, Q.; Chen, H.; Xu, X.; Wang, G. “Is This Blueberry Ripe?”: A Blueberry Ripeness Detection Algorithm for Use on Picking Robots. Front. Plant Sci. 2023, 14, 1198650. [Google Scholar] [CrossRef] [PubMed]
- Li, Z.; Xu, R.; Li, C.; Munoz, P.; Takeda, F.; Leme, B. In-Field Blueberry Fruit Phenotyping with a MARS-PhenoBot and Customized BerryNet. Comput. Electron. Agric. 2025, 232, 110057. [Google Scholar] [CrossRef]
- Zhang, R.; Dong, W.; Hou, P.; Li, H.; Han, X.; Chen, Q.; Li, F.; Zhang, X. YOLOv11-BSD: Blueberry Maturity Detection under Simulated Nighttime Conditions Evaluated with Causal Analysis. Smart Agric. Technol. 2025, 12, 101314. [Google Scholar] [CrossRef]
- Pérez-Borrero, I.; Marín-Santos, D.; Gegúndez-Arias, M.E.; Cortés-Ancos, E. A Fast and Accurate Deep Learning Method for Strawberry Instance Segmentation. Comput. Electron. Agric. 2020, 178, 105736. [Google Scholar] [CrossRef]
- Yu, Y.; Zhang, K.; Liu, H.; Yang, L.; Zhang, D. Real-Time Visual Localization of the Picking Points for a Ridge-Planting Strawberry Harvesting Robot. IEEE Access 2020, 8, 116556–116568. [Google Scholar] [CrossRef]
- Ilyas, T.; Umraiz, M.; Khan, A.; Kim, H. DAM: Hierarchical Adaptive Feature Selection Using Convolution Encoder Decoder Network for Strawberry Segmentation. Front. Plant Sci. 2021, 12, 591333. [Google Scholar] [CrossRef]
- An, Q.; Wang, K.; Li, Z.; Song, C.; Tang, X.; Song, J. Real-Time Monitoring Method of Strawberry Fruit Growth State Based on YOLO Improved Model. IEEE Access 2022, 10, 124363–124372. [Google Scholar] [CrossRef]
- Fan, Y.; Zhang, S.; Feng, K.; Qian, K.; Wang, Y.; Qin, S. Strawberry Maturity Recognition Algorithm Combining Dark Channel Enhancement and YOLOv5. Sensors 2022, 22, 419. [Google Scholar] [CrossRef]
- Lemsalu, M.; Bloch, V.; Backman, J.; Pastell, M. Real-Time CNN-Based Computer Vision System for Open-Field Strawberry Harvesting Robot. IFAC-PapersOnLine 2022, 55, 24–29. [Google Scholar] [CrossRef]
- He, Z.; Karkee, M.; Zhang, Q. Detecting and Localizing Strawberry Centers for Robotic Harvesting in Field Environment. IFAC-PapersOnLine 2022, 55, 30–35. [Google Scholar] [CrossRef]
- Cai, C.; Tan, J.; Zhang, P.; Ye, Y.; Zhang, J. Determining Strawberries’ Varying Maturity Levels by Utilizing Image Segmentation Methods of Improved DeepLabV3+. Agronomy 2022, 12, 1875. [Google Scholar] [CrossRef]
- Tang, C.; Chen, D.; Wang, X.; Ni, X.; Liu, Y.; Liu, Y.; Mao, X.; Wang, S. A Fine Recognition Method of Strawberry Ripeness Combining Mask R-CNN and Region Segmentation. Front. Plant Sci. 2023, 14, 1211830. [Google Scholar] [CrossRef]
- Visentin, F.; Castellini, F.; Muradore, R. A Soft, Sensorized Gripper for Delicate Harvesting of Small Fruits. Comput. Electron. Agric. 2023, 213, 108202. [Google Scholar] [CrossRef]
- Ma, Z.; Dong, N.; Gu, J.; Cheng, H.; Meng, Z.; Du, X. STRAW-YOLO: A Detection Method for Strawberry Fruits Targets and Key Points. Comput. Electron. Agric. 2025, 230, 109853. [Google Scholar] [CrossRef]
- Lawal, O.M. Study on Strawberry Fruit Detection Using Lightweight Algorithm. Multimed. Tools Appl. 2024, 83, 8281–8293. [Google Scholar] [CrossRef]
- He, Z.; Karkee, M.; Zhang, Q. Enhanced Machine Vision System for Field-Based Detection of Pickable Strawberries: Integrating an Advanced Two-Step Deep Learning Model Merging Improved YOLOv8 and YOLOv5-Cls. Comput. Electron Agric 2025, 234, 110173. [Google Scholar] [CrossRef]
- Xie, H.; Zhang, D.; Yang, L.; Cui, T.; He, X.; Zhang, K.; Zhang, Z. Development, Integration, and Field Evaluation of a Dual-Arm Ridge Cultivation Strawberry Autonomous Harvesting Robot. J. Field Robot. 2025, 42, 1783–1798. [Google Scholar] [CrossRef]
- Zhang, X.; Thayananthan, T.; Usman, M.; Liu, W.; Chen, Y. Multi-Ripeness Level Blackberry Detection Using YOLOv7 for Soft Robotic Harvesting. In Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VIII; SPIE: Bellingham, DC, USA, 2023; p. 15. [Google Scholar] [CrossRef]
- Miraei Ashtiani, S.H.; Javanmardi, S.; Jahanbanifard, M.; Martynenko, A.; Verbeek, F.J. Detection of Mulberry Ripeness Stages Using Deep Learning Models. IEEE Access 2021, 9, 100380–100394. [Google Scholar] [CrossRef]
- Qiu, H.; Zhang, Q.; Li, J.; Rong, J.; Yang, Z. Lightweight Mulberry Fruit Detection Method Based on Improved YOLOv8n for Automated Harvesting. Agronomy 2024, 14, 2861. [Google Scholar] [CrossRef]
- Naranjo-Torres, J.; Mora, M.; Hernández-García, R.; Barrientos, R.J.; Fredes, C.; Valenzuela, A. A Review of Convolutional Neural Network Applied to Fruit Image Processing. Appl. Sci. 2020, 10, 3443. [Google Scholar] [CrossRef]
- Falih, B.S.; Gierz, Ł.; Al-Zaidi, G.A. Detecting Clustered Fruits Using a Hybrid of Convolutional Neural Networks and Machine Learning Classifiers—Case Study. Adv. Sci. Technology. Res. J. 2025, 19, 1–9. [Google Scholar] [CrossRef]
- Peng, Y.; Wang, A.; Liu, J.; Faheem, M. A Comparative Study of Semantic Segmentation Models for Identification of Grape with Different Varieties. Agriculture 2021, 11, 997. [Google Scholar] [CrossRef]
- Shi, X.; Wang, S.; Zhang, B.; Zhang, Z.; Wang, S.; Ding, X.; Wang, S.; Qi, P.; Yang, H. Advances in Berry Harvesting Robots. Horticulturae 2025, 11, 1042. [Google Scholar] [CrossRef]
- Zhang, D.; Zhang, W.; Yang, H.; Yang, H. Application of Soft Grippers in the Field of Agricultural Harvesting: A Review. Machines 2025, 13, 55. [Google Scholar] [CrossRef]
- Chauhan, A.; Brouwer, B.; Luo, L.; Nederhoff, L.; El Harchoui, N.; Shoushtari, A.L. Measuring the Response of Soft Fruits to Robotic Handling. Smart Agric. Technol. 2025, 12, 101445. [Google Scholar] [CrossRef]
- Hughes, J.; Culha, U.; Giardina, F.; Guenther, F.; Rosendo, A.; Iida, F. Soft Manipulators and Grippers: A Review. Frontiers Robotics AI 2016, 3, 69. [Google Scholar] [CrossRef]
- Navas, E.; Fernández, R.; Armada, M.; Gonzalez-de-Santos, P. Diaphragm-Type Pneumatic-Driven Soft Grippers for Precision Harvesting. Agronomy 2021, 11, 1727. [Google Scholar] [CrossRef]
- Elfferich, J.F.; Dodou, D.; Santina, C. Della Soft Robotic Grippers for Crop Handling or Harvesting: A Review. IEEE Access 2022, 10, 75428–75443. [Google Scholar] [CrossRef]
- Zhang, Y.; Wang, Z. Review of Robotic Grippers for High-Speed Handling of Fragile Foods. Adv. Robot. 2025, 39, 1054–1070. [Google Scholar] [CrossRef]
- Navas, E.; Fernández, R.; Sepúlveda, D.; Armada, M.; Gonzalez-De-santos, P. Soft Grippers for Automatic Crop Harvesting: A Review. Sensors 2021, 21, 2689. [Google Scholar] [CrossRef]
- He, Z.; Liu, Z.; Zhou, Z.; Karkee, M.; Zhang, Q. Improving Picking Efficiency under Occlusion: Design, Development, and Field Evaluation of an Innovative Robotic Strawberry Harvester. Comput Electron Agric 2025, 237, 110684. [Google Scholar] [CrossRef]
- Chen, K.; Li, T.; Yan, T.; Xie, F.; Feng, Q.; Zhu, Q.; Zhao, C. A Soft Gripper Design for Apple Harvesting with Force Feedback and Fruit Slip Detection. Agriculture 2022, 12, 1802. [Google Scholar] [CrossRef]
- Gunderman, A.L.; Collins, J.; Myer, A.; Threlfall, R.; Chen, Y. Tendon-Driven Soft Robotic Gripper for Berry. arXiv 2021. [Google Scholar] [CrossRef]
- Wang, X.; Kang, H.; Zhou, H.; Au, W.; Wang, M.Y.; Chen, C. Development and Evaluation of a Robust Soft Robotic Gripper for Apple Harvesting. Comput. Electron. Agric. 2023, 204, 107552. [Google Scholar] [CrossRef]
- Elfferich, J.F.; Shahabi, E.; Santina, C.D.; Dodou, D. BerryTwist: A Twisting-Tube Soft Robotic Gripper for Blackberry Harvesting. IEEE Robot. Autom. Lett. 2025, 10, 429–435. [Google Scholar] [CrossRef]
- Lin, J.; Hu, Q.; Xia, J.; Zhao, L.; Du, X.; Li, S.; Chen, Y.; Wang, X. Non-Destructive Fruit Firmness Evaluation Using a Soft Gripper and Vision-Based Tactile Sensing. Comput. Electron. Agric. 2023, 214, 108256. [Google Scholar] [CrossRef]
- Varghese, F.; Auat Cheein, F.; Koskinopoulou, M. Finite Element Optimization of a Flexible Fin-Ray-Based Soft Robotic Gripper for Scalable Fruit Harvesting and Manipulation. Smart Agric. Technol. 2025, 11, 100899. [Google Scholar] [CrossRef]
- Ait Ameur, M.A.; El-Sayed, A.M.; Yan, X.T.; Mehnen, J.; Maier, A.M. A Novel Opto-Tactile Sensing Approach to Enhance the Handling of Soft Fruit. Comput. Electron. Agric. 2025, 235, 110397. [Google Scholar] [CrossRef]
- Mawah, S.C.; Park, Y.-J. Tendon-Driven Variable-Stiffness Pneumatic Soft Gripper Robot. Robotics 2023, 12, 128. [Google Scholar] [CrossRef]
- Navas, E.; Shamshiri, R.R.; Dworak, V.; Weltzien, C.; Fernández, R. Soft Gripper for Small Fruits Harvesting and Pick and Place Operations. Front. Robot. AI 2023, 10, 1330496. [Google Scholar] [CrossRef]
- Navas, E.; Blanco, K.; Rodríguez-Nieto, D.; Fernández, R. A Modular Soft Gripper with Embedded Force Sensing and an Iris-Type Cutting Mechanism for Harvesting Medium-Sized Crops. Actuators 2025, 14, 432. [Google Scholar] [CrossRef]
- De Preter, A.; Anthonis, J.; De Baerdemaeker, J. Development of a Robot for Harvesting Strawberries. IFAC-PapersOnLine 2018, 51, 14–19. [Google Scholar] [CrossRef]
- Furia, F.; Pagliarani, N.; Junge, K.; Roels, E.; Terryn, S.; Vanderborght, B.; Brancart, J.; Hughes, J.; Cianchetti, M. Soft Pneumatic Gripper with Interchangeable Fingertips by Using Reversible Polymers: The GraspBerry, a Raspberry Picking Case Study. IEEE Robot. Autom. Mag. 2025, 2–10. [Google Scholar] [CrossRef]
- Blanco, K.; Navas, E.; Rodríguez-Nieto, D.; Emmi, L.; Fernández, R. Design and Experimental Assessment of 3D-Printed Soft Grasping Interfaces for Robotic Harvesting. Agronomy 2025, 15, 804. [Google Scholar] [CrossRef]
- Cao, M.; Sun, Y.; Zhang, J.; Ying, Z. A Novel Pneumatic Gripper Driven by Combination of Soft Fingers and Bellows Actuator for Flexible Grasping. Sens. Actuators A Phys. 2023, 355, 114335. [Google Scholar] [CrossRef]
- Li, H.; Xie, D.; Xie, Y. A Soft Pneumatic Gripper with Endoskeletons Resisting Out-of-Plane Bending. Actuators 2022, 11, 246. [Google Scholar] [CrossRef]
- Zaidi, S.; Maselli, M.; Laschi, C.; Cianchetti, M. Actuation Technologies for Soft Robot Grippers and Manipulators: A Review. Curr. Robot. Rep. 2021, 2, 355–369. [Google Scholar] [CrossRef]
- Xu, J.; Xu, B.; Zhan, H.; Xie, Z.; Tian, Z.; Lu, Y.; Wang, Z.; Yue, H.; Yang, F. A Soft Robotic System Imitating the Multimodal Sensory Mechanism of Human Fingers for Intelligent Grasping and Recognition. Nano Energy 2024, 130, 110120. [Google Scholar] [CrossRef]
- Li, S.; Sun, W.; Liang, Q.K.; Liu, C.P.; Liu, J. Assessing Fruit Hardness in Robot Hands Using Electric Gripper Actuators with Tactile Sensors. Sens. Actuators A Phys. 2024, 365, 114843. [Google Scholar] [CrossRef]
- Dimeas, F.; Sako, D.V.; Moulianitis, V.C.; Aspragathos, N.A. Design and Fuzzy Control of a Robotic Gripper for Efficient Strawberry Harvesting. In Robotica; Cambridge University Press: Cambridge, UK, 2015; Volume 33, pp. 1085–1098. [Google Scholar]
- Xiong, Y.; From, P.J.; Isler, V. Design and Evaluation of a Novel Cable-Driven Gripper with Perception Capabilities for Strawberry Picking Robots. Proc. IEEE Int. Conf. Robot. Autom. 2018, 7384–7391. [Google Scholar] [CrossRef]
- Yu, Y.; Xie, H.; Zhang, K.; Wang, Y.; Li, Y.; Zhou, J.; Xu, L. Design, Development, Integration, and Field Evaluation of a Ridge-Planting Strawberry Harvesting Robot. Agriculture 2024, 14, 2126. [Google Scholar] [CrossRef]
- Sobol, Z.; Kurpaska, S.; Nawara, P.; Pedryc, N.; Basista, G.; Tabor, J.; Hebda, T.; Tomasik, M. Prototype of a New Head Grabber for Robotic Strawberry Harvesting with a Vision System. Sensors 2024, 24, 6628. [Google Scholar] [CrossRef]










| Source | Model | mAP@0.5 [%] | Precision [%] | Recall [%] | F1-Score [%] | Accuracy [%] | fps |
|---|---|---|---|---|---|---|---|
| Ling et al. [15] | HSA-YOLOv5 (HSV Self-Adaption YOLOv5) | 97 | - | - | - | - | - |
| Luo et al. [16] | Improved YOLOv11n (HCSA + DWR + DySample) | 93.4 | 92.5–94.3 | - | 89 | - | - |
| Zhang et al. [17] | Improved YOLOv8n (Container + CAA. RGB-D) | 83.6 (day)/93.4 (night) | 81.0/88.4 | 80.7/90.7 | ≈85–89 | - | 153–167 |
| Source | Model | mAP@0.5 [%] | Precision [%] | Recall [%] | F1-Score [%] | Accuracy [%] | fps |
|---|---|---|---|---|---|---|---|
| Ni et al. [36] | Mask R-CNN (ResNet-101 + FPN) | 71.6 | - | - | - | 90.5 | - |
| Yang et al. [37] | YOLOv5 | 83.2 | 83.8 | 76.1 | 79.8 | - | - |
| Haydar et al. [38] | YOLOv4-tiny (DepthAI − OAK-D) | 86.5 | - | - | - | - | - |
| Xiao et al. [39] | Modified YOLOv5 (ShuffleNet + CBAM) | 91.5 | 96.3 | 92 | 94.12 | - | 67.1 |
| Gai et al. [40] | TL-YOLOv8 (Transfer Learning YOLOv8) | 94.1 | 84.6 | 91.3 | 87.8 | - | - |
| Zhang et al. [41] | YOLOv11m (High-Throughput Phenotyping Model) | - | 90.0 (mature)/81.0 (immature) | 91.0 (mature)/79.0 (immature) | 90.0/80.0 | - | - |
| Liu et al. [42] | BlueberryYOLO (YOLOv5x + MobileNetv3 + Little-CBAM + MSSENet + EIoU_Loss | 78.3 | 79.3 | 75.9 | - | - | - |
| Li et al. [43] | BerryNet (YOLOv8 + SAM + CNN maturity classifier) | 78.7 (fruit segmentation)/52.8 (cluster detection) | 75.4 | 71.3 | 73.3 | - | - |
| Zhang et al. [44] | YOLOv11-BSD (nighttime detection. causal robustness) | 91.8 | 89 | 85.7 | 87.3 | - | 66.5 |
| Source | Model | mAP@0.5 [%] | Precision [%] | Recall [%] | F1-Score [%] | Accuracy [%] | fps |
|---|---|---|---|---|---|---|---|
| Pérez-Borrero et al. [45] | Improved Mask R-CNN | 43.85 | - | - | - | - | 10 |
| Yu et al. [46] | R-YOLO (MobileNet-V1) | 94.43 | 93.46 | 93.94 | 18 | ||
| Ilyas et al. [47] | Straw-Net (encoder–decoder s DAM a PDC) | 91.67 | 91.7 | 87.4 | 89.5 | 88.8 | 53 |
| An et al. [48] | Improved YOLOv5 | 94.26 | 93.15 | 90.76 | 91.91 | 93.15 | 30.5 |
| Fan et al. [49] | Lightweight YOLOv5 | >85 | >80 | >80 | 85–90 | 80–88 | - |
| Hu et al. [31] | YOLOv3 + Mask R-CNN (stereo 3D) | - | 93.9 | - | - | 93.4–94.5 | - |
| Lemsalu et al. [50] | YOLOv5 (TensorRT. edge device Jetson AGX Xavier) | 91.5 (ripe fruit)/43.6 (stalk) | 89 | 89.8 | 89.4 | – | 45 |
| He et al. [51] | YOLOv4 + YOLOv4-tiny (Dual-stage 3D localisation) | 80.68 (detection)/86.45 (localisation) | - | - | 0.8 | - | 55.2/4.18 |
| Cai et al. [52] | Improved DeepLabV3+ (ECA-SimAM + CBAM) | 83.05 | - | - | - | 90.9 | 7.67 |
| Tang et al. [53] | Mask R-CNN + Self-Calibrated Convolutions (SCNet50) + SVM | 97.9 | 98 | 84 | 83–98 | 86.6 | 18.2 |
| Visentin et al. [54] | YOLOv8 (detection + ripeness + force ML) | - | 98.2 (plant detection) | 92.4 (fruit detection) | - | 82 (successful picking) | - |
| Ma et al. [55] | STRAW-YOLO (YOLOv8-Pose + EMA + C2f-OREPA + DCN-C2f + Keypoints) | 96 | 91.6 | 91.7 | 91.6 | - | 62.6 |
| Lawal [56] | YOLOStrawberry (YOLOv5-light. Shuffle_Block + ResNet + SE) | 89.7 | ≈86 | ≈83 | ≈84 | - | ≈137 |
| He et al. [57] | Two-step YOLOv8 + YOLOv5-cls | 83.2 (field dataset YOLOv8) | - | - | - | 95.1 (YOLOv5-cls) | 119 |
| Xie et al. [58] | YOLOv5s (detection)/YOLOv5s-seg (segmentation) | 98.0/93.4 | 97.6/92.5 | 93.7/86.6 | 95.6/89.5 | - | - |
| Source | Model | mAP@0.5 [%] | Precision [%] | Recall [%] | F1-Score [%] | Accuracy [%] | fps |
|---|---|---|---|---|---|---|---|
| Zhang et al. [59] | YOLOv7-base | 91.4 | 89 | 90 | 86 | - | 46 |
| Olisah et al. [28] | VGG16 | 95.4 | 95.1 | 94.8 | 95.1 | ||
| Miraei Ashtiani et al. [60] | ResNet-18/AlexNet (CNN classification) | - | - | - | - | 98.0–98.6 | - |
| Qiu et al. [61] | Improved YOLOv8n (CSPPC + ADown + P-Head + KD) | 86.8 | 88.9 | 78.1 | 83.2 | - | 19.8 (Jetson Nano) |
| Source | Crops | Gripper type/Picking Method | Picking Success Rate | Fruit Damage | Other Comparable Metrics |
|---|---|---|---|---|---|
| Design and fuzzy control of a robotic gripper for efficient strawberry harvesting [92] | Strawberries | Pressure sensor network + fuzzy force control; mechanical fruit removal (imitation of the human hand) | Not specified numerically; efficiency comparable to the human hand | Targeted minimization through force control; no quantification | Maximum permissible clamping force and tearing force measured; design and verification of fuzzy control |
| Design and Evaluation of a Novel Cable-Driven Gripper with Perception Capabilities for Strawberry Picking Robots [93] | Strawberries | Cable-driven ‘iris’ gripper with internal rotating blade, no contact with fruit (cuts the stem) | 96.77% (isolated strawberries) | Minimal; fruit is not touched by fingers, only the stem is cut | Picking time: 7.49 s (operation), 10.62 s total; tray of 7–12 fruits |
| Design, Development, Integration, and Field Evaluation of a Ridge-Planting Strawberry Harvesting Robot [94] | Strawberries (ridge-planting) | Non-destructive head: gripping the stem + quick cut (laser scanning on fingers) | 49.30% after sorting; 30.23% without sorting | Declared as non-destructive; quantification not specified | Speed: 7 s/fruit (1 arm), 4 s/fruit (2 arms) |
| Prototype of a New Head Grabber for Robotic Strawberry Harvesting with a Vision System [95] | Strawberries (gutters) | Jaw gripping of the stem + clamping + cutting; no contact with the fruit | 90% (harvesting efficiency of the robotic arm) | No mechanical damage in the laboratory; average length of stem remnant 14 mm | 95% accuracy in detecting ripe fruit; target time < 4 s/fruit |
| Berry Twist: a Twisting-Tube Soft Robotic Gripper for Blackberry Harvesting [77] | Blackberries | Soft textile ‘twisting-tube’ cuff; gripping + twisting + pulling | 82% (tearing), 95% (release from gripper) | RDR indicator evaluated; damage quantification not in % | Materials tested: thick/thin gauze, spandex, combination; best results with thick gauze |
| Lab2Field transfer of a robotic raspberry harvester enabled by a soft sensorized PT [5] | Raspberries | Parallel jaws with silicone fingers; clamping force control | 4/7 attempts successful in full-pipeline test (field) | ≈80% harvest with little or no damage (during lab → field transfer) | Alignment errors Δx, Δy evaluated; measurement of tensile and compressive forces during harvesting |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Suchopár, A.; Kuře, J.; Kuřetová, B.; Hromasová, M. A Review of Integrated Approaches in Robotic Raspberry Harvesting. Agronomy 2025, 15, 2677. https://doi.org/10.3390/agronomy15122677
Suchopár A, Kuře J, Kuřetová B, Hromasová M. A Review of Integrated Approaches in Robotic Raspberry Harvesting. Agronomy. 2025; 15(12):2677. https://doi.org/10.3390/agronomy15122677
Chicago/Turabian StyleSuchopár, Albert, Jiří Kuře, Barbora Kuřetová, and Monika Hromasová. 2025. "A Review of Integrated Approaches in Robotic Raspberry Harvesting" Agronomy 15, no. 12: 2677. https://doi.org/10.3390/agronomy15122677
APA StyleSuchopár, A., Kuře, J., Kuřetová, B., & Hromasová, M. (2025). A Review of Integrated Approaches in Robotic Raspberry Harvesting. Agronomy, 15(12), 2677. https://doi.org/10.3390/agronomy15122677

