Next Article in Journal
Optimizing Plant Production Through Drone-Based Remote Sensing and Label-Free Instance Segmentation for Individual Plant Phenotyping
Previous Article in Journal
Evaluation of Photosynthetic Performance and Adaptability of Grape Varieties in Arid Regions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Advances in Berry Harvesting Robots

1
Shandong Academy of Agricultural Machinery Sciences (SAAMS), Jinan 250100, China
2
Key Laboratory of Intelligent Agricultural Equipment in Hilly and Mountainous Areas of Shandong Province, Jinan 250100, China
3
Huang Huai Hai Key Laboratory of Modern Agricultural Equipment, Ministry of Agriculture and Rural Affairs, Jinan 250100, China
4
Agricultural Robotics Innovation Team (ARIT), SAAMS, Jinan 250100, China
5
School of Automation, Qingdao University, Qingdao 266071, China
*
Authors to whom correspondence should be addressed.
Horticulturae 2025, 11(9), 1042; https://doi.org/10.3390/horticulturae11091042
Submission received: 3 July 2025 / Revised: 6 August 2025 / Accepted: 25 August 2025 / Published: 2 September 2025
(This article belongs to the Special Issue A New Wave of Smart and Mechanized Techniques in Horticulture)

Abstract

Berries are popular by consumers for improving vision, lowering blood sugar, improving circulation, and cardiovascular protection. They are usually small, thin-skinned, and fragile, with inconsistent ripening times. Harvesting robots are able to accurately determine the ripeness of fruits, avoiding pulp breakage and nutrient loss caused by manual squeezing. This work reviews the development and application of berry harvesting robots with market prospects in recent years. Next, this paper discusses the key technologies of berry picking robots, including fruit detection and localization technology, motion planning technology, and end-effector and harvesting mechanism. It also discusses the challenges currently faced in the development of berry harvesting robots, including external factors such as unstructured working environments and internal technical difficulties such as robot design and control. To address these challenges, future berry picking robots should focus on developing weak supervision recognition models based on deep learning, high-speed collision-free multi-arm collaborative harvesting technology, and high fault-tolerant harvesting technology to improve picking efficiency and quality, reduce fruit damage, and promote the automation and intelligence of the berry harvesting.

1. Introduction

In recent years, edible berries have gained widespread popularity among consumers in countries around the world. Berries such as blueberries, grapes, kiwifruits, and strawberries are usually flavorful, small, colorful, juicy, soft, and rich in nutrients [1,2]. Berries are rich in anthocyanin, ascorbic acid, and many other antioxidant substances, which have strong antioxidant and free radical scavenging functions [3]. For example, there are a large amount of anthocyanins in the blueberries, which are powerful antioxidants in nature [4,5]. They remove excess free radicals in the body and reduce cellular oxidative damage, thereby delaying aging, preventing cardiovascular disease, reducing the incidence of chronic diseases, and so on [6,7,8].
As one of the critical step of berry production, fruit harvesting needs to be completed at a specific time, otherwise it will affect the quality of the fruit as well as later sales [9,10]. Currently, the harvest of berries mainly relies on manual operations in many countries such as China, which is inefficient and requires a large amount of man power and material resources [11]. With the aging of the population in berry-producing countries such as Poland and Japan, the cost of berry harvesting is increasing year by year, which seriously affects the high-quality development of the berry industry [12,13]. As a result, a number of traditional berry harvesting machines have been developed to improve harvesting efficiency and reduce labor costs. As shown in Figure 1, the blueberry vibratory harvester was developed by Brondino et al. It utilized a comb-shaped device that achieves fruit removal by inserting into the blueberry bush and applying vibrations to the branches. It could harvest fruit from two blueberry plants at the same time, which reduced costs by 30% compared to the manual operation [14].
Malladi A et al. designed the hand-held vibratory picking device for rabbiteye and southern highbush blueberry. The experimental results showed that the vibratory picker could separate the fruit by applying vibration to the blueberry branches for 3–4 s. However, the harvesting effects varied greatly depending on the plant characteristics. The harvest rate of immature fruit from rabbiteye blueberries by this device was only 1%, while that of southern highbush blueberries was up to 23%, which revealed that this device was well suited for use in rabbiteye blueberry harvesting. The device could be used for small batch blueberry harvesting, but it was difficult to meet the needs of large batch blueberry harvesting [15]. Wang et al. designed a harvesting machine for wolfberry fruits with low loss (Figure 2). Wolfberry fruit harvesting was achieved by workers operating a retractable vibrating harvesting mechanism. Internal and external collection modules made of cushioning material were designed and developed to receive the maximum number of fruits and to minimize damage. The machine had a harvesting efficiency of nearly 90% for mature fruits and 7.40% for immature fruits [16]. Sargent et al. developed an improved over-the-row mechanical blueberry harvester. The improved machine caused less damage to the fruit after harvesting [17,18].
With the improvement of berry mechanical harvesting principles and technology, the traditional harvesting machinery compared to manual harvesting efficiency increased dramatically, but there are also many shortcomings such as high harvest rate of immature fruit, low degree of automation, and severe fruit damage [18,19]. In order to solve these problems, berry harvesting robots with high intelligence and fruit picking accuracy have been developed [20,21,22,23,24,25,26]. They can accurately locate ripe berries through sensors and object recognition technology, avoiding the problem of gathering unripe or decaying berries caused by errors in judgment during manual picking [26]. Moreover, they operate at a stable speed, greatly improving the picking efficiency. Their flexible mechanical arm and bionic jaws can be gently harvested in conjunction with each other, significantly reducing the rate of fruit damage [27].
This paper focuses on the research of berry harvesting robots. Relevant studies on berry harvesting robots, including their fruit recognition and positioning technologies, motion planning technologies, as well as fruit separation and fixation technologies, were systematically retrieved from academic databases such as Web of Science, Google Scholar, ScienceDirect, and IEEE Xplore. These articles were carefully organized and classified based on their research focuses and technical characteristics. This study further analyzes the key issues and challenges confronting berry harvesting robots and forecasts their future development trends, with the objective of offering valuable insights for subsequent research.

2. Overview of Berry Harvesting Robots

As shown in Figure 3, there are perception components, decision-making components, and execution components in most berry harvesting robots. The perception components are the “eyes” and “nerves” of berry harvesting robots to achieve precise operation. They mainly include visual sensors, force sensors, distance sensors, etc. Visual sensors such as CCD cameras and RGB-D cameras can obtain the color, shape, size, and position of the fruit, providing the basis for fruit localization and motion planning. Force sensors are usually integrated into the end-effector of the manipulator, which can sense the contact force between the fruit and the fruit stalk in real time during the picking process to avoid damage to the fruit or berry plant due to excessive force. Distance sensors such as infrared sensors and laser radars can detect the distance between the robot and obstacles such as fruit branches and leaves, assisting the robot to avoid obstacles and adjust the movement trajectory in complex field environments. These sensors work together to enable berry harvesting robots to accurately perceive the working environment and the state of the object fruit and realize intelligent and flexible picking operations [28]. Decision-making components of harvesting robots make up their “brain” to achieve autonomous operation, complete the fruit recognition and positioning, manipulator motion planning, picking strength control, and other decision-making instruction generation and optimization through the fusion and analysis of multi-source data [29]. The execution components are the “hands and feet” of berry harvesting robots that complete the fruit picking task, directly determining the accuracy and efficiency of the operation, mainly including manipulator, end-effector, and mobile institutions [30,31].
Fieldwork Robotics in Cambridge, UK developed a raspberry harvesting robot that can harvest more than 25,000 raspberries in one day in open-field farm, which was far more than the average worker’s output of 15,000. It had multiple built-in sensors and 3D cameras, used machine learning to detect raspberries, picked the fruits and put them in trays, and sorted them according to the maturity of the fruits [32]. As shown in Figure 4, a multi-row harvesting robot for an open-field strawberry farm was developed by Harvest CROO Robotics in Florida, US, which could harvest more than a dozen rows of strawberries at the same time [33,34]. During the harvesting process, dual cameras on the robot not only scanned the full range of fruits to determine if they are ripe for picking but also collected information about the health of the plant and made yield predictions. Each fruit was accurately recognized, and coordinates were set up so that the robot could pick it accurately.
The SW 6010 strawberry harvesting robot was developed by Agrobot Inc. in Huelva, Spain, and it was adapted to the open-field farm. It used AGvision optical sensors, which recognized and located harvestable fruit according to size, external quality, and ripeness. Multiple simultaneous robotic arms and end-effectors were used to improve operation efficiency. After picking, the fruits were put on the conveyor belt and sent to the packing area to complete the immediate sorting and packing [35,36]. Shibuya Seiki Inc. in Manassas, US developed a strawberry harvesting robot that takes about 8 s to pick a single strawberry in a greenhouse. It used a three-dimensional camera system that captured the color of the strawberries to determine whether the fruit was ripe or not, and then it cut the ripe fruit and put it into a basket [24,37]. Octinion company in Leuven, Belgium developed the Rubion harvesting robot for strawberries from a greenhouse, which picked, graded, and packed the strawberries at a time (Figure 5). It picked ripe strawberries from the bottom up, as picking from below to up reduces decision time due to the time required to locate the right picking point. Its picking speed, picking quality, and sorting quality were comparable to those of one skilled human. It could theoretically harvest 180–360 kg strawberry fruits in 16 h, far exceeding the 50 kg that farmers can harvest by hand [38]. Soran et al. proposed a modular and configurable harvesting system that is highly adaptable to different strawberry varieties and growing conditions in a glasshouse. It integrated independent grasping and obstruction removal mechanisms to selectively pick target strawberries while minimizing fruit damage and waste. Field experiments at commercial farms showed a 95% success rate in accurately detecting ripe fruits and an 87% harvesting success rate [39].

3. Fruit Detection and Localization Technology

One of the advantages of harvesting robots is that they could utilize multiple sensors to reliably determine whether the object fruit meets the harvesting criteria. It can also adjust the harvesting criteria by setting different harvesting thresholds. During berry harvesting, robots need to obtain information about the size, shape, color, texture, and hardness of the object fruit by various sensors to perform picking assessment, object recognition, and spatial localization. Most of the current berry harvesting robots use visual sensors for obtaining information about fruit morphological features and picking scenes [40,41]. Commonly used visual sensors include color cameras, binocular cameras, RGB-D cameras, LIDAR, near-infrared cameras, and even multispectral cameras [40,42,43,44,45,46].
Fruit detection algorithms for berry harvesting robots are currently categorized into three main groups, they are traditional image processing-based methods, machine learning (ML)-based methods, and deep learning (DL)-based methods [47]. Traditional recognition methods segment fruits by features such as color and shape but have weak adaptability to complex backgrounds or occluded scenes [48]. ML-based methods such as SVM and KNN combine features such as texture and edges to improve classification ability but rely on manual feature extraction and have limited generalization [49,50]. In recent years, convolutional neural networks (CNNs) have made great progress in multi-category object detection [51,52]. They can stack more and deeper multi-layer perceptron and hidden layers, use supervised or semi-supervised feature learning and hierarchical feature extraction algorithms, and thus directly learn different levels of feature representations from a large amount of data or images and create better detection models [53,54]. As a result, fruit detection algorithms based on DL such as Faster R-CNN, Mask R-CNN, SSD, and YOLO are widely used on the latest berry harvesting robots. Fruit detection algorithms in the berry harvesting robots based on DL are summarized in Table 1.
In order to solve the problems of low speed and low accuracy in recognizing waxberry under complex orchards, Yang et al. proposed a lightweight detection algorithm based on the YOLOv5 model. It used C3-Faster1 and C3-Faster2 modules to improve the detection accuracy and speed. To enhance the ability to detect small fruits, the algorithm also introduced coordinate attention and dynamic detection head. The mean average precision (mAP) value of this algorithm was 91.9% [66]. In order to solve the problem of difficult recognition and localization of fruit and picking points caused by leaf occlusion and fruit overlapping, Ma et al. proposed the novel model for detecting fruits and key points based on the YOLOv8-pose model. By introducing the C2F-OREPA module, the DCN-C2f module, and the EMA multi-scale attention mechanism, the detection accuracy and speed of strawberry fruits and plucking positions were effectively improved. The success rates of two-dimensional and three-dimensional localization of strawberry harvesting points by this model were 92.6% and 83.7% respectively [55]. Sozzi et al. collected images of grapes with different lighting, backgrounds, and growth stages and investigated the detection performance of different versions of YOLO on white grape bunches. As shown in Figure 6, the YOLOv5x algorithm was significantly better than the other algorithms in terms of detection accuracy and speed. In the case of occluded fruits, its detection average error was below 14% [62].
Detection algorithms based on DL generally require collecting and labeling a large amount of data with high time cost. To address this problem, Ciarfuglia et al. proposed the weakly supervised fruit detection algorithm based on a small amount of labeled grape data. It validated the effectiveness of the method in one vineyard in the south of Lazio, Italy, by means of the pseudo-label generation strategy and combining the YOLOv5 detection with the Mask R-CNN segmentation network. Experiments showed that it improved the detection mAP to 90.5% and the segmentation mAP to 78.30% with 300 labeled images. This significantly reduced the dependence of agricultural robots on high-cost labeled data, and it was well suited for small and medium-sized agribusinesses [63].
To achieve accurate and rapid localization of bayberry fruits, Lei et al. developed a high-precision, high-speed segmentation algorithm based on machine learning. In complex orchard scenarios, the bayberry segmentation accuracy reached 97.4%, with a single frame processing time of only 0.136 s, meeting the requirements of harvesting robots [67]. Akiva et al. designed a DL algorithm based on U-Net for segmenting and counting cranberries. Compared to the original algorithm, it improved overall segmentation performance by more than 6.74% and counting results by 22.91% [68]. Ni et al. proposed a blueberry fruit recognition and localization framework based on 3D photogrammetry and the Mask R-CNN instance segmentation algorithm. The accuracy of determining the number of fruits in a cluster was 97.3% [69].
Arihara et al. reported the improved algorithm based on the SSD model to detect and localize object fruits and used the stereo vision system to obtain three-dimensional coordinates. Afterwards, the robot planned the motion of the robotic arm joints through inverse kinematics and harvested the fruit by twisting the end-effector, with a recognition success rate of more than 90% [70]. Ge et al. realized strawberry localization by density base clustering and position approximation methods and proposed the safe region classification algorithm based on Hough Transform and 3D point cloud plane fitting to avoid the robotic arm colliding with table boards and straps. The optimized localization method has a picking accuracy of 74.1% in the modified scenario [71]. Due to the natural wind, mechanical collision and other disturbances will cause berry oscillation, which caused some difficulties in fruit localization and led to the failure of picking, Mehta et al. proposed an algorithm based on the particle filter to locate the position of the fruits. The algorithm could operate stably in the presence of fruit detection error and fruit movement with wind. The localization algorithm relied on vision technology and only required multiple inexpensive cameras for deployment, which greatly reduced the hardware cost of the picking robot [72].

4. Motion Planning Technology

Motion planning is an important technology for berry harvesting robots, which involves stable and feasible motion paths from the initial position to the object position in the presence of unstructured agricultural environments and robot constraints [73,74]. In order to realize effective control of the harvesting motion, they need to reconstruct the 3D map of the harvesting environment based on stereo visual perception information. After that, they realize the servo control of the end-effector by planning the joint trajectory of the robotic arm through inverse kinematics and combining with the fruit harvesting method [75].
Conventional motion planning algorithms such as RRT cannot meet the efficiency and quality requirements of harvesting robots in complex greenhouse environments. Chen et al. proposed the workspace decomposition-based motion planning algorithm for tomato harvesting robots. The algorithm fully utilized the spatial location data of obstacles such as branches and leaves and fruits and considered all feasible motion trajectories, which resulted in the shortest motion trajectory planned by the algorithm. The success rate of robotic tomato harvest using the algorithm was 80% with the picking speed of about 10 tomatoes/s [76]. Williams et al. proposed the cooperative motion planning algorithm for dynamic kiwifruit harvesting with the goal of minimizing harvesting time. It first clustered multiple fruit locations identified by the vision system. Then, the clustered fruits were dynamically assigned to four harvesting robots based on the condition of interference or non-interference of the robots. The algorithm could shorten the average harvesting time of a single kiwifruit to 5.5 s [23].
Aiming at the problems of low robot harvest efficiency and high fruit damage in dynamic uncertain vineyard environments, Luo et al. proposed a novel collision-free autonomous motion planning algorithm for grape picking robots. It was a motion planning algorithm based on energy optimization and artificial potential field, which could effectively improve the obstacle avoidance ability and operational efficiency of tandem robots in complex environments. The average computation time of this algorithm was less than 300 ms, and the success rate was 90% [77]. Jiang designed the new dual-arm cooperative grape picking robot to address the problems of low efficiency and high damage rate of horizontal trellis grape harvesting. As shown in Figure 7, the dual robotic arm cooperative harvesting strategy was proposed based on symmetric space segmentation and hazardous area collision avoidance techniques. Symmetrical spatial division meant that the left arm was responsible for one side of the area and the right arm was responsible for the other side of the area, with the camera’s center axis as the boundary so that parallel work was not interfered with. The center of the scaffolding was defined as the danger zone, and the robot adopted the master–slave asynchronous strategy (the priority arm moves first and the slave arm waits), which greatly reduced the risk of collision [78].
Xiong et al. proposed the dynamic obstacle separation motion planning strategy based on the artificial shaking harvesting method to address the problem of clustered strawberry fruit picking. Because it adopted the cavity clamping method to separate strawberries, it determined the region of interest around the object through 3D point cloud and then classified the object fruit and the obstacle object. As shown in Figure 8, it made used of the zig-zag motion trajectory to push through obstacles and achieve targeted fruit picking [79].

5. Fruit Fixation and Separation Techniques

The harvesting of berry fruits by end-effectors is usually divided into two steps: fruit fixation and fruit separation, which theoretically can be separated or integrated. There are two ways to fix fruits: restraining the fruit and restraining the fruit stalk (stem) [78]. The former is often used for picking kiwifruit and single-fruit tomatoes, while the latter is used for harvesting berries that grow in clusters, such as grapes. The methods of restraining the fruit can be categorized into four types: soft gripping, negative pressure suction, cavity clamping, and no fixation. Fruit separation is the core step of fruit harvesting, which can be roughly divided into the harvesting method of twisting off the fruit stalks and the harvesting method of cutting off the fruit stalks [80,81]. Since there is a non-lignified part at the base of the stalk, it has a lower mechanical strength. By twisting it at the right angle, the fruit can be separated from the stalk. This method is more conducive to fruit storage and transportation and extends the shelf life of the product. There are many ways to twist off the stalk (stem): pulling, twisting around the stalk, or twisting vertically, as well as a combination of the above methods [20,26,82,83,84,85,86,87,88,89,90]. The relevant details of end-effectors in the berry harvesting robots are shown in Table 2.
Xiong et al. reported an end-effector for greenhouse strawberry harvesting. As shown in Figure 9, the end-effector consisted of multiple active fingers, passive fingers, and the cutting device. It used the swallowing method to pick strawberry fruits from below, which achieved a high tolerance for positional error and high picking speed [20].
Traditional mechanical grippers lack adaptability and flexibility, making it difficult to fit the shape of berries and leading to low picking success and high probability of fruit damage. Flexible grippers can reduce the complexity and cost of robotic systems by reducing the reliance on high-precision visual localization and force control algorithms through flexible materials and adaptive structures [86,95]. Francesco et al. present a flexible end-effector with a built-in force sensor for precision picking of small berries such as strawberries. With the help of the deep learning algorithm, the sensor could estimate the force on the berry fruit based on the changes in the position and shape of the marking points and then guide the flexible end-effector to realize low-loss harvesting of the fruit [89].
As shown in Figure 10, Elfferich et al. presented the twisting-tube device for blackberry harvesting, which had a harvesting success rate of 82%. It applied even pressure on the blackberry fruits so they could be held in place and to minimize damage [90].
In order to improve the efficiency and accuracy of picking ripe blackberries, Qiu et al. proposed the visual servo soft picking gripper based on tendon-driven actuation technology. As shown in Figure 11, it consisted of a small camera, a near-infrared sensor, multiple fingers, and their driving mechanism. It relied on near-infrared sensors to determine fruit ripeness and could successfully harvest more than 100 berries under outdoor conditions [26].
Graham et al. developed an end-effector for kiwifruit harvesting. It used electric shearing to separate the fruit from the stem. It was also equipped with the force sensor to prevent the fruit from falling off prematurely or damaging the kiwifruit plant. It had the high picking speed and could complete the cutting in less than 0.1 s [96]. Rong et al. proposed the electric end-effector for tomatoes. As shown in Figure 12, it was mainly composed of the fruit stem fixing mechanism, the shearing machine mechanism, and their driving devices. It ingeniously utilized the eccentric wheel to achieve the fixation and cutting of the fruit stalks of tomatoes. The servo motor directly drives the eccentric wheel to execute reciprocating motions, thereby regulating the opening and closing actions of the entire end-effector [80].

6. Discussion

In the unstructured planting environment, random fruit distribution, fruit aggregation, objects and non-objects blocking each other, and continuous change of background information and light lead to misrecognition and omission of fruits. This requires further optimization of the detection algorithm based on DL by enhancing the dataset, improving the feature extraction module of the algorithm, adding the attention mechanism module, and simplifying the model to improve fruit detection accuracy or speed [55,56]. Currently, most berry fruit detection algorithms are trained based on supervised learning [62,66]. High-quality datasets are the prerequisite for strong supervised learning fruit detection and location models that achieve good performance. However, factors such as the complexity of the background and the diversity of fruit objects in agricultural scenarios make the image labeling task very time-consuming and laborious [97]. When annotating these data, annotation noise may be unintentionally introduced. Weakly supervised algorithms only need to provide image-level labels for object detection, greatly reducing the annotation cost, which is one of the future development directions for scenarios with limited image data [63,98].
The vibration of sensors, propagation path error, and perception error also bring many problems and challenges to fruit detection and localization [99]. This reduces the robustness and localization accuracy of the machine, which in turn affects the harvesting speed and picking success [100]. Multi-sensor information fusion technology can effectively eliminate the noise and error caused by a single sensor due to light changes, occlusion, or view angle limitation by fusing the color, shape, and depth information of the berry image obtained by visual sensors, the three-dimensional spatial coordinates and distance data provided by LiDAR, and the attitude information from the inertial measurement unit [101]. Through spatial and temporal alignment and feature complementation, it can realize accurate perception of berry location, ripeness, and attitude, significantly improve the recognition and localization accuracy of picking robots in complex orchard environments, and reduce the rate of missed picking and fruit damage [102,103].
Berries are susceptible to damage during mechanized harvesting operations, resulting in a decrease in their nutritional value and quality [104], which requires that the robot harvesting operation be safe and lossless throughout the entire process. In some markets where fresh berries are the main product, mechanical harvesting is often not accepted by consumers because they worry that it will damage the fruit, thereby reducing its quality and edibility. Therefore, the amount of force applied by end-effectors to the fruit is important during the picking process. If the force is low, the fruit cannot be picked. If the force is too high, it can also lead to fruit damage. In addition, friction between the surface of the fruit and end-effectors or the loss of the fruit during harvesting can cause damage to the fruit [105]. Therefore, how to achieve precise control of the force exerted by end-effectors on the berry fruits during the picking process is a future research direction [81].
Traditional harvesting robots rely on precise hand–eye localization accuracy for fruit and fruit stalks. The integrated harvesting method of clamping and shearing has been widely used in grape, strawberry, bunch tomato, etc. Its success rate is highly dependent on the visual localization of the fruits and the localization of end-effectors. Some researchers have conducted many studies on the recognition of fruit stems and the precise positioning of the picking points [49,55,106,107]. However, some kinds of berry stems are often slender and easily obscured, which brings a great challenge for their precise localization. In addition, in order to avoid damaging berry fruits, the arms and end-effectors are required to cut off the stalk and harvest the fruit at a specific position and angle sometimes. This requires a lot of time for motion trajectory planning, which leads to low picking efficiency in the case of leafy branches [108]. Therefore, the future development of berry harvesting robots should be oriented towards the development of the comprehensive and strong tolerance capability so as to achieve the release of high-reliability picking efficiency in complex agricultural environments [109]. For example, end-effectors can be designed to incorporate sensors to sense positioning errors generated by the vision system and employ certain compensation mechanisms to eliminate the errors. In this way, the end-effector of berry harvesting robots can withstand more positional deviations, thus reducing the complexity and cost of the 3D localization system. Current research based on the single manipulator arm berry harvesting technology has been more in-depth [60,105], but due to the limitation of picking speed, it cannot be compared with manual operation [11]. Multi-arm berry harvesting robots can simultaneously or cooperatively complete multi-objective operations, significantly improving the picking efficiency. Multi-arm picking robots can realize high-speed object sensing, picking, and recovery tasks, and they are becoming the mainstream of picking robot technology and industrial development [78]. They are also integrated with the picking transportation of robotic collaboration, becoming an effective solution for the efficient problem of harvesting berries by robots [110]. There is a need to study the robotic arm motion planning method based on the composite configuration visual servo, construct the multi-arm cooperative high-speed collision-free operation planning and decision-making model, and continuously optimize the order of fruit picking and the multi-arm cooperative operation strategy for future berry harvesting robots [111,112,113].

7. Conclusions

Harvesting robots have become an important direction in the development of international berry harvesting technology. This study provides a systematic summary and analysis of berry harvesting robots in recent years, reviewing the development and application of berry robotic harvesting technologies with marketable prospects. It focuses on analyzing the development status of harvesting object detection and localization technology, motion planning technology, and end-effector and harvesting mechanisms. For highly unstructured environments and fruits with significant individual differences in size, shape, color, softness, hardness, position, and obstruction, eighty percent of berry harvesting robots reliance on obstacle avoidance path planning and high positioning accuracy not only results in poor adaptability to the working environment and low success rates but also severely limits their operational efficiency. Therefore, berry harvesting robots need to move in the direction of high fault tolerance and high stability. Optimizing operational efficiency is key to the practical application of berry harvesting robots. Designing lightweight deep learning algorithms, optimizing the picking and recovery motion trajectories of robotic arms, and developing efficient flexible end-effectors can all effectively improve the operational efficiency of picking robots. In addition, developing multi-arm collaborative robots capable of high-speed collision-free operations is another future development direction. We hope that this study can provide a reference for the subsequent research of berry harvesting robots and accelerate the innovative research and development of its key technologies, thereby reducing the labor intensity of berry farmers and promoting the intelligent development of agriculture.

Author Contributions

Writing—review and editing, X.S.; writing—original draft preparation, S.W. (Shaowei Wang), B.Z., X.D., P.Q., Z.Z., S.W. (Shucheng Wang), S.W. (Shubo Wang), and H.Y.; formal analysis, S.W. (Shaowei Wang) and B.Z.; resources, S.W. (Shucheng Wang); supervision, B.Z. and X.D.; data curation, X.D.; software, S.W. (Shubo Wang) and P.Q.; methodology, X.S.; investigation, S.W. (Shucheng Wang); visualization, Z.Z.; funding acquisition, H.Y.; funding acquisition, X.S. and H.Y.; project administration, H.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research is supported by the Natural Science Foundation Project in Shandong Province of China (No. ZR2024QE350), Shandong Province Key Research and Development Plan Project (Major Scientific and Technological Innovation Project) (No. 2022CXGC020701, 2023CXGC010702), National Key Research and Development Plan Project of China (No. 2023YFD2001100), Shandong Province Key Research and Development Plan Project (Rural Revitalization Technological Innovation Boosting Action Plan Project) (No. 2023TZXD061), Shandong Province Science and Technology-based Small and Medium-sized Enterprises Innovation Capacity Enhancement Project (2022TSGC2253, 2023TSGC0049) and Shandong Academy of Agricultural Sciences Agricultural Science and Technology Innovation Project (No. CXGC2025H07, CXGC2025H03, CXGC2024D17).

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Barkaoui, S.; Madureira, J.; Boudhrioua, N.; Cabo Verde, S. Berries: Effects on Health, Preservation Methods, and Uses in Functional Foods: A Review. Eur. Food Res. Technol. 2023, 249, 1689–1715. [Google Scholar] [CrossRef]
  2. Golovinskaia, O.; Wang, C.-K. Review of Functional and Pharmacological Activities of Berries. Molecules 2021, 26, 3904. [Google Scholar] [CrossRef] [PubMed]
  3. Petruskevicius, A.; Viskelis, J.; Urbonaviciene, D.; Viskelis, P. Anthocyanin Accumulation in Berry Fruits and Their Antimicrobial and Antiviral Properties: An Overview. Horticulturae 2023, 9, 288. [Google Scholar] [CrossRef]
  4. Gui, H.; Dai, J.; Tian, J.; Jiang, Q.; Zhang, Y.; Ren, G.; Song, B.; Wang, M.; Saiwaidoula, M.; Dong, W.; et al. The Isolation of Anthocyanin Monomers from Blueberry Pomace and Their Radical-Scavenging Mechanisms in DFT Study. Food Chem. 2023, 418, 135872. [Google Scholar] [CrossRef] [PubMed]
  5. Ma, Y.; Feng, Y.; Diao, T.; Zeng, W.; Zuo, Y. Experimental and Theoretical Study on Antioxidant Activity of the Four Anthocyanins. J. Mol. Struct. 2020, 1204, 127509. [Google Scholar] [CrossRef]
  6. Mottaghipisheh, J.; Doustimotlagh, A.H.; Irajie, C.; Tanideh, N.; Barzegar, A.; Iraji, A. The Promising Therapeutic and Preventive Properties of Anthocyanidins/Anthocyanins on Prostate Cancer. Cells 2022, 11, 1070. [Google Scholar] [CrossRef]
  7. Reis, J.F.; Monteiro, V.V.S.; de Souza Gomes, R.; do Carmo, M.M.; da Costa, G.V.; Ribera, P.C.; Monteiro, M.C. Action Mechanism and Cardiovascular Effect of Anthocyanins: A Systematic Review of Animal and Human Studies. J. Transl. Med. 2016, 14, 315. [Google Scholar] [CrossRef]
  8. Wallace, T.; Slavin, M.; Frankenfeld, C. Systematic Review of Anthocyanins and Markers of Cardiovascular Disease. Nutrients 2016, 8, 32. [Google Scholar] [CrossRef]
  9. Mikulic-Petkovsek, M.; Veberic, R.; Hudina, M.; Zorenc, Z.; Koron, D. Fruit Quality Characteristics and Biochemical Composition of Fully Ripe Blackberries Harvested at Different Times. Foods 2021, 10, 1581. [Google Scholar] [CrossRef]
  10. Çolak, A.M.; Okatan, V.; Polat, M.; Güçlü, S.F. Different Harvest Times Affect Market Quality of Lyciumbarbarum L. Berries. Turk. J. Agric. For. 2019, 43, 326–333. [Google Scholar] [CrossRef]
  11. Guo, J.; Yang, Z.; Karkee, M.; Jiang, Q.; Feng, X.; He, Y. Technology Progress in Mechanical Harvest of Fresh Market Strawberries. Comput. Electron. Agric. 2024, 226, 109468. [Google Scholar] [CrossRef]
  12. Suresh Kumar, M.; Mohan, S. Selective Fruit Harvesting: Research, Trends and Developments towards Fruit Detection and Localization—A Review. Proc. Inst. Mech. Eng. C J. Mech. Eng. Sci. 2023, 237, 1405–1444. [Google Scholar] [CrossRef]
  13. Pu, Y.; Wang, S.; Yang, F.; Ehsani, R.; Zhao, L.; Li, C.; Xie, S.; Yang, M. Recent Progress and Future Prospects for Mechanized Harvesting of Fruit Crops with Shaking Systems. Int. J. Agric. Biol. Eng. 2023, 16, 1–13. [Google Scholar] [CrossRef]
  14. Brondino, L.; Borra, D.; Giuggioli, N.R.; Massaglia, S. Mechanized Blueberry Harvesting: Preliminary Results in the Italian Context. Agriculture 2021, 11, 1197. [Google Scholar] [CrossRef]
  15. Malladi, A.; Vashisth, T.; NeSmith, S. Development and Evaluation of a Portable, Handheld Mechanical Shaker to Study Fruit Detachment in Blueberry. Hortscience 2013, 48, 394–397. [Google Scholar] [CrossRef]
  16. Wang, Y.; Yang, C.; Gao, Y.; Lei, Y.; Ma, L.; Qu, A. Design and Testing of an Integrated Lycium barbarum L. Harvester. Agriculture 2024, 14, 1370. [Google Scholar] [CrossRef]
  17. Sargent, S.A.; Takeda, F.; Williamson, J.G.; Berry, A.D. Harvest of Southern Highbush Blueberry with a Modified, over-the-Row Mechanical Harvester: Use of Soft-Catch Surfaces to Minimize Impact Bruising. Agronomy 2021, 11, 1412. [Google Scholar] [CrossRef]
  18. Sargent, S.A.; Takeda, F.; Williamson, J.G.; Berry, A.D. Harvest of Southern Highbush Blueberry with a Modified, over-the-Row Mechanical Harvester: Use of Handheld Shakers and Soft Catch Surfaces. Agriculture 2020, 10, 4. [Google Scholar] [CrossRef]
  19. Yu, P.; Li, C.; Rains, G.; Hamrita, T. Development of the Berry Impact Recording Device Sensing System: Software. Comput. Electron. Agric. 2011, 77, 195–203. [Google Scholar] [CrossRef]
  20. Xiong, Y.; Peng, C.; Grimstad, L.; From, P.J.; Isler, V. Development and Field Evaluation of a Strawberry Harvesting Robot with a Cable-Driven Gripper. Comput. Electron. Agric. 2019, 157, 392–402. [Google Scholar] [CrossRef]
  21. Liu, X.; Li, P.; Hu, B.; Yin, H.; Wang, Z.; Li, W.; Xu, Y.; Li, B. The Identification, Separation, and Clamp Function of an Intelligent Flexible Blueberry Picking Robot. Processes 2024, 12, 2591. [Google Scholar] [CrossRef]
  22. Johan From, P.; Grimstad, L.; Hanheide, M.; Pearson, S.; Cielniak, G. RASberry—Robotic and Autonomous Systems for Berry Production. Mech. Eng. 2018, 140, S14–S18. [Google Scholar] [CrossRef]
  23. Williams, H.A.M.; Jones, M.H.; Nejati, M.; Seabright, M.J.; Bell, J.; Penhall, N.D.; Barnett, J.J.; Duke, M.D.; Scarfe, A.J.; Ahn, H.S.; et al. Robotic Kiwifruit Harvesting Using Machine Vision, Convolutional Neural Networks, and Robotic Arms. Biosyst. Eng. 2019, 181, 140–156. [Google Scholar] [CrossRef]
  24. Hayashi, S.; Yamamoto, S.; Tsubota, S.; Ochiai, Y.; Kobayashi, K.; Kamata, J.; Kurita, M.; Inazumi, H.; Peter, R. Automation Technologies for Strawberry Harvesting and Packing Operations in Japan. J. Berry Res. 2014, 4, 19–27. [Google Scholar] [CrossRef]
  25. Vrochidou, E.; Tziridis, K.; Nikolaou, A.; Kalampokas, T.; Papakostas, G.A.; Pachidis, T.P.; Mamalis, S.; Koundouras, S.; Kaburlasos, V.G. An Autonomous Grape-Harvester Robot: Integrated System Architecture. Electronics 2021, 10, 1056. [Google Scholar] [CrossRef]
  26. Qiu, A.; Young, C.; Gunderman, A.L.; Azizkhani, M.; Chen, Y.; Hu, A.-P. Tendon-Driven Soft Robotic Gripper with Integrated Ripeness Sensing for Blackberry Harvesting. In Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA), London, UK, 29 May–2 June 2023; pp. 11831–11837. [Google Scholar]
  27. Wang, C.; Pan, W.; Zou, T.; Li, C.; Han, Q.; Wang, H.; Yang, J.; Zou, X. A Review of Perception Technologies for Berry Fruit-Picking Robots: Advantages, Disadvantages, Challenges, and Prospects. Agriculture 2024, 14, 1346. [Google Scholar] [CrossRef]
  28. Maharshi, V.; Sharma, S.; Prajesh, R.; Das, S.; Agarwal, A.; Mitra, B. A Novel Sensor for Fruit Ripeness Estimation Using Lithography Free Approach. IEEE Sensors J. 2022, 22, 22192–22199. [Google Scholar] [CrossRef]
  29. Krishnan, A.; Swarna, S.; Balasubramany, H.S. Robotics, IoT, and AI in the Automation of Agricultural Industry: A Review. In Proceedings of the 2020 IEEE Bangalore Humanitarian Technology Conference (B-HTC), Vijayapur, India, 8–10 October 2020; pp. 1–6. [Google Scholar]
  30. Lytridis, C.; Bazinas, C.; Kalathas, I.; Siavalas, G.; Tsakmakis, C.; Spirantis, T.; Badeka, E.; Pachidis, T.; Kaburlasos, V.G. Cooperative Grape Harvesting Using Heterogeneous Autonomous Robots. Robotics 2023, 12, 147. [Google Scholar] [CrossRef]
  31. Buayai, P.; Tan, Y.S.; Kamarudzaman, M.F.B.; Makino, K.; Nishizaki, H.; Mao, X. Automating Grape Thinning: Predicting Robotic Arm End-Effector Positions Using Depth Sensing Technology and Neural Networks. In Proceedings of the 2023 IEEE International Workshop on Metrology for Agriculture and Forestry (Metroagrifor), Pisa, Italy, 6–8 November 2023; pp. 76–80. [Google Scholar]
  32. Verbiest, R.; Ruysen, K.; Vanwalleghem, T.; Demeester, E.; Kellens, K. Automation and Robotics in the Cultivation of Pome Fruit: Where Do We Stand Today? J. Field Rob. 2021, 38, 513–531. [Google Scholar] [CrossRef]
  33. Bogue, R. Fruit Picking Robots: Has Their Time Come? Ind. Robot Int. J. Robot. Res. Appl. 2020, 47, 141–145. [Google Scholar] [CrossRef]
  34. De Preter, A.; Anthonis, J.; De Baerdemaeker, J. Development of a Robot for Harvesting Strawberries. IFAC-PapersOnLine 2018, 51, 14–19. [Google Scholar] [CrossRef]
  35. Hambling, D. WE: ROBOT: The Robots That Already Rule Our World; Aurum: Gunsan, Republic of Korea, 2018; ISBN 1-78131-805-0. [Google Scholar]
  36. Cao, L.; Chen, Y.; Jin, Q. Lightweight Strawberry Instance Segmentation on Low-Power Devices for Picking Robots. Electronics 2023, 12, 3145. [Google Scholar] [CrossRef]
  37. Shigehiko, H.; Satoshi, Y.; Masago, T.; Kobayashi, L.; Junzo, K.; Rajendra, P.; Kazuhiro, Y. Development and practical application of a stationary strawberry harvesting robot integrated with mobile cultivation system. J. Agric. Food Eng. 2017, 79, 415–425. [Google Scholar] [CrossRef]
  38. Kolhalkar, N.R.; Pandit, A.A.; Kedar, S.A.; Yedukondalu, G. Artificial Intelligence Algorithms for Robotic Harvesting of Agricultural Produce. In Proceedings of the 2025 1st International Conference on AIML-Applications for Engineering & Technology (ICAET), Pune, India, 16 January 2025; pp. 1–6. [Google Scholar]
  39. Parsa, S.; Debnath, B.; Khan, M.A.; Ghalamzan, E.A. Modular Autonomous Strawberry Picking Robotic System. J. Field Rob. 2024, 41, 2226–2246. [Google Scholar] [CrossRef]
  40. Pal, A.; Leite, A.C.; From, P.J. A Novel End-to-End Vision-Based Architecture for Agricultural Human–Robot Collaboration in Fruit Picking Operations. Robot. Auton. Syst. 2024, 172, 104567. [Google Scholar] [CrossRef]
  41. Zeeshan, S.; Aized, T.; Riaz, F. In-Depth Evaluation of Automated Fruit Harvesting in Unstructured Environment for Improved Robot Design. Machines 2024, 12, 151. [Google Scholar] [CrossRef]
  42. Yang, L.; Noguchi, T.; Hoshino, Y. Development of a Pumpkin Fruits Pick-and-Place Robot Using an RGB-D Camera and a YOLO Based Object Detection AI Model. Comput. Electron. Agric. 2024, 227, 109625. [Google Scholar] [CrossRef]
  43. Mack, J.; Schindler, F.; Rist, F.; Herzog, K.; Töpfer, R.; Steinhage, V. Semantic Labeling and Reconstruction of Grape Bunches from 3D Range Data Using a New RGB-D Feature Descriptor. Comput. Electron. Agric. 2018, 155, 96–102. [Google Scholar] [CrossRef]
  44. Kurtser, P.; Ringdahl, O.; Rotstein, N.; Berenstein, R.; Edan, Y. In-Field Grape Cluster Size Assessment for Vine Yield Estimation Using a Mobile Robot and a Consumer Level RGB-D Camera. IEEE Robot. Autom. Lett. 2020, 5, 2031–2038. [Google Scholar] [CrossRef]
  45. Yoshida, T.; Kawahara, T.; Fukao, T. Fruit Recognition Method for a Harvesting Robot with RGB-D Cameras. ROBOMECH J. 2022, 9, 15. [Google Scholar] [CrossRef]
  46. Petter Wold, J.; Vejle Andersen, P.; Aaby, K.; Fagertun Remberg, S.; Hansen, A.; O’Farrell, M.; Tschudi, J. Inter Seasonal Validation of Non-Contact NIR Spectroscopy for Measurement of Total Soluble Solids in High Tunnel Strawberries. Spectrochim. Acta Part A 2024, 309, 123853. [Google Scholar] [CrossRef]
  47. Gongal, A.; Amatya, S.; Karkee, M.; Zhang, Q.; Lewis, K. Sensors and Systems for Fruit Detection and Localization: A Review. Comput. Electron. Agric. 2015, 116, 8–19. [Google Scholar] [CrossRef]
  48. Pothen, Z.S.; Nuske, S. Texture-Based Fruit Detection via Images Using the Smooth Patterns on the Fruit. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; Okamura, A., Menciassi, A., Ude, A., Burschka, D., Lee, D., Arrichiello, F., Liu, H., Moon, H., Neira, J., Sycara, K., et al., Eds.; IEEE: Piscataway, NJ, USA; 2016; pp. 5171–5176. [Google Scholar]
  49. Bai, Y.; Mao, S.; Zhou, J.; Zhang, B. Clustered Tomato Detection and Picking Point Location Using Machine Learning-Aided Image Analysis for Automatic Robotic Harvesting. Precis. Agric. 2023, 24, 727–743. [Google Scholar] [CrossRef]
  50. Kavitha, M.; Nirmala, P. Analysis and Comparison of SVM-RBF Algorithms for Colorectal Cancer Detection over Convolutional Neural Networks with Improved Accuracy. J. Pharm. Negat. Results 2022, 13, 94–103. [Google Scholar] [CrossRef]
  51. Zabawa, L.; Kicherer, A.; Klingbeil, L.; Töpfer, R.; Roscher, R.; Kuhlmann, H. Image-Based Analysis of Yield Parameters in Viticulture. Biosyst. Eng. 2022, 218, 94–109. [Google Scholar] [CrossRef]
  52. Kasinathan, T.; Uyyala, S.R. Detection of Fall Armyworm (Spodoptera Frugiperda) in Field Crops Based on Mask R-CNN. Signal Image Video Process. 2023, 17, 2689–2695. [Google Scholar] [CrossRef]
  53. Mekhalfi, M.L.; Nicolò, C.; Bazi, Y.; Al Rahhal, M.M.; Al Maghayreh, E. Detecting Crop Circles in Google Earth Images with Mask R-CNN and YOLOv3. Appl. Sci. 2021, 11, 2238. [Google Scholar] [CrossRef]
  54. Altaheri, H.; Alsulaiman, M.; Muhammad, G. Date Fruit Classification for Robotic Harvesting in a Natural Environment Using Deep Learning. IEEE Access 2019, 7, 117115–117133. [Google Scholar] [CrossRef]
  55. Ma, Z.; Dong, N.; Gu, J.; Cheng, H.; Meng, Z.; Du, X. STRAW-YOLO: A Detection Method for Strawberry Fruits Targets and Key Points. Comput. Electron. Agric. 2025, 230, 109853. [Google Scholar] [CrossRef]
  56. Yu, Y.; Zhang, K.; Yang, L.; Zhang, D. Fruit Detection for Strawberry Harvesting Robot in Non-Structural Environment Based on Mask-RCNN. Comput. Electron. Agric. 2019, 163, 104846. [Google Scholar] [CrossRef]
  57. Xie, H.; Zhang, D.; Yang, L.; Cui, T.; He, X.; Zhang, K.; Zhang, Z. Development, Integration, and Field Evaluation of a Dual-arm Ridge Cultivation Strawberry Autonomous Harvesting Robot. J. Field Rob. 2024, 42, 1783–1798. [Google Scholar] [CrossRef]
  58. Lawal, O.M. Study on Strawberry Fruit Detection Using Lightweight Algorithm. Multimed. Tools Appl. 2024, 83, 8281–8293. [Google Scholar] [CrossRef]
  59. Zhang, W.; Liu, Y.; Chen, K.; Li, H.; Duan, Y.; Wu, W.; Shi, Y.; Guo, W. Lightweight Fruit-Detection Algorithm for Edge Computing Applications. Front. Plant Sci. 2021, 12, 740936. [Google Scholar] [CrossRef] [PubMed]
  60. Seo, D.; Cho, B.-H.; Kim, K.-C. Development of Monitoring Robot System for Tomato Fruits in Hydroponic Greenhouses. Agronomy 2021, 11, 2211. [Google Scholar] [CrossRef]
  61. Magalhães, S.A.; Castro, L.; Moreira, G.; dos Santos, F.N.; Cunha, M.; Dias, J.; Moreira, A.P. Evaluating the Single-Shot MultiBox Detector and YOLO Deep Learning Models for the Detection of Tomatoes in a Greenhouse. Sensors 2021, 21, 3569. [Google Scholar] [CrossRef] [PubMed]
  62. Sozzi, M.; Cantalamessa, S.; Cogato, A.; Kayad, A.; Marinello, F. Automatic Bunch Detection in White Grape Varieties Using YOLOv3, YOLOv4, and YOLOv5 Deep Learning Algorithms. Agronomy 2022, 12, 319. [Google Scholar] [CrossRef]
  63. Ciarfuglia, T.A.; Motoi, I.M.; Saraceni, L.; Fawakherji, M.; Sanfeliu, A.; Nardi, D. Weakly and Semi-Supervised Detection, Segmentation and Tracking of Table Grapes with Limited and Noisy Data. Comput. Electron. Agric. 2023, 205, 107624. [Google Scholar] [CrossRef]
  64. Zhou, X.; Zou, X.; Tang, W.; Yan, Z.; Meng, H.; Luo, X. Unstructured Road Extraction and Roadside Fruit Recognition in Grape Orchards Based on a Synchronous Detection Algorithm. Front. Plant Sci. 2023, 14, 1103276. [Google Scholar] [CrossRef]
  65. Chen, X.; Hu, D.; Cheng, Y.; Chen, S.; Xiang, J. EDT-YOLOv8n-Based Lightweight Detection of Kiwifruit in Complex Environments. Electronics 2025, 14, 147. [Google Scholar] [CrossRef]
  66. Yang, C.; Liu, J.; He, J. A Lightweight Waxberry Fruit Detection Model Based on YOLOv5. IET Image Proc. 2024, 18, 1796–1808. [Google Scholar] [CrossRef]
  67. Lei, H.; Li, C.; Tang, Y.; Zhong, Z.; Jiao, Z. Accurate and Rapid Image Segmentation Method for Bayberry Automatic Picking via Machine Learning. Int. J. Agric. Biol. Eng. 2023, 16, 246–254. [Google Scholar] [CrossRef]
  68. Akiva, P.; Dana, K.; Oudemans, P.; Mars, M. Finding Berries: Segmentation and Counting of Cranberries Using Point Supervision and Shape Priors. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Seattle, WA, USA, 14–19 June 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 219–228. [Google Scholar]
  69. Ni, X.; Li, C.; Jiang, H.; Takeda, F. Three-Dimensional Photogrammetry with Deep Learning Instance Segmentation to Extract Berry Fruit Harvestability Traits. ISPRS J. Photogramm. Remote Sens. 2021, 171, 297–309. [Google Scholar] [CrossRef]
  70. Onishi, Y.; Yoshida, T.; Kurita, H.; Fukao, T.; Arihara, H.; Iwai, A. An Automated Fruit Harvesting Robot by Using Deep Learning. ROBOMECH J. 2019, 6, 13. [Google Scholar] [CrossRef]
  71. Ge, Y.; Xiong, Y.; Tenorio, G.L.; From, P.J. Fruit Localization and Environment Perception for Strawberry Harvesting Robots. IEEE Access 2019, 7, 147642–147652. [Google Scholar] [CrossRef]
  72. Mehta, S.S.; Ton, C.; Asundi, S.; Burks, T.F. Multiple Camera Fruit Localization Using a Particle Filter. Comput. Electron. Agric. 2017, 142, 139–154. [Google Scholar] [CrossRef]
  73. Oikawa, Y.; Ohya, A.; Yorozu, A. Pre-Planning of Trajectory and Orbiting around Greenhouses to Shorten Operation Time for Fruit-Set-Reagent Spraying Robot. In Proceedings of the 2024 10th International Conference on Mechatronics and Robotics Engineering (ICMRE), Milan, Italy, 27 February 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 105–109. [Google Scholar]
  74. Sarabu, H.; Ahlin, K.; Hu, A.-P. Graph-Based Cooperative Robot Path Planning in Agricultural Environments. In Proceedings of the 2019 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Hong Kong, China, 8–12 July 2019; pp. 519–525. [Google Scholar]
  75. Zeeshan, S.; Aized, T. Performance Analysis of Path Planning Algorithms for Fruit Harvesting Robot. J. Biosyst. Eng. 2023, 48, 178–197. [Google Scholar] [CrossRef]
  76. Chen, B.; Gong, L.; Yu, C.; Du, X.; Chen, J.; Xie, S.; Le, X.; Li, Y.; Liu, C. Workspace Decomposition Based Path Planning for Fruit-Picking Robot in Complex Greenhouse Environment. Comput. Electron. Agric. 2023, 215, 108353. [Google Scholar] [CrossRef]
  77. Luo, L.; Wen, H.; Lu, Q.; Huang, H.; Chen, W.; Zou, X.; Wang, C. Collision-free Path-planning for six-DOF Serial Harvesting Robot Based on Energy Optimal and Artificial Potential Field. Complexity 2018, 2018, 3563846. [Google Scholar] [CrossRef]
  78. Jiang, Y.; Liu, J.; Wang, J.; Li, W.; Peng, Y.; Shan, H. Development of a Dual-Arm Rapid Grape-Harvesting Robot for Horizontal Trellis Cultivation. Front. Plant Sci. 2022, 13, 881904. [Google Scholar] [CrossRef]
  79. Xiong, Y.; Ge, Y.; From, P.J. An Obstacle Separation Method for Robotic Picking of Fruits in Clusters. Comput. Electron. Agric. 2020, 175, 105397. [Google Scholar] [CrossRef]
  80. Rong, J.; Hu, L.; Zhou, H.; Dai, G.; Yuan, T.; Wang, P. A Selective Harvesting Robot for Cherry Tomatoes: Design, Development, Field Evaluation Analysis. J. Field Rob. 2024, 41, 2564–2582. [Google Scholar] [CrossRef]
  81. Vrochidou, E.; Tsakalidou, V.N.; Kalathas, I.; Gkrimpizis, T.; Pachidis, T.; Kaburlasos, V.G. An Overview of End Effectors in Agricultural Robotic Harvesting Systems. Agriculture 2022, 12, 1240. [Google Scholar] [CrossRef]
  82. Wang, X.; Hao, W.; Zhang, J.; He, Z.; Ding, X.; Cui, Y. Development and Evaluation of a Soft End Effector for Kiwifruit Harvesting. N. Z. J. Crop Hortic. Sci. 2025, 1–29. [Google Scholar] [CrossRef]
  83. Fang, W.; Wu, Z.; Li, W.; Sun, X.; Mao, W.; Li, R.; Majeed, Y.; Fu, L. Fruit Detachment Force of Multiple Varieties Kiwifruit with Different Fruit-Stem Angles for Designing Universal Robotic Picking End-Effector. Comput. Electron. Agric. 2023, 213, 108225. [Google Scholar] [CrossRef]
  84. Ochoa, E.; Mo, C. Design and Field Evaluation of an End Effector for Robotic Strawberry Harvesting. Actuators 2025, 14, 42. [Google Scholar] [CrossRef]
  85. Navas, E.; Shamshiri, R.R.; Dworak, V.; Weltzien, C.; Fernández, R. Soft Gripper for Small Fruits Harvesting and Pick and Place Operations. Front. Rob. AI 2024, 10, 1330496. [Google Scholar] [CrossRef]
  86. Navas, E.; Fernández, R.; Sepúlveda, D.; Armada, M.; Gonzalez-de-Santos, P. Soft Grippers for Automatic Crop Harvesting: A Review. Sensors 2021, 21, 2689. [Google Scholar] [CrossRef]
  87. Gunderman, A.L.; Collins, J.A.; Myers, A.L.; Threlfall, R.T.; Chen, Y. Tendon-Driven Soft Robotic Gripper for Blackberry Harvesting. IEEE Robot. Autom. Lett. 2022, 7, 2652–2659. [Google Scholar] [CrossRef]
  88. Chiu, Y.-C.; Yang, P.-Y.; Chen, S. Development of the End-Effector of a Picking Robot for Greenhouse-Grown Tomatoes. Appl. Eng. Agric. 2013, 29, 1001–1009. [Google Scholar] [CrossRef]
  89. Visentin, F.; Castellini, F.; Muradore, R. A Soft, Sensorized Gripper for Delicate Harvesting of Small Fruits. Comput. Electron. Agric. 2023, 213, 108202. [Google Scholar] [CrossRef]
  90. Elfferich, J.F.; Shahabi, E.; Santina, C.D.; Dodou, D. BerryTwist: A Twisting-Tube Soft Robotic Gripper for Blackberry Harvesting. IEEE Robot. Autom. Lett. 2025, 10, 429–435. [Google Scholar] [CrossRef]
  91. Xu, Z.; Liu, J.; Wang, J.; Cai, L.; Jin, Y.; Zhao, S.; Xie, B. Realtime Picking Point Decision Algorithm of Trellis Grape for High-Speed Robotic Cut-and-Catch Harvesting. Agronomy 2023, 13, 1618. [Google Scholar] [CrossRef]
  92. Mu, L.; Cui, G.; Liu, Y.; Cui, Y.; Fu, L.; Gejima, Y. Design and Simulation of an Integrated End-Effector for Picking Kiwifruit by Robot. Inf. Process. Agric. 2020, 7, 58–71. [Google Scholar] [CrossRef]
  93. Gao, J.; Zhang, F.; Zhang, J.; Yuan, T.; Yin, J.; Guo, H.; Yang, C. Development and Evaluation of a Pneumatic Finger-like End-Effector for Cherry Tomato Harvesting Robot in Greenhouse. Comput. Electron. Agric. 2022, 197, 106879. [Google Scholar] [CrossRef]
  94. Milojević, A.; Linß, S.; Ćojbašić, Ž.; Handroos, H. A Novel Simple, Adaptive, and Versatile Soft-Robotic Compliant Two-Finger Gripper with an Inherently Gentle Touch. J. Mech. Rob. 2021, 13, 11015. [Google Scholar] [CrossRef]
  95. Dzedzickis, A.; Petronienė, J.J.; Petkevičius, S.; Bučinskas, V. Soft Grippers in Robotics: Progress of Last 10 Years. Machines 2024, 12, 887. [Google Scholar] [CrossRef]
  96. Graham, S.S.; Zong, W.; Feng, J.; Tang, S. Design and Testing of a Kiwifruit Harvester End-Effector. Trans. ASABE 2018, 61, 45–51. [Google Scholar] [CrossRef]
  97. Bhattarai, U.; Karkee, M. A Weakly-Supervised Approach for Flower/Fruit Counting in Apple Orchards. Comput. Ind. 2022, 138, 103635. [Google Scholar] [CrossRef]
  98. Denarda, A.R.; Crocetti, F.; Costante, G.; Valigi, P.; Fravolini, M.L. MangoDetNet: A Novel Label-Efficient Weakly Supervised Fruit Detection Framework. Precis. Agric. 2024, 25, 3167–3188. [Google Scholar] [CrossRef]
  99. Bac, C.W.; Van Henten, E.J.; Hemming, J.; Edan, Y. Harvesting Robots for High-value Crops: State-of-the-art Review and Challenges Ahead. J. Field Rob. 2014, 31, 888–911. [Google Scholar] [CrossRef]
  100. Kapach, K.; Barnea, E.; Mairon, R.; Edan, Y.; Shahar, O.B. Computer Vision for Fruit Harvesting Robots State of the Art and Challenges Ahead. Int. J. Comput. Vis. Robot. 2012, 3, 4. [Google Scholar] [CrossRef]
  101. Singh, R.; Nisha, R.; Naik, R.; Upendar, K.; Nickhil, C.; Deka, S.C. Sensor Fusion Techniques in Deep Learning for Multimodal Fruit and Vegetable Quality Assessment: A Comprehensive Review. J. Food Meas. Charact. 2024, 18, 8088–8109. [Google Scholar] [CrossRef]
  102. Barreto-Cubero, A.J.; Gómez-Espinosa, A.; Escobedo Cabello, J.A.; Cuan-Urquizo, E.; Cruz-Ramírez, S.R. Sensor Data Fusion for a Mobile Robot Using Neural Networks. Sensors 2021, 22, 305. [Google Scholar] [CrossRef]
  103. Wu, J.; Zhang, B.; Zhou, J.; Xiong, Y.; Gu, B.; Yang, X. Automatic Recognition of Ripening Tomatoes by Combining Multi-Feature Fusion with a Bi-Layer Classification Strategy for Harvesting Robots. Sensors 2019, 19, 612. [Google Scholar] [CrossRef] [PubMed]
  104. Ma, L.; He, Z.; Zhu, Y.; Jia, L.; Wang, Y.; Ding, X.; Cui, Y. A Method of Grasping Detection for Kiwifruit Harvesting Robot Based on Deep Learning. Agronomy 2022, 12, 3096. [Google Scholar] [CrossRef]
  105. Uppalapati, N.K.; Walt, B.; Havens, A.J.; Mahdian, A.; Chowdhary, G.; Krishnan, G. A Berry Picking Robot with A Hybrid Soft-Rigid Arm: Design and Task Space Control. In Proceedings of the Robotics: Science and Systems, Corvalis, OR, USA, 12–16 July 2020; Robotics: Science and Systems Foundation: College Station, TX, USA, 2020; p. 95. [Google Scholar]
  106. Du, X.; Meng, Z.; Ma, Z.; Lu, W.; Cheng, H. Tomato 3D Pose Detection Algorithm Based on Keypoint Detection and Point Cloud Processing. Comput. Electron. Agric. 2023, 212, 108056. [Google Scholar] [CrossRef]
  107. Tafuro, A.; Adewumi, A.; Parsa, S.; Amir, G.E.; Debnath, B. Strawberry Picking Point Localization Ripeness and Weight Estimation. In Proceedings of the 2022 International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2022; pp. 2295–2302. [Google Scholar]
  108. Cheng, C.; Fu, J.; Su, H.; Ren, L. Recent Advancements in Agriculture Robots: Benefits and Challenges. Machines 2023, 11, 48. [Google Scholar] [CrossRef]
  109. Kaleem, A.; Hussain, S.; Aqib, M.; Cheema, M.J.M.; Saleem, S.R.; Farooq, U. Development Challenges of Fruit-Harvesting Robotic Arms: A Critical Review. Agriengineering 2023, 5, 2216–2237. [Google Scholar] [CrossRef]
  110. Li, T.; Xie, F.; Qiu, Q.; Feng, Q. Multi-Arm Robot Task Planning for Fruit Harvesting Using Multi-Agent Reinforcement Learning. In Proceedings of the 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Detroit, MI, USA, 1 October 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 4176–4183. [Google Scholar]
  111. Shome, R.; Bekris, K.E. Anytime Multi-Arm Task and Motion Planning for Pick-and-Place of Individual Objects via Handoffs. In Proceedings of the 2019 International Symposium on Multi-robot and Multi-agent Systems (MRS), New Brunswick, NJ, USA, 22–23 August 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 37–43. [Google Scholar]
  112. Mateu-Gomez, D.; Martínez-Peral, F.J.; Perez-Vidal, C. Multi-Arm Trajectory Planning for Optimal Collision-Free Pick-and-Place Operations. Technologies 2024, 12, 12. [Google Scholar] [CrossRef]
  113. Chu, X.; Hu, Q.; Zhang, J. Path Planning and Collision Avoidance for a Multi-Arm Space Maneuverable Robot. IEEE Trans. Aerosp. Electron. Syst. 2018, 54, 217–232. [Google Scholar] [CrossRef]
Figure 1. Berry harvesting machinery [14]. Reproduced with permission from [Brondino, L.; Borra, D.; Giuggioli, N.R.; Massaglia, S.], [Agriculture]; published by [MDPI], [2021].
Figure 1. Berry harvesting machinery [14]. Reproduced with permission from [Brondino, L.; Borra, D.; Giuggioli, N.R.; Massaglia, S.], [Agriculture]; published by [MDPI], [2021].
Horticulturae 11 01042 g001
Figure 2. Wolfberry harvester: 1—guide rod, 2—external collection module, 3—damping module, 4—internal collection module, 5—harvesting mechanism, 6—operator seat, 7—canopy shading panel, 8—wind selector, 9—harvest accumulation and storage module, 10—mobile chassis frame, 11—longitudinal conveyor system, 12—transverse conveyor system, 13—receiving plate, 14—driven wheel [16]. Reproduced with permission from [Wang, Y.; Yang, C.; Gao, Y.; Lei, Y.; Ma, L.; Qu, A.], [Agriculture]; published by [MDPI], [2024].
Figure 2. Wolfberry harvester: 1—guide rod, 2—external collection module, 3—damping module, 4—internal collection module, 5—harvesting mechanism, 6—operator seat, 7—canopy shading panel, 8—wind selector, 9—harvest accumulation and storage module, 10—mobile chassis frame, 11—longitudinal conveyor system, 12—transverse conveyor system, 13—receiving plate, 14—driven wheel [16]. Reproduced with permission from [Wang, Y.; Yang, C.; Gao, Y.; Lei, Y.; Ma, L.; Qu, A.], [Agriculture]; published by [MDPI], [2024].
Horticulturae 11 01042 g002
Figure 3. Architecture of berry harvesting robots.
Figure 3. Architecture of berry harvesting robots.
Horticulturae 11 01042 g003
Figure 4. Multi-row strawberry harvesting robot.
Figure 4. Multi-row strawberry harvesting robot.
Horticulturae 11 01042 g004
Figure 5. Octinion strawberry harvesting robot.
Figure 5. Octinion strawberry harvesting robot.
Horticulturae 11 01042 g005
Figure 6. Detection performance of different versions of YOLO on the white grape bunches [62]. Reproduced with permission from [Sozzi, M.; Cantalamessa, S.; Cogato, A.; Kayad, A.], [Agronomy]; published by [MDPI], [2022].
Figure 6. Detection performance of different versions of YOLO on the white grape bunches [62]. Reproduced with permission from [Sozzi, M.; Cantalamessa, S.; Cogato, A.; Kayad, A.], [Agronomy]; published by [MDPI], [2022].
Horticulturae 11 01042 g006
Figure 7. Dual robotic arm cooperative harvesting strategy [78]: (A) initial working state, (BG) cooperative harvesting state, (H) harvesting completion state. Reproduced with permission from [Jiang, Y.; Liu, J.; Wang, J.; Li, W.; Peng, Y.; Shan, H.], [Front. Plant Sci.]; published by [Frontiers], [2022].
Figure 7. Dual robotic arm cooperative harvesting strategy [78]: (A) initial working state, (BG) cooperative harvesting state, (H) harvesting completion state. Reproduced with permission from [Jiang, Y.; Liu, J.; Wang, J.; Li, W.; Peng, Y.; Shan, H.], [Front. Plant Sci.]; published by [Frontiers], [2022].
Horticulturae 11 01042 g007
Figure 8. Zig-zag moving motion trajectory for strawberry fruit picking: when moving upward, the gripper moves to the left and right to push aside two sides of obstacles: (a) initial working state, (b,c) obstacle avoidance and picking state, (d) picking completion state [79]. Reproduced with permission from [Xiong, Y.; Ge, Y.; From, P.J.], [Comput. Electron. Agric.]; published by [Elsevier], [2020].
Figure 8. Zig-zag moving motion trajectory for strawberry fruit picking: when moving upward, the gripper moves to the left and right to push aside two sides of obstacles: (a) initial working state, (b,c) obstacle avoidance and picking state, (d) picking completion state [79]. Reproduced with permission from [Xiong, Y.; Ge, Y.; From, P.J.], [Comput. Electron. Agric.]; published by [Elsevier], [2020].
Horticulturae 11 01042 g008
Figure 9. Cable-driven gripper for strawberry [20]. Reproduced with permission from [Xiong, Y.; Peng, C.; Grimstad, L.; From, P.J.; Isler, V.]; [Comput. Electron. Agric.]; published by [Elsevier], [2019].
Figure 9. Cable-driven gripper for strawberry [20]. Reproduced with permission from [Xiong, Y.; Peng, C.; Grimstad, L.; From, P.J.; Isler, V.]; [Comput. Electron. Agric.]; published by [Elsevier], [2019].
Horticulturae 11 01042 g009
Figure 10. Twisting-tube end-effector for blackberry harvesting [90]. Reproduced with permission from [Elfferich, J.F.; Shahabi, E.; Santina, C.D.; Dodou, D.]; [IEEE Robot. Autom. Lett.]; published by [IEEE], [2025].
Figure 10. Twisting-tube end-effector for blackberry harvesting [90]. Reproduced with permission from [Elfferich, J.F.; Shahabi, E.; Santina, C.D.; Dodou, D.]; [IEEE Robot. Autom. Lett.]; published by [IEEE], [2025].
Horticulturae 11 01042 g010
Figure 11. Visual servo soft blackberry gripper [26]. Reproduced with permission from [Qiu, A.; Young, C.; Gunderman, A.L.; Azizkhani, M.; Chen, Y.; Hu, A.-P.]; [IEEE International Conference on Robotics and Automation (ICRA)]; published by [IEEE], [2023].
Figure 11. Visual servo soft blackberry gripper [26]. Reproduced with permission from [Qiu, A.; Young, C.; Gunderman, A.L.; Azizkhani, M.; Chen, Y.; Hu, A.-P.]; [IEEE International Conference on Robotics and Automation (ICRA)]; published by [IEEE], [2023].
Horticulturae 11 01042 g011
Figure 12. Electric end-effector for tomato clusters [80]. Reproduced with permission from [Rong, J.; Hu, L.; Zhou, H.; Dai, G.; Yuan, T.; Wang, P. A.]; [J. Field Rob.]; published by [Wiley], [2024].
Figure 12. Electric end-effector for tomato clusters [80]. Reproduced with permission from [Rong, J.; Hu, L.; Zhou, H.; Dai, G.; Yuan, T.; Wang, P. A.]; [J. Field Rob.]; published by [Wiley], [2024].
Horticulturae 11 01042 g012
Table 1. Fruit detection algorithms based on DL.
Table 1. Fruit detection algorithms based on DL.
Berry AppliedBasic ModelDetection Rate (%)Inference Speed (ms/Image)Reference
StrawberryYOLOv896.092@640 × 480 px[55]
Mask R-CNN98.41-[56]
YOLOv598.083@640 × 480 px[57]
YOLOv589.77.30@512 × 512 px[58]
TomatoCSPNet84.740.32@416  ×  416[59]
Faster R-CNN88.6180@416  ×  416[60]
SSD85.3124.75@300 × 300 px[61]
GrapeYOLO79.631.25@416  ×  416 px[62]
YOLOv590.510.1@1280 × 720 px[63]
YOLOv793.41.596@1024 × 473[64]
Kiwi fruitYOLOv891.5-[65]
Table 2. End-effectors for berry harvesting robots.
Table 2. End-effectors for berry harvesting robots.
Type of BerryEnd-Effector TypeHarvest SpeedHarvest Success RateReference
StrawberrySoft gripper-82%[89]
Cable-driven gripper7.5 s/fruit59.0%[20]
Pneumatic gripper2.8 s/fruit94.74%[84]
GrapeCut-clip end-effector8.45 s/cluster83%[78]
Disc knife cutting end-effector6.18 s/cluster92.78%[91]
BlackberryTwisting-tube soft gripper-82%[90]
Tendon-driven soft gripper-88%[26]
Soft gripper4.8 s/fruit95.24%[87]
KiwifruitCavity clamping end-effector5 s/fruit94.2%[92]
Soft end-effector6.7 s/fruit86.36%[82]
TomatoSoft gripper with negative pressure suction74.6 s/fruit95.3%[88]
Cavity clamping end-effector6.4 s/fruit69.4%[93]
BlueberrySoft gripper with two-finger7.5 s/fruit-[94]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Shi, X.; Wang, S.; Zhang, B.; Zhang, Z.; Wang, S.; Ding, X.; Wang, S.; Qi, P.; Yang, H. Advances in Berry Harvesting Robots. Horticulturae 2025, 11, 1042. https://doi.org/10.3390/horticulturae11091042

AMA Style

Shi X, Wang S, Zhang B, Zhang Z, Wang S, Ding X, Wang S, Qi P, Yang H. Advances in Berry Harvesting Robots. Horticulturae. 2025; 11(9):1042. https://doi.org/10.3390/horticulturae11091042

Chicago/Turabian Style

Shi, Xiaojie, Shaowei Wang, Bo Zhang, Zixuan Zhang, Shucheng Wang, Xinbing Ding, Shubo Wang, Peng Qi, and Huawei Yang. 2025. "Advances in Berry Harvesting Robots" Horticulturae 11, no. 9: 1042. https://doi.org/10.3390/horticulturae11091042

APA Style

Shi, X., Wang, S., Zhang, B., Zhang, Z., Wang, S., Ding, X., Wang, S., Qi, P., & Yang, H. (2025). Advances in Berry Harvesting Robots. Horticulturae, 11(9), 1042. https://doi.org/10.3390/horticulturae11091042

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop