Next Article in Journal
Chemical and Biological Response of Four Soil Types to Lime Application: An Incubation Study
Next Article in Special Issue
Summer Maize Growth Estimation Based on Near-Surface Multi-Source Data
Previous Article in Journal
Evaluating, Screening and Selecting Yardlong Bean [Vigna unguiculata subsp. sesquipedalis (L.) Verdc.] for Resistance to Common Cutworm (Spodoptera litura Fabricius)
Previous Article in Special Issue
Research on Instance Segmentation Algorithm of Greenhouse Sweet Pepper Detection Based on Improved Mask RCNN
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Branch Interference Sensing and Handling by Tactile Enabled Robotic Apple Harvesting

1
Laboratory of Motion Generation and Analysis, Faculty of Engineering, Monash University, Clayton, VIC 3800, Australia
2
Robotics and Autonomous Systems Group, Data61, CSIRO, Brisbane, QLD 4069, Australia
*
Author to whom correspondence should be addressed.
Agronomy 2023, 13(2), 503; https://doi.org/10.3390/agronomy13020503
Submission received: 31 December 2022 / Revised: 8 February 2023 / Accepted: 8 February 2023 / Published: 9 February 2023
(This article belongs to the Special Issue AI, Sensors and Robotics for Smart Agriculture)

Abstract

:
In the dynamic and unstructured environment where horticultural crops grow, obstacles and interference frequently occur but are rarely addressed, which poses significant challenges for robotic harvesting. This work proposed a tactile-enabled robotic grasping method that combines deep learning, tactile sensing, and soft robots. By integrating fin-ray fingers with embedded tactile sensing arrays and customized perception algorithms, the robot gains the ability to sense and handle branch interference during the harvesting process and thus reduce potential mechanical fruit damage. Through experimental validations, an overall 83.3–87.0% grasping status detection success rate, and a promising interference handling method have been demonstrated. The proposed grasping method can also be extended to broader robotic grasping applications wherever undesirable foreign object intrusion needs to be addressed.

1. Introduction

Researchers around the world have been applying robots to perform selective horticultural crop harvesting tasks for over three decades. However, the complexity of the dynamic and unstructured environment makes robotic harvesting very challenging [1]. There are studies focused on the optimization of grasping pose [2], manipulation strategies [3,4], and end-effector design [5,6]. A major issue brought by the horticultural environment complexity in the robotic harvesting field is the uncertain fruit damage rate [7]. Various types of fruit damage in the robotic harvesting process and automated post-harvest handling process [8] have been reported in the literature, Silwal’s apple harvester recorded a 6.3% skin damage rate [9], Baeton’s suction cup end-effector reported an overall 30% apple damage in the field test [10], Lehnert [11] and Williams [12] have also disclosed a 26% and 25% damage rate on sweet pepper and kiwifruit, respectively. The existence of branches, twigs, trellis wires, or sprinkler lines, tend to be major obstacles leading to fruit damage [7] when a robot implements pinching, clamping, or grasping actions with pulling, bending, and twisting motions [13]. Such damage, including mechanical abrasions, bruises, and punctures, will speed up the fruit decay and result in a much shorter shelf life [14]. Avoiding fruit damage during robotic harvesting is a challenging task. Some researchers use visual perception [15,16,17,18] to segment fruit and branches, respectively, and implement specific strategies to detach fruits. However, the accuracy and robustness of visual perception are limited in such complex environments. Therefore, additional perception approaches, especially tactile sensing, could be a promising supplement to improve the quality of the automation of robotic harvesting [7]. In addition, properly designed tactile sensing technologies could also be applied to the broader robotic automation field and thus benefit various industries including agriculture, food processing, and logistics.

2. Literature Review

To enable robots to harvest fruits, various end-effectors have been proposed. Among all of those solutions, three modules are commonly applied, which are suction cup, cutter, and robot finger. Lehnert [11], Shiigi [19], Feng [20] and Hayashi [21] applied a suction cup to hold the fruits before cutting the stem, while Baeten et al. [22] used a large suction cup to pull apples off the tree; Kondo [23], and Feng [24] developed a mechanical stem cutter against tomatoes, while Bachche et al. [25] applied a thermal cutter to detach sweet peppers. In terms of grippers with fingers, William [12] and Traptic team [26] adopted robotic claws to harvest kiwifruit and strawberry, respectively. The Virgo tomato harvester team developed three delicate fingers capable of gently collecting cheery tomatoes [27], while Davidson et al. [28] designed three adaptive tendon-driven fingers to perform a spherical power grasp with pull-twist motion against apples. The aforementioned solutions have proved their function to detach fruits from plants. However, these solutions lack the sensing ability to adapt and react to environmental variances.
There are many studies to embed tactile sensing into robotic grippers to enable the capacity of tactile perception [29]. Elliott et al. [30]’s robot finger integrated a high-resolution tactile-sensing inspired by the GelSight sensing [31] and can sense the shape and texture of the target object. Yancheng et al. [32]’s 3 × 3 flexible tactile sensing array can extract contact force in three-axis when performing object grasping tasks. Lingfeng et al. [33] presented a highly stretchable tactile pressure sensor in their design. Linhan et al. [34] proposed a tactile-enabled robot finger that can precept normal force and torque to assist the grasping manipulation. Besides the approaches of purely relying on the tactile sense, researchers have also conducted pilot research on integrating vision and tactile sensing simultaneously [35]. Visual data can be utilized to perform the five-dimensional grasp representation while tactile feed-backs can be adopted to assess the grasping quality.
Only a few works have applied tactile sensing to fruit handling. Zhang et al. [36] integrated their two-finger end-effector with 4 × 6 tactile sensing arrays that can detect the fruit hardness by perceiving the contacting force between the finger and the fruit. Cortés et al. [37] demonstrated a non-destructive approach to detect mango ripeness by utilizing both the tactile sensing data and the near-infrared reflectance spectroscopy feedback. Zhou et al. [38] utilized tactile information collected by piezoresistive sensors to detect fruit slip during the harvesting process. However, Zhang’s tactile solution requires a rigid finger base, which limits its contact area with the fruit and thus may lead to fruit damage caused by force concentration; Cortés’ work aims at detecting fruit ripeness after harvesting, which is not applicable to the robotic harvesting field; Zhou’s approach of learning-based slip detection presents an example of tactile sensing utilized in fruit harvesting, but its algorithm is limited to slip detection only.
In this research, the authors presented a novel tactile-enabled multi-DoF end-effector to tackle the challenge of obstacle (mainly branch) interference in robotic fruit harvesting. This novel end-effector is an upgraded version of the authors’ 2-DoF apple harvesting gripper [39], which has gone through two years of field tests and has achieved over a 70% harvest success rate of over 764 apples (70.7% on 603 Pink lady apples and 71.4% on 161 Smitten apples) in an apple orchard in Victoria. The proposed gripper has enabled independent finger actuation and tactile sensing on top of the previous version. By combining the upgraded hardware with a customized tactile perception framework, this gripper is designed to classify different grasping statuses and react accordingly. Firstly, to classify the grasping statuses of the gripper, a deep-touch-based convolutional network with three subnets are proposed to precept the information of stress distribution on the soft fingers. A conventional perception algorithm that processes moving variance is then applied to localize the interference to the specific finger. The authors then comprehensively evaluate the performance of the presented end-effector and its perception framework in various grasping conditions, including a simulated harvesting experiment using a fruit harvesting robot.
The rest of this paper proceeds as follows, Section 3 illustrates the design requirements and fabrication procedures of the proposed end-effector including the sensing hardware, together with the definition of the grasping status and the development of the sensing algorithms. Section 4 presents the experiment setup and related results of the proposed algorithms. Finally, the conclusions are presented in Section 5.

3. Methods and Materials

3.1. End-Effector Design

The end-effector design is expected to reduce fruit damage under frequently occurring branch interference. In order to control the potential mechanical damage during the harvesting process, the authors identified three major sources of fruit damage from the literature: impact, vibration forces (or friction), and compression [8,40]. In the case of robotic harvesting, this indicates two design requirements. On the one hand, all the end-effector components that may contact the fruit directly need to be cushioned to reduce the effect of impact, friction, and compression between the gripper and the fruit. On the other hand, the existence of obstacles such as branches, twigs, trellis frames, and wires could introduce all three damage sources. Specifically, an impact such as fruit-obstacles colliding could lead to both bruising and puncture, while immoderate friction is expected when a gripper performs twisting with branches grasped in hand together with the fruit. Not to mention the excessive compression caused by force concentration when a branch or twig gets pressed on the fruit by grasping force introduced by the robotic gripper. Therefore, it is of significant importance to endow the robotic gripper with obstacle-handling ability.
With these design requirements, a multi-DoF hybrid gripper with four flexible fingers independently actuated is proposed, as presented in Figure 1.
To reduce the potential mechanical damage, the authors took initiative on the below two aspects:
Minimizing potential force concentration: the fingers are set to be flexible to perform a gentle grasp while maintaining adaptability to various fruit shapes. Specifically, a fin-ray effect structured 3D printed finger skeleton is adopted in the finger design to provide essential shape adaptability, the 3D printing filament in this design is Thermoplastic Polyurethane (TPU). A layer of silicone skin made of ELASTOSIL RTV3428 is then cast on the surface of the finger, to improve grasp stability by increasing the friction, the silicone skin will also help on preventing force concentration which will reduce the potential mechanical damage. A two-bellow suction cup is introduced to provide necessary compliance and dampening while guiding the off-centre target back to the gripper centre.
Increasing dexterity to obstacle interference: four independent actuators and multiple tactile sensors are integrated so that four fingers can sense the obstacle and act independently to deal with the interference. Specifically, after the target fruit has been grasped by the proposed gripper, the tactile data will be precept to determine the grasping status, if an obstacle interference was sensed localized to one specific finger, the interfered finger will be released independently, and the picking process will proceed with the remaining fingers. If there is still interference detected, the robot’s central control will adjust the approaching angle of the end-effector and then repeat the harvest attempt.
The combination of tactile sensors and independent fingers will also provide the robot with a self-protection function as well as grasping status judging potential, which will be illustrated in the following sections.

Tactile Design

Twenty-four piezoresistive tactile sensors (RX-M0404S) were utilized in this work for their power-saving, cost efficiency, and reliability [41]. Specifically, each finger has 6 tactile arrays embedded by a layer of silicone skin as shown in Figure 2. Each tactile array is 14 mm × 14 mm by size and 4 × 4 by taxel elements. The external force between 0.2 N and 20 N will trigger the resistance change of the taxel units and thus be measured by a data processing circuit.
To prepare the data for further perception, a data processing circuit has been implemented. Specifically, a signal isolation circuit [42] is adopted to address the potential issue of crosstalk among the 384 taxels. As the arrangement of 16 × 6 = 96 taxels on each finger might bring significant crosstalk if measuring simultaneously. An electrical-grounding-based readout architecture is selected to read sensing unit values one after another, each taxel can be regarded as a variable resistor, and the entire 384 taxels on four fingers form a 24 × 16 variable resistors array. By feeding one side of each taxel with a certain constant voltage while connecting the other side of the taxel to a shift register (which could selectively provide either a certain voltage or 0 volts), the shift register can ground the taxel one after another, in this way, the potential crosstalk between adjacent sensing units can be minimized. The authors have applied the Cypress PSoC controller to collect all the measured data via selected channels of multiplexers.

3.2. Tactile Perception

3.2.1. Grasp Status Definition

To achieve branch interference detection, different grasp statuses are required to be differentiated. In this work, the authors define four target grasping statuses which are most common in the field: good grasp, null grasp, finger-obstructed grasp, and branch-interfered grasp, as shown in Figure 3. Specifically, good grasp refers to a grasp where all the fingers of the gripper are in stable contact with the target fruit, and no foreign obstacles are grasped, null grasp is the grasp where no fruits are obtained, finger obstructed grasp indicates that one or more fingers get obstructed by stiff branches or other rigid obstacles such as trellis wires or support beams. Branch interfered grasp represents the grasp with a branch caught in between the finger and the target fruit, it is important to point out that although there might be more than one branch next to the target apple in the orchard environment, the scenario where only one branch was intruding into the grippers workspace is more frequently observed, plus the fact that overcrowded branches tend to be pruned before harvesting season [43], the authors will focus on single branch interference in this work.
When conducting robotic harvesting tasks in the orchard, failure to identify null grasps will lead to an increased average cycle time, as the robot will move the empty gripper to the fruit collecting point; unable to detect a finger-obstructed grasp may result in damage to the robotic finger or even gripper; failure to sense the frequently occurred branch interfered grasp might cause a high chance of fruit damage, as the grasping force applied on the branch would concentrate on the apple skin and result in bruise or puncture.

3.2.2. Data Processing

A 24 × 16 ×   t matrix is created to collect the data provided by the 384 taxels, where t is the number of monitored time frames with intervals of 60 ms. The 24 × 16 ×   t matrix will be normalized and reshaped to a 384 ×   t matrix ( M 0 ) which will be further split into four 96 ×   t matrices, in such a way that the tactile output of each taxel on each finger can be clearly monitored as a pressure value time series, as shown in Figure 4.
The authors then performed data smoothing by calculating the moving average ( v i , k ) against every four frames of the normalized data, after which the filtered data M 1 was fed into the proposed neural network for further processing, while a moving variance ( σ i ) matrix M 2 was calculated against each matrix element for the use of interference localization. Assuming the pressure change rate would be a potential indicator to distinguish the pressure increase introduced by a fruit and a branch. Specifically:
M 0 = p 1 , 1 p 1 , t p 384 , 1 p 384 , t
where, p i , j is the voltage value measured at the ith taxel and the jth time frame;
The smoothed-out matrix
M 1 = v 1 , 1 v 1 , k v 384 , 1 v 384 , k
where v i , k is the moving average of the voltage value p i , j against every four frames:
v i , k = 1 4 j = ( t 3 ) t ( p i , j ) , t 4
The moving variance matrix is:
M 2 = σ 1 , 1 2 σ 1 , k 2 σ 384 , 1 2 σ 384 , k 2
where
σ i , k 2 = 1 4 j = ( t 3 ) t ( p i , j v i , k ) 2 , t 4

3.2.3. Grasping Status Classification

Three assumptions were applied to differentiate the four grasping statuses. Firstly, the pressure value on the fingers during a null grasp should be considerably smaller than the other three grasp statuses. Secondly, the finger that gets obstructed before contacting the target fruit tends to feedback on major value change earlier than other fingers that are not encountering obstacles. Thirdly, for the branch-interfered grasp, the rate of the pressure change of the branch-contacting area tends to be sharper than the fruit-contacting area, as the force concentration is more likely to occur due to the existence of the branch that caused a reduction in the contact area between the finger and the fruit.
Based on the aforementioned analysis, different grasping statuses can be classified by extracting the feature of the pressure distribution matrix from the tactile sensing arrays. This work utilizes a CNN model named Deep-touch to perform the grasping status prediction, as shown in Figure 5. There are three networks in Deep-touch: firstly, a local finger network is used to handle the data collected from each finger separately, in this network, a ResNet-18 model is applied with the kernel of the pooling layer changed to 2 × 1. A 1 × 4 vector will be generated by each local finger network; secondly, a global network which also utilizes the ResNet-18 model is then created to precept the data from all fingers as an entire image, the input of this global network is the pressure map of all fingers, while the output is a 1 × 256 feature vector. After which the output will be concatenated with the global network, and a 1 × 512 feature vector is thus generated. This 1 × 512 vector is then used as the input of a fully-connected network, which will fuse features extracted from both the local and global networks and finally provide the predictions of the grasping status.

3.2.4. Interference Localization

Since the branch interference during the grasping process has the highest chance to cause fruit damage compared with the other three scenarios, it is very important to locate the interference.
To determine which finger is interfered with by the branch, moving variance is adopted to distinguish the interfered area, assuming that the contact between finger and branch could trigger much faster pressure change than contact between finger and apple during the grasping process, specifically, the taxel with the largest moving variance on each finger is selected to compare with the selected points on the other fingers. As aforementioned, the 384 × t matrix M 2 was divided into four 96 × t matrices so that each 96 × t matrix represents the pressure value time series of each finger, by comparing the moving variance of the 96 taxels, the point where the max moving variance value occurs can be located, as shown in Figure 6. In this example, taxel 322 is outputting the maximum pressure variance among the 384 output values; therefore, the algorithm outputs the estimation that a branch interference occurred in the middle part of finger 4.

4. Experiments and Results

4.1. Experiment Setup

To validate the proposed method, experiments were conducted at both the gripper subsystem level and the harvesting robot system level. Specifically, for gripper subsystem level experiments, as shown in Figure 7, grasp tests were conducted by fixing the proposed gripper on a desk in the lab and then manually simulating the four targeted scenarios. For null grasp, a bunch of artificial branches-leaves sets (with no fruit attached) were fed to the gripper. For good grasp, 12 apples of different varieties were used as the target objects, these apples were numbered from ① to ⑫ as shown in Figure 7a. The size and weight of each apple are presented in the table of Figure 8 (the authors understand the excessive physical properties variations of different apples and have selected seven popular apple varieties in Australia (two Jazz, two Golden delicious, two royal gala, one Granny smith, two Red delicious, two Pink lady, one Fuji) as a start. The shape and the size of these apples were intentionally chosen (as shown in Figure 7a) to cover common apple shapes). Each grasp was performed against two typical apple orientations: stem pointing up and stem horizontally placed. For branch interfered grasp, branches were placed in between the fruit and the finger with three typical orientations ( 45 , 0 , and 45 as shown in Figure 7b) before grasping. For finger obstructed grasp, a 3D printed finger stopper was placed to constrain the bending of one finger during the grasping process, as can be seen in Figure 7c. considering the shape (size) and weight of apple ②, ④, ⑥, and ⑪, are similar with apple ①, ⑩, ⑦, ⑩, the apple ②, ④, ⑥, and ⑪ are not included in this test.
It is important to note that, for all the grasp tests, to prevent the grasping force from damaging the fruit, the finger-apple contact pressure was tuned to be under 0.31 MPa [38] by limiting the maximum air pressure to between 0.6 Mpa to 0.9 Mpa.

4.2. Experiment on Status Detection

Two hundred grasp tests were conducted to examine the performance of the proposed deep-touch CNN network. In fact, to collect data for training and validation, 800 grasps were performed, the ratio of the training set, the validation set, and the test set was 60%:20%:20%. For every 200 grasps, 96 were grasps with branch interference, 48 were grasps under good condition, 30 were null grasps, and 26 were finger-obstructed grasps. The interval between each grasp was set as 5 seconds to allow sufficient time for apple placing. Each grasp was set to last for 1440 ms from the grasp command sent to the fingers open command triggered, specifically, the first 480 ms was assigned as the waiting time for the sensor values to become stable, while the latter 960 ms was regarded as a stable period. With 60 ms sampling frequency, the total 24 frames of tactile data were collected and then processed to 21 smoothed frames of data (as aforementioned data processing method), these 21 frames of data were then used as the input of the deep-touch network. In summary, there were 12,600, 4200, and 4200 frames of data sets for training validation, and test, respectively.
For each grasp of the 200 test grasps, 21 frames of data were fed into the neural network, which led to 21 classification results of grasping status. The final grasping status was determined based on the vote of the 21 output results.
The experiment results are presented in the table of Figure 8 and summarized in Table 1. Besides a 96.6% classification accuracy achieved against null grasps, the proposed algorithm presents a 92.3% and 87.0% detection accuracy on finger-obstructed grasping scenarios and overall performance. Most importantly, the proposed method successfully classified 80 branch interference cases out of 96 grasps, which laid a promising foundation for branch interference handling. Figure 8 also presents: (1) the proposed algorithm tends to be significantly more accurate in detecting branches perpendicular to the fingers (44/48, or 91.7%) than branches with an angle of 45 and 45 (36/48, or 75.0%); (2) most failures occurred on apple ⑤ (3 failures) and ⑧ (6 failures), which are the smallest apple and the most irregular shaped apple among the 12 tested apples. In summary, the proposed method, on the one hand, presents promising grasping status recognition capability and branch interference detection ability, on the other hand, the proposed algorithm encountered difficulties in identifying branch interference on small apples and irregularly shaped apples.
After the 96 branch interfered grasp tests, the 12 apples were monitored for 5 days to verify noticeable damages. The authors have spotted no bruise damage but two skin punctures (both are on apple ①, as shown in Figure 8). To make it clear, bruise refers to the discoloration of fruit flesh without skin damage, while punctures refer to a small hole on the fruit skin made by a sharp point. The authors have examined the two punctures on the skin of the apple ①, which is believed to be related to the setting of the branch, as shown in Figure 7a, a branch was placed with a sharp spur facing towards the skin of the apple ① (the bottom right one), when the fingers pressed the branch, that spur was forced into the apple skin and caused the punctures. It should be noted that not all grasp tests on the apple ① were end with punctures, this is because the branch was attached to the apple via a patch of Blu Tack (a type of adhesive putty), which allows the branch to rotate under external force and thus changes the orientation of the sharp spur. While for the bruise damage, the authors have repeated the grasps test without the sensing and sensing-triggered reaction. For each grasp, a branch was placed in between one finger and the apple, with one side of the branch fixed by clamps to simulate the branch fixing on the trunk in the real world. After grasping, the gripper was rotated along its longitudinal axis to simulate the case in the orchard when the gripper twists the target apple with a branch grasped. The result was recorded as the “Bruise in controlled tests” in Figure 8, it can be seen that 8 bruise damages were observed among the 48 grasps in the controlled test (the grasps with the branch angle of 0 were not conducted in the controlled test considering it is less common than the angled branch settings in the real case). Figure 9 presents typical bruise damage in the controlled test. This result indicates that the proposed method demonstrated its potential of preventing (or at least reducing) bruising damage in the robotic harvesting applications.

4.3. Experiment on Interference Localization

Ninety-six grasps were conducted to test the interference localization function. The experiment set is as presented in Figure 10, where the branch was placed in between the target apple and different parts (distal, middle, and proximal) of each finger to validate if the proposed algorithm can localize the difference and trigger desired reaction. It should be noted that in this experiment, the branch was always placed perpendicular to the finger, and there was neither 45 nor 45 angle settings similar to the grasp status experiment.
The experiment result is presented in Table 2, the algorithm demonstrated a 22/24, 21/24, 18/24, and 19/24 localization accuracy on finger 1, finger 2, finger 3, and finger 4, respectively, the overall recognition rate is 83.3%. It can also be seen that the algorithm performed the best in detecting branches in the middle part of the fingers, this is because the finger applies higher pressure in the middle than in the distal and proximal part (due to the mechanical settings of these fin-ray effect fingers). High pressure triggered sharper resistance change on the taxels which tends to make it easier for the proposed algorithm to localize the interference.
Once the interference had been localized on a specific finger, the finger was released to free the branch. The gripper control system presented a robust performance that it reacts accurately every single time when interference is detected.

4.4. Experiment on the Robotic Harvesting System

To examine if the proposed gripper design and sensing algorithms can function accurately in the robot harvesting system. The authors have integrated the proposed gripper with the tactile sensing algorithms into an apple harvesting system. The layout of this system is presented in Figure 11. The proposed gripper was mounted at the end joint of a Universal Robot arm (UR5), with data transmitting cables (Flat Flexible Cables, or FFCs) and actuation tubes attached to the arm. The robot arm was fixed on a mobile base together with the control box and the data processing box. An Intel Realsense D435 RGB-D camera was set on top of the gripper to enable machine vision for the robot system. Specifically, the color images collected by the D435 camera will be processed with fruit recognition and segmentation via a geometry-aware neural network [39]. The position of the apples will be obtained by the process and then sent to the robot control system for desirable approaching-picking motion.
The system-level experiment was determined to be conducted in the lab mainly based on the following two concerns: (1) as a proof-of-concept research work, the number of variables needs to be properly controlled, the excessive variables in the orchard environment might bring undesirable noises; for example, leaves might be grasped by one or multiple fingers together with the target fruit and the branch, which may significantly affect the sensing performance; (2) the dynamic unstructured orchard environment might pose risks to the newly developed sensing hardware, especially the data transmitting FFC cables (as shown in Figure 11) which might be tangled in the branch and thus damage the sensors. It would be safer and more practical for the research team to validate the concept in an indoor controlled environment first and utilize the data and experience gained for optimizing the system before bringing the system to the orchard.
To set up the test environment, a tree-shaped wooden frame was manually attached with dozens of off-the-shelf artificial apple branches and leaves. The branches and leaves were set in various orientations to simulate the orchard apple tree canopy. Six apples (numbered from ① to ⑥) were hung on six different branches, as shown in Figure 12. Specifically, the angles between the trunk and the six branches range from 45 to 90 , while the angles between each branch and the tree base plane (the plane perpendicular to the robot center-trunk line) were set to be within the range of 0 (branch ⑥) to 45 (branch ④). The branches are thicker (8 mm diameter) on the trunk side and thinner on the far side (≤5 mm).
To check the gripper’s performance under different scenarios, three apples were hung isolatedly (with no branches set within a 50 mm distance from the apple surface), and three apples were set next to three different branches (on the trunk side). During the test, one out of the three isolated apples was manually moved outside the gripper workspace to check the gripper’s reaction under null grasp (simulating the scenario where strong wind disturbance blows the target apple away from the gripper’s workspace). The detailed apple position settings in the two rounds test can be seen in Table 3.
During the test, both the output classification result and the gripper reaction were monitored. The results were presented in Table 3, where a light green highlight indicates a successful classification as well as correct gripper reaction, while a light red highlight refers to an incorrect detection. The result in the table shows that: (1) the proposed algorithm was able to output accurate recognition of all the null grasps, good grasps, and branch-interfered grasps; (2) the proposed algorithm mistook the finger-obstructed grasp for branch-interfered grasp twice among the three grasps, the potential reason could be the distance between the apple surface and the branch was too short and the proposed algorithm lacks the sensitivity to detect it; (3) the proposed gripper stably implemented correct reactions against the two null grasps, four good grasps, and three branch-interfered grasps; specifically, the gripper held steady after identified a good grasp, released all four actuators after confirming a null grasp, and released the branch-interfered finger accurately when a branch-interference was recognized. The motion sequence of grasping an apple with a branch interference is illustrated in Figure 13.

5. Conclusions

This work developed a novel tactile-enabled fruit harvesting method capable of distinguishing various grasping scenarios and handling obstacle interference during the harvesting process. As the first of its kind, the proposed method provides robotic fruit harvesting grippers with both the obstacle-sensing ability and the essential dexterity to handle various picking scenarios. The proposed deep learning algorithms have been validated through lab experiments at both the gripper subsystems level and the harvesting system level. An overall 87.0% and 83.3% grasping status detection accuracy has been achieved at the subsystem level and the harvesting system level, respectively. The proposed soft gripper demonstrated stable and accurate reactions on different sensing outcomes. The independently controlled fingers present an effective approach to dealing with branch interference, which presents a promising potential for obstacle interference handling and thus reducing mechanical fruit damage in robotic harvesting applications.
Besides promising overall grasping status detection accuracy, the proposed method did not perform well in identifying branch interference on small apples and irregularly shaped apples. It also encountered difficulties in differentiating finger-obstructed grasp from branch-interfered grasp when the branch-apple surface distance is short. In addition, the proposed interference localization algorithm tends to be less effective when the branch interference occurs in the distal and the proximal part of the fingers. Thorough investigations are expected in future work to address these issues.
In addition, although fruit damage was witnessed in the 96 branch interfered grasp tests, the experiment was conducted against only 12 apples, which may increase the likelihood of a Type II error skewing the results. It is expected that the sample size can be significantly increased in future study so that the relation between the branch interference and the fruit damage can be further revealed.
It is also worth mentioning that the proposed method needs to prove its capability in the real world. Considering the orchard environment tends to be much more complex than the lab settings, there might be more than one branch intruded in the grasp space, the leaves might affect the sensing accuracy, and clustered fruits might also introduce noises to this method. Moreover, the limit of the current hardware should also be addressed. An example is how can the proposed method handle the scenario where an apple is grasped by the fingertips (tripod/quadrupod grasp) or pinched by two fingers on the side (adduction grip), considering no sensor was placed in the fingertip or the finger side under the current layout. Moreover, the potential engineering issues should be resolved before entering into the field trial, the exposed data transmitting FFC cables need to be covered to prevent dragging or even snapping by the branches and twigs. Extensive field tests in the orchard environment are expected in future work.
Above all, this work may potentially provide guidance for the design of an apple harvesting gripper. Such a method can also be extended to broader robotic grasping fields wherever undesirable foreign object intrusion needs to be handled.

Author Contributions

H.Z. contributed to the conceptualization, methodology, investigation, and draft preparation. H.K. contributed to the software, algorithm development, and draft preparation. X.W. contributed to the formal analysis and validation. W.A. and M.Y.W. contributed to Writing—Reviewing and Editing. C.C. provided significant contributions to this development as the lead. All authors have read and agreed to the published version of the manuscript.

Funding

This study has received funding from the Australian Research Council (ARC ITRH IH150100006).

Data Availability Statement

Not applicable.

Acknowledgments

This study has received funding from the Australian Research Council (ARC ITRH IH150100006). The authors would like to thank CooperGerwing, Charles Troeung, Shao Liu and Godfrey Keung in the Laboratory of Motion Generation Analysis at Monash University for their expertise and assistance on this work.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
DoFDegree of freedom
TPUThermoplastic Polyurethane
PMPressure Map
CNNConvolutional neural network
ResNetResidual neural network
FFCFlat Flexible Cable

References

  1. Zhao, Y.; Gong, L.; Huang, Y.; Liu, C. A review of key techniques of vision-based control for harvesting robot. Comput. Electron. Agric. 2016, 127, 311–323. [Google Scholar]
  2. Kang, H.; Zhou, H.; Wang, X.; Chen, C. Real-time fruit recognition and grasping estimation for robotic apple harvesting. Sensors 2020, 20, 5670. [Google Scholar] [PubMed]
  3. Lin, G.; Zhu, L.; Li, J.; Zou, X.; Tang, Y. Collision-free path planning for a guava-harvesting robot based on recurrent deep reinforcement learning. Comput. Electron. Agric. 2021, 188, 106350. [Google Scholar]
  4. Tang, Y.; Chen, M.; Wang, C.; Luo, L.; Li, J.; Lian, G.; Zou, X. Recognition and localization methods for vision-based fruit picking robots: A review. Front. Plant Sci. 2020, 11, 510. [Google Scholar] [PubMed]
  5. Wang, X.; Khara, A.; Chen, C. A soft pneumatic bistable reinforced actuator bioinspired by Venus Flytrap with enhanced grasping capability. Bioinspir. Biomim. 2020, 15, 056017. [Google Scholar]
  6. Wang, X.; Zhou, H.; Kang, H.; Au, W.; Chen, C. Bio-inspired soft bistable actuator with dual actuations. Smart Mater. Struct. 2021, 30, 125001. [Google Scholar] [CrossRef]
  7. Zhou, H.; Wang, X.; Au, W.; Kang, H.; Chen, C. Intelligent robots for fruit harvesting: Recent developments and future challenges. Precis. Agric. 2022, 23, 1856–1907. [Google Scholar] [CrossRef]
  8. Hussein, Z.; Fawole, O.; Opara, U. Harvest and postharvest factors affecting bruise damage of fresh fruits. Hortic. Plant J. 2020, 6, 1–13. [Google Scholar]
  9. Silwal, A.; Davidson, J.; Karkee, M.; Mo, C.; Zhang, Q.; Lewis, K. Design, integration, and field evaluation of a robotic apple harvester. J. Field Robot. 2017, 34, 1140–1159. [Google Scholar]
  10. Baeten, J.; Donné, K.; Boedrij, S.; Beckers, W.; Claesen, E. Autonomous fruit picking machine: A robotic apple harvester. In Proceedings of the 6th International Conference on Field and Service Robotics-FSR 2007, Chamonix, France, 9–12 July 2007. [Google Scholar]
  11. Lehnert, C.; English, A.; McCool, C.; Tow, A.; Perez, T. Autonomous sweet pepper harvesting for protected cropping systems. IEEE Robot. Autom. Lett. 2017, 2, 872–879. [Google Scholar]
  12. Williams, H.; Jones, M.; Nejati, M.; Seabright, M.; Bell, J.; Penhall, N.; Barnett, J.; Duke, M.; Scarfe, A.; Ahn, H.; et al. Robotic kiwifruit harvesting using machine vision, convolutional neural networks, and robotic arms. Biosyst. Eng. 2019, 181, 140–156. [Google Scholar]
  13. Bu, L.; Hu, G.; Chen, C.; Sugirbay, A.; Chen, J. Experimental and simulation analysis of optimum picking patterns for robotic apple harvesting. Sci. Hortic. 2020, 261, 108937. [Google Scholar]
  14. Polat, R.; Aktas, T.; Ikinci, A. Selected mechanical properties and bruise susceptibility of nectarine fruit. Int. J. Food Prop. 2012, 15, 1369–1380. [Google Scholar]
  15. Kang, H.; Chen, C. Fast implementation of real-time fruit detection in apple orchards using deep learning. Comput. Electron. Agric. 2020, 168, 105108. [Google Scholar] [CrossRef]
  16. Font, D.; Pallejà, T.; Tresanchez, M.; Runcan, D.; Moreno, J.; Martínez, D.; Teixidó, M.; Palacín, J. A proposal for automatic fruit harvesting by combining a low cost stereovision camera and a robotic arm. Sensors 2014, 14, 11557–11579. [Google Scholar] [PubMed]
  17. Zhou, Y.; Tang, Y.; Zou, X.; Wu, M.; Tang, W.; Meng, F.; Zhang, Y.; Kang, H. Adaptive Active Positioning of Camellia oleifera Fruit Picking Points: Classical Image Processing and YOLOv7 Fusion Algorithm. Appl. Sci. 2022, 12, 12959. [Google Scholar] [CrossRef]
  18. Tang, Y.; Zhou, H.; Wang, H.; Zhang, Y. Fruit detection and positioning technology for a Camellia oleifera C. Abel orchard based on improved YOLOv4-tiny model and binocular stereo vision. Expert Syst. Appl. 2023, 211, 118573. [Google Scholar] [CrossRef]
  19. Shiigi, T.; Kurita, M.; Kondo, N.; Ninomiya, K.; Rajendra, P.; Kamata, J.; Hayashi, S.; Kobayashi, K.; Shigematsu, K.; Kohno, Y. Strawberry harvesting robot for fruits grown on table top culture. In Proceedings of the American Society of Agricultural and Biological Engineers Annual International Meeting 2008, ASABE 2008, Providence, RI, USA, 29 June–2 July 2008; pp. 3139–3148. [Google Scholar]
  20. Feng, Q.; Wang, X.; Zheng, W.; Qiu, Q.; Jiang, K. New strawberry harvesting robot for elevated-trough culture. Int. J. Agric. Biol. Eng. 2012, 5, 1–8. [Google Scholar]
  21. Hayashi, S.; Shigematsu, K.; Yamamoto, S.; Kobayashi, K.; Kohno, Y.; Kamata, J.; Kurita, M. Evaluation of a strawberry-harvesting robot in a field test. Biosyst. Eng. 2010, 105, 160–171. [Google Scholar]
  22. Baeten, J.; Donné, K.; Boedrij, S.; Beckers, W.; Claesen, E. Autonomous fruit picking machine: A robotic apple harvester. Field Serv. Robot. 2008, 42, 531–539. [Google Scholar]
  23. Kondo, N.; Yata, K.; Iida, M.; Shiigi, T.; Monta, M.; Kurita, M.; Omori, H. Development of an end-effector for a tomato cluster harvesting robot. Eng. Agric. Environ. Food 2010, 3, 20–24. [Google Scholar] [CrossRef]
  24. Feng, Q.; Zou, W.; Fan, P.; Zhang, C.; Wang, X. Design and test of robotic harvesting system for cherry tomato. Int. J. Agric. Biol. Eng. 2018, 11, 96–100. [Google Scholar] [CrossRef]
  25. Bachche, S.; Oka, K. Performance testing of thermal cutting systems for sweet pepper harvesting robot in greenhouse horticulture. J. Syst. Des. Dyn. 2013, 7, 36–51. [Google Scholar] [CrossRef]
  26. Bogue, R. Robots poised to transform agriculture. Ind. Robot. Int. J. Robot. Res. Appl. 2021, 48, 637–642. [Google Scholar]
  27. Anandan, T. Cultivating robotics and AI: Smarter machines help relieve aging agricultural workforce and fewer workers. Control Eng. 2020, 67, M1. [Google Scholar]
  28. Davidson, J.; Silwal, A.; Hohimer, C.; Karkee, M.; Mo, C.; Zhang, Q. Proof-of-concept of a robotic apple harvester. In Proceedings of the 2016 IEEE/RSJ International Conference On Intelligent Robots And Systems (IROS), Daejeon, Republic of Korea, 9–14 October 2016; pp. 634–639. [Google Scholar]
  29. Jin, T.; Sun, Z.; Li, L.; Zhang, Q.; Zhu, M.; Zhang, Z.; Yuan, G.; Chen, T.; Tian, Y.; Hou, X.; et al. Triboelectric nanogenerator sensors for soft robotics aiming at digital twin applications. Nat. Commun. 2020, 11, 5381. [Google Scholar] [CrossRef]
  30. Donlon, E.; Dong, S.; Liu, M.; Li, J.; Adelson, E.; Rodriguez, A. Gelslim: A high-resolution, compact, robust, and calibrated tactile-sensing finger. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots And Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 1927–1934. [Google Scholar]
  31. Yuan, W.; Dong, S.; Adelson, E. Gelsight: High-resolution robot tactile sensors for estimating geometry and force. Sensors 2017, 17, 2762. [Google Scholar]
  32. Wang, Y.; Chen, J.; Mei, D. Flexible tactile sensor array for slippage and grooved surface recognition in sliding movement. Micromachines 2019, 10, 579. [Google Scholar]
  33. Zhu, L.; Wang, Y.; Mei, D.; Jiang, C. Development of fully flexible tactile pressure sensor with bilayer interlaced bumps for robotic grasping applications. Micromachines 2020, 11, 770. [Google Scholar] [CrossRef]
  34. Yang, L.; Han, X.; Guo, W.; Wan, F.; Pan, J.; Song, C. Learning-based optoelectronically innervated tactile finger for rigid-soft interactive grasping. IEEE Robot. Autom. Lett. 2021, 6, 3817–3824. [Google Scholar] [CrossRef]
  35. Guo, D.; Sun, F.; Fang, B.; Yang, C.; Xi, N. Robotic grasping using visual and tactile sensing. Inf. Sci. 2017, 417, 274–286. [Google Scholar] [CrossRef]
  36. Zhang, Z.; Zhou, J.; Yan, Z.; Wang, K.; Mao, J.; Jiang, Z. Hardness recognition of fruits and vegetables based on tactile array information of manipulator. Comput. Electron. Agric. 2021, 181, 105959. [Google Scholar] [CrossRef]
  37. Cortés, V.; Blanes, C.; Blasco, J.; Ortiz, C.; Aleixos, N.; Mellado, M.; Cubero, S.; Talens, P. Integration of simultaneous tactile sensing and visible and near-infrared reflectance spectroscopy in a robot gripper for mango quality assessment. Biosyst. Eng. 2017, 162, 112–123. [Google Scholar] [CrossRef]
  38. Zhou, H.; Xiao, J.; Kang, H.; Wang, X.; Au, W.; Chen, C. Learning-based slip detection for robotic fruit grasping and manipulation under leaf interference. Sensors 2022, 22, 5483. [Google Scholar]
  39. Wang, X.; Kang, H.; Zhou, H.; Au, W.; Chen, C. Geometry-aware fruit grasping estimation for robotic harvesting in apple orchards. Comput. Electron. Agric. 2022, 193, 106716. [Google Scholar]
  40. Gonzalez, K. Bruising Profile of Fresh ‘Golden Delicious’ Apples. Ph.D. Thesis, Washington State University, Pullman, WA, USA, 2009. [Google Scholar]
  41. Xu, F.; Li, X.; Shi, Y.; Li, L.; Wang, W.; He, L.; Liu, R. Recent developments for flexible pressure sensors: A review. Micromachines 2018, 9, 580. [Google Scholar]
  42. Romano, J.; Hsiao, K.; Niemeyer, G.; Chitta, S.; Kuchenbecker, K. Human-inspired robotic grasp control with tactile sensing. IEEE Trans. Robot. 2011, 27, 1067–1079. [Google Scholar]
  43. Lauri, P. Developing a new paradigm for apple training. Compact. Fruit Tree 2009, 42, 17–19. [Google Scholar]
Figure 1. Mechanical design.
Figure 1. Mechanical design.
Agronomy 13 00503 g001
Figure 2. Layout of the tactile sensing array.
Figure 2. Layout of the tactile sensing array.
Agronomy 13 00503 g002
Figure 3. Four grasping status.
Figure 3. Four grasping status.
Agronomy 13 00503 g003
Figure 4. Data-set reshaping.
Figure 4. Data-set reshaping.
Agronomy 13 00503 g004
Figure 5. Proposed Deep-touch CNN network.
Figure 5. Proposed Deep-touch CNN network.
Agronomy 13 00503 g005
Figure 6. Interference localization.
Figure 6. Interference localization.
Agronomy 13 00503 g006
Figure 7. Experiment setting of the grasping statuses detection: (a) Apples and branch orientation settings; (b) branch interfered grasp; (c) finger obstructed grasp.
Figure 7. Experiment setting of the grasping statuses detection: (a) Apples and branch orientation settings; (b) branch interfered grasp; (c) finger obstructed grasp.
Agronomy 13 00503 g007
Figure 8. Grasping status experiment record.
Figure 8. Grasping status experiment record.
Agronomy 13 00503 g008
Figure 9. Bruise damage: grasp with sensing turned off.
Figure 9. Bruise damage: grasp with sensing turned off.
Agronomy 13 00503 g009
Figure 10. Three different branch interference areas.
Figure 10. Three different branch interference areas.
Agronomy 13 00503 g010
Figure 11. The integrated apple harvesting system.
Figure 11. The integrated apple harvesting system.
Agronomy 13 00503 g011
Figure 12. Artificial apple tree setting.
Figure 12. Artificial apple tree setting.
Agronomy 13 00503 g012
Figure 13. Motion Sequence: (a) gripper reach target; (b) grasp action; (c) tactile perception; (d) interference detected; (e) finger released; (f) gripper retraction; (g) gripper reach dropping point; (h) fruit released.
Figure 13. Motion Sequence: (a) gripper reach target; (b) grasp action; (c) tactile perception; (d) interference detected; (e) finger released; (f) gripper retraction; (g) gripper reach dropping point; (h) fruit released.
Agronomy 13 00503 g013
Table 1. Grasp status detection experiment result.
Table 1. Grasp status detection experiment result.
ConditionsNull GraspFinger Obstructed GraspGood GraspBranch Interfered GraspOverall
Accuracy29/3024/2641/4880/96174/200
percentage96.6%92.3%85.4%83.3%87.0%
Table 2. Interference localization experiment result.
Table 2. Interference localization experiment result.
Interference
Localisation Tests
DistalMiddleProximalOverall Interference
Localisation Success
96 graspsFinger17/88/87/883.30%
Finger27/88/86/8
Finger35/88/85/8
Finger45/88/86/8
Table 3. Experiment result of grasping status detection on the robotic harvesting system.
Table 3. Experiment result of grasping status detection on the robotic harvesting system.
Apple No.
Distance to
the trunk (mm)
1618423815111663
Distance to the
nearest branch (mm)
5152571200
Test round 1LabelNullGoodGoodObstructedBranchBranch
DetectionNullGoodGoodBranchBranchBranch
Distance to the
nearest branch (mm)
5055531890
Test round 2LabelGoodGoodNullObstructedObstructedBranch
DetectionGoodGoodNullObstructedBranchBranch
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhou, H.; Kang, H.; Wang, X.; Au, W.; Wang, M.Y.; Chen, C. Branch Interference Sensing and Handling by Tactile Enabled Robotic Apple Harvesting. Agronomy 2023, 13, 503. https://doi.org/10.3390/agronomy13020503

AMA Style

Zhou H, Kang H, Wang X, Au W, Wang MY, Chen C. Branch Interference Sensing and Handling by Tactile Enabled Robotic Apple Harvesting. Agronomy. 2023; 13(2):503. https://doi.org/10.3390/agronomy13020503

Chicago/Turabian Style

Zhou, Hongyu, Hanwen Kang, Xing Wang, Wesley Au, Michael Yu Wang, and Chao Chen. 2023. "Branch Interference Sensing and Handling by Tactile Enabled Robotic Apple Harvesting" Agronomy 13, no. 2: 503. https://doi.org/10.3390/agronomy13020503

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop