Next Article in Journal
Changes of Soil Water and Heat Transport and Yield of Tomato (Solanum lycopersicum) in Greenhouses with Micro-Sprinkler Irrigation under Plastic Film
Previous Article in Journal
Biosurfactants: Potential and Eco-Friendly Material for Sustainable Agriculture and Environmental Safety—A Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Fuzzy Comprehensive Evaluation for Grasping Prioritization of Stacked Fruit Clusters Based on Relative Hierarchy Factor Set

1
School of Agricultural Engineering, Jiangsu University, Zhenjiang 212013, China
2
School of Electrical & Information Engineering, Jiangsu University, Zhenjiang 212013, China
*
Author to whom correspondence should be addressed.
Agronomy 2022, 12(3), 663; https://doi.org/10.3390/agronomy12030663
Submission received: 24 January 2022 / Revised: 18 February 2022 / Accepted: 7 March 2022 / Published: 9 March 2022
(This article belongs to the Section Agricultural Biosystem and Biological Engineering)

Abstract

:
To solve the problems of unclear boundaries and inconsistent influence weights among prioritization evaluation factors for grasping stacked fruit clusters by parallel robots, a fuzzy comprehensive evaluation method for the grasping prioritization of stacked fruit clusters based on a relative hierarchy factor set is proposed. According to the morphological features of stacked fruit clusters and motion features of parallel robots, a hierarchical tree model without a cross based on a subtree structure is constructed to analyze the multiple factors with unclear boundaries. A relative factor set with positive and negative effects is constructed, and a mathematical expectation is used to construct an average random consistency index and consistency satisfaction value for improving the consistency of influence weights and precision of consistency verification for a comparison matrix. The weight vector is constructed from the top to the bottom of the model, and the membership matrix of the multi-layer factors and grasping prioritization are calculated from bottom to top. The results showed that the average precision of grasping prioritization of stacked fruit clusters based on the proposed method increased by 27.77% compared with the existing fuzzy comprehensive evaluation method. The proposed method can effectively improve prioritization precision for grasping randomly stacked fruit clusters affected by multiple factors and can further realize accurate automatic sorting.

1. Introduction

The automatic sorting of fruits by robots is of great significance in the automated and intelligent development of agricultural production and agriculture product processing [1,2]. Compared with serial robots, parallel robots with closed kinematic chains possess some inherent advantages such as compact structure, high precision, high stiffness and enhanced dynamics, and these features make the parallel robots especially suitable for fruit automatic sorting with high requirement of grasping stability [3]. During automatic sorting of fruits, accurate and reliable grasping detection is a precondition for achieving accurate, fast and nondestructive robotic manipulation [4]. Currently, the research of grasping detection based on machine vision mainly includes fruit recognition and detection, grasping point detection, grasping pose calculation and prioritizing robotic grasping. Among them, evaluating the priority order for grasping fruits is a challenging problem in robotic manipulation [5]. Compared with fruits that grow singly [6], such as pears and pineapples, because of placement randomness of fruit clusters such as those of grape and lychee, the cluster morphology is diverse, and the shape and position of the stalk are unconstrained [7], which make it more difficult to evaluate grasping prioritization [8]. In addition, due to occlusion and overlap of stacked fruit clusters, improper grasping prioritization may cause fruit damage and berry abscission and may reduce robotic grasping success [9]. In addition, because six degrees of freedom are not needed in the automatic sorting of fruits, the use of a 6−DOF (degree of freedom) robot will increase the unnecessary cost [10]. Compared with a 6−DOF robot, a low-DOF parallel robot with less than six degrees of freedom is more suitable for fruit automatic sorting because of its low cost, high accuracy and fast speed, but these features also place greater demand on grasping prioritization [11]. Therefore, the cluster features, stalk features, grasping position constraints and change difficulties of the robot pose are considered as evaluation factors for grasping prioritization of stacked fruit clusters. However, due to the unclear boundaries among evaluation factors and the different grasping prioritization weights, it is difficult to make a precise comprehensive evaluation for grasping prioritization of stacked fruit clusters in multiple evaluation factors.
Currently, the main extraction and calculation methods of factors affecting grasping prioritization include infrared image analysis, spectrum analysis, laser inspection and machine vision [12,13]. Compared with other methods, machine vision, which possesses the advantages of non−contact, powerful adaptability and high cost performance, is more suitable for grasping prioritization evaluation of stacked fruit clusters. The main evaluation methods for grasping prioritization based on machine vision include fixed prioritization, logical decision, index evaluation, fuzzy comprehensive evaluation (FCE) and machine learning. The fixed prioritization method determines prioritization manually according to experience. For example, the grasping order is determined from left to right and from top to bottom based on centroids of grasping objects. It is simple and effective, but its application field is limited [14]. Compared with fixed prioritization based on experience, logical decision and index evaluation methods evaluate prioritization by quantitative analysis of factors affecting prioritization. For example, the grasping order is evaluated by logical analysis for target occlusion and contact between the target and others [15]. For multi−target grasping, some scholars selected objects without occlusion and entanglement for priority grasping by using entanglement between targets as a judgment basis [16]. The distance between the centroid of each object and average centroid is defined as dispersity, which is used as an index to evaluate prioritization [17]. Some scholars constructed evaluation functions based on object features, such as size, weight and shape, which is combined with an adjacency matrix to evaluate prioritization [18]. However, grasping prioritization of stacked fruit clusters is affected by multiple factors with unclear boundaries, and it is difficult to make a quantitative comprehensive analysis for these factors. Compared with the quantitative analysis methods, FCE and machine learning methods evaluate prioritization by qualitative analysis. The machine learning method learns optimal strategies of grasping based on neural networks by human teaching or off−line training. For example, some scholars constructed network models for multi−object scenes, and people freely chose grasping order for robot learning and training to realize the grasping learning of robots [19]. Some scholars defined the grasping order evaluation of stacked multi−objects as grasping−relationship recognition and constructed visual manipulation relationship networks based on object pairing pooling layers [20]. The machine learning method based on a neural network has the advantages of high precision, good robustness and strong adaptability. However, it needs to train the model in a large dataset for a long time, and the difference between a simulative and the actual environment can easily cause inaccuracies in a training model. The FCE method applies fuzzy relation synthesis to analyze factors involved in decision making and makes a comprehensive evaluation on object membership according to certain factors. For example, the complexity of joint operation from current pose to target pose, distance between gripper and target, difficulty of obstacle avoidance and visibility of accessible parts were taken as evaluating factors of grasping prioritization, and a weighted fuzzy evaluation matrix was used to evaluate grasping prioritization [21]. Therefore, compared with other methods, the FCE method is more suitable for prioritizing robotic grasping of stacked fruit clusters affected by multiple factors with unclear boundaries.
However, the existing FCE method still has the following problems when it is used to evaluate the grasping prioritization of stacked fruit clusters. First, the existing method uses models with no hierarchy or cross hierarchy to analyze, which has a good effect on evaluation with a simple factor relationship. However, it is difficult to construct a model and calculate membership for multiple evaluation factors of grasping prioritization with unclear boundaries. Second, the absolute factor values for the same grasping prioritization in different images are inconsistent, and the different evaluation factors have different influence weights on grasping prioritization. All of these make it difficult for the existing FCE method based on absolute factor values to accurately evaluate grasping prioritization. Finally, the existing method uses experiential data and a consistency index from a large number of random reciprocal matrices of the same order to verify a consistency of comparison matrix. The lack of theoretical basis makes it difficult to construct and verify an accurate comparison matrix for grasping the prioritization of stacked fruit clusters [22]. Therefore, this paper improves the existing FCE and proposes a fuzzy comprehensive evaluation method for the grasping prioritization of stacked fruit clusters based on relative hierarchy factor sets to improve the precision of grasping prioritization on multiple evaluation factors with unclear boundaries and inconsistent influence weights.

2. Materials and Methods

2.1. Existing Fuzzy Comprehensive Evaluation Method

The evaluation vector and value are calculated based on a factor set, membership matrix and weight vector in the existing method [23]. The factor set and evaluation set are E = { e 1 , e 2 , e 3 , , e m } and D = { d 1 , d 2 , d 3 , , d n } , respectively. The fuzzy mapping from E to F ( D ) is established as: f : E F ( D ) , e i E , e i f ( e i ) = r i 1 / d 1 + r i 2 / d 2 + + r i n / d n   ( 0 r i j 1 ,   1 i m ,   1 j n ) . The fuzzy relation is induced by f to calculate the membership matrix R = ( r i j ) m × n of factors. The ith row of R represents the mapping result of fuzzy comprehensive evaluation on the ith factor. Different factors have different weights on the evaluation result; thus, it is necessary to construct corresponding weight vectors for all factors. Currently, the main methods for constructing weight vectors include expert estimation, delphi, weighted average, frequency distribution and analytic hierarchy process (AHP). Compared with other methods, the AHP makes decision according to the thinking mode of decomposition, comparative judgment and synthesis, which is more suitable for the decision−making problem of difficult−to−quantify prioritization with multi−objects and multi−criteria. The existing AHP divides the factors into three layers: target, criteria and scheme, as shown in Figure 1.
The weight of the target layer is set as 1, and it is assigned to different factors in the criteria layer. Each factor weight in the scheme layer comes from the assignment of each factor weight in the criteria layer. The final weight sum of each layer is 1. All factors in the criteria layer are compared in pairs to construct a comparison matrix A c = ( a i j ) r × r . The consistency of matrix A c is verified based on the consistency ratio C R = μ / R I . If C R < 0.1 , A c is considered to have satisfactory consistency. Otherwise, A c needs to be modified. R I is the average random consistency index, which is the average index of a large number of random reciprocal matrices of same order [24]. μ is the consistency index of A c :
μ = λ max r r 1 = 1 r ( r 1 ) 1 i < j r ( ε i j + ε j i 2 ) = 1 r ( r 1 ) i j i , j = 1 r ( ε i j 1 )
λ max is the largest eigenvalue of A c , w c = ( w 1 , w 2 , , w r ) T is the eigenvector of A c corresponding to λ max , W c = ( w i j ) r × r = ( w i / w j ) r × r is the consistency matrix, ε i j = a i j / w i j , ε i j > 0 , ε i j = 1 / ε j i , if ε i j = 1 , and there is no error in the comparison matrix A c . w c = ( w 1 , w 2 , , w r ) T is calculated based on the comparison matrix meeting consistency requirements. The normalized w c i ,   i = 1 , 2 , , r are used as the weights of thecriteria layer factors. To calculate the weights of the scheme layer factors, the comparison matrix A p i corresponding to each C i ,   i = 1 , 2 , , r needs to be calculated. Then, the weight w p i of factor P i is obtained by weight assignment from each C i according to Equation (2).
w p i = j = 1 r w p i j
In w p i j ,   j = 1 , 2 , , r is the weight component of P i from each C i . The weight values corresponding to all factors are used to construct the weight vector w = ( w 1 , w 2 , , w m ) . According to Equation (3), the final fuzzy comprehensive evaluation vector S is calculated to realize the object evaluation based on multiple factors.
S = w R

2.2. Improved Fuzzy Comprehensive Evaluation Method Based on Relative Hierarchy Factor Set

The FCE is improved by constructing a hierarchical tree model without a cross and a relative factor set with positive and negative effects, a forward constructing weight vector, and an inverse calculating membership matrix and comprehensive evaluation value.

2.2.1. Hierarchical Tree Model without a Cross

A hierarchical tree model without a cross is constructed by analyzing multiple factors affecting the evaluation results, as shown in Figure 2. The evaluation vector J is determined by multiple factors in the first layer, and the evaluation result of each factor is determined by multiple factors in the next layer. Each factor can only affect the evaluation result of one factor in the previous layer. Then, the hierarchical tree model without a cross is formed by an analogy. m is number of layers, in n i ,   i = 1 , 2 , , m is the number of factors in layer i and in n 1 = 1 , n i j represents the number of factors in the sub−factor layer of the j th subtree structure in the i th layer of the model. In j = 1 , 2 , , k i , k i is the number of subtree structures in the i th layer of the model and k i = n i 1 . The model is composed of subtree structures. The first layer of the subtree structure is the main factor layer and the second layer is the sub−factor layer. There is more than one subtree structure in each of the two layers of the model, and there is no cross between the subtree structures.

2.2.2. Relative Factor Set with Positive and Negative Effects

For the model with m layers, the absolute factor set of the g th evaluation object is represented by vector E g = ( e 21 g , e 22 g , , e 2 n 2 g , e 31 g , e 32 g , , e 3 n 3 g , , e m n m g ) and g = 1 , 2 , , b . b is the number of objects to be evaluated in an image. According to Equation (4), the same factors of different objects are normalized to calculate the relative factor vector E g = ( e 21 g , e 22 g , , e 2 n 2 g , e 31 g , e 32 g , , e 3 n 3 g , , e m n m g ) of multiple objects in an image. s = max ( n i ) , i = 1 , 2 , , m , the max ( e i j g ) and min ( e i j g ) are the maximum and minimum values respectively when g = 1 , 2 , , b .
e i j g = e i j g min ( e i j g ) max ( e i j g ) min ( e i j g ) , i = 2 , 3 , , m ; j = 1 , 2 , , s
The higher the factor value, the lower the priority order, which is called the negative effect. Otherwise, it is called the positive effect. According to the positive or negative effect of each factor on prioritization, the effect vector P N = ( p 1 , p 2 , p 3 , p a ) ,   a = i = 2 m n i is constructed, and p i - 1 , 1 . The positive and negative effects of factors correspond to p i = 1 and p i = 1 , respectively. Performing the Hadamard product for relative factor vector E g and effect vector P N , the relative factor set with positive and negative effects can be represented by vector E g and calculated according to Equation (5).
E g = P N E g

2.2.3. Forward Construction of Weight Vector

According to the hierarchical tree model without a cross, the weight vector is constructed using a forward transfer method from top to bottom. The sum of the weights in each layer of the model is 1, and the weight of each sub−factor comes from the weight assignment of the factors in the previous layer. If the weight in the main factor layer of the j th subtree structure in the i th layer of the model is v ( i 1 ) j , the sum of the sub−factor weights v i j k belonging to this main factor is v ( i 1 ) j = k = 1 n i j v i j k .
(1)
Construction of a comparison matrix
The comparison matrix A i j = ( a i j ) n i j × n i j of the j th subtree structure in the i th layer of the model is constructed using the way of factor comparison in pairs. If the importance of e i compared with e j is a i j , then the importance of e j compared with e i is a j i = 1 / a i j .
(2)
Consistency verification of comparison matrix
The ideal comparison matrix A i j should be a consistency matrix satisfying a i j a j k = a i k ,   ( 1 i , j , k n i j ) . However, because construction of the comparison matrix is affected by subjective decision making, the consistency of the actual comparison matrix may not meet requirements. Therefore, it is necessary to verify the consistency of A i j and adjust the consistency based on the verification results. In this paper, the existing consistency verification method is improved by using the mathematical expectation E ( μ ) of μ as the average random consistency index of A i j . Based on the normal distribution of μ , the upper critical value of the consistency index of A i j is calculated and used as the consistency satisfaction value to improve verification accuracy. ε i j is uniquely determined by the comparison matrix A n × n and consistency matrix W of A .
In a certain precision range, the same eigenvector w = ( w 1 , w 2 , , w n ) T and W correspond to an infinite positive reciprocal matrix A i , i = 1 , 2 , 3 , . Therefore, according to Equation (6), A i can be calculated based on random perturbation of the same consistency matrix W .
A i = W F
F = ( ε i j ) n × n is the random perturbation matrix. When ε i j 1 , its probability density function f ( x ) obeys some kind of semi−normal distribution. When 0 < ε i j 1 , f ( x ) is calculated based on the reciprocal of therandom variable. When ε i j < 0 , f ( x ) is 0. Then:
f ( x ) = 0 x 0 1 x 2 2 π σ e ( 1 x 1 ) 2 2 σ 2 0 < x 1 1 2 π σ e ( x 1 ) 2 2 σ 2 x 1
Therefore, the relationship that ε i j and ε j i are reciprocal and have the same probability is reflected in the probability distribution. ε i j are regarded as independent random variables with the same distribution. μ obeys the normal distribution approximately. Then, the mathematical expectation E ( μ ) of μ can be calculated according to Equation (8).
E ( μ ) = 1 n ( n 1 ) i j i , j = 1 n E ( ε i j 1 ) = 1 n ( n 1 ) i j i , j = 1 n [ 0 1 ( x 1 ) 1 x 2 2 π σ e ( 1 x 1 ) 2 2 σ 2 d x + 1 + ( x 1 ) 1 2 π σ e ( x 1 ) 2 2 σ 2 d x ] = 1 n ( n 1 ) i j i , j = 1 n 1 + ( x + 1 x 2 ) 1 2 π σ e ( x 1 ) 2 2 σ 2 d x = 1 + ( x + 1 x 2 ) 1 2 π σ e ( x 1 ) 2 2 σ 2 d x
E ( μ ) can be regarded as the average random consistency index of A . According to normal distribution, E ( μ ) + 1.29 D ( μ ) is calculated based on E ( μ ) and the mathematical variance D ( μ ) , and is regarded as the upper critical value of the consistency index of A to replace the original consistency satisfaction value 0.1 R I . When μ E ( μ ) + 1.29 D ( μ ) , the comparison matrix is considered to have satisfactory consistency.
(3)
Weight calculation based on a comparison matrix
If comparison matrix A has the largest eigenvalue λ max , and if w = ( w 1 , w 2 , , w n ) T is the eigenvector of A corresponding to λ max , then:
w i = j = 1 n a i j n   , i = 1 , 2 , , n
The eigenvector is normalized to be used as the weight coefficient of the sub−factor layer of the subtree structure:
w i = w i i = 1 n w i
The weight of the sub−factor layer of the subtree structure comes from the assignment of the weight in the main factor layer, and v ( i 1 ) j = k = 1 n i j v i j k . Therefore, according to the weight coefficient vector w = ( w 1 , w 2 , , w n ) T , the corresponding weight is:
v i j k = v ( i 1 ) j w k   , k = 1 , 2 , , n
v 11 = 1 , and n = n i j is the number of factors in the sub−factor layer of the j th subtree structure in the i th layer of the model and is the corresponding weight vector order, and i = 2 , 3 , , m . Then, the weight vector in the sub−factor layer of the j th subtree structure in the i th layer of the model is:
V i j = ( v i j 1 , v i j 2 , , v i j n i j )
From the first layer, the weight vectors of all subtree structures in the model are calculated from top to bottom:
V = ( V i j ) s u m × 1   ,   s u m = i = 1 m j = 1 k i p = 1 n i j 1

2.2.4. Inverse Calculation of the Membership Matrix and Comprehensive Evaluation Value

Compared with the existing method calculating the evaluation vector based on all factors directly, from the membership matrix of the last layer, the evaluation vector of the main factor in each subtree structure is calculated from bottom to top based on the constructed hierarchical tree model without a cross in this paper. Then, based on the evaluation vector of the main factor in the j th subtree structure, the membership matrix of the sub−factor in the ( j 1 ) th subtree structure is calculated. The evaluation vector is set to D = ( d 1 , d 2 , d 3 , , d N ) , and the membership matrix of all factors in the last layer ( m th layer) of the model is:
R = R m 1 R m 2 R m k m = r 11 r 12 r 13 r 1 N r 21 r 22 r 23 r 2 N r n m 1 r n m 2 r n m 3 r n m N
R is composed of the membership matrices R m j ,   j = 1 , 2 , , k m of the k m subtree structures in the m th layer. The membership matrix of the j th subtree structure in the m th layer of the model is: R m j = ( r i j ) n m j N , and j = 1 N r i j = 1 . Each row of the membership matrix represents the probabilities that the object belongs to different values of the evaluation vector in factor e i j . The comprehensive evaluation vector J ( m 1 ) q corresponding to the q th factor of the ( m 1 ) th layer of the model is calculated based on R m j :
J ( m 1 ) q = V m j R m j ,       q = 1 , 2 , , n m 1
n m 1 = k m . If the sub−factors of the j th subtree structure in the ( m 1 ) th layer of the model correspond to the a th to b th factors of the ( m 1 ) th layer of the model, the membership matrix R ( m - 1 ) j of the j th subtree structure in the ( m 1 ) th layer of the model is calculated based on J ( m 1 ) q , according to Equation (16).
R ( m - 1 ) j = J ( m 1 ) a   J ( m 1 ) ( a + 1 )   J ( m 1 ) ( a + 2 )     J ( m 1 ) b T
Then, according to Equations (15) and (16), the comprehensive evaluation vector J of the top layer is calculated by iterating from the lower to the top layer:
J = V 21 R 21
In the fuzzy comprehensive evaluation for grasping prioritization, the value in J is the probability that each object belongs to the corresponding grasping prioritization. Therefore, the prioritization corresponding to the maximum value in J is selected as the prioritization for the grasping object. The final evaluation value is: d = d m a . m a is the index corresponding to the maximum value in J , and J m a = max ( J i ) ,   i = 1 , 2 , , N .

2.3. Prioritizing Robotic Grasping for Stacked Fruit Clusters Based on Improved FCE

The process of prioritizing robotic grasping for stacked fruit clusters based on improved FCE is shown in Figure 3. According to the features of stacked fruit clusters in the image, such as fitting the line segments of stalks, stalk depths, stalk pixels, cluster area, centroid of cluster, centroids of berries and number of berries, and motion features of low-DOF parallel robot, the factors affecting grasping prioritization are calculated. The improved FCE is used to evaluate the grasping prioritization accurately.

2.3.1. Image Acquisition and Processing

To obtain the depth and color features of the fruit clusters, two cameras are used to obtain RGB images of the stacked fruit clusters, and the camera calibration method based on a single calibration plate is used [25,26]. Under the same lighting conditions, compared with other regions, the protruding and smooth region of the berry ball in the optical axis direction is more likely to be white in the image due to overexposure. Because the color can be regarded as the mixture of spectral color and white, the S image in the HSV (hue, saturation, value) color model can represent the degree that the color is close to the spectral color or white. Therefore, to increase the contrast between the berry and the background in the image and to retain more detailed features of the stacked fruit clusters, the S image is used to extract contour of the fruit clusters. According to the texture features of the fruit clusters, the coherence enhancing diffusion [27] and median filtering are used to preprocess the S image for sharpening the image edges and reducing the effect of noise [28]. The binary image of the fruit cluster contours is obtained by local threshold segmentation and Canny edge detection. In addition, compared with berry regions, the color of the stalk regions in the images of the fruit clusters with different freshness changes greatly and is less affected by overexposure and specular reflection, which is more suitable to analyze fruit cluster freshness. As shown in Figure 4a, the color of the stalk changes from green to brown with time. Compared with the RGB color model that each image contains only one basic color and cannot describe the color change, the Q image in the quadrature−phase color model can represent the change of color from purple to yellow green. It is more suitable to extract the stalk color feature representing fruit cluster freshness, as shown in Figure 4b.

2.3.2. Features Extraction and Calculation of Stacked Fruit Clusters in Images

The positions of the stalk contours are detected according to gradient direction changes of the cluster contour. Compared with the gradient direction changes of other regions, the direction of the stalk endpoint changes greatly and is located in the trough of the gradient direction change curve, as shown in Figure 5. Therefore, the position of the stalk endpoint p a can be detected based on the curve extremum at a certain threshold. According to the gradient direction changes of the points near p a , the stalk category is determined. If there are two regions with small direction changes on the left and right of p a , there is a node in the stalk. Otherwise, there is no node in the stalk. The contour intersections of the stalk and cluster for the stalk with a node and stalk without a node are located in the positions of second and third peaks near the trough p a , respectively. The midpoint p b of intersections p b 1 and p b 2 is another endpoint of the stalk. The line segment of the stalk is fitted based on p a and p b ; the stalk length and direction, the distances between the fitting line segment of the stalk and other regions are calculated. The ROI (region of interest) for the stalk is extracted based on p a , p b 1 and p b 2 , and the stalk region is extracted based on local threshold segmentation. Then, according to the RGB image obtained by the left and right cameras and the Q image obtained by image transformation, the depth and color features of the stalk are extracted, respectively.
The stalk contours are removed based on p b 1 and p b 2 . The constraints for detecting segmentation points of the cluster regions are constructed according to the global direction changes of contours, fitted ellipses of local contours, and gradient directions, as shown in Figure 6. The extremum of the global direction change curve of a fruit cluster contour mainly corresponds to the contour intersections of the stalk and cluster and the bottom contour of cluster. Compared with fruit cluster contour, the contour global directions of different clusters are different, and the direction changes in the intersections of the cluster contours are obvious. Therefore, the candidate positions of cluster segmentation points can be extracted according to the extremums of global direction change curves of cluster contours. The candidate positions include the intersections of different cluster contours, the bottom contours of clusters, and the berry contours on the cluster edges. Compared with the intersections of the different cluster contours, the centers of the fitting circles of the berry contours on the edges of the clusters are inside the cluster contours. Because the shape of the berry is not a standard circle and the cluster contour is composed of berry contours, there is a difference between the center position of the fitting circle of the berry contour on the cluster contour and the actual center position. Therefore, the centers of the arcs corresponding to the berry contours on the cluster contour are calculated based on ellipse fitting. The relationships between the fitted ellipse centers of the arcs corresponding to the candidate positions and cluster contour are judged to purify the candidate positions. If the ellipse center is inside the cluster contour, the corresponding candidate position is removed. Otherwise, it is retained. On purified candidate positions, the gradient direction changes of the contour points are detected, and the extremums of the change curves filtered by a small convolution kernel are calculated as the segmentation points. The matching of segmentation points for cluster segmentation has the following constraints: (1) The Euclid distance between two points is the smallest. (2) There is no intersection between the line segment composed of two points and the cluster contour. The constraint (2) is judged first. If the constraint (2) is satisfied, the constraint (1) is calculated based on the distance matrix, and the closest two points are taken as the matching point pairs. The line segment composed of matching point pairs is the segmentation line of the cluster region. The actual and visible areas of the clusters and the centroids of the clusters are calculated based on the segmented cluster regions. In the edge binary image of a single cluster region, the berry contours are fitted to ellipses for calculating the centroids and number of berries.

2.3.3. Evaluation for Grasping Prioritization of Stacked Fruit Clusters

Based on the factors affecting grasping prioritization of stacked fruit clusters, the hierarchical tree model with three layers is constructed, as shown in Figure 7.
The factors include four categories: grasping position constraints e 11 , cluster features e 12 , stalk features e 13 and change difficulties of robot pose e 14 , and they are all relative factors of multiple objects in an image. The grasping position constraints include the constraints on the X-axis, Y-axis and Z-axis denoted as e 21 , e 22 and e 23 , correspondingly. According to the minimum distances d m x and d m y between the fitting line segment of stalk and other regions in the directions perpendicular and parallel to the fitting line segment, the finger width g and length l of the clamping mechanism, the grasping position constraints on the X-axis and Y-axis are defined. As shown in Figure 8a, if there are other regions in the calculation area, the absolute factor values of the grasping position constraints are:
e 21 = 2 d m x 2 g w
e 22 = d m y l
w is the opening width of the clamping mechanism. The size of the calculation area is calculated based on Equations (20) and (21). The l p is the length of the stalk fitting line segment. If there is no other region in the calculation area, the calculation area is extended to the image boundary.
g 1 = 2 g + w ,       l 1 = 2 l
g 2 = 4 g + w ,       l 2 = l , l p l l p , l p > l  
According to Equation (22), the grasping position constraint on the Z-axis is calculated based on the depths d a and d b of the stalk endpoints from the detection plane. To make the fitting line segment of the midpoint of the stalk coincide with the finger midpoint of the clamping mechanism, the finger height h of the clamping mechanism is used to calculate the grasping position constraint.
e 23 = ( d a + d b ) 2 h 2
The cluster features include four sub−factors: occlusion, compactness, weight and freshness, denoted as e 24 , e 25 , e 26 and e 27 , correspondingly. The occlusion is calculated based on the cluster area, as shown in Figure 8b. The red line in Figure 8b is the segmentation line of the cluster region. The cluster region is divided into left and right regions based on the stalk fitting line. The larger area in the left and right regions is taken as the half actual area of the cluster region (yellow area in Figure 8b). Then, the ratio of the visible area a 1 + a 2 to actual area 2 a 1 is taken as the occlusion:
e 24 = a 1 + a 2 2 a 1
As shown in Figure 8c, according to Equation (24), the compactness is calculated based on the distance d o i between centroids of the berry and cluster in the segmented cluster region. n is the number of berries. The centers of fitting ellipses are calculated and used as the centroids of the berries. The fitting ellipses of the berries are obtained based on the least−squares method in the edge binary image of a single cluster region [29,30]. According to Equation (25), the centroid coordinate ( x o , y o ) of the cluster is calculated. ( x , y ) is the contour point coordinate of the cluster, and f ( x , y ) is the contour point gray value.
e 25 = i = 1 n d o i n 2
x o = x f ( x , y ) / f ( x , y ) y o = y f ( x , y ) / f ( x , y )
The berry average weight k a obtained from a large number of experiments is used to estimate the weight of the fruit cluster in the image:
e 26 = n k a
The fruit cluster freshness is defined according to the stalk color and is calculated based on average gray value of the stalk region in the Q image according to Equation (27). n R is the number of points in the stalk region, and f ( x , y ) is the point gray value.
e 27 = f ( x , y ) n R
The stalk features include three sub−factors: category, length and direction denoted as e 28 , e 29 and e 210 , correspondingly. As shown in Figure 9, the stalk is divided into two categories: stalk with a node and stalk without a node. The skeleton of the stalk with a node can be fitted to the line segments with the herringbone. The skeleton of the stalk without a node can be fitted to a line segment. According to the relationship between the line segments of the skeleton and l , the stalk can be further divided into six categories: short A B stalk without node, long A B stalk without node, short A B stalk with node, long A B stalk with node, short E B stalk with node and long E B stalk with node. Then, the absolute factor value of the stalk category is:
e 28 = 1 ,   l A B l   and   without   node 2 ,   l A B < l   and   without   node 3 ,   l A B l   and   with   node 4 ,   l / 2 l A B < l   and   with   node 5 ,   l A B < l / 2   and   l E B l   and   with   node 6 ,   l A B < l / 2   and   l E B < l   and   with   node
A B , B D and B C are the line segments of the stalk skeleton. r 2 is the angular bisector of ∠DBC. r 1 is the tangent in the outermost point of the cluster contour. E is the intersection point of r 1 and r 2 . l A B and l E B are the projection lengths of the line segments A B and E B on the X w Y w plane of the world coordinate system O w X w Y w Z w , respectively.
The length and direction of the stalk are calculated based on the skeleton line segments of the stalk. If the endpoint coordinates of the skeleton line segments in O w X w Y w Z w are A = ( x A , y A , z A ) , B = ( x B , y B , z B ) and E = ( x E , y E , z E ) , the stalk length is:
e 29 = ( x A x B ) 2 + ( y A y B ) 2 + ( z A z B ) 2 ,   e 28 = 1 , 2 , 3 , 4 ( x B x E ) 2 + ( y B y E ) 2 + ( z B z E ) 2 ,   e 28 = 5 , 6
The stalk direction is represented by vector e 210 = ( x 210 , y 210 , z 210 ) ; thus, the absolute factor values of the stalk direction are:
x 210 = x A x B e 29 ,   e 28 = 1 , 2 , 3 , 4 x B x E e 29 ,   e 28 = 5 , 6 ,     y 210 = y A y B e 29 ,   e 28 = 1 , 2 , 3 , 4 y B y E e 29 ,   e 28 = 5 , 6 ,     z 210 = z A z B e 29 ,   e 28 = 1 , 2 , 3 , 4 z B z E e 29 ,   e 28 = 5 , 6
The absolute factor value for change difficulties of a robot pose denoted as e 211 is calculated according to the pose change e 211 = ( x 211 , y 211 , z 211 , α 211 , β 211 , θ 211 ) of the clamping mechanism from current pose g to grasping pose p [31]. Taking the 4 − R(2 − SS) parallel robot as an example, due to the motion constraint of the robot, the pose of the clamping mechanism includes the position ( x , y , z ) in O w X w Y w Z w and rotation angle θ around the Z-axis. Therefore, the absolute factor value for the change difficulties of the robot pose can be simplified as e 211 = ( x 211 , y 211 , z 211 , θ 211 ) . x 211 = x g x p , y 211 = y g y p , z 211 = z g z p and θ 211 = θ g θ p .
All absolute factor values are used to construct the absolute factor vector E = ( E 11 , E 12 , E 13 , E 14 ) = ( e 1 , e 2 , , e 16 ) of the third layer in the prioritization model, including in E 11 = ( e 21 , e 22 , e 23 ) , E 12 = ( e 24 , e 25 , e 26 , e 27 ) , E 13 = ( e 28 , e 29 , x 210 , y 210 , z 210 ) and E 14 = ( x 211 , y 211 , z 211 , θ 211 ) . According to Equation (4), the absolute factor values are normalized to obtain E . Based on the positive and negative effects of the different factors on the grasping prioritization, the effect vector P N = ( 1 , 1 , 1 , 1 , 1 , 1 , 1 , 1 , 1 , 1 , 1 , 1 , 1 , 1 , 1 , 1 ) is constructed. According to Equation (5), the relative factor set E with positive and negative effects is constructed. Then, the corresponding weight vector is constructed from the top to the bottom of the model. The membership matrix and comprehensive evaluation value of the multi-layer factors are calculated from the bottom to the top of the model to evaluate the grasping prioritization of stacked fruit clusters.

3. Results

3.1. Experimental Platform

The experiments were performed in the self-developed automatic sorting system for stacked fruit clusters based on a 4 − R(2 − SS) parallel robot and machine vision, as shown in Figure 10. The images of the White Rosa grape clusters for the experiments were acquired by the left and right cameras. The hardware of the experiment included the calibration plate with 7 × 7 circles with a diameter of 3.5 mm and spacing of 7 mm, the IntelICITM)i7−3770 CPU with 8GB memory, and a graphics card with NVIDIA GeForce GTX 1060 6GB. The flat light source with color temperature of 6600~7000 k was adopted to illuminate the fruit clusters.

3.2. Experimental Method

To verify the effectiveness and advantages of the proposed method, four group experiments were carried out based on the four factor categories in the second layer of the model. The White Rosa grape clusters were stacked in different ways. In the four group experiments, the main factor differences of the grasping prioritization among the images were the grasping position constraints, cluster features, stalk features and change difficulties of the robot pose. The improved FCE was compared with the existing FCE to verify the prioritization precision for the grasping stacked fruit clusters. The grasping prioritizations obtained by evaluation were compared with the prioritizations obtained by oft−repeated human detection to calculate the prioritization precision.

3.3. Experimental Results and Discussion

As shown in Figure 11, the original images for the experiments were divided into four groups with main differences of the grasping position constraints, fruit cluster features, stalk features and change difficulties of the robot pose. In each group of experiments, the 20 images of the stacked grape clusters were used for analysis. The factor values obtained in the four group experiments are shown in Table 1, Table 2, Table 3 and Table 4. The data in the tables are the absolute and relative factor values in a random image of each group.
According to Equations (31)–(33), the difference ratio R P i , and average difference ratios R P 210 ¯ and R P 211 ¯ of each factor in each group were calculated, and the differences of the four group experiments were analyzed, as shown in Table 5. According to the longitudinal comparison in Table 5, we can see that, compared with other groups, the differences of grasping position constraints in the first group were more obvious, in which the difference ratios of e 21 , e 22 and e 23 reached 1.344, 1.071 and 1.180, respectively. Compared with the other groups, the differences of the fruit cluster features in the second group were more obvious, in which the difference ratios of e 24 , e 25 , e 26 and e 27 reached 0.399, 0.504, 0.538 and 0.308, respectively. Compared with the other groups, the differences of the stalk features in the third group were more obvious, in which the difference ratios of e 28 and e 29 reached 0.833 and 0.601, respectively. Compared with the other groups, the differences of the change difficulties of the robot pose in the fourth group were more obvious, in which the average difference ratio of e 211 reached 2.421.
R P i = ( e i max e i min ) / e i max ,       i = 21 , 22 , 23 , , 210
R P 210 ¯ = ( R P x 210 + R P y 210 + R P z 210 ) / 3
R P 211 ¯ = ( R P x 211 + R P y 211 + R P z 211 + R P θ 211 ) / 4
From Table 1, Table 2, Table 3 and Table 4, we can see that the absolute factor values for the same grasping prioritization in the different images were inconsistent. For example, in the four group experiments, the maximum absolute factor values of the grasping position constraint on the X-axis were 123.60, 67.71, 99.58 and 71.04, respectively, which were not similar to each other. In addition, because the effect, unit and value range of the absolute factor values were different, the different absolute factor values were not comparable. For example, the grasping position constraint on the Z-axis had a positive effect on the grasping prioritization, its unit was in mm, and its value range was (−10.01, 82.68). The cluster compactness had a negative effect on grasping prioritization, its unit was in pixels, and its value range was (1.83, 3.77). Compared with the absolute factor values, the relative factor values for the same grasping prioritization in the different images were consistent by normalization. In addition, the different values in the relative factor set were comparable by adding positive and negative effects.
According to the obtained relative factor set with positive and negative effects, the improved and existing FCEs were respectively used to evaluate grasping prioritization of the stacked fruit clusters in the images. For each fruit cluster in each image, the prioritization obtained by oft−repeated human detection was used as the real value and was compared with the grasping prioritization obtained by evaluating for calculation of prioritization precision. For a fruit cluster, if the grasping prioritization obtained by evaluation is consistent with the real value, the prioritization is correct; otherwise, it is incorrect. As shown in Table 6, the average precisions A P i and the mean average precision A P ¯ in the four group experiments were calculated by
A P i = C O i E R i + C O i ,       i = 1 , 2 , 3 , 4
A P ¯ = i = 1 4 A P i 4
C O i is the number of fruit clusters with correct prioritization, and E R i is the number of fruit clusters with incorrect prioritization in the ith group.
The grasping prioritization evaluation is mainly used for the automatic sorting of fruits by robots. To facilitate the grasp by the robot, the stepping motor is adopted, and the conveyor belt is in a suspended state when the stacked fruit clusters reach the evaluation position. After the evaluation and sorting is completed, the conveyor belt begins to move toward next process. In the process of evaluation and sorting, the robot is responsible for implementing the fast grasp−and−place operation [32,33], which is arranged in the way as shown in Figure 12. To reduce the vibration and berry dropping that may occur in high−speed robotic sorting [34], the peak acceleration of the clamping mechanism was set at 6 m/s2. According to the length of the fruit clusters and the size of the workspace, the distances S A B , S B C and S C D were set as 0.4, 1 and 0.4 m, respectively. According to Equation (36), based on the 3–4–5 polynomial, the total time for the clamping mechanism to bring a fruit cluster from A to D was T t o t a l = 2.22   s .
T = 5.7735 S / a m a x
where a m a x is the peak acceleration of the clamping mechanism, and S is the distance.
The system can start the grasping prioritization evaluation when the clamping mechanism grasps the last fruit cluster and arrives at B, and it should end when the clamping mechanism places the last fruit cluster in D and returns to C. The time of grasping prioritization evaluation should meet the condition of Equation (37).
T i T B C + T C D + T d + T D C = T t o t a l + T d
where T i is the time of the grasping prioritization evaluation, T B C , T C D and T D C are the time for the clamping mechanism to pass BC, CD and DC, and T d is the time for the clamping mechanism to place the fruit cluster. Therefore, compared with the speed of the grasping prioritization evaluation, its accuracy is more important.
From Table 6, we can see that, although the evaluation time T i i of the improved FCE is longer than time T e i of the existing FCE, they are all less than T t o t a l + T d , and they meet the time requirements of on−line sorting on the conveyor belt. Compared with the other features, the existing FCE was more sensitive to stalk features and was less sensitive to change difficulties of the robot pose. It was difficult to improve the precision of the grasping prioritization with multiple evaluation factors. The improved FCE had high average precision in the four group experiments. Compared with the existing FCE, the average precisions A P i of the improved FCE in the four group experiments increased by 26.07%, 29.09%, 21.50% and 34.52% respectively, and the mean average precision A P ¯ increased by 27.79%. It effectively solves the problem that it is difficult to make a quantitative comprehensive analysis for multiple factors with unclear boundaries and improves the grasping prioritization precision of stacked fruit clusters.

4. Discussion and Conclusions

For multiple evaluation factors of grasping prioritization, such as the cluster and stalk features of the stacked fruit clusters, and the grasping position constraints and pose change difficulties of the low-DOF parallel robot with motion constraints, due to the unclear boundaries among the evaluation factors and the different grasping prioritization weights, it is difficult to make a precise comprehensive evaluation for grasping prioritization of stacked fruit clusters. Therefore, this paper proposes a fuzzy comprehensive evaluation method for the grasping prioritization of stacked fruit clusters based on the relative hierarchy factor set to further improve grasping success of stacked fruit clusters by low-DOF parallel robots. The following conclusions can be drawn:
(1)
According to the multiple factors affecting the evaluation results, a hierarchical tree model without a cross based on a subtree structure is constructed. It solves the problem that it is difficult to construct a model and calculate a membership for multiple evaluation factors with unclear boundaries by existing FCE.
(2)
According to the relationship between the factors in each image and the positive or negative effect of each factor on the evaluation results, a relative factor set with positive and negative effects is constructed. It solves the problems that the absolute factor values for the same grasping prioritization in different images are inconsistent, and the different evaluation factors have different influence weights on grasping prioritization.
(3)
The average random consistency index and consistency satisfaction value are constructed based on mathematical expectations. It solves the problem that the use of experiential data in the existing FCE makes it difficult to construct and verify an accurate and effective comparison matrix for grasping prioritization. In addition, based on an analytic hierarchy process and subtree structure, the weight vector of evaluation factors is constructed from the top to the bottom of the model, and the membership matrix and comprehensive evaluation value of the multi-layer factors are calculated from the bottom to the top of the model.
(4)
The improved FCE was applied to evaluate the grasping prioritization of stacked fruit clusters in a self-developed parallel robot sorting system. The experimental results demonstrate that compared with the existing FCE, the average precision of the grasping prioritization of stacked fruit clusters based on the proposed method increased by 27.77%. The improved FCE based on the relative hierarchy factor set can effectively improve prioritization precision for grasping randomly stacked fruit clusters affected by multiple factors and further realize accurate automatic sorting.
Based on the factors affecting grasping prioritization obtained by RGB images, this paper constructs a hierarchical tree model with three layers to evaluate grasping prioritization of stacked fruit clusters. In the following research work, the method proposed in this paper can be further used for the factors affecting grasping prioritization obtained by infrared images, multispectral images, and laser data, which lay a foundation for the fuzzy comprehensive evaluation of grasping prioritization based on multiple sensors.

Author Contributions

Conceptualization, Q.Z.; methodology, Q.Z.; software, Q.Z.; validation, Q.Z.; formal analysis, Q.Z.; investigation, Q.Z.; resources, Q.Z.; data curation, Q.Z.; writing—original draft preparation, Q.Z.; writing—review and editing, Q.Z., Z.Z. and G.G.; visualization, Q.Z.; supervision, Q.Z.; project administration, Q.Z.; funding acquisition, Q.Z. and Z.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by a Project Funded by the Priority Academic Program Development of Jiangsu Higher Education Institutions, grant number PAPD-2018-87; Innovation and Entrepreneurship Program of Jiangsu Province, grant number JSSCBS20210974; and Scientific Research Start-up Fund for Senior Talents of Jiangsu University, grant number 4111140011.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Shi, H.; Wang, Q.; Gu, W.; Wang, X.; Gao, S. Non-destructive firmness detection and grading of bunches of red globe grapes based on machine vision. Food Sci. 2021, 42, 232–239. [Google Scholar] [CrossRef]
  2. Schouterden, G.; Verbiest, R.; Demeester, E.; Kellens, K. Robotic cultivation of pome fruit: A benchmark study of manipulation tools-from research to industrial standards. Agronomy 2021, 11, 1922. [Google Scholar] [CrossRef]
  3. Li, Y.; Shang, D.; Fan, X.; Liu, Y. Motion reliability analysis of the delta parallel robot considering mechanism errors. Math. Probl. Eng. 2019, 2019, 3501921. [Google Scholar] [CrossRef]
  4. Sun, F.; Liu, C.; Huang, W.; Zhang, J. Object classification and grasp planning using visual and tactile sensing. IEEE Trans. Syst. Man Cybern. Syst. 2017, 46, 969–979. [Google Scholar] [CrossRef]
  5. Luo, L.; Tang, Y.; Zou, X.; Ye, M.; Feng, W.; Li, G. Vision-based extraction of spatial information in grape clusters for harvesting robots. Biosyst. Eng. 2016, 151, 90–104. [Google Scholar] [CrossRef]
  6. Liu, X.; Jia, W.; Ruan, C.; Zhao, D.; Gu, Y.; Chen, W. The recognition of apple fruits in plastic bags based on block classification. Precis. Agric. 2018, 19, 735–749. [Google Scholar] [CrossRef]
  7. Li, J.; Tang, Y.; Zou, X.; Lin, G.; Wang, H. Detection of fruit-bearing branches and localization of litchi clusters for vision-based harvesting robots. IEEE Access 2020, 8, 117746–117758. [Google Scholar] [CrossRef]
  8. Zhang, Q.; Gao, G. Grasping point detection of randomly placed fruit cluster using adaptive morphology segmentation and principal component classification of multiple features. IEEE Access 2019, 7, 158035–158050. [Google Scholar] [CrossRef]
  9. Zhang, Q.; Gao, G. Prioritizing robotic grasping of stacked fruit clusters based on stalk location in RGB-D images. Comput. Electron. Agric. 2020, 172, 105359. [Google Scholar] [CrossRef]
  10. Li, C.; Ma, Y.; Wang, S.; Qiao, F. Novel industrial robot sorting technology based on machine vision. In Proceedings of the 9th International Conference on Modelling, Identification and Control (ICMIC), Kunming, China, 10–12 July 2017. [Google Scholar] [CrossRef]
  11. Xiang, S.; Gao, H.; Liu, Z.; Gosselin, C. Dynamic transition trajectory planning of three-DOF cable-suspended parallel robots via linear time-varying MPC. Mech. Mach. Theory 2020, 146, 103715. [Google Scholar] [CrossRef]
  12. Zhang, C.; Zhao, C.; Huang, W.; Wang, Q.; Liu, S.; Li, J.; Guo, Z. Automatic detection of defective apples using nir coded structured light and fast lightness correction. J. Food Eng. 2017, 203, 69–82. [Google Scholar] [CrossRef]
  13. Wang, H.; Hu, R.; Zhang, M.; Zhai, Z.; Zhang, R. Identification of tomatoes with early decay using visible and near infrared hyperspectral imaging and image-spectrum merging technique. J. Food Process Eng. 2021, 44, e13654. [Google Scholar] [CrossRef]
  14. Kitaaki, Y.; Haraguchi, R.; Shiratsuchi, K.; Domae, Y.; Okuda, H.; Noda, A.; Sumi, K.; Fukuda, T.; Kaneko, S.I.; Matsuno, T. A robotic assembly system capable of handling flexible cables with connector. In Proceedings of the 2011 IEEE International Conference on Mechatronics and Automation, Beijing, China, 7–10 August 2011. [Google Scholar] [CrossRef]
  15. Jabalameli, A.; Behal, A. From single 2D depth image to gripper 6D pose estimation: A fast and robust algorithm for grabbing objects in cluttered scenes. Robotics 2019, 8, 63. [Google Scholar] [CrossRef] [Green Version]
  16. Domae, Y.; Okuda, H.; Taguchi, Y.; Sumi, K.; Hirai, T. Fast graspability evaluation on single depth maps for bin picking with general grippers. In Proceedings of the IEEE International Conference on Robotics & Automation, Hong Kong, China, 31 May–7 June 2014. [Google Scholar] [CrossRef] [Green Version]
  17. Maehara, K.; Miyagawa, H. Object Grasping Control Method and Apparatus. US Patent US8788095B2, 22 July 2014. [Google Scholar]
  18. Wang, W.; Li, R.; Diekel, Z.M.; Jia, Y. Robot action planning by online optimization in human–robot collaborative tasks. Int. J. Intell. Robot. Appl. 2018, 2, 161–179. [Google Scholar] [CrossRef]
  19. Zhang, H.; Liang, H.; Ni, T.; Huang, L.; Yang, J. Research on multi-object sorting system based on deep learning. Sensors 2021, 21, 6238. [Google Scholar] [CrossRef]
  20. Zhang, H.; Lan, X.; Zhou, X.; Tian, Z.; Zheng, Y.; Zheng, N. Visual manipulation relationship network for autonomous robotics. In Proceedings of the IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids), Beijing, China, 6–9 November 2018. [Google Scholar] [CrossRef]
  21. Kong, C.; Wang, S.; Wang, Y.; Gong, C.; Lu, H. Application of AHP-FCA modeling in visual guided manipulator. In Proceedings of the 2017 2nd International Conference on Robotics and Automation Engineering (ICRAE), Shanghai, China, 29–31 December 2017. [Google Scholar] [CrossRef]
  22. Zhou, R.; Song, H.; Li, J. Research on settlement particle recognition based on fuzzy comprehensive evaluation method. EURASIP J. Image Video Process. 2018, 108, 1–15. [Google Scholar] [CrossRef]
  23. Xu, X.; Nie, C.; Jin, X.; Li, Z.; Zhu, H.; Xu, H.; Wang, J.; Zhao, Y.; Feng, H. A comprehensive yield evaluation indicator based on an improved fuzzy comprehensive evaluation method and hyperspectral data. Field Crops Res. 2021, 270, 108204. [Google Scholar] [CrossRef]
  24. Sen, P.; Roy, M.; Pal, P. Evaluation of environmentally conscious manufacturing programs using a three-hybrid multi-criteria decision analysis method. Ecol. Indic. 2017, 73, 264–273. [Google Scholar] [CrossRef]
  25. Gao, G.Q.; Zhang, Q.; Zhang, S. Pose detection of parallel robot based on improved RANSAC algorithm. Meas. Control. 2019, 52, 855–868. [Google Scholar] [CrossRef] [Green Version]
  26. Tang, Y.; Li, L.; Wang, C.; Chen, M.; Feng, W.; Zou, X.; Huang, K. Real-time detection of surface deformation and strain in recycled aggregate concrete-filled steel tubular columns via four-ocular vision. Robot. Comput.-Integr. Manuf. 2019, 59, 34–46. [Google Scholar] [CrossRef]
  27. Yan, X.; Jin, R.; Weng, G. Active contours driven by order-statistic filtering and coherence-enhancing diffusion filter for fast image segmentation. J. Electron. Imaging 2020, 29, 023012. [Google Scholar] [CrossRef]
  28. Ruan, C.; Zhao, D.; Jia, W.; Chen, Y. A New Image Denoising Method by Combining WT with ICA. Math. Probl. Eng. 2015, 2015, 582640. [Google Scholar] [CrossRef]
  29. Rabatel, G.; Guizard, C. Grape berry calibration by computer vision using elliptical model fitting. In Proceedings of the 6th European Conference on Precision Agriculture, Skiathos, Greece, 6 June 2007; Available online: https://hal.archives-ouvertes.fr/hal-00468536 (accessed on 20 January 2022).
  30. Chen, J.; Li, R.; Mo, P.; Zhou, G.; Cai, S.; Chen, D. A modified method for morphology quantification and generation of 2D granular particles. Granul. Matter 2021, 24, 16. [Google Scholar] [CrossRef]
  31. Zhang, Q.; Gao, G. Grasping model and pose calculation of parallel robot for fruit cluster. Trans. Chin. Soc. Agric. Eng. 2019, 35, 37–47. [Google Scholar] [CrossRef]
  32. Blanes, C.; Mellado, M.; Ortiz, C.; Valera, A. Review. Technologies for robot grippers in pick and place operations for fresh fruits and vegetables. Span. J. Agric. Res. 2011, 9, 1130–1141. [Google Scholar] [CrossRef]
  33. Mo, J.; Shao, Z.; Guan, L.; Xie, F.; Tang, X. Dynamic performance analysis of the x4 high-speed pick-and-place parallel robot. Robot. Comput.-Integr. Manuf. 2017, 46, 48–57. [Google Scholar] [CrossRef]
  34. Liu, J.; Tang, S.; Shan, S.; Ju, J.; Zhu, X. Simulation and test of grapefruit cluster vibration for robotic harvesting. Trans. Chin. Soc. Agric. Mach. 2016, 47, 1–8. [Google Scholar] [CrossRef]
Figure 1. Existing AHP.
Figure 1. Existing AHP.
Agronomy 12 00663 g001
Figure 2. Hierarchical tree model without a cross.
Figure 2. Hierarchical tree model without a cross.
Agronomy 12 00663 g002
Figure 3. Prioritizing robotic grasping for stacked fruit clusters based on improved FCE. e i j represents the j th factor of the i th layer in the hierarchical tree model.
Figure 3. Prioritizing robotic grasping for stacked fruit clusters based on improved FCE. e i j represents the j th factor of the i th layer in the hierarchical tree model.
Agronomy 12 00663 g003
Figure 4. Fruit cluster freshness. (a) Color images; (b) Q images.
Figure 4. Fruit cluster freshness. (a) Color images; (b) Q images.
Agronomy 12 00663 g004
Figure 5. Gradient directions of fruit cluster contours. (a) Stalk with node; (b) stalk without node.
Figure 5. Gradient directions of fruit cluster contours. (a) Stalk with node; (b) stalk without node.
Agronomy 12 00663 g005
Figure 6. Segmentation point constraints of the cluster region.
Figure 6. Segmentation point constraints of the cluster region.
Agronomy 12 00663 g006
Figure 7. Prioritization model for grasping stacked fruit clusters.
Figure 7. Prioritization model for grasping stacked fruit clusters.
Agronomy 12 00663 g007
Figure 8. Factor calculation. (a) Grasping position constraints; (b) cluster occlusion; (c) cluster compactness.
Figure 8. Factor calculation. (a) Grasping position constraints; (b) cluster occlusion; (c) cluster compactness.
Agronomy 12 00663 g008
Figure 9. Stalks. (a) Stalk with node; (b) stalk without node.
Figure 9. Stalks. (a) Stalk with node; (b) stalk without node.
Agronomy 12 00663 g009
Figure 10. Prototype of automatic sorting system for stacked fruit clusters based on a 4−R(2−SS) parallel robot and machine vision. (1) Display screen; (2) control cabinet; (3) detection cabinet; (4) parallel robot; (5) light source; (6) fruit cluster; (7) left camera; (8) right camera.
Figure 10. Prototype of automatic sorting system for stacked fruit clusters based on a 4−R(2−SS) parallel robot and machine vision. (1) Display screen; (2) control cabinet; (3) detection cabinet; (4) parallel robot; (5) light source; (6) fruit cluster; (7) left camera; (8) right camera.
Agronomy 12 00663 g010
Figure 11. Original images for experiments in the four groups.
Figure 11. Original images for experiments in the four groups.
Agronomy 12 00663 g011
Figure 12. Path planning.
Figure 12. Path planning.
Agronomy 12 00663 g012
Table 1. Experimental data with main differences of grasping position constraints.
Table 1. Experimental data with main differences of grasping position constraints.
Grasping Position Constraint on X-axisGrasping Position Constraint on Y-axisGrasping Position Constraint on Z-axisCluster OcclusionCluster CompactnessCluster Weight
e21(mm)e21e22(mm)E22e23(mm)e23e24(%)e24e25(pixel)e25e26(g)
−21.130.129105.701.00014.020.36654.360.0002.01−0.101635.55
−42.470.000−7.470.00031.340.63063.550.5911.99−0.076735.90
64.180.64255.360.55555.591.00069.901.0002.72−1.000479.45
75.960.71378.940.76441.280.78261.180.4392.34−0.519568.65
123.601.00023.890.277−10.010.00056.370.1291.930.000769.35
Cluster WeightCluster FreshnessStalk CategoryStalk LengthStalk Direction
e26e27(pixel)e27e28e28e29(mm)e29x210(mm)x210y210(mm)y210
−0.53889.150.0001.000.00068.700.789−0.970.000−0.24−0.243
−0.88593.33−0.4131.000.00069.590.812−0.89−0.0420.40−0.673
0.00090.84−0.1672.00−0.50038.520.0000.92−1.000−0.36−0.163
−0.30899.26−1.0003.00−1.00076.771.0000.79−0.932−0.600.000
−1.00089.50−0.0351.000.00061.240.594−0.36−0.3210.89−1.000
Stalk DirectionChange Difficulties of Robot Pose
z210(mm)z210x211(mm)x211y211(mm)y211z211(mm)z211θ211(°)θ211
−0.08−1.000−357.250.00024.65−0.786−125.870.00089.56−0.781
−0.22−0.272235.64−1.000118.13−1.000489.53−1.000155.78−1.000
−0.18−0.460145.20−0.847−318.900.000337.20−0.752−99.45−0.156
−0.15−0.650−65.73−0.492−44.82−0.627246.81−0.606−146.800.000
−0.270.000−212.61−0.24457.88−0.862−33.76−0.15066.84−0.706
Table 2. Experimental data with main differences of fruit cluster features.
Table 2. Experimental data with main differences of fruit cluster features.
Grasping Position Constraint on X-axisGrasping Position Constraint on Y-axisGrasping Position Constraint on Z-axisCluster OcclusionCluster CompactnessCluster Weight
e21(mm)e21e22(mm)e22e23(mm)e23e24(%)e24e25(pixel)e25e26(g)
31.990.38121.150.07321.820.35577.280.7222.58−0.374401.40
43.130.57415.830.00016.180.15386.941.0003.77−1.000869.70
67.711.00047.730.44029.010.61452.240.0002.21−0.179713.60
10.040.00088.401.00039.771.00069.150.4871.870.000535.20
37.700.48026.400.14611.930.00072.300.5783.02−0.605613.25
Cluster WeightCluster FreshnessStalk CategoryStalk LengthStalk Direction
e26e27(pixel)e27e28e28e29(mm)e29x210(mm)x210y210(mm)y210
0.00089.810.0003.00−1.00084.531.0000.71−0.939−0.70−0.008
−1.000108.23−0.4602.00−0.50038.670.0000.57−0.8500.79−1.000
−0.667129.86−1.0002.00−0.50046.870.179−0.69−0.037−0.710.000
−0.286100.02−0.2551.000.00079.540.891−0.750.000−0.64−0.045
−0.452115.50−0.6411.000.00067.770.6350.80−1.0000.58−0.860
Stalk DirectionChange Difficulties of Robot Pose
z210(mm)z210x211(mm)x211y211(mm)y211z211(mm)z211θ211(°)θ211
−0.07−1.00015.64−0.41173.60−0.702108.95−1.000105.67−1.000
−0.230.000−112.97−0.182268.94−1.000−145.780.00015.31−0.671
−0.13−0.663−215.400.000−146.32−0.366−78.25−0.265−89.76−0.288
−0.16−0.426347.38−1.000−386.100.00014.55−0.629−168.900.000
−0.14−0.569−46.80−0.30033.65−0.641−88.45−0.22557.25−0.824
Table 3. Experimental data with main differences of stalk features.
Table 3. Experimental data with main differences of stalk features.
Grasping Position
Constraint on X-axis
Grasping Position
Constraint on Y-axis
Grasping Position
Constraint on Z-axis
Cluster OcclusionCluster CompactnessCluster Weight
e21(mm)e21e22(mm)e22e23(mm)e23e24(%)e24e25(pixel)e25e26(g)
26.840.13253.810.632−3.230.00054.850.0002.23−0.333691.30
15.810.00043.210.52412.830.18759.570.3642.75−0.778657.85
99.581.000−8.170.00043.650.54663.520.6681.840.000524.05
88.460.86789.901.00082.681.00054.910.0053.01−1.000747.05
68.850.63342.300.51536.290.46067.831.0001.91−0.060635.55
Cluster WeightCluster FreshnessStalk CategoryStalk LengthStalk Direction
e26e27(pixel)e27e28e28e29(mm)e29x210(mm)x210y210(mm)y210
−0.750101.50−0.7523.00−0.40087.411.0000.64−0.4660.76−1.000
−0.60094.29−0.3506.00−1.00047.070.2320.430.000−0.850.000
0.00088.010.0002.00−0.20039.260.0840.70−0.5940.65−0.932
−1.00091.36−0.1871.000.00056.720.4160.88−1.000−0.46−0.242
−0.500105.94−1.0004.00−0.60034.860.0000.88−1.000−0.44−0.256
Stalk DirectionChange Difficulties of Robot Pose
z210(mm)z210x211(mm)x211y211(mm)y211z211(mm)z211θ211(°)θ211
−0.09−1.000−99.65−0.070278.42−1.000−138.500.000−78.240.000
−0.310.000−10.31−0.24055.84−0.58598.75−0.459−55.33−0.124
−0.30−0.04855.80−0.366−15.76−0.451378.91−1.000−26.70−0.280
−0.15−0.724−136.200.00044.85−0.564307.34−0.86294.36−0.938
−0.19−0.530388.50−1.000−257.900.000255.90−0.762105.80−1.000
Table 4. Experimental data with main differences of change difficulties of robot pose.
Table 4. Experimental data with main differences of change difficulties of robot pose.
Grasping Position Constraint on X-axisGrasping Position Constraint on Y-axisGrasping Position Constraint on Z-axisCluster OcclusionCluster CompactnessCluster Weight
e21(mm)e21e22(mm)e22e23(mm)e23e24(%)e24e25(pixel)e25e26(g)
22.480.00026.780.379−7.420.00061.220.3691.830.000535.20
63.560.84618.490.2693.190.15470.851.0001.93−0.105713.60
52.160.61157.890.79361.271.00064.030.5532.78−1.000680.15
71.041.000−1.730.00037.760.65858.300.1782.04−0.221602.10
45.170.46773.501.00030.110.54655.590.0002.61−0.821434.85
Cluster WeightCluster FreshnessStalk CategoryStalk LengthStalk Direction
e26e27(pixel)e27e28e28e29(mm)e29x210(mm)x210y210(mm)y210
−0.360100.27−0.7821.000.00065.830.6010.57−0.912−0.810.000
−1.00099.02−0.7162.00−0.50039.950.000−0.68−0.126−0.63−0.136
−0.880104.37−1.0002.00−0.50043.770.089−0.880.000−0.25−0.424
−0.60085.540.0001.000.00071.400.7310.71−1.000−0.69−0.089
0.00092.33−0.3613.00−1.00082.991.000−0.86−0.0140.49−1.000
Stalk DirectionChange Difficulties of Robot Pose
z210(mm)z210x211(mm)x211y211(mm)y211z211(mm)z211θ211(°)θ211
−0.15−0.886367.25−1.000−157.98−0.423507.45−1.000−33.76−0.386
−0.38−0.078−57.94−0.436−355.600.000−167.890.00064.91−0.794
−0.400.000−387.300.00098.67−0.973−33.68−0.199114.60−1.000
−0.12−1.000218.60−0.803102.74−0.981336.30−0.747−126.870.000
−0.14−0.93514.70−0.533111.45−1.00077.89−0.364−88.35−0.160
Table 5. Differences of the four group experiments.
Table 5. Differences of the four group experiments.
GroupsRP21RP22RP23RP24RP25RP26RP27RP28RP29
Group 11.3441.0711.1800.2220.2900.3770.1020.6670.498
Group 20.8520.8210.7000.3990.5040.5380.3080.6670.543
Group 30.8410.8211.0390.1910.3890.2990.1690.8330.601
Group 40.6841.0241.1210.2150.3420.3910.1800.6670.519
GroupsRP210RP211
RPx210RPy210RPz210 R P 210 ¯ RPx211RPy211RPz211RPθ211 R P 211 ¯
Group 12.0571.6722.5112.0802.5163.7001.2571.9422.354
Group 21.9281.9052.1351.9901.6202.4362.3382.5982.248
Group 30.5092.1122.4061.6761.3511.9261.3661.7401.595
Group 42.2342.6312.3822.4162.0554.1911.3312.1072.421
Table 6. Experimental results of grasping prioritization.
Table 6. Experimental results of grasping prioritization.
MethodsStatisticGroup 1Group 2Group 3Group 4 A P ¯ ( % )
--Total number of fruit clusters652385600420--
Average number of fruit clusters in a single image8.154.817.505.25--
Existing FCE C O i 458256452258--
E R i 194129148162--
A P i (%)70.2566.4975.3361.4368.38
T e i 0.8690.5520.8090.601--
Improved FCE C O i 628368581403--
E R i 24171917--
A P i (%)96.3295.5896.8395.9596.17
T i i 1.3520.8151.3110.906--
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhang, Q.; Zhao, Z.; Gao, G. Fuzzy Comprehensive Evaluation for Grasping Prioritization of Stacked Fruit Clusters Based on Relative Hierarchy Factor Set. Agronomy 2022, 12, 663. https://doi.org/10.3390/agronomy12030663

AMA Style

Zhang Q, Zhao Z, Gao G. Fuzzy Comprehensive Evaluation for Grasping Prioritization of Stacked Fruit Clusters Based on Relative Hierarchy Factor Set. Agronomy. 2022; 12(3):663. https://doi.org/10.3390/agronomy12030663

Chicago/Turabian Style

Zhang, Qian, Zhenghui Zhao, and Guoqin Gao. 2022. "Fuzzy Comprehensive Evaluation for Grasping Prioritization of Stacked Fruit Clusters Based on Relative Hierarchy Factor Set" Agronomy 12, no. 3: 663. https://doi.org/10.3390/agronomy12030663

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop