Design of a Virtual Multi-Interaction Operation System for Hand–Eye Coordination of Grape Harvesting Robots
Abstract
:1. Introduction
2. Material and Method
2.1. Overall Scheme
- (1)
- To realize the recognition and positioning of targets by the vision in a virtual environment: This scene is a virtual environment that truly reflects the target and is not simply a virtual creation in an abstract environment. It can truly reflect the geometric size and spatial distribution of the target and truly describe the scaffolding covering the situation. It can also change the color and distribution of fruits according to seasonal changes. The algorithms for target recognition are run autonomously to discover targets and give coordinates in a virtual environment.
- (2)
- Hand–eye coordination operations require the establishment of visual interaction between the robot and individual targets. Based on the visual feedback of the target information, the robot can complete motion planning and be driven to harvest fruit one by one. Instead of artificially inputting the results of camera recognition into the simulation environment and then initiating visual servo, motion planning should be autonomous.
- (3)
- Conducting such simulations in a cross-platform virtual simulation system enables measurements to be made during the run. Data (such as robot collision, robot arm motion pose, visual recognition result, harvesting effect, and system time consumption) are obtained and analyzed for optimization.
- (1)
- A virtual simulation platform is built with V-REP software as the core. It contains harvesting robot modeling and virtual vineyard scene modeling, which recreates the virtual environment to simulate harvesting work, and vision sensors to acquire images. In V-REP, the robot arm accepts control information from MATLAB for motion simulation and collision detection [11] (Section 2.2).
- (2)
- To realize communication between V-REP, Python, and MATLAB, we create a data interface between the modules. OpenCV can be called using the remote API in V-REP. V-REP and MATLAB were used to establish a bidirectional communication architecture through an interface for exchanging sensor and robot control information (Section 2.3).
- (3)
- OpenCV is used as an image processing system to receive images from vision sensors in the simulation platform. OpenCV computer vision library and Python language are used to identify and locate images. The picking point information is obtained, and the processed 3D coordinates are sent to the MATLAB simulation module via UDP (Section 2.4).
- (4)
- The D-H method is used for inverse kinematics analysis. The kinematic model of the robotic arms is built in MATLAB. The simulation control module is built in Simulink. The generated data are sent to V-REP so that the robotic arm model in V-REP responds in real-time, and finally, the coordinated control of the grape being recognized and the robotic arm being grasped is realized [19] (Section 2.4).
2.2. Construction of Virtual Simulation Platform
2.2.1. Design and Modeling of Picking Environment
2.2.2. Modeling of Harvesting Robots
- (1)
- Setting the properties of the robot arm model. The key joint features of the robotic arm model are extracted and set as convex packages and added motion joints. By setting the dynamic properties of each joint of the robot arm and adjusting the inheritance relationship, each joint is constrained to the corresponding upper-level joint [21]. In turn, the hierarchical tree of the robot arm joint model is constructed. Finally, the drive, mass and moment of inertia are added to the robot arm joints, and the rotation speed is set for the robot arm model according to the actual prototype.
- (2)
- Setting the end-effector. The end-effector and UR5 robot arm are fixed together by a “force sensor”.
- (3)
- Setting up the robot chassis. The crawler-type chassis is found in the model library that comes with V-REP and is dragged into the page. By adding a mechanical “force sensor”, the arm and the chassis, which have mechanical properties, are fixed and connected as a whole.
2.2.3. Configuration of Vision Sensors
2.2.4. Control Scripting
2.3. Communication Interface Design
- (1)
- The scripting program of the model was modified, and the code “simRemoteApi.start (19999)” was entered to establish the connection between V-REP and Python. By acquiring the camera handle, the images sent by the vision sensors in the V-REP can be read [22].
- (2)
- Communication between OpenCV and the control system is achieved by sending the 3D coordinates of the recognized image to MATLAB using UDP and custom functions.
- (3)
- The “Synchronization Trigger” and “Start Connection” modules were added to Simulink. The simulation time is set to 0.05 s, the same as in V-REP, to establish a synchronous simulation environment. The robot system model in V-REP is connected to the Simulink control program so that it can continuously receive data from MATLAB to respond to control.
2.4. Visual Servo Simulation in Virtual Scene
2.4.1. Visual Recognition and Positioning
2.4.2. Robotic Arm Modeling and Kinematic Analysis
- (1)
- First, we observe the difference in the initial posture of the robot arm in the robot toolbox and V-REP. The initial pose of the UR5 robot is modified until it matches the initial position in the MATLAB toolbox.
- (2)
- In V-REP, we label each joint position and perform the simulation again with the new D-H parameters to observe whether the model in MATLAB is consistent with that in V-REP.
- (3)
- If multiple random points are taken to verify that the positions are basically the same, the D–H model established by the abovementioned method is more accurate. The model of the robot arm is shown in Figure 15.
2.4.3. Construction of Simulink Control System
- (1)
- The V-REP and MATLAB communication module has been introduced in “2.3”. The purpose is to realize the communication with MATLAB and establish the simulation environment.
- (2)
- The kinematic analysis of the robotic arm is shown in Figure 17, and the robot arm inverse kinematics equation is as follows.
- (3)
- Construction of simulation control system in Simulink. Since the posture and position of the robotic arm end-effector when it reaches point P is known—based on the “ikcon” inverse kinematics program—the angle “θ” that the six joints should be rotated, which is used as the movement of the robot arm in the system, can be calculated. Finally, each joint is controlled to rotate to the target position to achieve the end-effector grasping action, as shown in Figure 18.
3. Test and Evaluation
3.1. Simulation Environment and Conditions
3.2. Results and Discussion
3.2.1. Results
3.2.2. Discussion
4. Conclusions and Future Works
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Jin, Y.; Liu, J.; Wang, J.; Xu, Z.; Yuan, Y. Far-near combined positioning of picking-point based on depth data features for horizontal-trellis cultivated grape. Comput. Electron. Agric. 2022, 194, 106791. [Google Scholar] [CrossRef]
- Fu, L.; Gao, F.; Wu, J.; Li, R.; Karkee, M.; Zhang, Q. Application of consumer RGB-D cameras for fruit detection and localization in field: A critical review. Comput. Electron. Agric. 2020, 177, 105687. [Google Scholar] [CrossRef]
- Jin, Y.; Yu, C.; Yin, J.; Yang, S.X. Detection method for table grape ears and stems based on a far-close-range combined vision system and hand-eye-coordinated picking test. Comput. Electron. Agric. 2022, 202, 107364. [Google Scholar] [CrossRef]
- Lufeng, L.; Xiangjun, Z.; Qinghua, L.; Zishang, Y.; Po, Z.; Juntao, X. Virtual Simulation and Prototype Test for Behavior of Robot in Picking Process. Trans. Chin. Soc. Agric. Mach. 2018, 49, 34–42. [Google Scholar]
- He, Z.; Ma, L.; Wang, Y.; Wei, Y.; Ding, X.; Li, K.; Cui, Y. Double-Arm Cooperation and Implementing for Harvesting Kiwifruit. Agriculture 2022, 12, 1763. [Google Scholar] [CrossRef]
- Cheng, K.; Wang, Q.; Yang, D.; Dai, Q.; Wang, M. Digital-twins-driven semi-physical simulation for testing and evaluation of industrial software in a smart manufacturing system. Machines 2022, 10, 388. [Google Scholar] [CrossRef]
- Qiang, L.; Jun, W.; Jun, Z.; Po, Z.; En, L.; Peng, Z.; Jie, Y. Obstacle avoidance path planning and simulation of mobile picking robot based on DPPO. J. Syst. Simul. 2022, 1–12. [Google Scholar] [CrossRef]
- Michel, O. Cyberbotics ltd. webots™: Professional mobile robot simulation. Int. J. Adv. Robot. Syst. 2004, 1, 5. [Google Scholar] [CrossRef] [Green Version]
- Koenig, N.; Howard, A. Design and use paradigms for gazebo, an open-source multi-robot simulator. In Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No. 04CH37566), Sendai, Japan, 28 September–2 October 2004; pp. 2149–2154. [Google Scholar]
- Rohmer, E.; Singh, S.P.; Freese, M. V-REP: A versatile and scalable robot simulation framework. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 1321–1326. [Google Scholar]
- Shamshiri, R.R.; Hameed, I.A.; Pitonakova, L.; Weltzien, C.; Balasundram, S.K.; Yule, I.J.; Grift, T.E.; Chowdhary, G. Simulation software and virtual environments for acceleration of agricultural robotics: Features highlights and performance comparison. Int. J. Agric. Biol. Eng. 2018, 11, 15–31. [Google Scholar] [CrossRef]
- Cao, X.; Zou, X.; Jia, C.; Chen, M.; Zeng, Z. RRT-based path planning for an intelligent litchi-picking manipulator. Comput. Electron. Agric. 2019, 156, 105–118. [Google Scholar] [CrossRef]
- Luo, L.; Zou, X.; Cheng, T.; Yang, Z.; Zhang, C.; Mo, Y. Design of virtual test system based on hardware-in-loop for picking robot vision localization and behavior control. Trans. Chin. Soc. Agric. Eng. 2017, 33, 39–46. [Google Scholar]
- Shamshiri, R.R.; Hameed, I.A.; Karkee, M.; Weltzien, C. Robotic harvesting of fruiting vegetables: A simulation approach in V-REP, ROS and MATLAB. Proc. Autom. Agric. Secur. Food Supplies Future Gener. 2018, 126, 81–105. [Google Scholar]
- Iqbal, J.; Xu, R.; Sun, S.; Li, C. Simulation of an autonomous mobile robot for LiDAR-based in-field phenotyping and navigation. Robotics 2020, 9, 46. [Google Scholar] [CrossRef]
- Chun, W.; Yuan, H.; Peng, G. Research on Deburring Path Generation based on Vision Sensor. J. Dalian Jiaotong Univ. 2022, 43, 64–67. [Google Scholar] [CrossRef]
- Liu, J. Research on Robot Grasping Simulation Training Technology Based on Deep Learning; Harbin Institute of Technology: Harbin, China, 2018. [Google Scholar]
- De Melo, M.S.P.; da Silva Neto, J.G.; da Silva, P.J.L.; Teixeira, J.M.X.N.; Teichrieb, V. Analysis and Comparison of Robotics 3D Simulators. In Proceedings of the 2019 21st Symposium on Virtual and Augmented Reality (SVR), Rio de Janeiro, Brazil, 28–31 October 2019; pp. 242–251. [Google Scholar]
- Jin, Y.; Gao, Y.; Liu, J.; Hu, C.; Yao, Z.; Li, P. Hand-eye Coordination Planning with Deep Visual Servo for Harvesting Robot. Trans. Chin. Soc. Agric. Mach. 2021, 52, 18–25+42. [Google Scholar]
- Jiang, Y.; Liu, J.; Wang, J.; Li, W.; Peng, Y.; Shan, H. Development of a dual-arm rapid grape-harvesting robot for horizontal trellis cultivation. AI Sens. Robot. Plant Phenotyping Precis. Agric. 2022, 16648714, 348. [Google Scholar] [CrossRef] [PubMed]
- Nozali, T. Numerical simulation and visualization experiment of solid particle motion affected by parameters of flow in tapered drum rotating type separator, Asia Simulation Conference-international Conference on System Simulation & Scientific Computing. In Proceedings of the Asia Simulation Conference 2008/the 7th International Conference on System Simulation and Scientific Computing (ICSC ‘2008), Beijing, China, 10–12 October 2008. [Google Scholar]
- Hong, Z.; Cheng, W. Virtual Simulation Platform for Remote Control Based on Web and V-REP. Comput. Technol. Autom. 2021, 40, 16–20. [Google Scholar] [CrossRef]
- Zhen, L. The Kinematic Simulation and Design of Kiwifruit Picking Manipulator; Northwest A&F University: Xianyang, China, 2015. [Google Scholar]
- Wang, Z.; Lu, H.; Geng, W.; Sun, Z. OpenCV-based target detection and localization system for fruit picking robots. Electron. Technol. Softw. Eng. 2022, 220, 137–140. [Google Scholar]
- Lei, L.; Yang, W.; Dong, Q.X.; Zhang, X.; Zhang, L. Visual Positioning Method for Handle of Aircraft D.3oor. Sci. Technol. Vis. 2017, 31–34. [Google Scholar] [CrossRef]
- Chao, Z.; Bing, T.; Wei, H. Application of Robot Simulation based on V-REP and MATLAB. Shipboard Electron. Countermeas. 2020, 43, 111–114+128. [Google Scholar] [CrossRef]
- Li, M. The Research on Mechanical System Design and Key Technology of Chinese Prickly Ash Picking Robot; Lanzhou University of Technology: Lanzhou, China, 2019. [Google Scholar]
- Qi, R.; Zhou, W.; Wang, T. An obstacle avoidance trajectory planning scheme for space manipulators based on genetic algorithm. Jiqiren/Robot 2014, 36, 263–270. [Google Scholar]
Specification | Length/mm |
---|---|
Length of the robot arm | 1000 |
Distance between shoulders | 760 |
Shoulder height | 1750 |
Overall height | 1900 |
Maximum width of double-arm operation | 2600 |
Maximum working height | 2000 |
Link i | /° | |||
---|---|---|---|---|
1 | 89.2 | 0 | 0 | |
2 | 0 | 425 | Pi/2 | |
3 | 0 | 392 | 0 | |
4 | 109.3 | 0 | 0 | |
5 | 94.75 | 0 | −Pi/2 | |
6 | 82.5 | 0 | Pi/2 |
Platform | Incoming | Outgoing |
---|---|---|
V-REP | Robot model properties | Collision Relationships |
Trellis environmental properties | Robot interaction with trellis | |
Built-in Scripts | Robot operation status | |
Vision sensor properties | video streaming | |
“Light is enabled” | Light information | |
Image processing | Image | Location information of the picking target |
Different fruit recognition algorithms | Image recognition effect | |
MATLAB | Robotic arm kinematic parameters | Robotic arm model |
3-D coordinates | Robot arm joint angle | |
Overall system | – | Effectiveness of harvesting operations |
Number | Time/ms | Accuracy Rate/% | |
---|---|---|---|
Open CV | Yolo v4 | ||
1 | 0.65 | 88.9 | 100 |
2 | 0.74 | 83.3 | 88.9 |
3 | 0.44 | 100 | 100 |
4 | 0.57 | 87.5 | 94 |
5 | 0.38 | 100 | 100 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, J.; Liang, J.; Zhao, S.; Jiang, Y.; Wang, J.; Jin, Y. Design of a Virtual Multi-Interaction Operation System for Hand–Eye Coordination of Grape Harvesting Robots. Agronomy 2023, 13, 829. https://doi.org/10.3390/agronomy13030829
Liu J, Liang J, Zhao S, Jiang Y, Wang J, Jin Y. Design of a Virtual Multi-Interaction Operation System for Hand–Eye Coordination of Grape Harvesting Robots. Agronomy. 2023; 13(3):829. https://doi.org/10.3390/agronomy13030829
Chicago/Turabian StyleLiu, Jizhan, Jin Liang, Shengyi Zhao, Yingxing Jiang, Jie Wang, and Yucheng Jin. 2023. "Design of a Virtual Multi-Interaction Operation System for Hand–Eye Coordination of Grape Harvesting Robots" Agronomy 13, no. 3: 829. https://doi.org/10.3390/agronomy13030829
APA StyleLiu, J., Liang, J., Zhao, S., Jiang, Y., Wang, J., & Jin, Y. (2023). Design of a Virtual Multi-Interaction Operation System for Hand–Eye Coordination of Grape Harvesting Robots. Agronomy, 13(3), 829. https://doi.org/10.3390/agronomy13030829