Next Article in Journal
Agricultural Tractor Retail and Wholesale Residual Value Forecasting Model in Western Europe
Previous Article in Journal
Exposure to Cattle Slurry of Different Concentrations Influence Germination and Initial Growth of Selected Grass and Legume Species
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Path Planning and Control System Design of an Unmanned Weeding Robot

Nanjing Institute of Agricultural Mechanization, Ministry of Agriculture and Rural Affairs, Nanjing 210014, China
*
Author to whom correspondence should be addressed.
Agriculture 2023, 13(10), 2001; https://doi.org/10.3390/agriculture13102001
Submission received: 13 September 2023 / Revised: 11 October 2023 / Accepted: 13 October 2023 / Published: 15 October 2023
(This article belongs to the Section Agricultural Technology)

Abstract

:
Aiming at the demand by unmanned farms for unmanned operation in the entire process of field management, an unmanned plant protection robot for field management was developed based on a platform comprising a traditional high-clearance spray rod sprayer, integrated unmanned driving technology, image recognition technology, intelligent control technology, and precision operation technology. According to the agricultural machinery operation mode, agricultural machinery path planning, linear path tracking, and header path tracking algorithms were developed. Based on the overall structure and working principle of the chassis, the robot control system, steering control system, and operation control system were set. Based on the YOLOv5 image recognition algorithm, the crop–weed recognition model was developed. After 6000 rounds of training, the accuracy, recall, and mean average precision of the model were 87.7%, 84.5%, and 79.3%, respectively. Finally, a field experiment was carried out with the unmanned plant protection robot equipped with a complete system. Results show that the average lateral error of the robot is 0.036 m, the maximum lateral error is 0.2 m, the average root mean square error is 0.053 m, the average velocity error is 0.034 m/s, and the average root mean square error of velocity is 0.045 m/s when the robot works in a straight line. In weeding operations, the area ratio of weedy zones to field is 25%, which saves 75% of the herbicide compared to that dispensed in full spraying mode. The unmanned plant protection robot designed in this study effectively achieves machinery’s autonomous operation, providing valuable insights for research in unmanned farming and autonomous agricultural machinery.

1. Introduction

With the development of driverless, artificial intelligence, and intelligent control technologies, intelligent perception, decision-making, and control technologies designed for agricultural production have also begun to be applied to agricultural production. The application of new technologies not only improves the efficiency of agricultural production but also greatly reduces the labor intensity of practitioners, opening up a new direction for the research of agricultural production tools [1,2,3,4].
As a typical agricultural field management production tool, plant protection machinery is mainly used for weeding, fertilization, pesticide spraying, crop growth monitoring, crop pest identification, farmland information collection and so on. Regarding unmanned plant protection robots, a driverless system and intelligent operation system should be added to traditional plant protection machinery to realize the entire unmanned plant protection operation.
The existing unmanned system includes path planning and path tracking, and path planning includes both global path planning [5,6] and local path planning [7,8]. The global path-planning method used in this paper is the AB line operation algorithm. Compared to other control algorithms, this method is simple and versatile. The commonly used path-tracking methods include path tracking based on kinematics [9,10], pure path tracking based on proportion integration differentiation (PID) [11,12], improved pure path tracking [9,13,14,15,16], and Model Predictive Control (MPC) control [17,18,19], among others.
Since the method based on a kinematic model is used to linearize the model at a certain speed, the stability and robustness of the control system become worse when the speed changes. The pure tracking method is the most commonly used and mature path tracking algorithm, but under different speed and curvature conditions, the choice of forward distance directly determines the adjustment accuracy of the control system. MPC and other control methods based on a dynamic model rely heavily on the dynamic model of the controlled object, and how to establish an accurate dynamic model has always been the difficulty in achieving intelligent control of agricultural machinery, so the algorithm is still in the exploratory stage.
Intelligent operation system includes weed identification [20,21,22], pest identification [23,24], target spraying [25,26], and robot weeding [27,28,29]. With the development of convolutional neural network technology, target detection and semantic segmentation have been widely used in agricultural production. The target detection method is fast in recognition, but it cannot obtain an accurate contour of the object. Semantic segmentation can obtain the pixel-level contour of the object, but it requires high computational power and is difficult to apply to an embedded system.
The above-mentioned research on the unmanned plant protection technology of agricultural machinery has laid the foundation for the development of intelligent agricultural machinery. In the present work, we designed the navigation control, robot control, and operation control systems of an unmanned plant protection robot based on the existing navigation, control, and intelligent operation technologies of domestic unmanned agricultural machinery. Our goals were to test the feasibility of an unmanned plant protection robot through field experiments and to provide reference for researching unmanned agricultural machinery operation systems.

2. Materials and Methods

2.1. Composition of Weeding Robot

An unmanned weeding robot mainly includes four parts: a navigation system, a motion control system, intelligent working parts, and a data transmission unit (DTU). The navigation system is responsible for robot motion path planning and path tracking control, and the motion instructions are sent to the motion control system. The motion control system converts the motion command to the action of the robot through the electro-hydraulic control system, including speed regulation and clutch control. Intelligent working parts are the core of autonomous operation of robots, including weed identification, spraying control, and operation monitoring. The DTU is responsible for transmitting the robot’s operation information to the remote-control platform of agricultural machinery. The robot is shown in Figure 1, and the main performance parameters are in Table 1.
The chassis of the weeding robot was adapted from the 3WP-500G high-pressure chassis sprayer, manufactured by Shandong Huasheng Zhongtian Machinery Group Co., Ltd. (Linyi, China). The weeding robot’s chassis is equipped with fully hydraulic power steering, hydrostatic transmission drive system, cable-type continuously variable transmission, throttle regulator, and a braking device. The original steering wheel of the high-pressure chassis sprayer has been replaced with the servo steering wheel (CES-T) produced by Shanghai Huace Navigation Technology Co., Ltd. (Shanghai, China). The servo steering wheel consists of two parts: the fixed part, which is the cylindrical motor stator connected to the robot’s chassis, and the rotating part, comprising the motor rotor, steering wheel, and spline sleeve. The upper part of the spline sleeve is securely attached to the rotor with screws, while the lower end is connected to the robot’s steering column through gear meshing.
Figure 2 illustrates the controllers used on the weeding robot, namely the T506 edge computing box and the ZQWL IO expansion module. The T506 edge computing box is manufactured by Shenzhen Tuwei Information Technology Co., Ltd. (Shenzhen, China). It runs the Ubuntu 18.04 operating system, supports 4G/5G connectivity, features a fanless design for passive heat dissipation, and boasts a lightweight and modern appearance. It offers a variety of IO interface options and includes a bottom bracket for easy on-site installation. The ZQWL IO expansion module is provided by Shenzhen Zhiqian Internet of Things Electronic Technology Co., Ltd. (Shenzhen, China). It features an 8-channel serial port relay controller capable of handling 8-channel switching value collection and 8-channel relay output. The control board is equipped with RS232, RS485, and CAN communication interfaces, enabling switch acquisition and relay output control through custom protocols.

2.2. Path Planning

Path planning is used to generate the target trajectory of a robot and guide it to complete tasks independently according to farmland information and operational requirements of the plant protection robot. In plant protection robots, the path contains elements of point P, line L, and curve C, represented mathematically as:
P = [ x y h ] - 1 ,
where P is the point in the plane, x the longitude in UTM coordinates, y the latitude in UTM coordinates, and h the heading of the robot body at that point.
L = [ P 1 P 2 ] ,
where L is the line, P1 the starting point of the line, and P2 the end point of the line.
C = [ P 1 P 2 P n ] ,
Curve C is composed of a series of points, P means the points in the curve, and 1, 2, …, n means the sequence of points in the curve.
Path planning is divided into linear path planning and header path planning. The linear path is a series of parallel lines. The calculation method is:
L = L 0 + [ cos ( α π 2 ) sin ( α π 2 ) 0 ] × [ w w ] ,
where L is the new line, L0 the reference line, w the width of the line, and α the angle between the line and the x axis
As shown in Figure 3, the reference line A0B0 is determined before operation. According to the width of the field and the width of the machine operation, the corresponding parallel operation lines A1B1, A2B2, …, AnBn are generated. During the operation, the robot opens the operating component from point A0 to point B0 along the straight line A0B0 and then closes the operating component. The target path is switched to the multi-segment curve B0B1 for tracking. After reaching point B1, the operating component is opened, and the operating component is driven to point A1 along the straight line B1A1. The above steps are repeated until the robot reaches point Bn and the operation is completed.
Header path planning uses the Dubins path to plan the turning path, as shown in Figure 4. According to the different paths of the steering mode, steering can be divided into six modes: RSR, LSL, RSL, LSR, RLR, and LRL (where R denotes a right turn, S a straight line, and L a left turn) [30]. For agricultural robots, because they operate according to a parallel line, the heading difference between before and after turning is 180°, and RSL and LSR do not meet the operating conditions. In this paper, we only solved the four steering modes RSR, LSL, RLR, and LRL. In addition, we selected the minimum path length scheme as the target path.
To facilitate the path transition, the entry and exit points of the solved path are extended by 2 m along the tangent direction to enhance the smoothness of the path connection. Four Dubins paths become SRSRS, SLSLS, SRLRS, and SLRLS. Each path is divided into five sections, and each section is composed of arcs or straight lines. The above path is a multi-segment continuous function. For the processor, it is necessary to discretely sample the arcs and lines and represent the turnaround path as a set of points. The path is composed of straight lines and arc curves, so the straight lines and curves are sampled separately.
The linear sampling method is expressed as:
P i = [ x A + i d cos ( α ) y A + i d sin ( α ) α ] ,
(xA, yA) is the starting point of the line, d the sampling distance, α the angle between the line and x axis, and i the sampling sequence.
The curve discrete method is expressed as:
P i = [ x C R cos ( σ i Δ σ ) y C R sin ( σ i Δ σ ) σ π / 2 i Δ σ ] ,
(xC, yC) is the arc center, R the arc radius, σ the angle between the arc starting point and arc center line and x axis, Δσ the sampling angle, and i the sampling sequence.
As shown in Figure 5, the header path is divided into three parts: straight line segment A0B0, circular arc segment B0B1, and straight-line segment B1A1. When the robot moves along the A0B0 straight line during operation, and when the front end of the robot crosses the straight line L3, which is 2 m from straight line L1, the head adjustment path B0B1 is calculated. After the calculation is completed, the target path is switched to the head adjustment path. A 2 m overlap exists between the header path and the linear path, which can ensure the smooth transition of the path during the driving process of the robot. When the end of the robot crosses the line L1 formed by B0B1 from inside the field, the working parts are closed. When the end of the robot crosses line L2 1 m from line L1 from the outside of the field, the operating component is opened.

2.3. Path Tracing

The path-tracking algorithm is designed based on the pure path-tracking model. As shown in Figure 6, O is the origin of the world coordinate system, C the vehicle steering center, and E the vertical foot of the vehicle steering center C to the target straight path AB. In the world coordinate system:
O E = A B A C | A B | 2 A B + O A ,
In the equation below, G is the preview point, v is the ground speed of the robot, and t is the preview time. It can be seen that:
O G = vt A B | A B | + O E ,
In the equation below, ∠α is the angle between the target point G and body axis CH in the body coordinate system. According to the pure path-tracking model algorithm, it can be seen that:
| C G | sin ( 2 α ) = R sin ( π 2 α ) ,
The available turning radius R is:
R = | C G | 2 | H G | 2 ,
According to Ackerman steering geometry, the robot front wheel steering angle δ is:
δ = arctan ( L 2 R ) ,
The lateral error Elat is:
E lat = | C E | = A B × A C A B ,
The principle of header path tracking is shown in Figure 7. A pure path-tracking model based on preview point search is adopted. Its control principle is similar to that of linear tracking. The preview point G is obtained by the curve path search method and then the steering angle δ of the front wheel calculated. The process is as follows.
(1) Curve path AB is searched to find the closest point Pe to the Pc using Euclidean distance:
min e = i ( C P i ) ( i 0 n ) ,
(2) The preview distance is calculated according to the vehicle speed v and preview time t. Curve path EB is searched, and the closest point Pg with distance vt from point Pe is found:
min g = i ( P i P e | v t | ) ( i e n ) ,
Additionally, the vehicle steering angle δ is calculated using Formulas (9)–(11).

2.4. Speed Control System

The robot transmission route is shown in Figure 8. The engine output shaft is connected to the input shaft of the hydrostatic electrodeless transmission (HSET) electrodeless transmission through the belt, and the output shaft of the HSET electrodeless transmission is connected with the gearbox through the coupling. The two output shafts of the gearbox distribute the power to the front and rear axles of the vehicle body through the transmission shaft. The front axle of the vehicle body distributes the power to the left and right front wheels of the vehicle body through the differential. The structure of the rear axle of the vehicle body is similar.
Speed control is realized by a cascade PID controller. As shown in Figure 9, the outer loop is a speed proportional integral (PI) controller, and the inner loop is a proportion differentiation (PD) controller to control the opening of the HSET valve. The PI controller is expressed as follows:
Q ˜ ( k ) + = K p v ( E v ( k ) E v ( k 1 ) ) + K i v E v ( k ) ,
where Q ˜ is the target opening of the HSET valve, Kpv and Kiv are the proportional and integral parameters of the PI controller, respectively, and Ev is the speed error—which is the difference between the target speed and the actual speed v—as follows:
E v ( k ) = v ˜ ( k ) - v ( k ) ,
The HSET valve PD expression is:
M ( k ) = K p q E q ( k ) + K d q ( E q ( k ) E q ( k 1 ) ) ,
E q ( k ) = Q ˜ ( k ) Q ( k ) ,
M is the motor voltage, Kpq and Kdq are the proportional and differential parameters of the PD controller, Q is the actual position of the HSET valve, and Eq is the angle error of the HSET valve.

2.5. Steering Control System

The robot steering system, shown in Figure 10, mainly includes an electric steering wheel, hydraulic steering valve, oil tank, hydraulic pump, cooler, and front and rear steering hydraulic cylinders. The hydraulic pump is connected to the engine to provide power for the steering system. When the steering wheel turns right, the hydraulic oil outputs from the R port of the steering valve drive the front steering hydraulic cylinder to move left, and the front wheel turns right. The rear steering hydraulic cylinder is in series with the front steering hydraulic cylinder and drives the rear wheel to turn left. When the robot body turns left, the principle is the same as that of turning right, and the direction is opposite.
Wheel steering control is realized by the PID algorithm. The input is the target angle of the wheel, the output is the position of the motor, and the feedback is the actual angle of the wheel. To achieve better control of the wheel angle, the front wheel steering angle is measured via the method of integrating the kinematic estimation and transfer model of the steering system, expressed as:
δ ^ ( k ) = G δ a ( k ) + ( 1 - G ) δ b ( k ) ,
G is the weight parameter used for front wheel steering angle estimation, δa is the front wheel steering angle through the hydraulic steering system transfer model, and δb is the front wheel steering angle obtained via kinematics estimation. According to the research [31], the relationship between steering wheel angular velocity and front wheel steering angle is expressed as:
δ a ( k ) = δ ˜ ( k - 1 ) + ( G h ω s ( k ) + μ ) dt ,
where Gh is the transfer coefficient of the hydraulic steering system, ωs the angular rate of the steering wheel motor, and μ the angular rate noise caused by hydraulic leakage.
δ b ( k ) = arctan ( ϕ ˙ ( k ) L 2 v ( k ) ) ,
where ϕ ˙ is the angular rate of rotation along the direction of the robot body, v the speed of the reference point of the robot body, and L the axial distance.

2.6. Dataset Production

The dataset used was collected by a vehicle camera installed at the bottom of the high-altitude gap sprayer; the camera was placed 80 cm above the ground. The data were collected on 27 August 2020 at the Baima Experimental Base of the Nanjing Institute of Agricultural Mechanization, Ministry of Agriculture and Rural Affairs, China. The onboard camera was a Ruierwei model USBFHD01M with a maximum resolution of 1920 × 1080. A total of 436 images were obtained via frame sampling and clipping.
The rectangular box-labeling tool in Labelme (v4.5.10) was used to label the crop objects in the image, where corn is labeled as 1 and weeds as 0. The labeled data were divided into a training set and a verification set in a ratio of 8:2. Owing to the small size of the dataset, the data were repeated twice. In addition, the images were enhanced using random contrast adjustment, random brightness adjustment, random left–right flipping, and random noise addition according to the probability of 50%. The original image sizes were adjusted to 640 × 320 pixels to facilitate easier training.

2.7. Model Building and Training

YOLOv5 was used in the present work for weed identification. The network includes three modules: feature extraction, feature aggregation, and target prediction layers [32]. The feature extraction layer is mainly used to extract feature information at different levels from an image. The feature aggregation layer is used to enhance the detection ability of the network for objects of different scales. The target prediction layer is used to associate the feature information of the image with the shape and position information of the object.
The model training platform configuration was an Intel core i7 10700 CPU with a main frequency of 4.8 GHz and 16 GB of memory and an Nvidia RTX 2060 GPU with 11 GB of memory. The environment was the Ubuntu 20.04 operating system, and the programming language was Python 3.9.0. The stochastic gradient descent method was used to train the model. The batch size was set to 8 and the momentum factor to 0.9. The other parameters used the default values of YOLOv5. The confidence threshold was set as 0.5, and the evaluation indexes selected were precision, recall, and mean average precision (mAP).
P r e c i s i o n = T P T P + F P ,
R e c a l l = T P T P + F N ,
m A P = A P 1 + A P 2 2 ,
The training results of the model are shown in Figure 11. In the first 3000 rounds, the jitter of each index is relatively intense, and the training parameters tend to be stable in the range 3000–6000 rounds. After 6000 rounds, the accuracy of the model increases and the recall rate and mAP value decrease. Therefore, the model output after 6000 rounds of training was selected as the weed identification model in the present work, with an accuracy, recall rate, and mAP of 87.7%, 84.5%, and 79.3%, respectively.
Figure 12 displays the identification results of corn and weeds based on the YOLOv5 convolutional neural network model. It is evident that the model accurately identifies both corn and weed targets in complex field images, and the segmentation results closely align with the actual values.

2.8. Weeding Decision Method

The spraying control flowchart is shown in Figure 13. The camera installed at the bottom of the machine automatically obtains an image of the working area with a length of 2 m and a width of 1 m, obtains the contour and position information of crops and weeds through the crop–weed identification network, and then calculates whether there are weeds in the spraying area of each nozzle. If there are weeds in the area, the nozzle is opened for tk, and if not, no action is taken. The solenoid valve opening time tk can be calculated by the solenoid valve dead time tdead, the distance len from the camera to the solenoid valve, and the vehicle speed v. Here, tdead = 2 s and len = 1 m:
t k = t d e a d + l e n v

2.9. Spray System

The spray system of the plant protection robot is shown in Figure 14. The spraying module includes the tank, pump, valve group, distributor, electronic control nozzle, and other components which can adjust the pressure and flow of the spraying as well as control each nozzle to open and close independently. The electrical system is composed of a distributed input/output (IO) module and wiring harness. Through the controller area network (CAN) bus, the navigation system, the robot control, and working systems are connected, and the working state of the working parts is adjusted based on the control instructions.

3. Results and Discussion

To test the feasibility of the proposed plant protection robot unmanned operation system, a field experiment was carried out at the Baima Experimental Base of the Nanjing Institute of Agricultural Mechanization, Ministry of Agriculture and Rural Affairs, China in August 2021. The row spacing of the tested maize was 50 cm, the plant spacing 25 cm, and the plant height 15–25 cm. The main component of herbicide is mesotrione, which is safe for corn at the 3–5-leaf stage. The experiment was carried out twice. The first area was 60 m long, 6 m wide, and consisted of three working paths. The second area was 50 m long, 8 m wide, and consisted of four working paths. During the operation, the robot traveled and operated according to the planned path, setting a straight-line speed of 3.6 km/h. During operation, the lateral deviation of the robot and the opening and closing of the solenoid valve were recorded using a GCAN-401 data recorder.
Figure 15 shows the statistical data of the lateral error of the operating path of the plant protection robot. When the robot works in a straight line, the average lateral error of the robot is 0.036 m, the maximum lateral error 0.2 m, and the average root-mean-square error of lateral is 0.053 m, the average velocity error is 0.034 m/s, and the average root mean square error of velocity is 0.045 m/s.
Figure 16 shows the results of the weeding operation, in which green represents the weedy zone of the test field and the red is the clean zone. As determined statistically, the area of the test field is 720 m2 and the area of the weedy zones in the test field is 180 m2. The area ratio of weedy zones to field is 25%, which saves 75% of the herbicide compared that dispensed in full spraying mode. The area of weedy zones in the blue frame accounted for 62.3% of the weedy zones in the field, and the area ratio of weedy zones in the blue frame to field was 51.7%, which was approximately 2.05 times of that not in the blue frame. It can be concluded from the operational effectiveness diagram that weeds tend to cluster in the field environment. In other words, weed distribution in the field is uneven, with certain areas having a significantly higher ratio of weed distribution area to land area compared to others. This analysis suggests that guiding robot applications based on weed identification can lead to a substantial reduction in pesticide usage.

4. Conclusions

In this paper, we have designed a weeding robot by harnessing the power of both intelligent control technology and deep learning. To achieve the goal of unmanned weeding assignments, we have developed automatic path-planning and path-tracking algorithms. To enable autonomous driving, we have implemented speed control and steering wheel systems using PID algorithms. For enhanced intelligent operation, we have designed a weed identification system based on YOLOv5. To achieve precise weeding, we have also created an electronic spraying system.
According to the results of field experiment, the average lateral error of the robot is 0.036 m, the maximum lateral error is 0.2 m, the average root mean square error is 0.053 m, the average velocity error is 0.034 m/s, and the average root mean square error of velocity is 0.045 m/s when the robot works in a straight line. In the weeding operations, the area ratio of weedy zones to field is 25%, which saves 75% of the herbicide compared to that dispensed in full spraying mode.
In the future, we plan to delve deeper into the vehicle body dynamics model of the robot to enhance path tracking accuracy. We will also create a comprehensive weed dataset to improve weed identification accuracy under various working conditions. Additionally, we aim to incorporate more sensors, such as laser radar, ultrasonic waves, and gyroscopes, to further enhance the robot’s positioning and control accuracy.

Author Contributions

Conceptualization, T.Y. and C.J.; methodology, T.Y. and Y.N.; software, T.Y. and M.C.; validation, Z.L., T.Y. and C.J.; formal analysis, T.Y. and Y.N.; investigation, T.Y. and C.J.; resources, Z.L. and T.Y.; data curation, T.Y.; writing—original draft preparation, T.Y.; writing—review and editing, C.J., M.C. and Y.N.; visualization, T.Y.; supervision, C.J., M.C. and Z.L.; project administration, C.J.; funding acquisition, T.Y. and C.J. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the National key research and development program [Grant No. 2021YFD2000503], National Natural Science Foundation of China [Grant Nos. 32171911, 32272004], Special fund project for the construction of modern agricultural industrial technology system [Grant No. CARS-04-PS29].

Institutional Review Board Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on demand from the first author at ([email protected]).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Yin, X.; An, J.; Wang, Y.; Wang, Y.; Jin, C. Development and experiments of the autonomous driving system for high-clearance spraying machines. Trans. Chin. Soc. Agric. Eng. 2021, 37, 22–30. [Google Scholar]
  2. Liu, Z.; Zhang, Z.; Luo, X.; Wang, H.; Huang, P.; Zhang, J. Design of automatic navigation operation system for Lovol ZP9500 high clearance boom sprayer based on GNSS. Trans. Chin. Soc. Agric. Eng. 2018, 34, 15–21. [Google Scholar]
  3. Zhang, M.; Ji, Y.H.; Li, S.C.; Cao, R.; Xu, H.; Zhang, Z. Research Progress of Agricultural Machinery Navigation Technology. Trans. Chin. Soc. Agric. Mach. 2020, 51, 1–18. [Google Scholar]
  4. Roshanianfard, A.; Noguchi, N.; Okamoto, H.; Ishii, K. A review of autonomous agricultural vehicles (The experience of Hokkaido University). J. Terramech. 2020, 91, 155–183. [Google Scholar] [CrossRef]
  5. Ljungqvist, O.; Evestedt, N.; Axehill, D.; Cirillo, M.; Pettersson, H. A Path Planning and Path-following Control Framework for a General 2-trailer with a Car-like Tractor. J. Field Robot. 2019, 36, 1345–1377. [Google Scholar] [CrossRef]
  6. Chen, K.; Jie, Y.S.; Li, Y.M.; Liu, C.; Mo, J. Full Coverage Path Planning Method of Agricultural Machinery. Trans. Chin. Soc. Agric. Mach. 2022, 53, 17–26. [Google Scholar]
  7. Yang, Y.; Wen, X.; Ma, Q.L.; Zhang, G.; Cheng, S.; Qi, J.; Chen, Z.; Chen, L. Real time planning of the obstacle avoidance path of agricultural machinery in dynamic recognition areas based on Bezier curve. Trans. Chin. Soc. Agric. Eng. 2022, 38, 34–43. [Google Scholar]
  8. Zhou, J.; He, Y. Research Progress on Navigation Path Planning of Agricultural Machinery. Trans. Chin. Soc. Agric. Mach. 2021, 52, 1–14. [Google Scholar] [CrossRef]
  9. Wang, H.; Wang, G.M.; Luo, X.W.; Zhang, Z.G.; Gao, Y.; He, J.; Yue, B. Path tracking control method of agricultural machine navigation based on aiming pursuit model. Trans. Chin. Soc. Agric. Eng. 2019, 35, 11–19. [Google Scholar]
  10. Murillo, M.; Sanchez, G.; Deniz, N.; Genzelis, L. Improving path-tracking performance of an articulated tractor-trailer system using a non-linear kinematic model. Comput. Electron. Agric. 2022, 196, 106826. [Google Scholar] [CrossRef]
  11. Khalaji, A.K. PID-based target tracking control of a tractor-trailer mobile robot. Proc. Inst. Mech. Eng. Part C J. Mech. Eng. Sci. 2019, 233, 4776–4787. [Google Scholar] [CrossRef]
  12. Yang, Y.; Li, Y.; Wen, X.; Zhang, G.; Ma, Q.; Cheng, S.; Qi, J.; Xu, L.; Chen, L. An optimal goal point determination algorithm for automatic navigation of agricultural machinery: Improving the tracking accuracy of the Pure Pursuit algorithm. Comput. Electron. Agric. 2022, 194, 106760. [Google Scholar] [CrossRef]
  13. Soylu, S.; Çarman, K. Fuzzy logic based automatic slip control system for agricultural tractors. J. Terramech. 2021, 95, 25–32. [Google Scholar] [CrossRef]
  14. He, J.; Hu, L.; Wang, P.; Liu, Y.; Man, Z.; Tu, T.; Yang, L.; Li, Y.; Yi, Y.; Li, W.; et al. Path tracking control method and performance test based on agricultural machinery pose correction. Comput. Electron. Agric. 2022, 200, 107185. [Google Scholar] [CrossRef]
  15. Delavarpour, N.; Eshkabilov, S.; Bon, T.; Nowatzki, J.; Bajwa, S. The tractor-cart system controller with fuzzy logic rules. Appl. Sci. 2020, 10, 5223. [Google Scholar] [CrossRef]
  16. Meng, Q.K.; Chou, R.C.; Zhang, M.; Liu, G.; Zhang, Z.; Xiang, M. Navigation System of Agricultural Vehicle Based on Fuzzy Logic Controller with Improved Particle Swarm Optimization Algorithm. Trans. Chin. Soc. Agric. Eng. 2015, 46, 29–36. [Google Scholar]
  17. Manikandan, S.; Kaliyaperumal, G.; Hakak, S.; Gadekallu, T.R. Curve-Aware Model Predictive Control (C-MPC) Trajectory Tracking for Automated Guided Vehicle (AGV) over On-Road, In-Door, and Agricultural-Land. Sustainability 2022, 14, 12021. [Google Scholar] [CrossRef]
  18. Vatavuk, I.; Vasiljević, G.; Kovačić, Z. Task Space Model Predictive Control for Vineyard Spraying with a Mobile Manipulator. Agriculture 2022, 12, 381. [Google Scholar] [CrossRef]
  19. Li, Y.; He, D.; Ma, F.; Liu, P.; Liu, Y. MPC-based trajectory tracking control of unmanned underwater tracked bulldozer considering track slipping and motion smoothing. Ocean Eng. 2023, 279, 114449. [Google Scholar] [CrossRef]
  20. Sportelli, M.; Apolo-Apolo, O.E.; Fontanelli, M.; Frasconi, C.; Raffaelli, M.; Peruzzi, A.; Perez-Ruiz, M. Evaluation of YOLO Object Detectors for Weed Detection in Different Turfgrass Scenarios. Appl. Sci. 2023, 13, 8502. [Google Scholar] [CrossRef]
  21. Ajayi, O.G.; Ashi, J.; Guda, B. Performance evaluation of YOLO v5 model for automatic crop and weed classification on UAV images. Smart Agric. Technol. 2023, 5, 100231. [Google Scholar] [CrossRef]
  22. Wang, C.; Wu, X.; Zhang, Y.; Wang, W. Recognition and segmentation of maize seedlings in field based on dual attention semantic segmentation network. Trans. Chin. Soc. Agric. Eng. 2021, 37, 211–221. [Google Scholar]
  23. Selvaraj, M.G.; Vergara, A.; Ruiz, H.; Safari, N.; Elayabalan, S.; Ocimati, W.; Blomme, G. AI-powered banana diseases and pest detection. Plant Methods 2019, 15, 92. [Google Scholar] [CrossRef]
  24. Domingues, T.; Brandão, T.; Ferreira, J.C. Machine learning for detection and prediction of crop diseases and pests: A comprehensive survey. Agriculture 2022, 12, 1350. [Google Scholar] [CrossRef]
  25. Meshram, A.T.; Vanalkar, A.V.; Kalambe, K.B.; Badar, A.M. Pesticide spraying robot for precision agriculture: A categorical literature review and future trends. J. Field Robot. 2022, 39, 153–171. [Google Scholar] [CrossRef]
  26. Özlüoymak, Ö.B. Development and assessment of a novel camera-integrated spraying needle nozzle design for targeted micro-dose spraying in precision weed control. Comput. Electron. Agric. 2022, 199, 107134. [Google Scholar] [CrossRef]
  27. Blasco, J.; Aleixos, N.; Roger, J.M.; Rabatel, G.; Moltó, E. AE—Automation and emerging technologies: Robotic weed control using machine vision. Biosyst. Eng. 2002, 83, 149–157. [Google Scholar] [CrossRef]
  28. Bakker, T.; Bontsema, J.; Müller, J. Systematic design of an autonomous platform for robotic weeding. J. Terramech. 2010, 47, 63–73. [Google Scholar] [CrossRef]
  29. Raja, R.; Nguyen, T.T.; Vuong, V.L.; Slaughter, D.C.; Fennimore, S.A. RTD-SEPs: Real-time detection of stem emerging points and classification of crop-weed for robotic weed control in producing tomato. Biosyst. Eng. 2020, 195, 152–171. [Google Scholar] [CrossRef]
  30. Manyam, S.G.; Rathinam, S. On tightly bounding the dubins traveling salesman’s optimum. J. Dyn. Syst. Meas. Control 2018, 140, 71013. [Google Scholar] [CrossRef]
  31. Chen, Y.; He, Y. Development of agricultural machinery steering wheel angle measuring system based on GNSS attitude and motor encoder. Trans. Csae 2021, 37, 10–17. [Google Scholar]
  32. Wang, D.; He, D. Channel pruned YOLO V5s-based deep learning approach for rapid and accurate apple fruitlet detection before fruit thinning. Biosyst. Eng. 2021, 210, 271–281. [Google Scholar] [CrossRef]
Figure 1. Structure of unmanned plant protection robot: 1, control system; 2, servo steering wheel; 3, human–machine interaction; 4, positioning antenna; 5, spraying valve group.
Figure 1. Structure of unmanned plant protection robot: 1, control system; 2, servo steering wheel; 3, human–machine interaction; 4, positioning antenna; 5, spraying valve group.
Agriculture 13 02001 g001
Figure 2. Image of the controllers used on weeding robot: (a) T506 edge computing box. (b) ZQWL IO extension module.
Figure 2. Image of the controllers used on weeding robot: (a) T506 edge computing box. (b) ZQWL IO extension module.
Agriculture 13 02001 g002
Figure 3. Schematic of path planning.
Figure 3. Schematic of path planning.
Agriculture 13 02001 g003
Figure 4. Dubins path planning diagram.
Figure 4. Dubins path planning diagram.
Agriculture 13 02001 g004
Figure 5. Schematic of U-turn path.
Figure 5. Schematic of U-turn path.
Agriculture 13 02001 g005
Figure 6. Schematic of straight-line path tracing.
Figure 6. Schematic of straight-line path tracing.
Agriculture 13 02001 g006
Figure 7. Curve path tracing diagram.
Figure 7. Curve path tracing diagram.
Agriculture 13 02001 g007
Figure 8. Schematic of transmission system.
Figure 8. Schematic of transmission system.
Agriculture 13 02001 g008
Figure 9. Schematic of PID controller.
Figure 9. Schematic of PID controller.
Agriculture 13 02001 g009
Figure 10. Schematic of steering control system.
Figure 10. Schematic of steering control system.
Agriculture 13 02001 g010
Figure 11. Model training results.
Figure 11. Model training results.
Agriculture 13 02001 g011
Figure 12. Images of corn and weed identification results.
Figure 12. Images of corn and weed identification results.
Agriculture 13 02001 g012
Figure 13. Solenoid valve control principle.
Figure 13. Solenoid valve control principle.
Agriculture 13 02001 g013
Figure 14. Spray system.
Figure 14. Spray system.
Agriculture 13 02001 g014
Figure 15. Lateral and velocity error of the path: (a) Data of Path1; (b) Data of Path2; (c) Data of Path3; (d) Data of Path4; (e) Data of Path5; (f) Data of Path6.
Figure 15. Lateral and velocity error of the path: (a) Data of Path1; (b) Data of Path2; (c) Data of Path3; (d) Data of Path4; (e) Data of Path5; (f) Data of Path6.
Agriculture 13 02001 g015
Figure 16. Map of weeding: “Weedy zone” refers to an area in which the weed identification system detected weeds within that region under a confidence threshold of 0.5. “Clean zone” indicates that under a confidence threshold of 0.5, the system did not detect any weeds within that area.
Figure 16. Map of weeding: “Weedy zone” refers to an area in which the weed identification system detected weeds within that region under a confidence threshold of 0.5. “Clean zone” indicates that under a confidence threshold of 0.5, the system did not detect any weeds within that area.
Agriculture 13 02001 g016
Table 1. Main technical parameters of robot.
Table 1. Main technical parameters of robot.
Technical IndicatorsValue
Engine power (kW)15.4
Length × width × height (m)4.1 × 2.2 × 2.54
Spraying width (m)2
Ground clearance (m)1.2
Turning radius (m)3.5
Tank volume (L)500
Speed (km·h−1)0–5.6
Autopilot accuracy (cm)±2.5
Weed identification rate (%)>80
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Yang, T.; Jin, C.; Ni, Y.; Liu, Z.; Chen, M. Path Planning and Control System Design of an Unmanned Weeding Robot. Agriculture 2023, 13, 2001. https://doi.org/10.3390/agriculture13102001

AMA Style

Yang T, Jin C, Ni Y, Liu Z, Chen M. Path Planning and Control System Design of an Unmanned Weeding Robot. Agriculture. 2023; 13(10):2001. https://doi.org/10.3390/agriculture13102001

Chicago/Turabian Style

Yang, Tengxiang, Chengqian Jin, Youliang Ni, Zhen Liu, and Man Chen. 2023. "Path Planning and Control System Design of an Unmanned Weeding Robot" Agriculture 13, no. 10: 2001. https://doi.org/10.3390/agriculture13102001

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop