Next Article in Journal
A Vision-Based Algorithm for Assessing Head and Hand Tremor: Development and Validation Against IMU Sensors
Previous Article in Journal
On the Variability of the Barometric Effect and Its Relation to Cosmic-Ray Neutron Sensing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

UAV-Based Coverage Path Planning for Unmanned Agricultural Vehicles

1
School of Agricultural Engineering and Food Science, Shandong University of Technology, Zibo 255000, China
2
Shandong Provincial Key Laboratory of Smart Agricultural Technology and Intelligent Agricultural Machinery Equipment for Field Crops, Zibo 255000, China
*
Author to whom correspondence should be addressed.
Sensors 2026, 26(3), 927; https://doi.org/10.3390/s26030927
Submission received: 26 December 2025 / Revised: 23 January 2026 / Accepted: 29 January 2026 / Published: 1 February 2026
(This article belongs to the Section Sensors and Robotics)

Abstract

Accurate path planning was the prerequisite for autonomous navigation of agricultural vehicles. An Unmanned Aerial Vehicle (UAV)-based coverage path planning was developed in this research for automating guidance of agricultural vehicles and reducing the operator maneuver in the creation of navigation maps. High-resolution orthophoto maps of the field were constructed by using low-altitude UAV photogrammetry to obtain spatial information. Travel paths and working paths were automatically generated from anchor points selected by the operator under the image coordinate domain. The navigation path for unmanned agricultural vehicles was generated by Mercator projection-based conversion for the anchor pixel coordinates into latitude and longitude geographic coordinates. A Graphical User Interface (GUI) was developed for path generation, visualization, and performance evaluation, through which the proposed path planning method was implemented for autonomous agricultural vehicle navigation. Calculation accuracy tests demonstrated the mean planar coordinate error was 2.23 cm and the maximum error was 3.37 cm for path planning. Field tests showed that lateral navigation errors remained within ±5.5 cm for the unmanned high-clearance sprayer, which indicated that the developed UAV-based coverage path planning method was feasible and featured high accuracy. It provided an effective solution for achieving fully autonomous agricultural vehicle operations.

1. Introduction

With the advancement of modern scientific and technological fields—including sensor and measurement-control technologies, communication technologies, intelligent control systems, as well as the application of machine vision and Global Navigation Satellite System (GNSS) navigation used in autonomous driving—modern agricultural production has been rapidly progressing toward greater intelligence, informatization, large-scale operation, and precision [1,2,3]. Autonomous navigation technology in agricultural vehicles was recognized as a core component within the precision agriculture technology system, and it has played a critical role in the realization of unmanned farming operations [4,5,6].
Although intelligent agricultural machinery technology is advancing rapidly, path planning techniques for autonomous navigation of agricultural vehicles remain an aspect that needs to improve, especially when confronted with the complex and highly variable conditions of real-world farming environments. Extensive research has been conducted on coverage path planning and navigation for agricultural vehicles [7,8,9,10]. An improved ant colony algorithm was employed by Tu et al. to reduce path planning costs, which provided important references for addressing collaborative scheduling problems among agricultural machinery during field operations [11]. Numerous scholars have also conducted research on path tracking and control. To address the sideslip problem of rice farm machinery in paddy field environments, a model predictive control (MPC) path-tracking method based on attitude correction for agricultural machinery was proposed, and field experiments demonstrated that the average root mean square error for three-line straight-path tracking was 0.043 m [12]. A path-tracking algorithm for agricultural machinery based on optimal target points was proposed, which simulated the driver’s look-ahead behavior to identify the optimal target point within the look-ahead region according to an evaluation function, and tracking error was reduced by over 20% compared to the pure pursuit algorithm [13]. A dynamic turning path planning method for four-wheel vehicles based on an asymmetric switching steering strategy was proposed. When agricultural vehicles experienced field slippage, this algorithm could dynamically replant the path according to the real-time position of the vehicle [14]. To address the issues of low trajectory planning efficiency and susceptibility to local optima for unmanned agricultural machinery in complex, narrow, and unstructured environments, this research proposed an improved bidirectional A* algorithm combined with an optimal control method [15]. The study proposed a single-obstacle avoidance algorithm based on agricultural machinery motion rules, as well as dual/multiple obstacle avoidance algorithms determined by the dimensions of the safe driving area [16].
Numerous scholars have investigated classical algorithms, theoretical foundations, coverage path planning, navigation, as well as path tracking and control for agricultural vehicles. These research efforts have been complemented by the growing application of UAVs in modern agricultural production, which is accelerating the development of unmanned farms [17,18,19,20]. This research analyzed the latest advances in UAV-ground vehicle collaboration, including collaborative navigation, perception fusion, and task allocation in agricultural applications [21]. A UAV-ground vehicle collaborative precision spraying system for honey pomelo orchards was developed in this research, where UAVs were responsible for canopy-top spraying while ground vehicles handled trunk and lower-canopy application, thereby improving pesticide utilization efficiency [22]. Researchers proposed an air-ground collaborative 3D mapping framework in this paper, where UAVs provide aerial perspectives and ground vehicles capture close-range details, generating a high-precision 3D model of the fields [23]. This research explored the collaborative applications of UAVs and ground vehicles within vineyard environments, encompassing agricultural operations such as monitoring, pruning, and harvesting. A decentralized multi-phase approach was proposed as an alternative to more common cooperative schemes. When perennial crops are considered, it is advantageous to build a simplified geometrical crop model. Preliminary results highlight the benefits achievable by exploiting the tailored technologies selected and applied to improve each of the analyzed mission phases [24].
Existing path planning methods for agricultural vehicles often require extensive manual intervention, which limits the applicability to fully autonomous operation. To address the above issues, this research proposed a UAV-based coverage path planning method for unmanned agricultural vehicles. High-resolution orthophoto maps obtained from low-altitude UAV photogrammetry were used to generate a travel path and working path, while a Mercator projection-based coordinate transformation was designed to convert pixel-based anchor points into geographic coordinates for the navigation of unmanned agricultural vehicles. Field tests with an unmanned high-clearance sprayer were conducted to validate the feasibility of the proposed method.

2. Materials and Methods

As illustrated in Figure 1, a photogrammetric UAV was used to obtain aerial images of the target field, which were stitched to generate a high-resolution orthophoto map. Path anchor points were calibrated to plan travel paths and working paths according to the generated orthophoto map. A coordinate transformation algorithm was developed to convert anchor pixel coordinates into geographic coordinates for the generation of navigation paths for unmanned agricultural vehicles.
A high-clearance sprayer with the autonomous navigation system [25] was used as the test platform, as shown in Figure 2. Its main parameters were shown in Table 1. The system contains data acquisition, action execution, and a navigation system. Data acquisition was realized using a positioning module, angle sensor, and Inertial Measurement Unit (IMU), while motion execution was achieved through automatic steering, an automatic throttle, and a continuously variable transmission. These modules are coordinated by the navigation control system to enable unmanned operation. The navigation system employed a dual-antenna positioning and orientation receiver based on a Trimble BD982 board with Real-Time Kinematic (RTK) differential service, together with an IMU.
A Phantom 4 RTK UAV was used as the low-altitude photogrammetric platform for aerial imaging of the operational areas. The UAV supported control-point-free aerial surveying and provided centimeter-level positioning accuracy while maintaining high-resolution imaging performance. It was equipped with an RTK positioning module and a Time Sync system, which enabled microsecond-level synchronization among the flight controller, camera, and RTK module, thereby reducing temporal errors between image acquisition and positioning data. The key performance parameters of the UAV are shown in Table 2.

2.1. Acquisition of Field Orthophoto Map

To enable accurate planning of travel paths and working paths for unmanned agricultural vehicles, low-altitude aerial photogrammetry was conducted using a Phantom 4 RTK UAV (SZ DJI Technology Co., Ltd. in Shenzhen, China) to generate a high-precision orthophoto map of the target field. A set of overlapping aerial images was acquired and subsequently processed through photogrammetric reconstruction to produce a geometrically corrected orthophoto map with unified scale and spatial reference, which served as the fundamental spatial data source for subsequent path planning and coordinate transformation. The field orthophoto map was constructed through the following procedures:
(1) Definition of the target field and takeoff–landing site selection:
The target field was firstly defined to include the garage. Within the target field, a location with an unobstructed view and minimal electromagnetic interference was selected as the UAV takeoff and landing site. In addition, an arbitrary point within the target field was designated as a reference point, and its geographic coordinates (latitude and longitude) were recorded for use in subsequent coordinate transformation and accuracy evaluation.
(2) UAV flight planning and image acquisition:
The UAV flight paths were planned in accordance with the Low-Altitude Digital Aerial Photogrammetry Fieldwork Specifications (CH/T 3005-2021) [26]. Appropriate flight altitude, forward overlap, and side overlap ratios were configured to ensure sufficient image redundancy. The UAV then operated autonomous flight missions to acquire high-resolution aerial images covering the entire area. The UAV flight paths and the relationship between the camera sensor parameters and ground distance are shown in Figure 3.
(3) Image processing and orthophoto map generation:
The corresponding latitude, longitude, and altitude information were extracted from the Exchangeable Image File Format (EXIF) metadata for each aerial image. Feature points were then detected and matched between adjacent images to generate point cloud data for the entire surveyed area. By integrating the Position and Orientation System (POS) data with the feature matching results, the exterior orientation parameters of each image and the three-dimensional coordinates of ground points were calculated through bundle block adjustment, thereby establishing an accurate geometric relationship between the images and the ground. Based on the reconstructed point cloud, a Digital Surface Model (DSM) of the field was generated, followed by orthorectification and image mosaicking to produce the high-resolution orthophoto map of the target field. The mosaicking process involved automatic feature extraction and matching across overlapping images, followed by bundle adjustment constrained by the onboard Real-Time Kinematic (RTK) positioning data. Geometric correction was achieved through orthorectification using the generated DSM. The accuracy of the resulting orthophoto was indirectly validated by the coordinate transformation accuracy tests.

2.2. Planing of Travel Paths and Working Paths

Travel paths and working paths for the agricultural vehicle were planned according to the pixel coordinate system of the orthophoto map. Travel paths included forward paths from the garage to the starting position of the working field and return paths from the field exit back to the garage. According to the operational characteristics of the high-clearance sprayer used in this research, the working paths were generated as rectangular strip patterns by delineating the field plots.
For the planning of forward and return paths, the locations of the garage, the starting point, and the ending point of the working field were first determined. Path anchor points were then calibrated along the roads connecting the garage to the field entrance and from the field exit back to the garage on the orthophoto map. The spacing between adjacent anchor points was selected according to practical requirements. Smaller spacing resulted in a larger number of anchor points, which improved tracking performance of the agricultural vehicle, whereas larger spacing reduced data volume and facilitated subsequent processing. The spacing was determined based on road width, with narrower roads requiring denser anchor points.
For the generation of working paths, the target field was identified by selecting its boundary points on the orthophoto map to obtain pixel coordinate information. Based on the geometric dimensions of the field, the vehicle working width, minimum turning radius, and operation direction were used as input parameters to generate coverage paths.
Using the high-clearance sprayer as the test platform, an automatic path generation algorithm was developed for working path planning. The sprayer had a minimum turning radius of 3.5 m and a working width of 12 m. By default, the primary working direction was aligned with the longer side of the working field to minimize the number of turns. After determining the primary working direction and the lateral pixel dimension of the field, the required number of working swaths was calculated based on the relationship among the field width, working width, and minimum turning radius. Given the significant difference between the working width and the minimum turning radius, the number of working swaths was determined using the following formula:
S w = c A i x s × g R e s W
where S w represents the number of working swaths required to ensure complete coverage; c A i x s corresponded to the lateral pixel dimension of the working field; g R e s signified the ground resolution; W denoted the working width. The ceiling function was applied to ensure no coverage omissions during the spray work of the high-clearance sprayer.
Consequently, the corresponding working swath spacing in the pixel coordinate system was determined, ensuring that working swaths were evenly distributed along the lateral direction of the working field, thereby achieving uniform coverage of the working area. The formula is expressed as follows:
a W = c A i x s S w
where a W represented the swath spacing. This formula evenly distributed the lateral pixel dimension, ensuring equal pixel spacing between each working swath. The corresponding actual physical spacing was a W × g R e s . This could ensure that the actual swath spacing neither exceeded the working width nor violated the minimum turning radius requirement.
The starting and ending points for each working swath were calculated according to the determined operation starting point and swath spacing. Assuming the coordinates of the operation starting point were (xmin, ymin), the starting and ending point’s coordinate for each working swath were derived as follows:
S i = x min + i × a W , y min E i = x min + i × a W , y max
where ymax represented the maximum pixel coordinate along the longer side of the working field; Si included the starting point coordinates of the corresponding working swath; Ei included the ending point coordinates of the corresponding working swath.
The headland turning model of the high-clearance sprayer was shown in Figure 4. When the high-clearance sprayer reached the headland after traveling along a working swath, a rectangular turning pattern was executed to enter the next working swath. All coordinate calculations during this process were performed within the pixel coordinate system.
When the high-clearance sprayer reached point A, the pixel coordinate of point B could be calculated based on the pixel coordinate of point A, as shown in Formula (4). The following formula represents the process of coordinate calculation of the point B.
L A B = s q r t ( 2 ) × R gR e s P s i A B = P s i A 1 A + π 4 x B = x A + L A B × sin P s i A B y B = y A + L A B × cos P s i A B
where L A B represents the length of segment A B ; R denotes the turning radius of the high-clearance sprayer; P s i A 1 A is the angle between the line A 1 A and the y-axis of the pixel coordinate system; P s i A B is the angle between the line A B and the y-axis of the image plane coordinate system; ( x A , y A ) and ( x B , y B ) are the pixel coordinates of the point A and the point B .
The pixel distance from point B to point C is ( a W 2 R ) , and thus, the pixel coordinate of point C can be expressed as follows:
x C = x B y C = y B + a W 2 R g R e s
where ( x C , y C ) is the pixel coordinate of the point C .
The process of calculating the pixel coordinates of point D from the pixel coordinates of point C was derived as follows:
L C D = s q r t ( 2 ) × R g R e s P s i C D = P s i B C π 4 x D = x C + L C D × sin P s i C D y D = y C + L C D × cos P s i C D
where L C D represents the length of segment C D , P s i B C is the angle between the line B C and the y-axis of the image plane coordinate system, P s i C D is the angle between the line C D and the y-axis of the image plane coordinate system, and ( x C , y C ) and x D , y D are the pixel coordinates of point C and point D.
Through the aforementioned procedures, the pixel coordinate sets of anchor points for the forward travel path, the working paths, and the return travel path were obtained. These three sets of pixel coordinates were sequentially stored in three separate files.

2.3. Generation of Navigation Paths for Agricultural Vehicles

The operational process of agricultural vehicles from initiation to completion could be summarized in three phases. First, the vehicle traveled from the garage to the operation starting point. Subsequently, the vehicle performed the work based on provided parameters such as working width, number of passes, operation direction, operating speed, and headland turning patterns, proceeding until the operation termination point was reached. Finally, the vehicle returned from the operation termination point back to the garage.
The anchor points information of the forward travel paths, working paths, and return travel paths created on the Orthophoto map were stored in three separate files, all in the form of pixel coordinates. Since the navigation system of the vehicle could not directly utilize pixel coordinates, it was necessary to convert the pixel coordinates along the travel paths and working paths into corresponding geographic coordinates (latitude and longitude).
The conversion algorithm was designed based on the Mercator projection under a local mapping assumption, as illustrated in Figure 5. The equator was defined as the standard parallel, and the prime meridian was designated as the central meridian. The intersection of these two lines served as the coordinate origin point, with east and north directions assigned as positive, and west and south directions as negative.
The transformation formulas of the projected x-coordinate and y-coordinate are expressed as follows:
x t = R e λ y t = R e × 0 Ψ sec Ψ d Ψ = R × ln tan ( π 4 + Ψ 2 )
where λ denotes the longitude of the target point on the Earth’s surface; Re represents the equivalent Earth radius used for local projection; Ψ is the latitude of the target point.
During the algorithm development process, a structure named S o u r s e was first constructed to store the horizontal and vertical coordinates ( x , y ) , latitude and longitude ( l a t , l o n ) of the target pixel points, and the required ground resolution g R e s .
By applying Formula (8), the pixel coordinate differences d e l X and d e l Y between the target point and the reference point were calculated as follows:
d e l X = x k P i x . x d e l Y = k P i x . y y
where k P i x . x represents the horizontal pixel coordinate of the reference point, and k P i x . y represents the vertical pixel coordinate of the reference point.
Since longitude in the Mercator projection exhibits a linear relationship with the projected coordinate system, the formula for calculating the longitude increment between the target point and the reference point could be directly established, the formula is expressed as follows:
d e l L o n = d e l X × g R e s R e × cos ( k P L a t × M P I 180 )
where d e l L o n represents the longitude increment; R e denotes the Earth’s radius; k P L a t is the latitude coordinate of the reference point; M P I represents the mathematical constant pi (π).
The longitude value of the target point was calculated as follows:
l o n = k P L o n + d e l L o n × 180 M P I
where k P L o n represents the longitude coordinate of the known point.
Since the variation in latitude on the Earth’s surface is nonlinear, it was necessary to calculate the projected value corresponding to the latitude coordinate of the reference point:
y t 0 = R e × log ( tan ( M P I 4 + k P L a t × M P I 360 ) )
Subsequently, the projected value corresponding to the latitude coordinate of the target point was calculated as follows:
y t = y t 0 + d e l Y × g R e s
Finally, the latitude value of the target point was obtained as follows:
l a t = 2 × tan 1 ( exp ( y t R e ) ) × 180 π 90
Therefore, navigation paths for unmanned agricultural vehicles were generated by converting the anchor pixel coordinates into latitude and longitude geographic coordinates, which contained the forward path, working path and return path.

3. Results and Discussion

A GUI was developed using Visual Studio 2022 to implement the UAV-based coverage path planning method in the navigation for unmanned agricultural vehicles, as shown in Figure 6. The GUI featured the function of extraction of geographic coordinate points, the design of a conversion algorithm between the pixel coordinate system and the geographic coordinate system, the automatic generation of working paths, and the creation of path anchor points along with the visualization of travel paths. The GUI generated the navigation paths according to the orthophoto map of the field obtained by the UAV.
To evaluate the accuracy and stability of the UAV-based path planning method proposed in this research, calculation accuracy tests of the path planning and field tests of the high-clearance sprayer’s navigation were conducted at Shandong University of Technology, Zibo, China.

3.1. Accuracy Tests of the Path Planning

The flight altitude of the UAV was set at 60 m, with both head and side overlap ratios configured at 70%. A total of 136 aerial images were captured, and image stitching was performed to generate the orthophoto map of the target field, as illustrated in Figure 7. The orthophoto map of the target field was loaded into the GUI. Planned paths by the GUI using the UAV-path planning were shown in Figure 7b. Twenty-five validation points with were selected evenly on the planned path from the GUI. The geographic coordinates of these validation points were obtained through field measurements using the Bei Dou high-precision positioning system and used as reference values. The corresponding coordinates of those 25 validation points were calculated using the proposed transformation algorithm. The accuracy of the UAV-path planning was evaluated according to the position errors between the geographic coordinates and transformed coordinates of validation points.
The absolute value of position errors between two coordinate groups were analyzed to quantify the transformation accuracy, as illustrated in Figure 8. Latitude errors between the reference group and test group had a maximum value of 2.49 cm with a mean value of 1.55 cm, while longitudinal errors recorded a maximum of 2.45 cm with a mean of 1.5 cm. The planar accuracy demonstrated a maximum error of 3.37 cm and a mean value of 2.23 cm. These errors were attributable to the combined effects of the accuracy of the positioning system and the limitations of the coordinate transformation. Those results indicated that the proposed coordinate transformation method has high accuracy, which met the requirement for navigation of agricultural vehicles.

3.2. Field Tests of the High-Clearance Sprayer’s Navigation

To evaluate the stability and working performance of the UAV-based path planning method proposed in this research, field tests with an unmanned high-clearance sprayer were conducted, as shown in Figure 9.
A section of the target field was designated as the work place, and a position was selected as the hypothetical garage. The orthophoto map of the target field was loaded in the GUI and planned the forward path for the high-clearance sprayer from the garage to the starting point of the work place. The boundary of the work field was delineated by selecting the boundary points. Working paths for the sprayer were then designed by using the GUI’s path planning function. The return path from the ending point of the work place back to the garage was planned finally to generate the complete navigation path for the high-clearance sprayer. The automatically generated rectangular working paths consisted of 10 swaths. The navigation system of the sprayer would record the lateral errors and heading errors in real-time during field operations. Figure 10 shows the high-clearance sprayer during field operation, and Figure 10 illustrates the sprayer’s travel path and working paths.
In straight-line path tracking, the errors displayed a sawtooth-like pattern due to field surface undulations and steering adjustments, as shown in Figure 11. The lateral error fluctuated around 0, ranging from −5.5 cm to +5.5 cm, and the heading error fluctuated around 0 with a range from −2.5° to +2.5°.
The mean value of absolute tracking errors in a straight line was statistically analyzed using the average value, maximum value, and RMS (Root Mean Square) error as metrics to evaluate the accuracy and stability of straight-line path tracking. The statistical results of the straight-line path tracking errors are presented in Table 3.
The maximum average lateral and heading errors were 3.69 cm and 1.15°, respectively. The maximum lateral and heading errors were 5.11 cm and 1.75°, respectively. And the maximum RMS lateral and heading errors were 2.61 cm and 1.12°, respectively. For Path 1 to Path 10, the lateral error and heading error of each path were similar, with no obvious variation pattern. Field test results demonstrated that the developed UAV-based coverage path planning method was feasible and featured high accuracy, which provided an effective solution for achieving fully autonomous agricultural vehicle operations.

4. Conclusions

This research proposed a UAV-based coverage path planning method for unmanned agricultural vehicles to reduce manual intervention. High-resolution orthophoto maps were generated using low-altitude UAV photogrammetry. Travel paths and working paths were automatically generated according to the anchor points selected by the operator in the image coordinate domain. A Mercator projection-based coordinate transformation algorithm was designed to convert pixel-based path anchor points into geographic coordinates for generation of navigation path for agricultural vehicles. A GUI was developed to implement the proposed path planning method in the navigation of unmanned agricultural vehicles. Field tests showed that lateral navigation errors remained within ±6 cm for the unmanned high-clearance sprayer, which indicated that the developed UAV-based coverage path planning method was feasible and featured high accuracy.
Future work will focus on extending the proposed method to more complex agricultural environments, such as irregular fields and dynamic obstacles. In addition, the integration of real-time perception data and adaptive path replanning strategies will be investigated to further improve the robustness and autonomy of agricultural vehicle navigation.

Author Contributions

Conceptualization, G.X. and E.Z.; methodology, G.X., E.Z. and G.A.; software, X.Y.; validation, J.D. and P.Z.; formal analysis, J.D. and X.Z.; investigation, J.D. and P.Z.; resources, G.X. and X.Y.; data curation, E.Z.; writing—original draft preparation, G.X., E.Z. and G.A.; writing—review and editing, X.Y., J.D., G.A. and G.X.; visualization, E.Z.; supervision, X.Y.; project administration, X.Y.; funding acquisition, X.Y. and J.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Key R&D Program of Shandong Province, China (Grant No. 2022SFGC0201); National Natural Science Foundation of China (Grant No. 32171910); National Key Research and Development Program of China (Grant No. 2021YFD2000502); Agricultural Engineering Foundation of SDUT of China (Grant No. NZY-2025-07).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Höffmann, M.; Patel, S.; Büskens, C. Optimal guidance track generation for precision agriculture: A review of coverage path planning techniques. J. Field Robot. 2024, 41, 823–844. [Google Scholar] [CrossRef]
  2. Chakraborty, S.; Elangovan, D.; Govindarajan, P.L.; Elnaggar, M.F.; Alrashed, M.M.; Kamel, S. A Comprehensive Review of Path Planning for Agricultural Ground Robots. Sustainability 2022, 14, 9156. [Google Scholar] [CrossRef]
  3. Wang, T.; Chen, B.; Zhang, Z.; Li, H.; Zhang, M. Applications of machine vision in agricultural robot navigation: A review. Comput. Electron. Agric. 2022, 198, 107085. [Google Scholar] [CrossRef]
  4. Yao, Z.; Zhao, C.; Zhang, T. Agricultural machinery automatic navigation technology. iScience 2024, 27, 108714. [Google Scholar] [CrossRef] [PubMed]
  5. Mizik, T. How can precision farming work on a small scale? A systematic literature review. Precis. Agric. 2023, 24, 384–406. [Google Scholar] [CrossRef]
  6. Yang, X.; Shu, L.; Chen, J.; Ferrag, M.A.; Wu, J.; Nurellari, E.; Huang, K. A Survey on Smart Agriculture: Development Modes, Technologies, and Security and Privacy Challenges. IEEE/CAA J. Autom. Sin. 2021, 8, 273–302. [Google Scholar] [CrossRef]
  7. Ahmadi, A.; Nardi, L.; Chebrolu, N.; Stachniss, C. Visual Servoing-based Navigation for Monitoring Row-Crop Fields. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; IEEE: New York, NY, USA, 2020; pp. 4920–4926. [Google Scholar] [CrossRef]
  8. Mier, G.; Valente, J.; de Bruin, S. Fields2Cover: An Open-Source Coverage Path Planning Library for Unmanned Agricultural Vehicles. IEEE Robot. Autom. Lett. 2023, 40, 789–805. [Google Scholar] [CrossRef]
  9. Pour Arab, D.; Spisser, M.; Essert, C. Complete coverage path planning for wheeled agricultural robots. J. Field Robot. 2023, 40, 1460–1503. [Google Scholar] [CrossRef]
  10. Pour Arab, D.; Spisser, M.; Essert, C. 3D hybrid path planning for optimized coverage of agricultural fields: A novel approach for wheeled robots. J. Field Robot. 2025, 42, 455–473. [Google Scholar] [CrossRef]
  11. Tu, Y.; Hou, X. Research on Cooperative Allocation Model of Agricultural Transport Machinery Field Path. J. Agric. Mech. Res. 2024, 46, 53–57. [Google Scholar] [CrossRef]
  12. He, J.; Hu, L.; Wang, P.; Liu, Y.; Man, Z.; Tu, T.; Yang, L.; Li, Y.; Yi, Y.; Li, W.; et al. Path tracking control method and performance test based on agricultural machinery pose correction. Comput. Electron. Agric. 2022, 200, 107185. [Google Scholar] [CrossRef]
  13. Yang, Y.; Li, Y.; Wen, X.; Zhang, G.; Ma, Q.; Cheng, S.; Qi, J.; Xu, L.; Chen, L. An optimal goal point determination algorithm for automatic navigation of agricultural machinery: Improving the tracking accuracy of the Pure Pursuit algorithm. Comput. Electron. Agric. 2022, 194, 106760. [Google Scholar] [CrossRef]
  14. He, Z.; Bao, Y.; Yu, Q.; Lu, P.; He, Y.; Liu, Y. Dynamic path planning method for headland turning of unmanned agricultural vehicles. Comput. Electron. Agric. 2023, 206, 107699. [Google Scholar] [CrossRef]
  15. Wu, X.; Xu, L.; Zhen, R.; Wu, X. Bi-Directional Adaptive A* Algorithm Toward Optimal Path Planning for Large-Scale UAV Under Multi-Constraints. IEEE Access 2020, 8, 85431–85440. [Google Scholar] [CrossRef]
  16. Liu, Y.; Ji, C.; Tian, G.; Gu, B.; Wei, J.; Chen, K. Obstacle avoidance path planning for autonomous navigation agricultural machinery. J. South China Agric. Univ. 2020, 41, 117–125. [Google Scholar] [CrossRef]
  17. Rejeb, A.; Abdollahi, A.; Rejeb, K.; Treiblmaier, H. Drones in agriculture: A review and bibliometric analysis. Comput. Electron. Agric. 2022, 198, 107017. [Google Scholar] [CrossRef]
  18. Guebsi, R.; Mami, S.; Chokmani, K. Drones in Precision Agriculture: A Comprehensive Review of Applications, Technologies, and Challenges. Drones 2024, 8, 686. [Google Scholar] [CrossRef]
  19. Velusamy, P.; Rajendran, S.; Mahendran, R.K.; Naseer, S.; Shafiq, M.; Choi, J.-G. Unmanned Aerial Vehicles (UAV) in Precision Agriculture: Applications and Challenges. Energies 2022, 15, 217. [Google Scholar] [CrossRef]
  20. Mansur, H.; Gadhwal, M.; Abon, J.E.; Flippo, D. Mapping for Autonomous Navigation of Agricultural Robots Through Crop Rows Using UAV. Agriculture 2025, 15, 882. [Google Scholar] [CrossRef]
  21. Munasinghe, I.; Perera, A.; Deo, R.C. A Comprehensive Review of UAV-UGV Collaboration: Advancements and Challenges. Sens. Actuator Netw. 2024, 13, 81. [Google Scholar] [CrossRef]
  22. Chen, Y.; Liu, Z.; Lin, Z.; Xu, Z.; Guan, X.; Zhou, Z.; Zheng, D.; Hewitt, A. UAV-UGV cooperative targeted spraying system for honey pomelo orchard. Int. J. Agric. Biol. Eng. 2024, 17, 22–31. [Google Scholar] [CrossRef]
  23. Potena, C.; Khanna, R.; Nieto, J.; Siegwart, R.; Nardi, D.; Pretto, A. AgriColMap: Aerial-Ground Collaborative 3D Mapping for Precision Farming. IEEE Robot. Autom. Lett. 2019, 4, 1085–1092. [Google Scholar] [CrossRef]
  24. Mammarella, M.; Comba, L.; Biglia, A.; Dabbene, F.; Gay, P. Cooperation of unmanned systems for agricultural applications: A case study in a vineyard. Biosyst. Eng. 2022, 223, 81–102. [Google Scholar] [CrossRef]
  25. Yin, X.; An, J.H.; Wang, Y.X.; Wang, Y.K.; Jin, C.Q. Development and experiments of the autonomous driving system for high-clearance spraying machines. Trans. Chin. Soc. Agric. Eng. 2021, 37, 22–30. [Google Scholar] [CrossRef]
  26. CH/T 3005-2021; Low-Altitude Digital Aerial Photography Specifications. Beijing, China, 2021.
Figure 1. UAV-based path planning for full coverage of the target field.
Figure 1. UAV-based path planning for full coverage of the target field.
Sensors 26 00927 g001
Figure 2. Main components of the unmanned high-clearance sprayer.
Figure 2. Main components of the unmanned high-clearance sprayer.
Sensors 26 00927 g002
Figure 3. UAV flight paths and sensor–ground mapping geometry: (a) UAV flight paths over the target field and (b) relationship between sensor parameters and ground distance.
Figure 3. UAV flight paths and sensor–ground mapping geometry: (a) UAV flight paths over the target field and (b) relationship between sensor parameters and ground distance.
Sensors 26 00927 g003
Figure 4. Headland turning of the unmanned high-clearance sprayer.
Figure 4. Headland turning of the unmanned high-clearance sprayer.
Sensors 26 00927 g004
Figure 5. Mercator projection-based conversion from (a) the cylindrical reference into (b) the plane coordinate system.
Figure 5. Mercator projection-based conversion from (a) the cylindrical reference into (b) the plane coordinate system.
Sensors 26 00927 g005
Figure 6. The GUI for operation.
Figure 6. The GUI for operation.
Sensors 26 00927 g006
Figure 7. Location of the target field in (a) the satellite imagery and (b) its orthophoto map.
Figure 7. Location of the target field in (a) the satellite imagery and (b) its orthophoto map.
Sensors 26 00927 g007
Figure 8. Position errors in (a) latitude and (b) longitude between two groups of validation points.
Figure 8. Position errors in (a) latitude and (b) longitude between two groups of validation points.
Sensors 26 00927 g008
Figure 9. Field tests with the high-clearance sprayer of Zibo, China.
Figure 9. Field tests with the high-clearance sprayer of Zibo, China.
Sensors 26 00927 g009
Figure 10. Planned path and actual navigation trajectory of the unmanned high-clearance sprayer.
Figure 10. Planned path and actual navigation trajectory of the unmanned high-clearance sprayer.
Sensors 26 00927 g010
Figure 11. Error variations for Path 2 during field tests: (a) lateral error and (b) heading error.
Figure 11. Error variations for Path 2 during field tests: (a) lateral error and (b) heading error.
Sensors 26 00927 g011
Table 1. Parameters of the high-clearance sprayer.
Table 1. Parameters of the high-clearance sprayer.
ParametersValue
Motor power (kW)20
Wheelbase × Thread (m × m)1.5 × 1.5
Sprinkling width (m)12
Traveling speed (km/h)0–10
Tank volume (L)500
Minimum turn radius (m)3.5
Table 2. Photogrammetric remote sensing parameters of Phantom 4 RTK UAV.
Table 2. Photogrammetric remote sensing parameters of Phantom 4 RTK UAV.
Technical ParametersValue
Camera gimbal pitch range (°)−90–30
Camera focal length (mm)8.8
Image resolution4864 × 3648 (4:3)
Image sensor1 inch CMOS
Table 3. Statistics of straight-line path tracking errors.
Table 3. Statistics of straight-line path tracking errors.
PathLateral Error (cm)Heading Error (°)
AverageMaximumRMSAverageMaximumRMS
Path 12.744.532.310.391.330.65
Path 22.714.772. 091.121.591.06
Path 33.694.252.180.721.671.11
Path 42.194.832.280.661.750.82
Path 52.124.722.371.131.690.68
Path 62.315.112.151.101.250.69
Path 72.284.322.610.521.411.05
Path 82.914.951.921.081.520.92
Path 92.794.311.380.911.610.85
Path 101.584.291.861.151.241.12
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Xue, G.; Zhang, E.; An, G.; Du, J.; Yin, X.; Zhou, P.; Zhang, X. UAV-Based Coverage Path Planning for Unmanned Agricultural Vehicles. Sensors 2026, 26, 927. https://doi.org/10.3390/s26030927

AMA Style

Xue G, Zhang E, An G, Du J, Yin X, Zhou P, Zhang X. UAV-Based Coverage Path Planning for Unmanned Agricultural Vehicles. Sensors. 2026; 26(3):927. https://doi.org/10.3390/s26030927

Chicago/Turabian Style

Xue, Guangjie, Engen Zhang, Guangshun An, Juan Du, Xiang Yin, Peng Zhou, and Xuening Zhang. 2026. "UAV-Based Coverage Path Planning for Unmanned Agricultural Vehicles" Sensors 26, no. 3: 927. https://doi.org/10.3390/s26030927

APA Style

Xue, G., Zhang, E., An, G., Du, J., Yin, X., Zhou, P., & Zhang, X. (2026). UAV-Based Coverage Path Planning for Unmanned Agricultural Vehicles. Sensors, 26(3), 927. https://doi.org/10.3390/s26030927

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop